Every Asian American has been asked this question. A computer gives the best answer.
By Jeff Guo
Oct 21 2016
To be Asian in America is to be quizzed, constantly, about your ethnicity. What are you? Where are you from? No, but where are your parents from?
Nowadays, such questions are more awkward than ominous. But there was a time when this was a national obsession of sorts — when splashy magazines like Life published guides to help readers distinguish between the “parchment yellow” Chinese with their “finely-bridged” noses, and the “earthy yellow” Japanese, with their “massively boned faces.”
Recently, computer scientists at the University of Rochester tried to teach an algorithm to tell the difference between Chinese, Japanese and Korean faces. They wanted to explore how advancements in artificial intelligence have made it easier for computers to interpret pictures in sophisticated ways. But, intentionally or not, their research taps into the uncomfortable history of how Asians have struggled to fit into American life.
The scientists were inspired by a quiz created by Japanese American web designer Dyske Suematsu. Fifteen years ago, Suematsu decided, half-jokingly, to investigate the stereotype that Asians all look alike. He threw a party in New York City and invited Asian friends. He put their portraits on the Internet and asked strangers to guess their ethnicity.
The website was a huge hit, quickly becoming one of the web’s first viral sensations. Suematsu says that millions have registered and taken the test. On average, people identify 7 out of 18 photos correctly — an accuracy rate of about 39 percent. That’s barely better than pure guessing, which would yield an accuracy rate of 33 percent, on average.
“This is a challenging task even for humans,” said Jiebo Lu, a professor of computer science at the University of Rochester. “I asked some of my students to take the test and they all failed horribly — even though all of them were Asian.”
Lu and his students suspected that a trained artificial intelligence might be able to perform as well, or even better. Recently, they collected hundreds of thousands of pictures of East Asian faces and fed them through an algorithm to figure out just what made Chinese, Japanese, and Korean people look different. In a draft reportdetailing their results, they provide samples of the pictures fed to the computer.
But despite what they expected to be a difficult task, the scientists were surprised to discover that the computer could achieve accuracy rates of over 75 percent. This is far better than humans performed on Suematsu’s quiz. The computer’s advantage is that it could draw on a vast library of faces, Lu explained. “Our machine has seen far more examples than any living person,” he said.
Lack of experience is a major reason humans sometimes struggle to tell foreigners apart. Psychologists call it the “cross-race” effect: We are much better at distinguishing members of our own race or ethnicity than members of other races or ethnicities.
Studies suggest that with training, people can improve at recognizing the faces of people from different ethnic backgrounds. As Lu and his colleagues have demonstrated, computers might even be better than we are at noticing some of these subtle distinctions.
It’s not all about physical proportions. When the scientists went to investigate how the computer was making its decisions, they discovered an interesting pattern. Many of the cues that stood out to the algorithm were cultural features, like hairstyles or glasses or facial expressions. This makes sense, since the people of China, Japan and Korea have somewhat shared ancestries, but distinct senses of fashion.