Algorithms can be racist, too. Specifically Yelp’s which, as the Tampa Bay Times reports, is proliferating harmful Asian stereotypes by listing Korean and Chinese restaurants under the search terms “dog meat” and “cat meat.”
When a user clicks the search term “dog meat,” which may appear when looking for something as innocuous as a pet groomer, the app reveals a list of Korean restaurants. And when the search phrase is changed to “cat meat,” a list of Chinese restaurants appears. The Tampa Bay Times performed these searches on a dozen major cities with nearly the same results.
A spokesperson for Yelp told the publication that the company’s searches rely on “real-world consumer user data and human behavior patterns.” The Asian stereotypes being perpetuated by the app have been learned by Yelp’s algorithm, thanks to keywords in Yelp’s user-submitted restaurant reviews, users’ previous searches, and behaviors on the app.
“To be clear, no human programed these results or matches and we are taking prompt action to remove them from autocomplete and our other systems,” Yelp’s statement said.
“Our small businesses must already combat racist, inflammatory reviews from users on Yelp. These biased search results for dog and cat meat vendors should not be an additional concern,” Ken Lee told the Tampa Bay Times. Lee is the CEO of OCA, a national organization that advocates for the overall well being of Asian-Americans. “The company has a social and corporate responsibility to their consumers and local businesses to keep their app and website clear of prejudice and misinformation. Algorithmic bias against communities of color continues to plague new technology and online platforms—this incident is another in a long line of incidents.”
This isn’t the first time prejudicial biases have had an influence on algorithms. The Times points out that Google had to adjust its search engine because the phrases “are Jews,” “are women” or “Islamists” once auto-filled with “evil.” At one point in time, Google Photos categorized black people as gorillas, and the list of algorithm-biased errors goes on.
“We have many racist stereotypes in the United States that harm Asian Americans, and these kinds of derogatory notions are likely embedded in some of the reviews, but also circulating widely as key phrases in other online spaces outside of Yelp,” Safiya Umoja Noble told the Times. Noble is a professor at USC and author of Algorithms of Oppression: How Search Engines Reinforce Racism. “We need a re-evaluation of the biased data that is used to train artificial intelligence to sort through the volumes of content.”
There’s no doubt that the stereotypes of Asian people eating dog or cat meat is harmful, hurtful, and othering. It’s something that me and my fellow Korean friends have been grilled about on countless occasions, and something I believe reflects a decades-old, embedded tendency to stereotype Asian-Americans, even by people who chose to go to our families’ restaurants. People may never stop sharing their biases, but it’s up to companies like Yelp to make sure their algorithms don’t proliferate that nonsense, too.
Read the Tampa Bay Times’ full report here.