- The Media Line - https://themedialine.org -

Google Slammed for Suggesting ‘Smelly Arabs’

Organization claims ‘Google Suggest’ feature perpetuates stereotypes of Arabs smelling bad and having big noses.

You’re a white Christian living in a developed Western country and your neighbor is an Arab Muslim. It’s not a rare setting.

Once a year you notice that for an entire month your neighbor doesn’t eat or drink during the day. You hear that this is known as the month of Ramadan. You are curious and worldly and you would like to know more about your neighbor’s culture and religion. You may not feel comfortable asking your neighbor. So you resort to the next best thing – Google! – that wonderful tool that helps you trawl through monstrous masses of online information to find the most esoteric information out there.

You may not differentiate between Arabs as an ethnic group and Islam as a religion. You start punching in the words “why do Arabs…”

And here you will be baffled, amused or horrified at what happens next.

The ‘Google Suggest’ feature, a labor saving device designed to predict queries will automatically suggest completing your query with ‘why do Arabs stink?’ or ‘why do Arabs have big noses?’

The suggestions are not programmed by Google but rather are based on an algorithm that takes the frequency of Google queries into account.

Other top suggestions yielded by typing, “Why do Arabs” in Google’s search include,

It is not hard to understand why Arab interest groups such as the London-based Arab Media Watch (AMW) have started to remonstrate against the suggestions.

“What’s worrying is that these [suggestions] are based on the overall popularity of searches, so if you may not have been looking for that, many other people have,” Guy Gabriel, advisor to AMW told The Media Line. “We’re in a day and age where the Internet is a tool by which we break down barriers and learn more about different communities across the world so it’s alarming to notice on Google that this isn’t the case as it stands.”

The organization advocating fair and objective coverage of Arab issues in the British media says Google is “failing in its aim to avoid offending a large audience of users,” and said the feature not only perpetuates stereotypes but also highlights a worrying trend among Google users.

“I’m not suggesting that Google are aware of this and they are refusing to do anything about it,” Gabriel said. “What it does mean is that they have pledged to try and prevent it in cases where they know about it. Now that it has been flagged, they are in a position to do something about it.”

Similar queries on other ethnic groups suggest that Google users think Jews have long noses and are rude and cheap, Asians smell bad and have bad teeth and Chinese people have bad breath.

AMW claimed that while searches regarding other ethnic groups produced a similar range of pejorative or stereotypical suggestions, queries about Arabs yielded more offensive results than other groups, and a search using Jews produced noticeably far less.

In searching ‘Why do Jews’, for example, two pejorative suggestions were raised out of a list of 10: ‘why do Jews have big noses’ and ‘why are Jews hated’. Remaining suggestions were more informative, such as ‘why do Jews celebrate Passover.’

“It’s not to say Arabs should feel singled out,” Gabriel said. “There is invective against different people. We’re drawing attention to this fact.”

But Jeff Jarvis, a media expert, blogger and director of the interactive journalism program at the City University of New York’s Graduate School of Journalism warned against regulating the predictive feature.

“The reflex of censorship and regulation of speech is what I find offensive,” he told The Media Line. “For better or worse, Google is merely reflecting back what people are asking and saying. The result is generally brilliant — this is the insight that powers all of Google — but sometimes unfortunate.”

“If Google puts itself in the position of censoring anything that anyone could find offensive anywhere, then we will be left with a least common denominator of nothing,” Jarvis argued. “That would be the greater tragedy.”

“I suggest that the issue here is not Google but is the larger question of cultural knowledge,” he continued. “I suggest that the answer is not to retreat behind complaints but instead to publish more and connect more people with each other to gain greater understanding and muffle the sound of the bad with the sound of the good.”

A Google spokesperson told The Media Line that suggestions in the Google homepage search box were “based on neutral algorithms, to help them formulate the query, reduce spelling errors, and save keystrokes by choosing from the list of suggestions.”

Google said the ‘Google Suggest’ feature uses a combination of signals to rank its predictions such as overall popularity of various searches.

“We try not to suggest queries that could be offending to a large audience of users,” Google said in a statement. “This includes explicit porn words as well as queries that lead to porn sites, dirty words, hate and violence terms.”

“We do remove certain clearly pornographic or hateful or malicious slur terms from Suggest,” the spokesperson continued. “We find that by providing suggestions upfront to the user, we can help make their search experience more efficient and convenient.”

“We are continuously improving the Google search experience,” the spokesperson said. “We have no future plans to announce at this time.”

The spokesperson referred users to a page [3] where requests can be filed to remove content from the Google Suggest index.

Google Suggest, intended to be a labor saving device is described by Google as a tool to help rest your fingers, catch spelling mistakes, save time and automatically repeat a search that has been made on your computer in the past.

For example, if you type ‘New York’, or even just ‘New Y’ it opens a window suggesting New York Times, New York Post, New York University and New York Yankees. You just need to scroll down and pick your query.