Google has responded to claims of anti-Semitism in its search engines by altering autocomplete suggestions.
According to the Observer, suggestions for the phrase “are Jews…” by a Google search engine are usually followed by “evil”. People searching “are women…” had autocomplete suggest “evil” and for “are Muslims…”, “bad” was the suggestion.
However at the start of the week the autocomplete suggestions had been removed by Google for searches on Jews and Women but the search “are Muslims…” still suggest the same results.
“We took action within hours of being notified on Friday of the autocomplete results,” according to a Google spokesperson. “Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query.”
“Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for such a wide range of material on the web – 15% of searches we see every day are new. Because of this, terms that appear in autocomplete may be unexpected or unpleasant,” the spokesperson explained.
This is not the first time Google and others’ autocomplete and search algorithms have caused offence. An auto-suggested photo tag within Google’s Photos service in July 2015 labelled two black teenagers as “Gorillas”. Google apologised and said it was working on “longer term fixes” around the recognition of dark-skinned faces as well as the linguistics of photo labels.
In June this year, Google’s search engine also came under fire for the offence caused when images for “three black teenagers” showed criminal mugshots but did not hold the same search results for “three white teenagers”.