What's one thing a potential employer or date will do before hiring or deciding to meet up with you for coffee Tuesday night? Answer: Google you. Depending on the algorithm's results, Google's handy autocomplete may bring up a couple keywords as they start typing your name into the search bar. Here's where things can get sticky if your name is associated with not so flattering words or activities.
For a Japanese man, this feature has lost him his job and taken away the potential for several others. It is for this reason, a Tokoyo district court has ordered Google to turn off or disassociate certain terms in autocomplete for this unnamed man (the name was withheld for his privacy). PC World reports that the autocomplete function would bring up crimes the man says he didn't commit:
The search giant likely links the man's name to the crime terms because a false story about him containing allegations apparently spread across various sites, which were then indexed by the search giant, Tomita said. The man says he has no knowledge of the types of crimes that appear.
Google did not respond to a request for comment on this case. The company has faced similar cases in other countries and has usually responded with the defense that it is not responsible for the results, as they are automatically generated, though this defense has not always succeeded. The company does screen some terms from its auto-complete feature, including pornographic words.
Last year in Italy, a court ordered that Google filter out search suggestions damaging to individual reputations after a man's name was linked to "con man" and "fraud." The company was fined in France because an insurance company was linked to the word "crook," and has also been the subject of litigation from a hotel in Ireland and individuals in the U.S., according to media reports.
Tokyo lawyer Hiroyuki Tomita, who is representing the man, said that he was told by some potential employers that he was not brought on thanks to the autocomplete associating his names with the crimes. ABC News elaborates stating that when his name is searched, according to Tomita, more than 10,000 disparaging words are associated with him.
Even with the court order that set a deadline of Sunday for the man's name to be relieved of Google's autocomplete, as of Monday the function remained unchanged, according to PC World. ABC News states that Google has maintained its autocomplete doesn't violate privacy policies or the Communications Decency Act of the U.S., and it will therefore not change its system to comply with Japanese law. Here's what Google's autocomplete policy says about how the feature works:
Predicted queries are algorithmically determined based on a number of purely algorithmic factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.
The search queries that you see as part of autocomplete are a reflection of the search activity of all web users. Just like the web, the search queries presented may include silly or strange or surprising terms and phrases. While we always strive to neutrally and algorithmically reflect the diversity of content on the web (some good, some objectionable), we also apply a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights.
AsiaJin reports Google Japan as saying it doesn't have the authority to remove these words from associating with the man's name. If Google continues to refuse compliance with the court order, Tomita said his client will seek further legal action for damages he has incurred.
ZDNet reports Tomita as saying this feature "could lead to irretrievable damage such as a loss of job or bankruptcy" for his client.
What do you think? Should Google comply with the Japanese court order to prevent this man with being associated disparaging terms? Or is the search giant in the right to deny this request given that it is headquartered in the United States -- therefore under U.S. law -- and its autocomplete policy has a "narrow set of removal policies"?