Google’s algorithmic failures can have dreadful consequences, from directing racist search termsto the White Housein Google Maps to labelingBlack people as gorillasin Google Photos.
This week, the Silicon Valley giant added another algorithmic screw-up to the list: misidentifying a software engineer as a serial killer.
The victim of this latest botch was Hristo Georgiev, an engineer based in Switzerland. Georgiev discovered that a Google search of his name returned a photo of him linked to a Wikipedia entry on a notorious murderer.
Seems like Google falsely associated a photo of mine with a Wikipedia article of a serial killer. I don’t know if this is hilarious or terrifying.pic.twitter.com/rmAL7uQYy4
— Hristo Georgiev (@hggeorgievcom)June 24, 2021
“My first reaction was that somebody was trying to pull off some sort of an elaborate prank on me, but after opening the Wikipedia article itself, it turned out that there’s no photo of me there whatsoever,” said Georgiev in ablog post.
[Read:Why entrepreneurship in emerging markets matters]
The 💜 of EU tech
The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now!
Georgiev believes the error was caused by Google’s knowledge graph, which generates infoboxes next to search results.
He suspects the algorithm matched his picture to the Wikipedia entry because the now-dead killer shared his name.
Georgiev is far from the first victim of the knowledge graph misfiring. The algorithm has previously generated infoboxes that falsely registered actor Paul Campbellas deceasedand listed the California Republican Party’s ideologyas “Nazism”.
In Georgiev’s case, the issue was swiftly resolved. After reporting the bug to Google, the company removed his image from the killer’s infobox. Georgiev gave credit to theHackerNews communityfor accelerating the response.
Other victims, however, may not be so lucky. If they never find the error — or struggle to resolve it — the misinformation could have troubling consequences.
I certainly wouldn’t want a potential employer, client, or partner to see my face next to an article about a serial killer.
Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to itright here.
Story byThomas Macaulay
Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.
Get the TNW newsletter
Get the most important tech news in your inbox each week.