From an email I received from Mozzilla
Google prides itself on the consistency and accuracy of their search engine. But its latest (currently US-only) AI-powered search feature for quick answers, AI Overviews, has been serving up some bizarre – and potentially dangerous – answers:
- “Cheese not sticking to pizza?” Mix about 1/8 cup of non-toxic glue to the sauce.
- “Is poison good for you?” Yes, poison can be good for humans in small quantities.
- “How many rocks should I eat?” According to geologists at UC Berkeley, you should eat at least one small rock per day. 1
Google users are already reporting many other examples like these. And while many of them are funny, others could be life-threatening: People have reported that Google’s AI Overviews have told them to add more oil to a cooking fire, to clean with deadly chlorine gas and to follow false advice about life-threatening diseases.2
Google has spent decades and billions building its reputation as a source of consistent and accurate information. By prematurely rolling out a harmful AI feature that is clearly not yet ready, nor equipped to provide users with accurate and safe information, the company is risking not only its reputation – but potentially its users’ lives.
Internet sleuths have tracked down some of the odd answers from Google’s AI as being from sarcastic replies on Reddit threads and in articles written by satirical outlets like The Onion 3,4. It is alarming that the tool would take them at face value and suggest them as top answers.
Google’s CEO has defended the new search function, noting that it provides valuable “context” but that “there are still times it’s going to get it wrong” – and also noting that ‘‘hallucinations” are both an “unsolved problem…and an inherent feature” of AI models. 5
But these so-called “hallucinations” could have dire consequences – and are a tangible example of why AI needs to be trustworthy. At Mozilla, we have been working on and advocating for trustworthy AI for years, making sure AI products make our lives easier – instead of threatening them. Together, let’s put the pressure on Google and make sure it removes AI Overviews until the tool has been fixed.
Thank you for everything you do for the internet.
Christian Bock
Head of Supporter Engagement
Mozilla
More Information:
- CNET: Glue in Pizza? Eat Rocks? Google’s AI Search Is Mocked for Bizarre Answers. 24 May 2024.