Hilarious gibberish or AI’s fatal flaw? Google Search confidently explains nonsense phrases (Update)
"You can't lick a badger twice" shows why trusting Google Search is getting harder and harder everyday.


- Google Search’s AI-powered summary, called AI Overviews, is generating false definitions for entirely fictional idioms.
- Users have found they could type nonsensical phrases into Google paired with “meaning” to receive fabricated but very confident explanations.
- Even though Google labels AI Overviews as experimental, this behavior raises significant concerns about trust and accuracy in Google’s search results.
Update, April 24, 2025 (10:27 AM ET): Through a spokesperson, Google has provided the following statement on AI Overviews’ hallucination:
When people do nonsensical or ‘false premise’ searches, our systems will try to find the most relevant results based on the limited web content available. This is true of Search overall, and in some cases, AI Overviews will also trigger in an effort to provide helpful context. AI Overviews are designed to show information backed up by top web results, and their high accuracy rate is on par with other Search features like Featured Snippets.