Google AI overviews will explain any nonsense phrase you make up

What does 'You can't lick a badger twice' mean? AI Overviews will confidently explain any idiotic saying you invent.

Apr 24, 2025 - 18:13
 0
Google AI overviews will explain any nonsense phrase you make up
google logo on a phone screen with google logo in background

Google's AI Overviews sometimes acts like a lost man who won't ask for directions: It would rather confidently make a mistake than admit it doesn't know something.

We know this because folks online have noticed you can ask Google about any faux idiom — any random, nonsense saying you make up — and Google AI Overviews will often prescribe its meaning. That's not exactly surprising, as AI has shown a penchant for either hallucinating or inventing stuff in an effort to provide answers with insufficient data.

In the case of made-up idioms, it's kind of funny to see how Google's AI responds to idiotic sayings like "You can't lick a badger twice." On X, SEO expert Lily Ray dubbed the phenomenon "AI-splaining."

Someone on Threads noticed you can type any random sentence into Google, then add “meaning” afterwards, and you’ll get an AI explanation of a famous idiom or phrase you just made up. Here is mine

[image or embed]— Greg Jenner (@gregjenner.bsky.social) April 23, 2025 at 6:15 AM

Fantastic technology, glad society spent a trillion dollars on this instead of sidewalks.

[image or embed]— Dan Olson (@foldablehuman.bsky.social) April 21, 2025 at 12:01 AM

New game for you all: ask google what a made-up phrase means.

[image or embed]— Crab Man (@crabman.bsky.social) April 18, 2025 at 1:40 AM

I tested the "make up an idiom" trend, too. One phrase — "don't give me homemade ketchup and tell me it's the good stuff" — got the response "AI Overview is not available for this search." However, my next made up phrase — "you can't shake hands with an old bear" — got a response. Apparently Google's AI thinks this phrase suggests the "old bear" is an untrustworthy person.

google ai overview describing what
Credit: Screenshot: Google

In this instance, Google AI Overview's penchant for making stuff up is kind of funny. In other instances — say, getting the NFL's overtime rules wrong — it can be relatively harmless. And when it first launched, it was telling folks to eat rocks and put glue on pizza. Other examples of AI hallucinations are less amusing. Keep in mind that Google warns users that AI Overviews can get facts wrong, though it remains at the top of many search results.

So, as the old, time-honored idiom goes: Be wary of search with AI, what you see may be a lie.