human ai hallucinations
| |

When AI Gets It Wrong: The Truth About AI Hallucinations and Human Misperceptions

ChatGPT was released to the public in November 2022. Within six months, it landed an attorney in hot water with a judge. The attorney submitted a brief to the judge that ChatGPT had written in its entirety. As impressive as that sounds, it contained six fictitious citations to other cases and precedents. The judge was not pleased, and the attorney was professionally embarrassed. Everyone involved was a victim of an Artificial Intelligence hallucination.

In what may have been the most public example of Artificial Intelligence hallucinations, this example provides an excellent illustration of one of the risks of relying too deeply on AI for information. Because AI is a language model, not a logic model, we shouldn’t be surprised that this can happen. We think of hallucinations as sensory misinterpretations. AI “hallucinates” by generating false information. Although they’re different, AI and human hallucinations share common patterns of distortion. Let’s explore this.

What Are AI Hallucinations?

AI hallucinations are incorrect, misleading, or entirely fabricated outputs by an AI system. AI generates its responses probabilistically based on patterns it experiences in its training data. Because AI never gets “real-world experience,” it can’t verify its claims. Training data can contain intended or unintended biases, as I explained here. What’s more, AI is confident in its misinformation. It doesn’t “know” it’s wrong, it simply presents answers as if they are correct. Remember, it’s not a logic model. It’s all about language and prediction.

In another well-known example of this phenomenon, Meta’s AI data verification system incorrectly labeled the assassination attempt against President Trump as fake news. Google’s AI claimed that astronauts played with cats on the moon, even generating a fictional quote by Neil Armstrong. These are both examples of hallucinations generated by language probability models.

How Do Humans Experience “Hallucinations” in a Similar Way?

When I was growing up, my mother always told us that my father’s mother died in an asylum, and that was all the information we could get for a long time. Although there were no drugs involved and my mother did not have mental illness, as we often associate with hallucinations, this was a prime example of a hallucination. Personal cognitive hallucinations are not just visual or auditory, but also false memories, biases, and misunderstandings. Yup, you read that right.

Our brains hate “open loops.” They don’t like unfinished business, and they detest not having all the data points. When information is complete, the brain will fill in the gaps. As much as our brains hate open loops, they love patterns – and sometimes where our brain sees or perceives a pattern, it will generate a conclusion, and sometimes it gets it wrong. I wrote a post a while back on the Six Biases, and one of those is Confirmation Bias – a tendency to believe more easily something that supports what we already believe. That’s another hallucination.

Here’s an example of a hallucination called the Mandela Effect, which is when large groups of people misremember something in the same way. Back when John McCain was running for President, and Sarah Palin was his running mate, during an interview with ABC’s Charles Gibson, Mrs. Palin said, “They’re our next-door neighbors, and you can actually see Russia from land here in Alaska, from an island in Alaska.” Not long after that interview, Saturday Night Live produced a parody of that interview, with Tina Fey playing Mrs. Palin. Ms. Fey’s line was, “I can see Russia from my house.” It doesn’t require a PhD in Geography to know that Mrs. Palin’s home was too far inland for that to be true, but the line from Saturday Night Live became attributed to Mrs. Palin, from people who seriously believed she said it. Another hallucination, and a classic example of the Mandela Effect.

Comparing AI and Human Hallucinations

There are things that humans have in common with AI when we look at hallucinations. Both AI and human brains rely on pattern recognition, and that can lead to false conclusions in both humans and AI. Both AI and humans struggle with distinguishing incorrect, incomplete, and inaccurate information when they receive bad input.

But, where AI lacks self-awareness and emotion, humans attach emotions and personal experiences even to false memories. Also, AI needs to be retrained by an external force in order to reduce and overcome hallucinations, where we humans have the capacity to do our own verification by seeking out external and alternative sources of information.

Real-World Impact of Hallucinations

Both human and AI hallucinations have an effect in our world. AI can spread misinformation – so can humans, for that matter – and humans can pick it up and ingest and believe it as truth. This has led to conspiracy theories and further spread. People can easily make personal misjudgments based on assumptions rather than facts. But when an AI produces hallucinations in medical, legal, and academic fields, we can see the potential for some serious outcomes.

Your Turn

The true story about my father’s mother was that she died of tuberculosis. How do we get from tuberculosis to the story of an asylum? I’m not sure how my mother got it so wrong, because she was a pretty smart woman, but it’s only a couple of degrees of separation. Tuberculosis patients went to sanitoriums. People in the 1920’s with mental health problems went to sanitariums. Sanitorium – Sanitarium – and from there, it’s a short hop to asylum. There was an open loop that my mom’s brain filled in for her.

Both AI and humans hallucinate in our own ways, but as humans, we have the power to reduce misinformation. That’s the call to action from this post. Well, that, and a request to drop a comment with your best story of a hallucination, either AI or human. (scroll down past the related posts)


My photography shops are https://www.oakwoodfineartphotography.com/ and https://oakwoodfineart.etsy.com, my merch shops are https://www.zazzle.com/store/south_fried_shop and https://society6.com/southernfriedyanqui.

Check out my New and Featured page – the latest photos and merch I’ve added to my shops! https://oakwoodexperience.com/new-and-featured/

Curious about safeguarding your digital life without getting lost in the technical weeds? Check out ‘Your Data, Your Devices, and You’—a straightforward guide to understanding and protecting your online presence. Perfect for those who love tech but not the jargon. Available now on Amazon:
https://www.amazon.com/Your-Data-Devices-Easy-Follow-ebook/dp/B0D5287NR3

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *