AI Hallucination
2 bite-size cards · 60 seconds each
What is AI Hallucination?
AI hallucination is when a language model generates text that sounds confident and coherent but is factually wrong — inventing citations that don't exist, misquoting real people, or describing events that never happened. It's one of the most significant limitations of current LLMs and a major barrier to high-stakes deployment.
Hallucination Taxonomy: Types, Causes, and Targeted Mitigations
Not all hallucinations are equal. Factual confabulation, reasoning errors, instruction deviation, and temporal confusion are distinct failure modes with different causes and different mitigations. Understanding the taxonomy lets you choose the right fix for the specific hallucination type your application encounters.
Keep going
Sign up free to get a personalised feed that adapts to your interests as you swipe.
Start for free