LLM Reference
AI Glossary
behavior

Hallucination

Definition

Hallucination refers to large language models generating plausible but factually incorrect or fabricated information, often confidently presented as true. It arises from gaps in training data or overgeneralization and is mitigated by techniques like RAG or grounding.

Models Using Hallucination(12)