LLM Reference
AI Glossary
technique

Grounding

Definition

Grounding in LLMs involves anchoring model outputs to verifiable external sources, such as retrieved documents or real-time data, to ensure factual accuracy and reduce hallucinations. It provides a foundation for reliable generation by linking responses to evidence.

Models Using Grounding(6)