LLM Reference
AI Glossary

GPT

Definition

GPT (Generative Pretrained Transformer) is a decoder-only Transformer model series pretrained on internet-scale data for autoregressive text generation, pioneering LLMs like GPT-3 and GPT-4. It exemplifies foundation models adapted via fine-tuning for chat and instruction tasks, demonstrating scaling benefits and few-shot learning.