AI Glossary
transformer
Definition
A Transformer is a neural architecture using self-attention mechanisms to process sequences in parallel, revolutionizing NLP by handling long-range dependencies efficiently. Transformers form the backbone of LLMs, with decoder-only variants dominating generative tasks and enabling scalable training on huge datasets.