LLM Reference

LaMDA 2B

About

LaMDA (Language Model for Dialogue Applications) is a family of conversational large language models developed by Google, featuring a 2 billion parameter variant. This model is a decoder-only Transformer, pre-trained on a vast dataset of 1.56 trillion words, comprising both documents and dialogues. LaMDA's design enables it to participate in open-ended conversations, offering contextually relevant and engaging responses. However, this version's smaller size might limit its performance on tasks requiring extensive context, compared to larger models like GPT-4. Challenges with accuracy, safety, and factual grounding were noted in earlier versions, but Google is actively investing in fine-tuning and integrating external knowledge to enhance the model's capabilities.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyLaMDA
Parameters2B
ArchitectureDecoder Only
Specializationgeneral