Pythia 1.4B
About
Pythia 1.4B, developed by EleutherAI, is a transformer-based large language model comprising 1.4 billion parameters. As part of the Pythia Scaling Suite, this model is designed to aid research on the behavior and limitations of large language models. It excels in generating text, understanding natural language, and supporting scientific experiments, making it a valuable tool for interpretability research. However, it's not specifically fine-tuned for certain applications like chatbots, which may result in responses that differ from user expectations. Limited to English, Pythia 1.4B may also produce biased or inappropriate content due to its training on a diverse yet potentially problematic dataset, the Pile 234.
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Specifications
FamilyPythia
Released2023-05-31
Parameters1.4B
ArchitectureDecoder Only
Specializationgeneral