Platypus 30B
About
Platypus 30B is a sophisticated large language model based on the LLaMA transformer architecture, featuring 33 billion parameters, 60 layers, and 52 attention heads. It excels in tasks such as text generation, question answering, and instruction following. The model's training involved a specialized dataset aimed at improving performance in STEM and logical reasoning, utilizing LoRA for efficient fine-tuning. Despite its advanced capabilities, Platypus 30B faces challenges with bias, static knowledge, and handling complex queries, emphasizing the need for responsible usage.
Capabilities
MultimodalFunction CallingTool UseJSON Mode