LLM Reference

Platypus2 70B on Together AI

Platypus2 · garage-bAInd

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.90
Output tokens$0.90

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Platypus2 70B

Platypus2-70B is an advanced auto-regressive language model leveraging the LLaMA 2 transformer architecture, specifically designed by Cole Hunter and Ariel Lee. Distinguished for its exceptional capabilities in STEM and logic tasks, the model's proficiency is bolstered by its training on the Open-Platypus dataset, optimized using Low-Rank Adaptation (LoRA) and Parameter-Efficient Fine-Tuning (PEFT) techniques. This efficient training method enables performance optimization with fewer computational resources. Notably, Platypus2-70B once attained the top spot on HuggingFace's Open LLM Leaderboard, showcasing its robust performance across various benchmark metrics. It supports diverse applications in fields such as education and research, although continued emphasis on safety and bias testing remains crucial. The model emphasizes English language proficiency and offers quantized versions for varied hardware compatibility.

Get Started

Model Specs

Released2023-12-15
Parameters70B
ArchitectureDecoder Only