LLM Reference

Platypus2 70B

About

Platypus2-70B is an advanced auto-regressive language model leveraging the LLaMA 2 transformer architecture, specifically designed by Cole Hunter and Ariel Lee. Distinguished for its exceptional capabilities in STEM and logic tasks, the model's proficiency is bolstered by its training on the Open-Platypus dataset, optimized using Low-Rank Adaptation (LoRA) and Parameter-Efficient Fine-Tuning (PEFT) techniques. This efficient training method enables performance optimization with fewer computational resources. Notably, Platypus2-70B once attained the top spot on HuggingFace's Open LLM Leaderboard, showcasing its robust performance across various benchmark metrics. It supports diverse applications in fields such as education and research, although continued emphasis on safety and bias testing remains crucial. The model emphasizes English language proficiency and offers quantized versions for varied hardware compatibility.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Together AI API$0.9$0.9
Serverless

Specifications

FamilyPlatypus2
Parameters70B
ArchitectureDecoder Only
Specializationgeneral