LLM Reference
Platypus

Platypus

About

The Platypus family is a series of cutting-edge large language models (LLMs) developed by researchers at Boston University, achieving top performance on Hugging Face's Open LLM Leaderboard. These models are characterized by their impactful use of the Open-Platypus dataset, a curated collection focusing on STEM and logic, which enables high performance with minimal fine-tuning. By integrating Low-Rank Adaptation (LoRA) modules, Platypus models effectively combine pre-trained strengths with domain-specific insights. This innovative approach facilitates significant reductions in training time and resource usage, as a 13B Platypus model can be trained in just 5 hours using a single A100 GPU. Furthermore, the research team has addressed data contamination issues during training to ensure reliable outcomes. 123.

Models(1)

Details

Researchergarage-bAInd
Models1