LLM ReferenceLLM Reference
4 models2024Up to 200K ctxFrom $0.2/1M input

About

The Capybara family of large language models (LLMs), developed by Nous Research, is known for its cutting-edge training techniques and robust performance. These models excel in bilingual tasks, particularly in English and Chinese, thanks to their Yi-34B model foundation 5. A standout feature is their remarkable ability to handle extensive contexts of up to 200,000 tokens 25, which facilitates a profound understanding of complex topics and nuanced dialogues. The training approach incorporates an innovative synthesis method known as Amplify-Instruct, leveraging diverse techniques and instructional data from datasets like Airoboros, EverythingLM, and Know_Logic, as well as LessWrong posts 235. This training results in models capable of sophisticated reasoning, engaging in multi-turn dialogues, and effectively summarizing intricate subjects 235. The family encompasses models with parameter sizes of 3B, 7B, and 34B, with larger variants planned 23. Additionally, a multimodal extension named Obsidian enhances their range of functionalities 2.

Specifications(4 models)

Capybara model specifications comparison
ModelReleasedContextParametersStructured Outputs
Nous Capybara 34B2024-10200K34BNo
Nous Capybara 7B2024-107BNo
Nous Capybara 7B V1.92024-107BYes
Nous Capybara 3B V1.92024-103BNo

Available From(2 providers)

Pricing

Capybara model pricing by provider
ModelProviderInput / 1MOutput / 1MType
Nous Capybara 7B V1.9Fireworks AI$0.2$0.2Provisioned
Nous Capybara 7B V1.9Together AI$0.2$0.2Serverless
Nous Capybara 34BFireworks AI$0.9$0.9Provisioned

Frequently Asked Questions

What is Capybara?
The Capybara family of large language models (LLMs), developed by Nous Research, is known for its cutting-edge training techniques and robust performance. These models excel in bilingual tasks, particularly in English and Chinese, thanks to their Yi-34B model foundation 5. A standout feature is their remarkable ability to handle extensive contexts of up to 200,000 tokens 25, which facilitates a profound understanding of complex topics and nuanced dialogues. The training approach incorporates an innovative synthesis method known as Amplify-Instruct, leveraging diverse techniques and instructional data from datasets like Airoboros, EverythingLM, and Know_Logic, as well as LessWrong posts 235. This training results in models capable of sophisticated reasoning, engaging in multi-turn dialogues, and effectively summarizing intricate subjects 235. The family encompasses models with parameter sizes of 3B, 7B, and 34B, with larger variants planned 23. Additionally, a multimodal extension named Obsidian enhances their range of functionalities 2.
How many models are in the Capybara family?
The Capybara family contains 4 models.
What is the latest Capybara model?
The latest model is Nous Capybara 34B, released in 2024-10.
How much does Capybara cost?
Capybara models range from $0.2/1M to $0.9/1M input tokens depending on the model and provider.

Models(4)