LLM Reference

Nous Capybara 34B

About

The Nous Capybara 34B, developed by NousResearch, is a cutting-edge large language model built on the Yi-34B architecture. It stands out with its remarkable 200K context length, enabling effective handling of vast input data. Excelling in tasks such as text generation, conversational AI, complex summarization, and information recall, the model is trained on a concise dataset of 20,000 examples, enhanced by the Amplify-Instruct synthesis technique. This transformer-based model provides multiple quantization formats for varied hardware capacities, though it faces challenges like small dataset size and potential accuracy trade-offs. Despite these, it remains a powerful tool for chatbot, content generation, and data analysis applications.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Fireworks AI Platform
Provisioned

Specifications

FamilyCapybara
Parameters34B
Context200K
ArchitectureDecoder Only
Specializationgeneral