LLM Reference

Dolphin 2.5 Mixtral 8x7B

About

The Dolphin 2.5 Mixtral 8x7B is a sophisticated large language model designed primarily for coding tasks, known for its proficiency across diverse programming languages including Kotlin. It utilizes the Mixtral-8x7b architecture and has been fine-tuned on datasets like Dolphin-Coder and MagiCoder, employing qLoRA and Axolotl during training. Featuring a 16k context window for fine-tuning and a base context window of 32k, it offers powerful yet uncensored capabilities, allowing it to handle a wide range of prompts, albeit this introduces ethical considerations. The model is available in various formats on platforms like Hugging Face, catering to different needs with options such as GGUF and GPTQ quantization levels. Despite its strengths, users should be mindful of ethical sensitivities and implement alignment measures when deploying it publicly.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Together AI API$0.6$0.6
Serverless

Specifications

FamilyDolphin
Parameters8x7B
ArchitectureMixture of Experts
Specializationgeneral