LLM ReferenceLLM Reference
Microsoft Foundry

Dolly 2.0 12B on Microsoft Foundry

Dolly 2.0 · Databricks Mosaic

Provisioned

Compare Dolly 2.0 12B Across Providers

ProviderInput (per 1M)Output (per 1M)
Microsoft Foundry$0.07$0.07
Replicate API$0.10$0.50

Pricing

TypePrice (per 1M)
Input tokens$0.07
Output tokens$0.07

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Dolly 2.0 12B

Dolly 2.0 12B is an instruction-following large language model developed by Databricks. It is built on the Pythia-12b architecture with 12 billion parameters and trained on around 15,000 instruction-response pairs crafted by Databricks employees. This model is adept at handling tasks such as brainstorming, classification, open and closed question answering, text generation, information extraction, and summarization. Despite its versatility, Dolly 2.0 12B is not considered state-of-the-art and struggles with more complex prompts, programming, mathematical tasks, factual accuracy, and nuanced tasks like humor. It is notable for its open-source license, permitting commercial use, though it may carry biases from its training data.

Get Started

Model Specs

Released2023-04-12
Parameters12b
ArchitectureDecoder Only