Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.10 |
| Output tokens | $0.50 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution
About Dolly 2.0 12B
Dolly 2.0 12B is an instruction-following large language model developed by Databricks. It is built on the Pythia-12b architecture with 12 billion parameters and trained on around 15,000 instruction-response pairs crafted by Databricks employees. This model is adept at handling tasks such as brainstorming, classification, open and closed question answering, text generation, information extraction, and summarization. Despite its versatility, Dolly 2.0 12B is not considered state-of-the-art and struggles with more complex prompts, programming, mathematical tasks, factual accuracy, and nuanced tasks like humor. It is notable for its open-source license, permitting commercial use, though it may carry biases from its training data.
Model Specs
Released2023-04-12
Parameters12b
ArchitectureDecoder Only