LLM Reference
Azure OpenAI

Falcon 7B on Azure OpenAI

Falcon · Technology Innovation Institute (TII)

Provisioned

Pricing

TypePrice (per 1M)
Input tokens$0.52
Output tokens$0.67

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Falcon 7B

Falcon-7B, developed by the Technology Innovation Institute, is a cutting-edge large language model boasting a decoder-only architecture with 7 billion parameters. It's trained on 1,500 billion tokens from the curated web dataset, RefinedWeb, enhancing its performance in language tasks. The model is equipped with advanced features like FlashAttention and multiquery attention, optimizing speed and memory usage. With 32 layers and rotary positional embeddings, it manages a sequence length of up to 2048 tokens efficiently. Renowned for tasks such as text generation, summarization, translation, and conversational AI, Falcon-7B is open-source under Apache 2.0, suitable even for consumer hardware, needing at least 16GB of memory for inference 236.

Get Started

Model Specs

Released2023-11-28
Parameters7B
ArchitectureDecoder Only

Related Models on Azure OpenAI