LLM ReferenceLLM Reference

Falcon 7B

About

Falcon-7B, developed by the Technology Innovation Institute, is a cutting-edge large language model boasting a decoder-only architecture with 7 billion parameters. It's trained on 1,500 billion tokens from the curated web dataset, RefinedWeb, enhancing its performance in language tasks. The model is equipped with advanced features like FlashAttention and multiquery attention, optimizing speed and memory usage. With 32 layers and rotary positional embeddings, it manages a sequence length of up to 2048 tokens efficiently. Renowned for tasks such as text generation, summarization, translation, and conversational AI, Falcon-7B is open-source under Apache 2.0, suitable even for consumer hardware, needing at least 16GB of memory for inference 236.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(4)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Microsoft Foundry$0.52$0.67Provisioned
GCP Vertex AIServerless
Cloudflare Workers AIServerless
Alibaba Cloud PAI-EASServerless

Rankings

Specifications

FamilyFalcon
Released2023-11-28
Parameters7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Innovative open-source AI for global impact

Abu Dhabi, United Arab Emirates
Founded 2019
Website