LLM Reference

Falcon 180B

About

Falcon 180B is an advanced large language model developed by the Technology Innovation Institute in the UAE, featuring an impressive 180 billion parameters. It is trained on a vast dataset of 3.5 trillion tokens, positioning it amongst the top-performing LLMs worldwide. This open-source model excels in natural language processing tasks such as reasoning, coding, and knowledge retrieval, showing results comparable to leading closed-source models like Google's PaLM 2 Large. While Falcon 180B is publicly available and open for commercial use under a modified Apache 2.0 license, its deployment requires significant computational resources. The model is optimized through architectural enhancements like multiquery and multigroup attention, improving efficiency and performance. Falcon 180B is accessible via platforms such as Hugging Face, fostering collaboration and wider application integration.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(2)

ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EAS
Serverless
Scale AI GenAI Platform
Serverless

Specifications

FamilyFalcon
Released2023-11-28
Parameters180B
ArchitectureDecoder Only
Specializationgeneral