LLM ReferenceLLM Reference

Falcon 180B

About

Falcon 180B is an advanced large language model developed by the Technology Innovation Institute in the UAE, featuring an impressive 180 billion parameters. It is trained on a vast dataset of 3.5 trillion tokens, positioning it amongst the top-performing LLMs worldwide. This open-source model excels in natural language processing tasks such as reasoning, coding, and knowledge retrieval, showing results comparable to leading closed-source models like Google's PaLM 2 Large. While Falcon 180B is publicly available and open for commercial use under a modified Apache 2.0 license, its deployment requires significant computational resources. The model is optimized through architectural enhancements like multiquery and multigroup attention, improving efficiency and performance. Falcon 180B is accessible via platforms such as Hugging Face, fostering collaboration and wider application integration.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(2)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EASServerless
Scale AI GenAI PlatformServerless

Benchmark Scores(4)

BenchmarkScoreVersionSource
Google-Proof Q&A58.9diamondOpen LLM Leaderboard
HellaSwag92.710-shotOpen LLM Leaderboard
HumanEval85.1pass@1Open LLM Leaderboard
Massive Multitask Language Understanding84.25-shotOpen LLM Leaderboard

Rankings

Specifications

FamilyFalcon
Released2023-11-28
Parameters180B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Innovative open-source AI for global impact

Abu Dhabi, United Arab Emirates
Founded 2019
Website