LLM ReferenceLLM Reference

Phi-2

phi-2

Open Source

About

Phi-2 is a compact language model by Microsoft endowed with 2.7 billion parameters and part of their Phi series. It shows formidable capabilities in reasoning and language understanding, outshining much larger models, even those with up to 25 times more parameters. Phi-2's training utilized a vast and diverse dataset of 1.4 trillion tokens, incorporating high-quality synthetic data and curated web content to bolster its common sense reasoning and general knowledge. Interestingly, despite lacking fine-tuning via reinforcement learning from human feedback (RLHF), it exhibits enhanced safety features and reduced bias. This makes Phi-2 a particularly useful asset in natural language processing research and development 127.

Phi-2 input tokens at $0.05/1M, output at $0.25/1M.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(5)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Microsoft Foundry$0.07$0.07Provisioned
Cloudflare Workers AIServerless
Together AI$0.1$0.1Serverless
Fireworks AI$0.10$0.10Provisioned
Replicate API$0.05$0.25Serverless

Benchmark Scores(4)

BenchmarkScoreVersionSource
Google-Proof Q&A41.2diamondresearch
HellaSwag85.910-shotresearch
HumanEval59.7pass@1research
Massive Multitask Language Understanding68.35-shotresearch

Rankings

Specifications

FamilyPhi-2
Released2023-12-12
Parameters2.7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuned

Created by

Advancing the state-of-the-art in AI and computing.

Redmond, Washington, United States
Founded 1991
Website