LLM ReferenceLLM Reference
Microsoft Foundry

Phi-2 on Microsoft Foundry

Phi-2 · Microsoft Research

ProvisionedOpen Source

Compare Phi-2 Across Providers

ProviderInput (per 1M)Output (per 1M)
Microsoft Foundry$0.07$0.07
Cloudflare Workers AI
Together AI$0.10$0.10
Fireworks AI$0.10$0.10
Replicate API$0.05$0.25

Pricing

TypePrice (per 1M)
Input tokens$0.07
Output tokens$0.07

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Phi-2

Phi-2 is a compact language model by Microsoft endowed with 2.7 billion parameters and part of their Phi series. It shows formidable capabilities in reasoning and language understanding, outshining much larger models, even those with up to 25 times more parameters. Phi-2's training utilized a vast and diverse dataset of 1.4 trillion tokens, incorporating high-quality synthetic data and curated web content to bolster its common sense reasoning and general knowledge. Interestingly, despite lacking fine-tuning via reinforcement learning from human feedback (RLHF), it exhibits enhanced safety features and reduced bias. This makes Phi-2 a particularly useful asset in natural language processing research and development 127.

Get Started