LLM Reference
NVIDIA NIM

Phi-3 Mini 128K on NVIDIA NIM

Phi-3 · Microsoft Research

Provisioned

Pricing

TypePrice (per 1M)
Input tokensFree
Output tokensFree

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Phi-3 Mini 128K

Phi-3 Mini-128K-Instruct, developed by Microsoft, is a 3.8 billion-parameter large language model renowned for its lightweight, open-source architecture. Despite its modest size, it excels in reasoning tasks, particularly in math and logic, and showcases strong code generation capabilities. A standout feature is its remarkable ability to handle up to 128,000 tokens, allowing it to process extensive text documents and codebases efficiently. While it has limitations in factual knowledge and focuses primarily on English, it strikes a balance between performance and efficiency, making it ideal for resource-constrained environments. The model is available on platforms like Azure AI Studio and Hugging Face and benefits from training on high-quality synthetic and publicly available data, with fine-tuning to improve instruction adherence and safety.

Get Started

Model Specs

Released2024-04-23
Parameters3.8B
Context128K
ArchitectureDecoder Only