LLM Reference
GroqCloud

Llama 4 Scout 17B-16E Instruct on GroqCloud

Llama 4 · AI at Meta

ServerlessOpen Source

Pricing

TypePrice (per 1M)
Input tokens$0.11
Output tokens$0.34

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Llama 4 Scout 17B-16E Instruct

Meta's Llama 4 Scout is a 17-billion parameter mixture-of-experts model with 16 expert routing. Optimized for efficient inference on edge and cloud environments with strong multi-turn conversation capabilities. Available on Cloudflare Workers AI.

Get Started

Model Specs

Released2025-04-05
Parameters17B
Context328K
ArchitectureMixture of Experts