LLM ReferenceLLM Reference
FriendliAI

FriendliAI

Inference PlatformTier 3

FriendliAI

AI

Platform Overview

FriendliAI offers serverless inference endpoints for LLMs with optimized token generation speeds and reduced inference latency. The platform supports various model formats and provides autoscaling, batch processing, and multi-model serving capabilities.

Platform Details

TypeInference Platform
TierTier 3
Models0

Organization

FriendliAI
Founded2022
Seoul, South Korea

FriendliAI develops high-performance AI inference engines and serverless endpoints optimized for efficiently serving large language models at scale with low latency.

Links

Website