LLM Reference

CoreWeave

CoreWeave Inc

Inference

Platform

CoreWeave provides GPU cloud infrastructure for AI workloads, including inference-heavy applications. The platform supports GPU instances from NVIDIA and AMD across global data centers with hourly billing, enabling users to deploy and run their own LLM inference workloads using frameworks like vLLM. Does not host pre-trained LLM models.

About CoreWeave Inc

CoreWeave is a cloud infrastructure company headquartered in Livingston, New Jersey. The company provides a specialized GPU cloud platform with the tagline "The Essential Cloud for AI™", offering GPU instances, AI storage, and infrastructure optimized for deploying AI workloads at scale.

Company Info

Founded2017
Livingston, New Jersey, USA