LLM ReferenceLLM Reference

LTM-2-mini

ltm-2-mini

ProprietaryCoding

About

LTM-2-mini is Magic's research prototype supporting a 100 million token context window, announced August 29, 2024. Uses a novel sequence-dimension algorithm approximately 1,000× more memory-efficient than transformer attention at this scale — requiring only a fraction of a single H100's HBM versus 638 H100s for Llama 3.1 405B at the same context length. Not publicly released for API access or self-hosting; Magic stated they were separately training a full LTM-2 model. Specialization: coding/software development. Source: https://magic.dev/blog/100m-token-context-windows

LTM-2-mini has a 100M-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

API Versions

ltm-2-mini

Rankings

Specifications

FamilyLTM
Released2024-08-29
Context100M
ArchitectureDecoder Only
Specializationcode
LicenseProprietary
Trainingpretrained

Created by

100M-token context window innovation

San Francisco, California, United States
Founded 2022
Website