ReMM SLERP L2 13B
About
The ReMM SLERP L2 13B, developed by Undi95, is a large language model that reimagines the original MythoMax-L2-B13 with enhanced architecture and techniques. Utilizing the SLERP method, it combines foundational models like Mythologic and Huginn to boost performance and versatility in natural language processing tasks. Built on the Llama architecture with 13 billion parameters, it effectively handles complex language scenarios with a context length of 4096 tokens. Renowned for its text generation and instruction-following prowess, the model supports various quantized formats for optimized hardware performance, although it requires significant computational resources and varies in output quality based on the quantization method used.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution
Providers(2)
Compare all →| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Together AI | $0.3 | $0.3 | Serverless | |
| OpenRouter | $0.45 | $0.65 | Serverless |
Rankings
Specifications
FamilyRe:MythoMax
Released2023-12-20
Parameters13B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning