LLM ReferenceLLM Reference

ReMM SLERP L2 13B

About

The ReMM SLERP L2 13B, developed by Undi95, is a large language model that reimagines the original MythoMax-L2-B13 with enhanced architecture and techniques. Utilizing the SLERP method, it combines foundational models like Mythologic and Huginn to boost performance and versatility in natural language processing tasks. Built on the Llama architecture with 13 billion parameters, it effectively handles complex language scenarios with a context length of 4096 tokens. Renowned for its text generation and instruction-following prowess, the model supports various quantized formats for optimized hardware performance, although it requires significant computational resources and varies in output quality based on the quantization method used.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(2)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Together AI$0.3$0.3Serverless
OpenRouter$0.45$0.65Serverless

Rankings

Specifications

Released2023-12-20
Parameters13B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Open source collaboration in AI research

N/A
Founded N/A
Website