LLM Reference

ReMM SLERP L2 13B

About

The ReMM SLERP L2 13B, developed by Undi95, is a large language model that reimagines the original MythoMax-L2-B13 with enhanced architecture and techniques. Utilizing the SLERP method, it combines foundational models like Mythologic and Huginn to boost performance and versatility in natural language processing tasks. Built on the Llama architecture with 13 billion parameters, it effectively handles complex language scenarios with a context length of 4096 tokens. Renowned for its text generation and instruction-following prowess, the model supports various quantized formats for optimized hardware performance, although it requires significant computational resources and varies in output quality based on the quantization method used.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Together AI API$0.3$0.3
Serverless

Specifications

Parameters13B
ArchitectureDecoder Only
Specializationgeneral