LLM Reference

ReMM SLERP L2 13B on Together AI

Re:MythoMax · Undi95

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.30
Output tokens$0.30

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About ReMM SLERP L2 13B

The ReMM SLERP L2 13B, developed by Undi95, is a large language model that reimagines the original MythoMax-L2-B13 with enhanced architecture and techniques. Utilizing the SLERP method, it combines foundational models like Mythologic and Huginn to boost performance and versatility in natural language processing tasks. Built on the Llama architecture with 13 billion parameters, it effectively handles complex language scenarios with a context length of 4096 tokens. Renowned for its text generation and instruction-following prowess, the model supports various quantized formats for optimized hardware performance, although it requires significant computational resources and varies in output quality based on the quantization method used.

Get Started

Model Specs

Released2023-12-20
Parameters13B
ArchitectureDecoder Only