LLM Reference

MythoLogic Mini 7B

About

The MythoLogic Mini 7B is a robust large language model designed for text generation and roleplaying, based on the Llama-2 architecture. Boasting 7 billion parameters, it uses the Nous Hermes-2 core with enhancements from Stable Beluga and a distilled Kimiko LoRa to improve linguistic and roleplaying capabilities. Supporting quantization methods like GGUF, AWQ, and GPTQ, it operates efficiently on mid-range GPUs with only 3.9 GB of VRAM. Known for generating coherent narratives and simulating character interactions, the model excels in creative writing and conversational AI, particularly with inputs formatted in the Alpaca style. It handles inputs of up to 4096 tokens, maintaining context over extensive dialogues or stories 125.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyMytho
Parameters7B
ArchitectureDecoder Only
Specializationgeneral