LLM Reference

Toppy M 7B on Together AI

Toppy · Undi95

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.20
Output tokens$0.20

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Toppy M 7B

Toppy M 7B is a 7-billion parameter large language model developed by Undi, designed for advanced natural language processing and sophisticated model interactions. It can handle real-time decision-making in AI-driven systems and dynamic content generation, making it highly compatible with leading AI development tools and platforms. The model supports enhanced tokenization and effective handling of special tokens. Various quantization formats, like GGUF, are available, offering trade-offs between model size, memory requirements, and performance. Users need to consider the appropriate quantization method for efficient integration into computational environments. However, the model’s limitations may include memory and processing power challenges. Notably, its Hugging Face repository is marked as "Not-For-All-Audiences" due to potentially sensitive content, suggesting possible biases or inappropriate outputs.

Get Started

Model Specs

Released2023-12-20
Parameters7B
Context4K
ArchitectureDecoder Only