
Toppy
About
The Toppy family of LLMs, rooted in the foundational model Toppy M 7B, is an advanced collection of models developed via innovative merging techniques. Toppy M 7B is a substantial 7-billion parameter model, crafted by integrating several other models like openchat/openchat_3.5, NousResearch/Nous-Capybara-7B-V1.9, and HuggingFaceH4/zephyr-7b-beta, utilizing the task_arithmetic merge method from mergekit 45. This process enhances the model's capabilities through the inclusion of various LoRAs (Low-Rank Adaptation) 45. Boasting multilingual abilities and proficiency in code generation, the model is versatile for multiple applications 2. The family also includes multiple quantized versions for different performance and resource balancing needs 6. It is important to note that some repositories associated with Toppy models have been flagged for sensitive content 310.