OpenChat 2 W
About
OpenChat V2 W is a 13-billion parameter large language model constructed using weighted behavior cloning and is based on the Llama model. It is trained on about 80,000 cleaned ShareGPT conversations, enabling it to perform text generation and conditional language modeling. The model features a context length of 2048 tokens and incorporates a conversation template with special tokens, necessitating careful consideration in its application. Despite its capabilities, it has limitations in complex reasoning, mathematical tasks, and coding challenges. Benchmarks reveal varied performance when compared to models like GPT-4 and Sonnet 3.5, with scores ranging from 50.17% to 81.23%. Its source code and an inference server are openly accessible 24.
Capabilities
MultimodalFunction CallingTool UseJSON Mode