LLM ReferenceLLM Reference

StarChat2 15B

Open Source

About

StarChat2 15B is a sophisticated large language model comprising 15 billion parameters, fine-tuned from the StarCoder2 model to function as an effective coding assistant. Built with a GPT-like transformer architecture, it is designed to support English and over 600 programming languages. Its training involved supervised fine-tuning and debate-preference optimization on synthetic datasets, enabling a blend of chat and programming capabilities. While demonstrating strong performance on benchmarks like MT Bench, IFEval, and HumanEval, it has not received reinforcement learning from human feedback, leading to possible issues such as biased outputs and security vulnerabilities in generated code. Despite this, the model is accessible through Hugging Face and can be used efficiently with the pipeline() function from 🤗 Transformers. A quantized version is also available to reduce memory usage.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyStarChat2
Released2024-07-04
Parameters15B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Community-driven open-source AI model hub

New York City, New York, United States
Founded 2016
Website