
Orca 2
About
Orca 2 is a family of small language models (SLMs) created by Microsoft Research, specifically engineered to enhance reasoning capabilities in smaller frameworks. Unlike large language models (LLMs) that heavily focus on size, Orca 2 demonstrates that smaller models can achieve performance comparable to or even exceeding larger models through innovative techniques. These models utilize instruction tuning, explanation tuning, and a unique method that omits conventional system prompts for more strategic reasoning. They are available in 7 billion and 13 billion parameter versions, both fine-tuned from LLaMA 2 base models with high-quality synthetic data. Orca 2 excels in reading comprehension, math problem-solving, and text summarization while being openly accessible for research. Despite its robust abilities, it still encompasses limitations such as potential biases and risks of generating inaccurate content 148.