Chronos 13B
About
The Chronos 13B model is a large language model with 13 billion parameters, primarily designed for chat, roleplay, and storywriting. Built on the transformer architecture, it can generate long, coherent text, excelling in creative writing tasks due to its training on extensive human-generated inputs. While capable of performing simple reasoning and coding tasks, its design focuses on creative text generation. The model's training data mainly includes English texts from diverse web sources, contributing to its strong performance in English. Despite its effectiveness, it can have biases and generate offensive content, reflecting its training dataset. Variants of the model, such as 4-bit quantized and GGML versions, offer efficiency for CPU and GPU usage, and input prompts follow Alpaca formatting for optimal results. Limitations also include varying performance across languages and potential issues from the lack of human feedback in training.