
Granite Time Series
About
The Granite Time Series models, developed by IBM Research, form a highly efficient family of compact, pre-trained models tailored for multivariate time-series forecasting. Known as TinyTimeMixers (TTMs), these models are distinguished by their small size, some with fewer than 1 million parameters, providing a stark contrast to more cumbersome models necessitating billions of parameters. Despite their compactness, TTMs deliver state-of-the-art performance in both zero-shot and few-shot forecasting scenarios, surpassing various well-known benchmarks. Trained on publicly available datasets and distributed under the Apache 2.0 license, they offer developers ease of access and modification. The TTM models come in various configurations, each optimized for different forecasting settings dictated by context and forecast lengths, such as the 512-96 model that predicts upcoming 96 time points based on the last 512. These models, available on Hugging Face, have evolved over time, with the newer TTM-R2 versions utilizing larger datasets for enhanced performance.