granite timeseries ttm r1

ibm-granite

Introduction

TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, developed by IBM Research. With less than 1 million parameters, they offer state-of-the-art forecasting performance while being lightweight and efficient. TTMs excel in zero-shot and few-shot forecasting with minimal computational resources.

Architecture

TTMs are designed as "focused pre-trained models," each tailored for a specific forecasting context and forecast length. This approach allows for accurate results with smaller models, facilitating easy deployment. The models are extremely fast to train, requiring only 3-6 hours on 6 A100 GPUs. TTM supports both zero-shot and fine-tuned forecasting, with capabilities for multivariate forecasting and handling exogenous and categorical data.

Training

The TTM models are pre-trained on a diverse set of public datasets from the Monash Time Series Forecasting repository. This includes datasets such as Australian Electricity Demand, Bitcoin, and Solar Power, among others. The models support minutely to hourly resolutions and are not recommended for lower resolutions like weekly or monthly without sufficient context length.

Guide: Running Locally

  1. Installation: Clone the repository and install dependencies.
  2. Load Model: Use the from_pretrained method to load the desired TTM variant.
  3. Data Preparation: Scale your data using external standard scaling.
  4. Forecasting:
    • Zeroshot: Directly apply the model to your data.
    • Finetuning: Use a subset of your data to fine-tune the model for better accuracy.
  5. Execution: Models can run on a single GPU or even CPU-only machines, but using cloud GPUs like NVIDIA A100 is recommended for faster training and inference.

License

The TTMs are released under the Apache-2.0 License, which allows for free use, distribution, and modification.

More Related APIs in Time Series Forecasting