granite timeseries ttm r2

ibm-granite

GRANITE-TIMESERIES-TTM-R2 Model Card

Introduction

The GRANITE-TIMESERIES-TTM-R2 model, developed by IBM Research, features TinyTimeMixers (TTMs), compact pre-trained models for Multivariate Time-Series Forecasting. TTMs are lightweight and outperform models with billions of parameters in zero-shot and few-shot scenarios. They are designed for use on devices ranging from laptops to single GPUs. The open-source version supports point forecasting for minutely to hourly resolutions.

Architecture

TTMs are specialized pre-trained models that focus on specific forecasting settings, optimizing accuracy and efficiency. They are pre-trained on extensive datasets and support various context and forecast lengths, ensuring flexibility for different forecasting needs.

Training

The TTM models were trained on datasets such as Australian Electricity Demand, Bitcoin data, and Solar Power, among others. The models are designed to be fine-tuned with minimal training data, leveraging their small size and speed for efficient forecasting.

Guide: Running Locally

  1. Install Dependencies: Ensure you have Python and necessary machine learning libraries installed.
  2. Download Model: Use the get_model utility to fetch the model suited to your context and forecast length.
  3. Select Context and Forecast Lengths: Choose the appropriate model branch based on your requirements (e.g., 512-96-r2 for 512 context length and 96 forecast length).
  4. Run Prediction: Load the model using TinyTimeMixerForPrediction.from_pretrained() and perform zeroshot or fine-tuned forecasting.
  5. Hardware Recommendation: While TTMs can run efficiently on a CPU, using a GPU (e.g., from Google Cloud or AWS) is recommended for faster processing.

License

The GRANITE-TIMESERIES-TTM-R2 is released under the Apache-2.0 License, allowing for open usage and modification, subject to compliance with the license terms.

More Related APIs in Time Series Forecasting