time series transformer tourism monthly
huggingfaceIntroduction
The Time Series Transformer model is designed for time-series forecasting and has been specifically trained on the tourism-monthly dataset for 30 epochs. This model utilizes a vanilla encoder-decoder Transformer architecture, similar to models used in machine translation, to predict future data points.
Architecture
The architecture of the Time Series Transformer follows the standard encoder-decoder Transformer model. It processes time-series data to forecast future values by generating samples in an autoregressive manner, predicting one time step at a time.
Training
The model was trained on the Monash Time Series Forecasting (monash_tsf) dataset, specifically focusing on the tourism-monthly data. The training process involved running the model for 30 epochs, optimizing it for accurate time-series predictions.
Guide: Running Locally
-
Setup Environment:
- Ensure you have Python installed.
- Install the Hugging Face Transformers library using pip:
pip install transformers
-
Download the Model:
- Use the Hugging Face Model Hub to download the Time Series Transformer model.
-
Run Inference:
- Load the model using the Transformers library and pass your time-series data for prediction.
-
Cloud GPU Recommendation:
- For enhanced performance, especially with large datasets or multiple predictions, consider using cloud GPUs such as those available on AWS, Google Cloud, or Azure.
License
The Time Series Transformer model is released under the MIT License, allowing for open use and modification.