Chat Time 1 7 B Base
ChengsenWangIntroduction
ChatTime is a multimodal time series foundation model that innovatively treats time series as a foreign language, allowing for unified processing of time series and text data. It offers zero-shot forecasting capabilities and supports bimodal input/output for both data types. The model's performance is validated through various experiments, and it is trained on four multimodal datasets to address data gaps.
Architecture
ChatTime is based on the LLaMA-2-7B-Base model, which is pre-trained on the ChengsenWang/ChatTime-1-Pretrain-1M dataset. This pre-training stage enhances the model's ability to handle both time series and textual data, making it a versatile solution for a range of tasks involving data forecasting and analysis.
Training
The model was pre-trained using the LLaMA-2-7B-Base framework on a comprehensive dataset, ChengsenWang/ChatTime-1-Pretrain-1M. This process aims to improve the model's understanding and processing of both time-series and textual data, enabling efficient zero-shot forecasting and multimodal analysis.
Guide: Running Locally
-
Environment Setup
- Install necessary libraries:
numpy
,pandas
,matplotlib
, and the ChatTime package.
- Install necessary libraries:
-
Download Dataset
- Obtain datasets, such as "Traffic" or "PTF", in CSV format.
-
Code Execution
- Use the provided Python scripts for zero-shot forecasting, context-guided forecasting, and time series question answering.
-
Model Initialization
- Load the ChatTime model using the specified model path, e.g.,
ChengsenWang/ChatTime-1-7B-Chat
.
- Load the ChatTime model using the specified model path, e.g.,
-
Run Experiments
- Experiment with different datasets and modify parameters as needed to explore the model's capabilities.
Cloud GPU Recommendation
For optimal performance, consider using cloud GPUs such as AWS EC2 P3 instances, Google Cloud TPU, or similar services to handle the computational demands of the model.
License
ChatTime is distributed under the Apache 2.0 license, allowing for broad use and modification with attribution.