lora alpaca trading candles
mrzlab630Introduction
The LORA-ALPACA-TRADING-CANDLES model is a fine-tuned version of the LLaMA 7B model, specifically designed for identifying trading candle patterns such as Four Price Doji, Inverted Hammer, Hammer, Hanging Man, Doji, and more. It utilizes the Alpaca-LoRA method and is based on the trading candles dataset.
Architecture
The model is built upon the LLaMA architecture, specifically leveraging a 7B parameter configuration. It incorporates LoRA (Low-Rank Adaptation) techniques to enhance its ability to execute instructions related to trading candle patterns.
Training
The model was fine-tuned on the LLaMA 7B weights, enabling it to recognize and classify various trading candles. It can interpret inputs formatted as either open-high-low-close (OHLC) values or simplified numerical sequences.
Guide: Running Locally
-
Install Dependencies:
- Make sure Python and packages like
torch
,transformers
,gradio
, andpeft
are installed. - Run:
pip install torch transformers gradio peft
- Make sure Python and packages like
-
Set Up Model:
- Import necessary libraries and load the pretrained models:
from transformers import LlamaTokenizer, LlamaForCausalLM from peft import PeftModel
- Import necessary libraries and load the pretrained models:
-
Configure Device:
- Check for GPU availability:
device = "cuda" if torch.cuda.is_available() else "cpu"
- Check for GPU availability:
-
Load Model:
- Load the LLaMA and LoRA weights:
model = LlamaForCausalLM.from_pretrained("mrzlab630/weights_Llama_7b") model = PeftModel.from_pretrained(model, "mrzlab630/lora-alpaca-trading-candles")
- Load the LLaMA and LoRA weights:
-
Run Interface:
- Use
gradio
to create an interface for input and model interaction:gr.Interface(fn=evaluate, inputs=[...], outputs=[...]).launch()
- Use
-
Cloud GPUs:
- For enhanced performance, consider using cloud GPU services such as AWS EC2, Google Cloud, or Azure.
License
The LORA-ALPACA-TRADING-CANDLES model is licensed under the Apache-2.0 License, allowing for open-source usage and modification.