Qwen 2.5 3b Rp lora_model
bunnycoreIntroduction
The QWEN-2.5-3B-RP-LORA model is a fine-tuned version of the base model unsloth/qwen2.5-3b-instruct-bnb-4bit
. Developed by Bunnycore, the model is designed for text generation tasks and utilizes the Unsloth framework and Hugging Face's TRL library for optimized training speed.
Architecture
- Base Model:
unsloth/qwen2.5-3b-instruct-bnb-4bit
- Frameworks Used: Transformers, Safetensors
- Tags:
- Text-generation-inference
- Transformers
- Unsloth
- Qwen2
- TRL
Training
The model was trained twice as fast using the Unsloth framework, which is designed to optimize and accelerate model training. The Hugging Face's TRL library was also employed to leverage advanced training techniques.
Guide: Running Locally
-
Clone the Repository:
- Use the Hugging Face Hub to clone the model repository.
-
Install Dependencies:
- Ensure you have Python and the required libraries installed. Common libraries include
transformers
,torch
, andsafetensors
.
- Ensure you have Python and the required libraries installed. Common libraries include
-
Load the Model:
- Utilize Hugging Face's Transformers library to load the model with a few lines of code:
from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("bunnycore/Qwen-2.5-3b-Rp-lora_model") tokenizer = AutoTokenizer.from_pretrained("bunnycore/Qwen-2.5-3b-Rp-lora_model")
- Utilize Hugging Face's Transformers library to load the model with a few lines of code:
-
Run Inference:
- Generate text by inputting prompts to the model:
input_text = "Your text prompt here" inputs = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**inputs)
- Generate text by inputting prompts to the model:
-
Cloud GPU Recommendation:
- For optimal performance, especially when fine-tuning or running large-scale inferences, consider using cloud GPU services such as AWS EC2, Google Cloud Platform, or Azure.
License
The QWEN-2.5-3B-RP-LORA model is licensed under the Apache-2.0 license, allowing for broad use and distribution with few restrictions.