violet_tarot
huggingtweetsIntroduction
Violet Tarot is a text generation model based on the tweets of the user Violet Witch (@violet_tarot). Utilizing the HuggingTweets framework, it allows users to create a personalized AI bot that generates text similar to the chosen Twitter user.
Architecture
The Violet Tarot model is built upon the GPT-2 architecture, a transformer-based model known for its capacity to generate coherent and contextually relevant text. It has been fine-tuned specifically on the tweet data of Violet Witch to tailor its output accordingly.
Training
The model was trained using tweets from the Violet Witch account. Initially, 3,250 tweets were downloaded, with 2,999 tweets being retained after filtering out retweets and overly short tweets. The fine-tuning process involved adapting a pre-trained GPT-2 model with this dataset, using hyperparameters and metrics tracked with Weights & Biases (W&B) for transparency and reproducibility.
Guide: Running Locally
To run the Violet Tarot model locally, follow these steps:
- Install the Transformers Library: Ensure you have the Hugging Face Transformers library installed.
pip install transformers
- Set Up the Pipeline: Use the Transformers library to set up a text generation pipeline.
from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/violet_tarot')
- Generate Text: Generate text using the model with a sample prompt.
generator("My dream is", num_return_sequences=5)
For better performance, particularly when generating larger volumes of text, consider using cloud-based GPUs such as those available from AWS, Google Cloud, or Azure.
License
The Violet Tarot model and its code are available under licenses specified in the HuggingTweets GitHub repository. Users should ensure compliance with these licenses when using or modifying the model.