distilbert plutchik
JuliusAlphonsoDISTILBERT-PLUTCHIK
Introduction
The DISTILBERT-PLUTCHIK
model is a text classification model based on the DistilBERT architecture, fine-tuned to recognize emotions using Plutchik's model. It leverages the Transformers library and is implemented in PyTorch, allowing for efficient and effective text processing tasks.
Architecture
The model utilizes the DistilBERT architecture, a smaller, faster, and lighter version of BERT. DistilBERT retains 97% of BERT's language understanding while being 60% faster and reducing the model size by 40%. It is optimized for emotion classification based on Plutchik’s model, which characterizes emotions into primary categories.
Training
The training process involves fine-tuning the DistilBERT model on a dataset labeled according to Plutchik's emotional model. This model allows for the classification of text into various emotional categories, potentially combining emotions for more nuanced sentiment analysis.
Guide: Running Locally
To run the DISTILBERT-PLUTCHIK
model locally, follow these steps:
-
Clone the Repository: Clone the model repository from Hugging Face using Git.
git clone https://huggingface.co/JuliusAlphonso/distilbert-plutchik
-
Install Dependencies: Ensure you have the required libraries installed, primarily
transformers
andtorch
.pip install transformers torch
-
Load the Model: Use the Transformers library to load the model and tokenizer.
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("JuliusAlphonso/distilbert-plutchik") model = AutoModelForSequenceClassification.from_pretrained("JuliusAlphonso/distilbert-plutchik")
-
Run Inference: Tokenize your input text and pass it through the model to get emotion predictions.
inputs = tokenizer("Your text here", return_tensors="pt") outputs = model(**inputs)
-
Cloud GPUs: For enhanced performance, consider utilizing cloud GPU resources such as AWS EC2 P3 instances, Google Cloud Platform, or NVIDIA GPUs.
License
The DISTILBERT-PLUTCHIK
model is available under the MIT License, allowing for flexible use, modification, and distribution in both personal and commercial applications.