t5 base finetuned emotion

mrm8488

Introduction

The T5-BASE-FINETUNED-EMOTION model is a fine-tuned version of Google's T5 model, targeted at emotion recognition tasks. It transforms language problems into a text-to-text format, utilizing the emotion dataset to classify text into six emotions: sadness, joy, love, anger, fear, and surprise.

Architecture

This model is based on the T5 (Text-to-Text Transfer Transformer) architecture, which is a unified framework that applies transfer learning to convert various language problems into a text-to-text format. The original T5 model was developed by researchers at Google and is well-documented for achieving state-of-the-art results in many NLP tasks.

Training

The model was fine-tuned on an emotion recognition dataset curated by Elvis Saravia. The dataset allows classification into six emotions. The fine-tuning process was adapted from a Colab Notebook by Suraj Patil, indicating a collaborative and open-source approach to refining the model's capabilities. The fine-tuned model demonstrates high accuracy and metrics across emotions, as evidenced by a precision of 0.93, recall of 0.92, and an F1-score of 0.93.

Guide: Running Locally

To run the T5-BASE-FINETUNED-EMOTION model locally:

  1. Installation: Ensure you have Python and PyTorch installed. Install the Hugging Face Transformers library:

    pip install transformers
    
  2. Load Model and Tokenizer:

    from transformers import AutoTokenizer, AutoModelWithLMHead
    
    tokenizer = AutoTokenizer.from_pretrained("mrm8488/t5-base-finetuned-emotion")
    model = AutoModelWithLMHead.from_pretrained("mrm8488/t5-base-finetuned-emotion")
    
  3. Define Emotion Recognition Function:

    def get_emotion(text):
        input_ids = tokenizer.encode(text + '</s>', return_tensors='pt')
        output = model.generate(input_ids=input_ids, max_length=2)
        dec = [tokenizer.decode(ids) for ids in output]
        label = dec[0]
        return label
    
  4. Example Usage:

    print(get_emotion("I feel as if I haven't blogged in ages.")) # Output: 'joy'
    print(get_emotion("I have a feeling I kinda lost my best friend.")) # Output: 'sadness'
    

For optimal performance, use a cloud GPU service such as AWS EC2, Google Cloud, or Azure.

License

The model and associated code are released under open-source licenses, allowing for use and modification. Specific license details can be found in the repository's documentation.

More Related APIs in Text2text Generation