bert base go emotion

bhadresh-savani

Introduction

The BERT-BASE-GO-EMOTION model is designed for text classification tasks, specifically targeting emotional recognition in text. It utilizes the BERT architecture and is built with PyTorch. The model is trained on the GoEmotions dataset, which is a large-scale annotated dataset for emotion classification in English text.

Architecture

The model employs the BERT-BASE architecture, known for its transformer-based approach to natural language processing tasks. BERT (Bidirectional Encoder Representations from Transformers) enables understanding of the context of words in a text, making it suitable for nuanced tasks such as emotion detection.

Training

  • Num of Examples: 169,208
  • Num of Epochs: 3
  • Batch Size per Device: 16
  • Total Train Batch Size: 16
  • Gradient Accumulation Steps: 1
  • Total Optimization Steps: 31,728

Training Output:

  • Train Loss: 0.1209

Evaluation Output:

  • Evaluation Accuracy: 0.9615
  • Evaluation Loss: 0.1165

Guide: Running Locally

  1. Setup Environment:

    • Ensure you have Python and PyTorch installed.
    • Install the Hugging Face Transformers library.
  2. Download the Model:

    • Use the transformers library to load the model:
      from transformers import BertForSequenceClassification
      model = BertForSequenceClassification.from_pretrained('bhadresh-savani/bert-base-go-emotion')
      
  3. Prepare the Dataset:

    • Obtain the GoEmotions dataset and preprocess it for input to the model.
  4. Run Inference:

    • Use the model to predict emotions from text inputs.
  5. Cloud GPUs:

    • For improved performance, consider using cloud GPUs such as AWS EC2 with NVIDIA GPUs or Google Cloud's AI Platform.

License

The BERT-BASE-GO-EMOTION model is licensed under the Apache 2.0 License, allowing for both personal and commercial use with minimal restrictions.

More Related APIs in Text Classification