B E R T Banking77
philschmidIntroduction
The BERT-Banking77 model, developed by Hugging Face staff member Phil Schmid, is a text classification model tailored for the banking domain. It is trained using AutoTrain on the BANKING77 dataset and optimized for multi-class classification tasks.
Architecture
BERT-Banking77 utilizes the BERT architecture, a transformer-based model developed by Google. It is implemented in PyTorch and designed for text classification tasks within the banking sector.
Training
The model is trained with AutoTrain, a tool that automates the training process for machine learning models. It has been evaluated on the BANKING77 dataset with the following key metrics:
- Accuracy: 92.64%
- Macro F1: 92.64%
- Weighted F1: 92.6%
Guide: Running Locally
To run the BERT-Banking77 model locally, follow these steps:
-
Install Dependencies: Ensure you have Python and the
transformers
library installed. You can install it using pip:pip install transformers
-
Load the Model: Use the following Python script to load the model and tokenizer, and classify text inputs:
from transformers import AutoTokenizer, AutoModelForSequenceClassification, pipeline model_id = 'philschmid/BERT-Banking77' tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForSequenceClassification.from_pretrained(model_id) classifier = pipeline('text-classification', tokenizer=tokenizer, model=model) result = classifier('What is the base of the exchange rates?') print(result)
-
Inference: You can also perform inference using cURL with the Hugging Face API:
curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/philschmid/BERT-Banking77
Cloud GPUs: For large-scale deployment or faster inference, consider using cloud GPU services from providers like AWS, Google Cloud, or Azure.
License
The BERT-Banking77 model and its code are available under the terms specified by Hugging Face. Users should refer to the model card on the Hugging Face website for detailed licensing information.