autonlp bank transaction classification 5521155
mgrellaIntroduction
The AUTONLP-BANK-TRANSACTION-CLASSIFICATION-5521155
model is designed for text classification tasks, specifically handling multi-class classification in the context of bank transaction data. The model has been developed using the AutoNLP tool from Hugging Face, and is based on the BERT architecture. It supports Italian language inputs and is compatible with inference endpoints, allowing for easy deployment and usage.
Architecture
This model utilizes the BERT architecture, a pre-trained transformer model, which has been fine-tuned for the specific task of bank transaction classification. The model is built using PyTorch and is available in the SafeTensors format for secure and efficient handling.
Training
The model was trained using Hugging Face's AutoNLP service. Key validation metrics for the model include:
- Loss: 1.317
- Accuracy: 82.21%
- Macro F1 Score: 0.571
- Micro F1 Score: 82.21%
- Weighted F1 Score: 82.17%
- Macro Precision: 0.606
- Micro Precision: 82.21%
- Weighted Precision: 84.92%
- Macro Recall: 0.587
- Micro Recall: 82.21%
- Weighted Recall: 82.21%
These metrics indicate a robust performance in classifying bank transaction data.
Guide: Running Locally
To run this model locally, follow these steps:
-
Set Up Environment: Install the necessary libraries, including
transformers
andtorch
. -
Authorization: Obtain an API key from Hugging Face and set it up in your environment.
-
Model and Tokenizer Loading:
from transformers import AutoModelForSequenceClassification, AutoTokenizer model = AutoModelForSequenceClassification.from_pretrained("mgrella/autonlp-bank-transaction-classification-5521155", use_auth_token=True) tokenizer = AutoTokenizer.from_pretrained("mgrella/autonlp-bank-transaction-classification-5521155", use_auth_token=True)
-
Input Processing and Model Inference:
inputs = tokenizer("I love AutoNLP", return_tensors="pt") outputs = model(**inputs)
For better performance, consider using cloud GPU services such as AWS, Google Cloud, or Azure, which offer scalable resources suited for running large models.
License
The model is distributed under the terms specified by Hugging Face, ensuring compliance with their usage and distribution policies. Users are encouraged to review the specific license details applicable to this model.