financial sentiment analysis
SigmaIntroduction
The Financial Sentiment Analysis model is a fine-tuned version of ahmedrachid/FinancialBERT
, optimized for sentiment analysis using the financial_phrasebank
dataset. It is designed to achieve high accuracy and F1 scores, specifically tailored for financial text classification.
Architecture
This model is built using the BERT architecture, leveraging the Transformers
library by Hugging Face and utilizing PyTorch for deep learning tasks. It is specifically adapted for text classification tasks within the financial domain.
Training
Training Procedure
- Dataset: The model is trained on the
financial_phrasebank
dataset. - Hyperparameters:
- Learning Rate: 2e-05
- Train Batch Size: 32
- Eval Batch Size: 32
- Seed: 42
- Optimizer: Adam (betas=(0.9, 0.999), epsilon=1e-08)
- LR Scheduler Type: Linear
- Number of Epochs: 5
Framework Versions
- Transformers: 4.19.1
- PyTorch: 1.11.0+cu113
- Datasets: 2.2.1
- Tokenizers: 0.12.1
Guide: Running Locally
-
Environment Setup:
- Ensure Python is installed on your system.
- Install the necessary libraries:
pip install transformers==4.19.1 torch==1.11.0 datasets==2.2.1 tokenizers==0.12.1
-
Model Download:
- Use the
transformers
library to load the model:from transformers import BertForSequenceClassification, BertTokenizer model = BertForSequenceClassification.from_pretrained('Sigma/financial-sentiment-analysis') tokenizer = BertTokenizer.from_pretrained('Sigma/financial-sentiment-analysis')
- Use the
-
Inference:
- Prepare your input data and perform sentiment analysis using the model:
inputs = tokenizer("Your financial text here", return_tensors="pt") outputs = model(**inputs)
- Prepare your input data and perform sentiment analysis using the model:
-
Cloud GPU Recommendation:
- For optimal performance, consider using cloud-based GPUs such as AWS EC2 with NVIDIA GPUs, Google Cloud Platform, or Azure.
License
The model and its associated code adhere to the licensing terms provided by Hugging Face and the original model creator, ahmedrachid/FinancialBERT
. Users must comply with these terms when using the model for any purpose.