Brain_ Tumor_ Classification
DevarshiIntroduction
The Brain Tumor Classification model is a fine-tuned version of Microsoft's swin-tiny-patch4-window7-224
transformer, designed for image classification tasks. It operates on the imagefolder
dataset and achieves high evaluation metrics: an accuracy, F1, recall, and precision of 0.9647.
Architecture
This model utilizes the Swin Transformer architecture, specifically the swin-tiny-patch4-window7-224
variant, which is known for its efficiency in processing image data by employing a hierarchical transformer structure.
Training
The model was trained using the following hyperparameters:
- Learning Rate: 5e-05
- Training Batch Size: 32
- Evaluation Batch Size: 32
- Seed: 42
- Gradient Accumulation Steps: 4
- Total Training Batch Size: 128
- Optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08
- Learning Rate Scheduler: Linear with a warmup ratio of 0.1
- Number of Epochs: 5
Training results showed a progressive improvement in validation metrics, culminating in a final validation loss of 0.1012 and an accuracy of 0.9647.
Guide: Running Locally
-
Install Required Packages:
- Ensure you have Python installed, then install the necessary packages:
pip install transformers==4.23.1 torch==1.12.1 datasets==2.6.1 tokenizers==0.13.1
- Ensure you have Python installed, then install the necessary packages:
-
Clone Repository and Load Model:
- Clone the model repository or download the model files from Hugging Face.
- Load the model using the
transformers
library in your Python environment.
-
Data Preparation:
- Prepare your dataset in the same format as the
imagefolder
dataset used for training.
- Prepare your dataset in the same format as the
-
Run Inference:
- Use the model to classify images by feeding them into the model and obtaining predictions.
-
Recommended Hardware:
- For efficient processing, consider using cloud-based GPUs like those provided by AWS, Google Cloud, or Azure.
License
This project is licensed under the Apache 2.0 License, allowing for both personal and commercial use with proper attribution.