serverless roomsort
ShuaHousetableIntroduction
The Serverless-Roomsort model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k
, achieving impressive results on its evaluation set: a loss of 0.0394 and an accuracy of 0.9892.
Architecture
This model uses the Beit architecture, specifically tailored for image classification tasks.
Training
Training Hyperparameters
- Learning Rate: 2e-05
- Train Batch Size: 16
- Eval Batch Size: 64
- Seed: 42
- Optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- LR Scheduler Type: Linear
- LR Scheduler Warmup Steps: 500
- Number of Epochs: 5
Training Results
- Epoch 1: Loss 0.7844, Accuracy 0.9791
- Epoch 2: Loss 0.0361, Accuracy 0.9830
- Epoch 3: Loss 0.0149, Accuracy 0.9879
- Epoch 4: Loss 0.0027, Accuracy 0.9892
- Epoch 5: Loss 0.0017, Accuracy 0.9889
Guide: Running Locally
- Setup Environment: Ensure Python and PyTorch are installed.
- Clone Repository: Clone the model from Hugging Face's repository.
- Install Dependencies: Install necessary libraries such as
transformers
,datasets
, andtorch
. - Load Model: Use the
transformers
library to load the model. - Inference: Run inference on image data using the loaded model.
Suggested Cloud GPUs
For optimal performance, consider using cloud GPUs like NVIDIA Tesla V100 or A100, available on platforms such as AWS, GCP, or Azure.
License
This project is licensed under the Apache-2.0 license.