beit-base-patch16-224-RH

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4340
  • Accuracy: 0.8037

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 8 0.7927 0.5888
0.8183 2.0 16 0.7412 0.5888
0.7414 3.0 24 0.6851 0.5888
0.6837 4.0 32 0.6638 0.5888
0.6621 5.0 40 0.6619 0.5981
0.6621 6.0 48 0.6446 0.6262
0.6538 7.0 56 0.6370 0.6729
0.641 8.0 64 0.6485 0.6636
0.628 9.0 72 0.6393 0.6449
0.6187 10.0 80 0.6409 0.5794
0.6187 11.0 88 0.6360 0.5794
0.6075 12.0 96 0.6209 0.6355
0.6081 13.0 104 0.6377 0.6449
0.5886 14.0 112 0.5931 0.6729
0.5945 15.0 120 0.6108 0.6636
0.5945 16.0 128 0.5846 0.7009
0.5808 17.0 136 0.5945 0.6822
0.5636 18.0 144 0.7402 0.6636
0.5839 19.0 152 0.5661 0.6916
0.5166 20.0 160 0.5360 0.6636
0.5166 21.0 168 0.5621 0.6729
0.5165 22.0 176 0.5509 0.7196
0.5308 23.0 184 0.5602 0.7570
0.4595 24.0 192 0.4735 0.7850
0.4553 25.0 200 0.4696 0.7664
0.4553 26.0 208 0.5306 0.7850
0.4004 27.0 216 0.4819 0.7944
0.3954 28.0 224 0.4831 0.7944
0.3521 29.0 232 0.4340 0.8037
0.3436 30.0 240 0.4790 0.7757
0.3436 31.0 248 0.4720 0.7757
0.34 32.0 256 0.5283 0.7850
0.2995 33.0 264 0.4383 0.7944
0.2951 34.0 272 0.4740 0.7944
0.3094 35.0 280 0.5863 0.7664
0.3094 36.0 288 0.4483 0.7850
0.2963 37.0 296 0.4759 0.7944
0.3045 38.0 304 0.4469 0.7944
0.2739 39.0 312 0.4517 0.7850
0.2717 40.0 320 0.4654 0.7944

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for Augusto777/beit-base-patch16-224-RH

Finetuned
(294)
this model

Evaluation results