smids_1x_deit_small_rms_0001_fold3

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1553
  • Accuracy: 0.7017

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.14 1.0 75 1.1120 0.335
1.2072 2.0 150 1.0986 0.3333
0.9539 3.0 225 0.9334 0.4917
0.9512 4.0 300 0.9203 0.4983
0.911 5.0 375 1.0159 0.445
0.9061 6.0 450 0.9432 0.5133
0.8557 7.0 525 0.9707 0.5517
0.796 8.0 600 0.8853 0.5633
0.837 9.0 675 0.8169 0.5667
0.8343 10.0 750 0.8015 0.5867
0.8478 11.0 825 0.8424 0.5533
0.7471 12.0 900 0.8480 0.5733
0.7041 13.0 975 0.8701 0.55
0.7689 14.0 1050 0.7602 0.625
0.6385 15.0 1125 0.8263 0.5933
0.7131 16.0 1200 0.7809 0.595
0.7152 17.0 1275 0.8940 0.565
0.7023 18.0 1350 0.7651 0.66
0.6514 19.0 1425 0.7331 0.6783
0.7116 20.0 1500 0.7305 0.6883
0.6713 21.0 1575 0.7155 0.6733
0.634 22.0 1650 0.7520 0.6883
0.664 23.0 1725 0.7448 0.6767
0.5579 24.0 1800 0.7383 0.6967
0.6505 25.0 1875 0.7438 0.69
0.6223 26.0 1950 0.7719 0.65
0.5322 27.0 2025 0.7151 0.7017
0.5674 28.0 2100 0.7078 0.6817
0.493 29.0 2175 0.7341 0.71
0.585 30.0 2250 0.7150 0.6867
0.534 31.0 2325 0.7507 0.6967
0.458 32.0 2400 0.7455 0.6983
0.512 33.0 2475 0.6902 0.6967
0.5074 34.0 2550 0.6773 0.6983
0.512 35.0 2625 0.6981 0.7083
0.452 36.0 2700 0.7620 0.7083
0.4013 37.0 2775 0.7597 0.7033
0.4319 38.0 2850 0.7472 0.705
0.4551 39.0 2925 0.8012 0.7067
0.4136 40.0 3000 0.7673 0.7133
0.4092 41.0 3075 0.8184 0.7067
0.412 42.0 3150 0.8145 0.7183
0.4199 43.0 3225 0.8148 0.725
0.3632 44.0 3300 0.8661 0.69
0.2849 45.0 3375 0.9491 0.7167
0.3044 46.0 3450 0.9227 0.7017
0.2713 47.0 3525 0.9951 0.6983
0.22 48.0 3600 1.0641 0.7017
0.2276 49.0 3675 1.1632 0.6983
0.2183 50.0 3750 1.1553 0.7017

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
6
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/smids_1x_deit_small_rms_0001_fold3

Finetuned
(309)
this model

Evaluation results