smids_3x_deit_small_sgd_001_fold1

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3034
  • Accuracy: 0.8765

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.874 1.0 226 0.8422 0.6477
0.6808 2.0 452 0.6564 0.7362
0.5184 3.0 678 0.5592 0.7780
0.5173 4.0 904 0.5087 0.7813
0.5113 5.0 1130 0.4746 0.8047
0.4469 6.0 1356 0.4445 0.8214
0.3731 7.0 1582 0.4247 0.8364
0.413 8.0 1808 0.4105 0.8397
0.369 9.0 2034 0.3943 0.8464
0.3882 10.0 2260 0.3826 0.8548
0.3525 11.0 2486 0.3772 0.8648
0.3235 12.0 2712 0.3644 0.8598
0.2669 13.0 2938 0.3578 0.8715
0.2463 14.0 3164 0.3531 0.8681
0.3167 15.0 3390 0.3484 0.8698
0.3258 16.0 3616 0.3417 0.8715
0.2311 17.0 3842 0.3383 0.8748
0.2669 18.0 4068 0.3359 0.8748
0.2828 19.0 4294 0.3332 0.8731
0.2409 20.0 4520 0.3289 0.8731
0.3064 21.0 4746 0.3277 0.8715
0.2918 22.0 4972 0.3274 0.8731
0.3068 23.0 5198 0.3218 0.8748
0.2544 24.0 5424 0.3201 0.8715
0.2207 25.0 5650 0.3191 0.8781
0.2923 26.0 5876 0.3153 0.8748
0.2033 27.0 6102 0.3156 0.8765
0.2492 28.0 6328 0.3128 0.8731
0.2136 29.0 6554 0.3157 0.8715
0.2344 30.0 6780 0.3143 0.8765
0.2394 31.0 7006 0.3111 0.8781
0.2259 32.0 7232 0.3105 0.8781
0.2144 33.0 7458 0.3085 0.8781
0.1831 34.0 7684 0.3101 0.8748
0.2542 35.0 7910 0.3071 0.8781
0.211 36.0 8136 0.3058 0.8781
0.2498 37.0 8362 0.3047 0.8815
0.2414 38.0 8588 0.3058 0.8765
0.1832 39.0 8814 0.3053 0.8765
0.1872 40.0 9040 0.3039 0.8781
0.1641 41.0 9266 0.3048 0.8765
0.244 42.0 9492 0.3047 0.8748
0.1877 43.0 9718 0.3040 0.8765
0.1335 44.0 9944 0.3029 0.8781
0.2501 45.0 10170 0.3036 0.8781
0.2161 46.0 10396 0.3032 0.8765
0.222 47.0 10622 0.3033 0.8781
0.2279 48.0 10848 0.3033 0.8765
0.1815 49.0 11074 0.3033 0.8765
0.1776 50.0 11300 0.3034 0.8765

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
11
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/smids_3x_deit_small_sgd_001_fold1

Finetuned
(309)
this model

Evaluation results