hushem_1x_deit_small_rms_00001_fold3

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3985
  • Accuracy: 0.8372

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.1600 0.4419
1.1403 2.0 12 0.8608 0.6744
1.1403 3.0 18 0.6312 0.7907
0.44 4.0 24 0.5560 0.7442
0.137 5.0 30 0.6293 0.6977
0.137 6.0 36 0.5955 0.7442
0.03 7.0 42 0.4797 0.8372
0.03 8.0 48 0.3954 0.8140
0.0079 9.0 54 0.4175 0.8372
0.0043 10.0 60 0.4040 0.7907
0.0043 11.0 66 0.4128 0.8372
0.0029 12.0 72 0.4075 0.8372
0.0029 13.0 78 0.4003 0.8372
0.0022 14.0 84 0.3993 0.8140
0.0018 15.0 90 0.3966 0.8140
0.0018 16.0 96 0.4005 0.8372
0.0015 17.0 102 0.4011 0.8372
0.0015 18.0 108 0.3998 0.8372
0.0013 19.0 114 0.3985 0.8372
0.0012 20.0 120 0.3993 0.8140
0.0012 21.0 126 0.3972 0.8372
0.0011 22.0 132 0.4006 0.8605
0.0011 23.0 138 0.3962 0.8372
0.001 24.0 144 0.3991 0.8605
0.0009 25.0 150 0.3957 0.8140
0.0009 26.0 156 0.3974 0.8372
0.0008 27.0 162 0.3957 0.8372
0.0008 28.0 168 0.3962 0.8372
0.0008 29.0 174 0.3950 0.8372
0.0007 30.0 180 0.3967 0.8372
0.0007 31.0 186 0.3962 0.8372
0.0007 32.0 192 0.3971 0.8372
0.0007 33.0 198 0.3980 0.8372
0.0007 34.0 204 0.3974 0.8372
0.0006 35.0 210 0.3977 0.8372
0.0006 36.0 216 0.3977 0.8372
0.0006 37.0 222 0.3981 0.8372
0.0006 38.0 228 0.3981 0.8372
0.0006 39.0 234 0.3981 0.8372
0.0006 40.0 240 0.3984 0.8372
0.0006 41.0 246 0.3985 0.8372
0.0006 42.0 252 0.3985 0.8372
0.0006 43.0 258 0.3985 0.8372
0.0006 44.0 264 0.3985 0.8372
0.0006 45.0 270 0.3985 0.8372
0.0006 46.0 276 0.3985 0.8372
0.0006 47.0 282 0.3985 0.8372
0.0006 48.0 288 0.3985 0.8372
0.0006 49.0 294 0.3985 0.8372
0.0006 50.0 300 0.3985 0.8372

Framework versions

  • Transformers 4.35.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.14.1
Downloads last month
3
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/hushem_1x_deit_small_rms_00001_fold3

Finetuned
(309)
this model

Evaluation results