hushem_1x_deit_small_sgd_00001_fold4

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4772
  • Accuracy: 0.2857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.4838 0.2619
1.5238 2.0 12 1.4835 0.2619
1.5238 3.0 18 1.4831 0.2619
1.5534 4.0 24 1.4828 0.2619
1.5381 5.0 30 1.4825 0.2619
1.5381 6.0 36 1.4822 0.2619
1.5402 7.0 42 1.4819 0.2619
1.5402 8.0 48 1.4816 0.2619
1.5343 9.0 54 1.4813 0.2619
1.5296 10.0 60 1.4811 0.2619
1.5296 11.0 66 1.4808 0.2619
1.5185 12.0 72 1.4805 0.2619
1.5185 13.0 78 1.4803 0.2619
1.5511 14.0 84 1.4801 0.2619
1.5137 15.0 90 1.4798 0.2619
1.5137 16.0 96 1.4796 0.2619
1.5299 17.0 102 1.4794 0.2619
1.5299 18.0 108 1.4792 0.2857
1.4899 19.0 114 1.4790 0.2857
1.5822 20.0 120 1.4789 0.2857
1.5822 21.0 126 1.4787 0.2857
1.5002 22.0 132 1.4786 0.2857
1.5002 23.0 138 1.4784 0.2857
1.5297 24.0 144 1.4783 0.2857
1.5406 25.0 150 1.4781 0.2857
1.5406 26.0 156 1.4780 0.2857
1.5241 27.0 162 1.4779 0.2857
1.5241 28.0 168 1.4778 0.2857
1.5379 29.0 174 1.4777 0.2857
1.5253 30.0 180 1.4776 0.2857
1.5253 31.0 186 1.4775 0.2857
1.549 32.0 192 1.4775 0.2857
1.549 33.0 198 1.4774 0.2857
1.5016 34.0 204 1.4774 0.2857
1.4996 35.0 210 1.4773 0.2857
1.4996 36.0 216 1.4773 0.2857
1.533 37.0 222 1.4772 0.2857
1.533 38.0 228 1.4772 0.2857
1.5136 39.0 234 1.4772 0.2857
1.5288 40.0 240 1.4772 0.2857
1.5288 41.0 246 1.4772 0.2857
1.5195 42.0 252 1.4772 0.2857
1.5195 43.0 258 1.4772 0.2857
1.5432 44.0 264 1.4772 0.2857
1.5238 45.0 270 1.4772 0.2857
1.5238 46.0 276 1.4772 0.2857
1.544 47.0 282 1.4772 0.2857
1.544 48.0 288 1.4772 0.2857
1.5337 49.0 294 1.4772 0.2857
1.5345 50.0 300 1.4772 0.2857

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
4
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/hushem_1x_deit_small_sgd_00001_fold4

Finetuned
(309)
this model

Evaluation results