hushem_5x_deit_small_sgd_00001_fold1

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6074
  • Accuracy: 0.2222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.4194 1.0 27 1.6162 0.2222
1.4315 2.0 54 1.6158 0.2222
1.4532 3.0 81 1.6154 0.2222
1.4652 4.0 108 1.6150 0.2222
1.4244 5.0 135 1.6147 0.2222
1.4622 6.0 162 1.6143 0.2222
1.4528 7.0 189 1.6140 0.2222
1.4262 8.0 216 1.6136 0.2222
1.4181 9.0 243 1.6133 0.2222
1.4163 10.0 270 1.6130 0.2222
1.4463 11.0 297 1.6127 0.2222
1.4137 12.0 324 1.6124 0.2222
1.4131 13.0 351 1.6121 0.2222
1.4148 14.0 378 1.6118 0.2222
1.444 15.0 405 1.6115 0.2222
1.4135 16.0 432 1.6113 0.2222
1.4356 17.0 459 1.6110 0.2222
1.4146 18.0 486 1.6108 0.2222
1.4096 19.0 513 1.6105 0.2222
1.4038 20.0 540 1.6103 0.2222
1.3926 21.0 567 1.6101 0.2222
1.4332 22.0 594 1.6099 0.2222
1.4214 23.0 621 1.6097 0.2222
1.4083 24.0 648 1.6095 0.2222
1.4271 25.0 675 1.6093 0.2222
1.4496 26.0 702 1.6091 0.2222
1.4117 27.0 729 1.6090 0.2222
1.403 28.0 756 1.6088 0.2222
1.3913 29.0 783 1.6087 0.2222
1.4302 30.0 810 1.6085 0.2222
1.4037 31.0 837 1.6084 0.2222
1.4442 32.0 864 1.6083 0.2222
1.4272 33.0 891 1.6082 0.2222
1.4095 34.0 918 1.6080 0.2222
1.4234 35.0 945 1.6079 0.2222
1.4343 36.0 972 1.6079 0.2222
1.4253 37.0 999 1.6078 0.2222
1.4109 38.0 1026 1.6077 0.2222
1.4096 39.0 1053 1.6076 0.2222
1.3772 40.0 1080 1.6076 0.2222
1.4046 41.0 1107 1.6075 0.2222
1.384 42.0 1134 1.6075 0.2222
1.4202 43.0 1161 1.6075 0.2222
1.3963 44.0 1188 1.6074 0.2222
1.4183 45.0 1215 1.6074 0.2222
1.3888 46.0 1242 1.6074 0.2222
1.4088 47.0 1269 1.6074 0.2222
1.393 48.0 1296 1.6074 0.2222
1.4397 49.0 1323 1.6074 0.2222
1.4472 50.0 1350 1.6074 0.2222

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
7
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_5x_deit_small_sgd_00001_fold1

Finetuned
(309)
this model

Evaluation results