hushem_1x_deit_small_rms_00001_fold4

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4388
  • Accuracy: 0.8095

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.2550 0.4286
1.2114 2.0 12 1.1127 0.4762
1.2114 3.0 18 0.8436 0.6667
0.6039 4.0 24 0.7891 0.6429
0.2289 5.0 30 0.6119 0.7143
0.2289 6.0 36 0.5730 0.7381
0.0572 7.0 42 0.5854 0.7381
0.0572 8.0 48 0.4823 0.7619
0.0155 9.0 54 0.4273 0.8095
0.0057 10.0 60 0.4459 0.8095
0.0057 11.0 66 0.4283 0.8333
0.0036 12.0 72 0.4439 0.8333
0.0036 13.0 78 0.4381 0.8333
0.0028 14.0 84 0.4361 0.8095
0.0022 15.0 90 0.4297 0.8095
0.0022 16.0 96 0.4286 0.8333
0.0018 17.0 102 0.4333 0.8333
0.0018 18.0 108 0.4303 0.8333
0.0015 19.0 114 0.4275 0.8095
0.0014 20.0 120 0.4353 0.8095
0.0014 21.0 126 0.4311 0.8095
0.0012 22.0 132 0.4354 0.8095
0.0012 23.0 138 0.4378 0.8095
0.0011 24.0 144 0.4372 0.8095
0.001 25.0 150 0.4362 0.8095
0.001 26.0 156 0.4357 0.8095
0.0009 27.0 162 0.4417 0.8095
0.0009 28.0 168 0.4425 0.8095
0.0009 29.0 174 0.4408 0.8095
0.0008 30.0 180 0.4402 0.8095
0.0008 31.0 186 0.4406 0.8095
0.0008 32.0 192 0.4385 0.8095
0.0008 33.0 198 0.4397 0.8095
0.0007 34.0 204 0.4393 0.8095
0.0007 35.0 210 0.4395 0.8095
0.0007 36.0 216 0.4391 0.8095
0.0007 37.0 222 0.4387 0.8095
0.0007 38.0 228 0.4386 0.8095
0.0007 39.0 234 0.4388 0.8095
0.0007 40.0 240 0.4387 0.8095
0.0007 41.0 246 0.4388 0.8095
0.0007 42.0 252 0.4388 0.8095
0.0007 43.0 258 0.4388 0.8095
0.0006 44.0 264 0.4388 0.8095
0.0007 45.0 270 0.4388 0.8095
0.0007 46.0 276 0.4388 0.8095
0.0007 47.0 282 0.4388 0.8095
0.0007 48.0 288 0.4388 0.8095
0.0006 49.0 294 0.4388 0.8095
0.0007 50.0 300 0.4388 0.8095

Framework versions

  • Transformers 4.35.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.14.1
Downloads last month
9
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_1x_deit_small_rms_00001_fold4

Finetuned
(309)
this model

Evaluation results