hushem_1x_deit_small_rms_0001_fold1

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 2.9945
  • Accuracy: 0.5111

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.6080 0.2444
2.1035 2.0 12 1.4698 0.2444
2.1035 3.0 18 1.4524 0.2444
1.4518 4.0 24 1.3705 0.4444
1.3954 5.0 30 1.3051 0.3333
1.3954 6.0 36 1.7248 0.2444
1.3016 7.0 42 1.2877 0.4444
1.3016 8.0 48 1.2531 0.3111
1.1687 9.0 54 2.0909 0.3333
1.0636 10.0 60 1.6618 0.4
1.0636 11.0 66 1.5040 0.3778
0.852 12.0 72 1.6151 0.3556
0.852 13.0 78 1.3188 0.4667
0.6455 14.0 84 1.7023 0.4
0.4371 15.0 90 1.6246 0.4889
0.4371 16.0 96 1.5721 0.5111
0.2911 17.0 102 1.7046 0.4889
0.2911 18.0 108 2.5082 0.4889
0.0854 19.0 114 2.6259 0.4444
0.0965 20.0 120 2.0620 0.6
0.0965 21.0 126 2.2433 0.4889
0.0234 22.0 132 1.7971 0.5333
0.0234 23.0 138 2.3712 0.4667
0.039 24.0 144 1.6644 0.6
0.016 25.0 150 2.8343 0.4222
0.016 26.0 156 2.1232 0.5111
0.0027 27.0 162 2.6607 0.4889
0.0027 28.0 168 2.7233 0.4889
0.0006 29.0 174 2.7692 0.4889
0.0004 30.0 180 2.8123 0.5111
0.0004 31.0 186 2.8470 0.5111
0.0004 32.0 192 2.8799 0.5111
0.0004 33.0 198 2.9035 0.5111
0.0003 34.0 204 2.9241 0.5111
0.0003 35.0 210 2.9413 0.5111
0.0003 36.0 216 2.9570 0.5111
0.0003 37.0 222 2.9705 0.5111
0.0003 38.0 228 2.9803 0.5111
0.0003 39.0 234 2.9868 0.5111
0.0003 40.0 240 2.9915 0.5111
0.0003 41.0 246 2.9938 0.5111
0.0003 42.0 252 2.9945 0.5111
0.0003 43.0 258 2.9945 0.5111
0.0002 44.0 264 2.9945 0.5111
0.0003 45.0 270 2.9945 0.5111
0.0003 46.0 276 2.9945 0.5111
0.0002 47.0 282 2.9945 0.5111
0.0002 48.0 288 2.9945 0.5111
0.0003 49.0 294 2.9945 0.5111
0.0002 50.0 300 2.9945 0.5111

Framework versions

  • Transformers 4.35.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.14.1
Downloads last month
6
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/hushem_1x_deit_small_rms_0001_fold1

Finetuned
(309)
this model

Evaluation results