smids_1x_deit_small_adamax_00001_fold2

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7991
  • Accuracy: 0.8569

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5561 1.0 75 0.5421 0.7804
0.3586 2.0 150 0.4304 0.8286
0.3328 3.0 225 0.3782 0.8586
0.266 4.0 300 0.3735 0.8536
0.2451 5.0 375 0.3596 0.8536
0.2001 6.0 450 0.3408 0.8669
0.1658 7.0 525 0.3450 0.8552
0.1239 8.0 600 0.3523 0.8519
0.0994 9.0 675 0.3736 0.8602
0.0972 10.0 750 0.3715 0.8636
0.0323 11.0 825 0.4031 0.8569
0.0293 12.0 900 0.4332 0.8586
0.0328 13.0 975 0.4440 0.8636
0.0162 14.0 1050 0.4947 0.8552
0.02 15.0 1125 0.5267 0.8586
0.014 16.0 1200 0.5586 0.8586
0.0039 17.0 1275 0.5843 0.8536
0.0174 18.0 1350 0.6241 0.8586
0.003 19.0 1425 0.6332 0.8502
0.0035 20.0 1500 0.6412 0.8569
0.0013 21.0 1575 0.6514 0.8519
0.0012 22.0 1650 0.6772 0.8502
0.0007 23.0 1725 0.6821 0.8502
0.0007 24.0 1800 0.7040 0.8569
0.0006 25.0 1875 0.6982 0.8569
0.0005 26.0 1950 0.7116 0.8519
0.0162 27.0 2025 0.7266 0.8602
0.0086 28.0 2100 0.7241 0.8536
0.0081 29.0 2175 0.7251 0.8602
0.0111 30.0 2250 0.7386 0.8602
0.0161 31.0 2325 0.7520 0.8586
0.0003 32.0 2400 0.7496 0.8519
0.0003 33.0 2475 0.7540 0.8552
0.0003 34.0 2550 0.7601 0.8602
0.0123 35.0 2625 0.7663 0.8519
0.0002 36.0 2700 0.7752 0.8586
0.0002 37.0 2775 0.7743 0.8502
0.0002 38.0 2850 0.7784 0.8519
0.0004 39.0 2925 0.7826 0.8536
0.0002 40.0 3000 0.7838 0.8536
0.0044 41.0 3075 0.7838 0.8586
0.0002 42.0 3150 0.7911 0.8552
0.001 43.0 3225 0.7929 0.8569
0.0106 44.0 3300 0.7934 0.8552
0.0002 45.0 3375 0.7947 0.8552
0.0002 46.0 3450 0.7979 0.8536
0.001 47.0 3525 0.7983 0.8536
0.0002 48.0 3600 0.7989 0.8569
0.0022 49.0 3675 0.7991 0.8569
0.0004 50.0 3750 0.7991 0.8569

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
12
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/smids_1x_deit_small_adamax_00001_fold2

Finetuned
(309)
this model

Evaluation results