smids_1x_deit_small_sgd_0001_fold3
This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.8065
- Accuracy: 0.6733
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.08 | 1.0 | 75 | 1.0697 | 0.42 |
1.068 | 2.0 | 150 | 1.0540 | 0.4567 |
1.0267 | 3.0 | 225 | 1.0400 | 0.4817 |
0.9978 | 4.0 | 300 | 1.0279 | 0.5117 |
1.0069 | 5.0 | 375 | 1.0166 | 0.5167 |
0.9928 | 6.0 | 450 | 1.0059 | 0.5333 |
0.9852 | 7.0 | 525 | 0.9957 | 0.535 |
0.9289 | 8.0 | 600 | 0.9859 | 0.54 |
0.9567 | 9.0 | 675 | 0.9764 | 0.5533 |
0.9517 | 10.0 | 750 | 0.9672 | 0.5583 |
0.9256 | 11.0 | 825 | 0.9583 | 0.565 |
0.9357 | 12.0 | 900 | 0.9498 | 0.565 |
0.9377 | 13.0 | 975 | 0.9415 | 0.57 |
0.9369 | 14.0 | 1050 | 0.9334 | 0.575 |
0.8836 | 15.0 | 1125 | 0.9259 | 0.5667 |
0.922 | 16.0 | 1200 | 0.9184 | 0.575 |
0.8518 | 17.0 | 1275 | 0.9112 | 0.5833 |
0.8741 | 18.0 | 1350 | 0.9043 | 0.5883 |
0.8604 | 19.0 | 1425 | 0.8978 | 0.5933 |
0.8822 | 20.0 | 1500 | 0.8914 | 0.5983 |
0.8613 | 21.0 | 1575 | 0.8855 | 0.61 |
0.8383 | 22.0 | 1650 | 0.8797 | 0.6117 |
0.8466 | 23.0 | 1725 | 0.8742 | 0.615 |
0.8373 | 24.0 | 1800 | 0.8689 | 0.6217 |
0.8379 | 25.0 | 1875 | 0.8638 | 0.6283 |
0.8435 | 26.0 | 1950 | 0.8590 | 0.6217 |
0.7992 | 27.0 | 2025 | 0.8544 | 0.6267 |
0.8242 | 28.0 | 2100 | 0.8500 | 0.6317 |
0.8405 | 29.0 | 2175 | 0.8460 | 0.635 |
0.8059 | 30.0 | 2250 | 0.8421 | 0.6383 |
0.8193 | 31.0 | 2325 | 0.8384 | 0.6383 |
0.8107 | 32.0 | 2400 | 0.8351 | 0.64 |
0.777 | 33.0 | 2475 | 0.8318 | 0.6433 |
0.799 | 34.0 | 2550 | 0.8288 | 0.6483 |
0.7972 | 35.0 | 2625 | 0.8260 | 0.6533 |
0.8308 | 36.0 | 2700 | 0.8234 | 0.6533 |
0.761 | 37.0 | 2775 | 0.8210 | 0.6567 |
0.8092 | 38.0 | 2850 | 0.8187 | 0.6567 |
0.8047 | 39.0 | 2925 | 0.8167 | 0.66 |
0.7661 | 40.0 | 3000 | 0.8149 | 0.6633 |
0.7897 | 41.0 | 3075 | 0.8132 | 0.6633 |
0.7801 | 42.0 | 3150 | 0.8117 | 0.665 |
0.8176 | 43.0 | 3225 | 0.8105 | 0.6683 |
0.7701 | 44.0 | 3300 | 0.8093 | 0.6717 |
0.8017 | 45.0 | 3375 | 0.8084 | 0.6717 |
0.7892 | 46.0 | 3450 | 0.8077 | 0.6717 |
0.7778 | 47.0 | 3525 | 0.8071 | 0.6717 |
0.7798 | 48.0 | 3600 | 0.8068 | 0.6733 |
0.7777 | 49.0 | 3675 | 0.8066 | 0.6733 |
0.7532 | 50.0 | 3750 | 0.8065 | 0.6733 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for hkivancoral/smids_1x_deit_small_sgd_0001_fold3
Base model
facebook/deit-small-patch16-224