smids_5x_beit_base_sgd_00001_fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1248
  • Accuracy: 0.3967

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2551 1.0 375 1.3211 0.3167
1.2561 2.0 750 1.3119 0.32
1.2134 3.0 1125 1.3028 0.325
1.226 4.0 1500 1.2942 0.325
1.1635 5.0 1875 1.2859 0.3267
1.2304 6.0 2250 1.2778 0.3333
1.1734 7.0 2625 1.2702 0.3383
1.1724 8.0 3000 1.2625 0.3417
1.1336 9.0 3375 1.2554 0.3467
1.1266 10.0 3750 1.2486 0.3517
1.1276 11.0 4125 1.2419 0.355
1.1538 12.0 4500 1.2355 0.355
1.1425 13.0 4875 1.2292 0.3567
1.1463 14.0 5250 1.2233 0.36
1.1661 15.0 5625 1.2174 0.3633
1.1118 16.0 6000 1.2118 0.365
1.123 17.0 6375 1.2063 0.3667
1.1065 18.0 6750 1.2010 0.3667
1.1074 19.0 7125 1.1959 0.365
1.0742 20.0 7500 1.1911 0.3717
1.0616 21.0 7875 1.1865 0.3717
1.0745 22.0 8250 1.1820 0.3717
1.0871 23.0 8625 1.1777 0.3717
1.031 24.0 9000 1.1737 0.3717
1.0843 25.0 9375 1.1697 0.375
1.0616 26.0 9750 1.1660 0.3767
1.0414 27.0 10125 1.1624 0.3783
1.0303 28.0 10500 1.1590 0.3783
0.9887 29.0 10875 1.1558 0.38
1.0267 30.0 11250 1.1528 0.38
1.0792 31.0 11625 1.1499 0.3833
1.0736 32.0 12000 1.1472 0.3883
1.0868 33.0 12375 1.1446 0.39
1.0257 34.0 12750 1.1422 0.3883
1.0237 35.0 13125 1.1400 0.39
1.0201 36.0 13500 1.1379 0.39
1.0769 37.0 13875 1.1360 0.3917
1.032 38.0 14250 1.1343 0.3933
1.0317 39.0 14625 1.1327 0.395
1.0402 40.0 15000 1.1312 0.395
0.957 41.0 15375 1.1300 0.395
1.0445 42.0 15750 1.1288 0.395
1.0399 43.0 16125 1.1278 0.395
1.0323 44.0 16500 1.1270 0.3967
1.0444 45.0 16875 1.1263 0.3967
0.9983 46.0 17250 1.1257 0.3967
1.042 47.0 17625 1.1253 0.3967
1.0685 48.0 18000 1.1250 0.3967
1.0486 49.0 18375 1.1249 0.3967
1.0457 50.0 18750 1.1248 0.3967

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/smids_5x_beit_base_sgd_00001_fold3

Finetuned
(292)
this model

Evaluation results