BEiT-TO-DA

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3226
  • Accuracy: 0.9032

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5028 0.97 14 1.3862 0.1452
1.3358 2.0 29 1.0831 0.8065
0.8919 2.97 43 0.8097 0.8226
0.7328 4.0 58 0.6546 0.7742
0.4881 4.97 72 0.4860 0.8226
0.4367 6.0 87 0.4770 0.8548
0.3343 6.97 101 0.3912 0.8548
0.2794 8.0 116 0.3226 0.9032
0.2424 8.97 130 0.6426 0.7903
0.2875 10.0 145 0.4604 0.8710
0.226 10.97 159 0.3026 0.8548
0.1819 12.0 174 0.3875 0.8710
0.2354 12.97 188 0.3413 0.9032
0.2264 14.0 203 0.3948 0.8871
0.1652 14.97 217 0.3650 0.8710
0.1449 16.0 232 0.3611 0.8871
0.0993 16.97 246 0.4574 0.8710
0.1566 18.0 261 0.3924 0.8871
0.1399 18.97 275 0.4828 0.8548
0.1025 20.0 290 0.5377 0.8710
0.0855 20.97 304 0.4958 0.8548
0.1419 22.0 319 0.6156 0.8387
0.117 22.97 333 0.4915 0.8710
0.0905 24.0 348 0.5897 0.8710
0.1199 24.97 362 0.4871 0.8710
0.1246 26.0 377 0.4824 0.8548
0.0967 26.97 391 0.7484 0.8065
0.1025 28.0 406 0.6974 0.8387
0.1112 28.97 420 0.6391 0.8226
0.0715 30.0 435 0.6585 0.8226
0.085 30.97 449 0.7087 0.8065
0.1032 32.0 464 0.6094 0.8387
0.0836 32.97 478 0.5578 0.8065
0.0716 34.0 493 0.5497 0.8710
0.069 34.97 507 0.5093 0.8710
0.0577 36.0 522 0.5189 0.8710
0.0882 36.97 536 0.6531 0.8226
0.0563 38.0 551 0.6661 0.8226
0.0841 38.62 560 0.6616 0.8226

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
3
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Augusto777/BEiT-TO-DA

Finetuned
(292)
this model

Evaluation results