sajjadi's picture
Model save
e089946 verified
|
raw
history blame
6.52 kB
metadata
base_model: google/vit-base-patch16-224-in21k
library_name: peft
license: apache-2.0
metrics:
  - accuracy
tags:
  - generated_from_trainer
model-index:
  - name: vit-base-patch16-224-in21k-lora
    results: []

Visualize in Weights & Biases Visualize in Weights & Biases

vit-base-patch16-224-in21k-lora

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2697
  • Accuracy: 0.9228
  • Pca Pca Loss: 1.3062
  • Pca Pca Accuracy: 0.6503

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.002
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Pca Loss Pca Accuracy
0.94 0.9923 97 0.4913 0.8846 1.2038 0.7736
0.817 1.9949 195 0.3755 0.8983 0.9604 0.795
0.8026 2.9974 293 0.3390 0.9051 0.9770 0.7767
0.7858 4.0 391 0.3173 0.908 0.9143 0.7867
0.6296 4.9923 488 0.3041 0.9123 0.9831 0.7605
0.8307 5.9949 586 0.2981 0.9097 0.9779 0.7575
0.7709 6.9974 684 0.2921 0.9141 0.9452 0.7604
0.658 8.0 782 0.2853 0.9166 1.0252 0.7402
0.6807 8.9923 879 0.2803 0.9179 1.0353 0.734
0.6216 9.9949 977 0.2814 0.9159 1.0342 0.7342
0.7122 10.9974 1075 0.2810 0.9179 1.1119 0.7147
0.5949 12.0 1173 0.2786 0.9171 1.1367 0.7071
0.6387 12.9923 1270 0.2773 0.9185 1.1188 0.7103
0.5631 13.9949 1368 0.2769 0.9198 1.0743 0.7207
0.5733 14.9974 1466 0.2763 0.919 1.1199 0.7109
0.576 16.0 1564 0.2719 0.9214 1.1115 0.7105
0.5544 16.9923 1661 0.2712 0.9213 1.0589 0.724
0.503 17.9949 1759 0.2718 0.9222 1.0503 0.7269
0.4921 18.9974 1857 0.2755 0.9205 1.0790 0.717
0.4738 20.0 1955 0.2707 0.9213 1.1020 0.7124
0.4823 20.9923 2052 0.2700 0.9215 1.2160 0.6829
0.5269 21.9949 2150 0.2709 0.9202 1.2518 0.6735
0.5386 22.9974 2248 0.2700 0.9198 1.2396 0.6723
0.5236 24.0 2346 0.2710 0.9206 1.2457 0.6728
0.4937 24.9923 2443 0.2701 0.9208 1.1680 0.6898
0.585 25.9949 2541 0.2707 0.9188 1.2159 0.6769
0.5391 26.9974 2639 0.2737 0.9199 1.2199 0.6747
0.4635 28.0 2737 0.2710 0.9186 1.2106 0.68
0.538 28.9923 2834 0.2698 0.9223 1.2144 0.6782
0.5182 29.9949 2932 0.2706 0.9219 1.2069 0.6808
0.4368 30.9974 3030 0.2715 0.921 1.2384 0.6728
0.5249 32.0 3128 0.2691 0.9202 1.2571 0.6685
0.5122 32.9923 3225 0.2710 0.9213 1.2628 0.6653
0.553 33.9949 3323 0.2734 0.9209 1.2588 0.6656
0.4843 34.9974 3421 0.2702 0.9217 1.2575 0.6667
0.5083 36.0 3519 0.2710 0.923 1.2574 0.6655
0.4537 36.9923 3616 0.2701 0.9218 1.2657 0.6635
0.485 37.9949 3714 0.2708 0.923 1.2852 0.6579
0.4307 38.9974 3812 0.2735 0.9209 1.2672 0.6611
0.4878 40.0 3910 0.2720 0.922 1.2981 0.652
0.4936 40.9923 4007 0.2717 0.9213 1.3003 0.6531
0.4256 41.9949 4105 0.2720 0.9209 1.2996 0.6525
0.4439 42.9974 4203 0.2709 0.9226 1.2975 0.6537
0.4468 44.0 4301 0.2703 0.9224 1.2981 0.6533
0.4269 44.9923 4398 0.2701 0.9222 1.3000 0.6528
0.4386 45.9949 4496 0.2696 0.9223 1.3100 0.6503
0.4434 46.9974 4594 0.2699 0.9225 1.3121 0.6493
0.473 48.0 4692 0.2698 0.923 1.3071 0.6502
0.4997 48.9923 4789 0.2698 0.9227 1.3061 0.6507
0.3989 49.6164 4850 0.2697 0.9228 1.3062 0.6503

Framework versions

  • PEFT 0.13.0
  • Transformers 4.45.1
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0