bombshelll's picture
End of training
9483639 verified
|
raw
history blame
3.68 kB
metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: swin-brain-tumor-type-classification
    results: []

swin-brain-tumor-type-classification

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2385
  • Accuracy: 0.9328

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 35

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.6739 1.0 19 2.4239 0.1978
2.2793 2.0 38 2.1011 0.3619
1.9273 3.0 57 1.6876 0.4963
1.5549 4.0 76 1.3626 0.5709
1.2386 5.0 95 1.1268 0.6269
1.1004 6.0 114 0.9445 0.7015
0.9008 7.0 133 0.8701 0.7313
0.8023 8.0 152 0.7917 0.7425
0.6566 9.0 171 0.6990 0.7612
0.6691 10.0 190 0.6204 0.7985
0.5605 11.0 209 0.5511 0.8321
0.5472 12.0 228 0.4945 0.8358
0.5098 13.0 247 0.4302 0.8619
0.4362 14.0 266 0.4027 0.8843
0.416 15.0 285 0.3956 0.8657
0.4095 16.0 304 0.3605 0.8881
0.3577 17.0 323 0.3339 0.8918
0.3624 18.0 342 0.3883 0.8694
0.304 19.0 361 0.3496 0.8769
0.2784 20.0 380 0.3275 0.8806
0.2763 21.0 399 0.3721 0.8806
0.2824 22.0 418 0.3156 0.8955
0.2453 23.0 437 0.3155 0.8843
0.2438 24.0 456 0.2928 0.9030
0.2285 25.0 475 0.2667 0.9216
0.2478 26.0 494 0.2816 0.9142
0.2242 27.0 513 0.2768 0.8993
0.2 28.0 532 0.2815 0.9142
0.2076 29.0 551 0.2443 0.9216
0.1978 30.0 570 0.2381 0.9216
0.1821 31.0 589 0.2563 0.9216
0.1786 32.0 608 0.2449 0.9254
0.1809 33.0 627 0.2385 0.9328
0.1812 34.0 646 0.2448 0.9291
0.1688 35.0 665 0.2446 0.9291

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0