|
--- |
|
license: apache-2.0 |
|
base_model: facebook/deit-base-distilled-patch16-224 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- imagefolder |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: deit-base-distilled-patch16-224-hasta-55-fold5 |
|
results: |
|
- task: |
|
name: Image Classification |
|
type: image-classification |
|
dataset: |
|
name: imagefolder |
|
type: imagefolder |
|
config: default |
|
split: train |
|
args: default |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.6388888888888888 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# deit-base-distilled-patch16-224-hasta-55-fold5 |
|
|
|
This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.9258 |
|
- Accuracy: 0.6389 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 128 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 100 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-------:|:----:|:---------------:|:--------:| |
|
| No log | 0.5714 | 1 | 1.1479 | 0.2778 | |
|
| No log | 1.7143 | 3 | 1.1506 | 0.4444 | |
|
| No log | 2.8571 | 5 | 1.2127 | 0.3889 | |
|
| No log | 4.0 | 7 | 1.0517 | 0.4167 | |
|
| No log | 4.5714 | 8 | 1.0113 | 0.5833 | |
|
| 1.0752 | 5.7143 | 10 | 1.0014 | 0.4722 | |
|
| 1.0752 | 6.8571 | 12 | 0.9754 | 0.5556 | |
|
| 1.0752 | 8.0 | 14 | 0.9684 | 0.6111 | |
|
| 1.0752 | 8.5714 | 15 | 0.9598 | 0.5 | |
|
| 1.0752 | 9.7143 | 17 | 0.9350 | 0.5278 | |
|
| 1.0752 | 10.8571 | 19 | 1.0848 | 0.5 | |
|
| 0.8688 | 12.0 | 21 | 0.9479 | 0.5 | |
|
| 0.8688 | 12.5714 | 22 | 0.9568 | 0.4722 | |
|
| 0.8688 | 13.7143 | 24 | 0.9859 | 0.5833 | |
|
| 0.8688 | 14.8571 | 26 | 1.0116 | 0.5278 | |
|
| 0.8688 | 16.0 | 28 | 1.0542 | 0.4722 | |
|
| 0.8688 | 16.5714 | 29 | 0.9993 | 0.5 | |
|
| 0.7636 | 17.7143 | 31 | 0.9896 | 0.5278 | |
|
| 0.7636 | 18.8571 | 33 | 0.9693 | 0.5278 | |
|
| 0.7636 | 20.0 | 35 | 0.9705 | 0.5556 | |
|
| 0.7636 | 20.5714 | 36 | 0.9990 | 0.5833 | |
|
| 0.7636 | 21.7143 | 38 | 1.0442 | 0.5833 | |
|
| 0.5868 | 22.8571 | 40 | 1.0241 | 0.5 | |
|
| 0.5868 | 24.0 | 42 | 0.9458 | 0.5 | |
|
| 0.5868 | 24.5714 | 43 | 0.9391 | 0.5556 | |
|
| 0.5868 | 25.7143 | 45 | 0.9626 | 0.5278 | |
|
| 0.5868 | 26.8571 | 47 | 0.9699 | 0.5278 | |
|
| 0.5868 | 28.0 | 49 | 0.9504 | 0.5 | |
|
| 0.4816 | 28.5714 | 50 | 0.9226 | 0.4722 | |
|
| 0.4816 | 29.7143 | 52 | 0.9353 | 0.5278 | |
|
| 0.4816 | 30.8571 | 54 | 0.9030 | 0.5556 | |
|
| 0.4816 | 32.0 | 56 | 0.8948 | 0.5278 | |
|
| 0.4816 | 32.5714 | 57 | 0.9272 | 0.5278 | |
|
| 0.4816 | 33.7143 | 59 | 0.9202 | 0.5278 | |
|
| 0.3909 | 34.8571 | 61 | 0.9052 | 0.5833 | |
|
| 0.3909 | 36.0 | 63 | 0.9258 | 0.6389 | |
|
| 0.3909 | 36.5714 | 64 | 0.9267 | 0.5833 | |
|
| 0.3909 | 37.7143 | 66 | 0.9902 | 0.5556 | |
|
| 0.3909 | 38.8571 | 68 | 1.0495 | 0.5278 | |
|
| 0.3124 | 40.0 | 70 | 0.9900 | 0.5278 | |
|
| 0.3124 | 40.5714 | 71 | 0.9510 | 0.5278 | |
|
| 0.3124 | 41.7143 | 73 | 0.9531 | 0.5833 | |
|
| 0.3124 | 42.8571 | 75 | 0.9439 | 0.5278 | |
|
| 0.3124 | 44.0 | 77 | 0.9521 | 0.5278 | |
|
| 0.3124 | 44.5714 | 78 | 0.9531 | 0.5278 | |
|
| 0.3225 | 45.7143 | 80 | 0.9551 | 0.5 | |
|
| 0.3225 | 46.8571 | 82 | 0.9520 | 0.5 | |
|
| 0.3225 | 48.0 | 84 | 0.9464 | 0.5278 | |
|
| 0.3225 | 48.5714 | 85 | 0.9409 | 0.5278 | |
|
| 0.3225 | 49.7143 | 87 | 0.9471 | 0.5833 | |
|
| 0.3225 | 50.8571 | 89 | 0.9646 | 0.5833 | |
|
| 0.2829 | 52.0 | 91 | 0.9805 | 0.5833 | |
|
| 0.2829 | 52.5714 | 92 | 0.9747 | 0.5833 | |
|
| 0.2829 | 53.7143 | 94 | 0.9646 | 0.5833 | |
|
| 0.2829 | 54.8571 | 96 | 0.9659 | 0.5833 | |
|
| 0.2829 | 56.0 | 98 | 0.9644 | 0.5833 | |
|
| 0.2829 | 56.5714 | 99 | 0.9646 | 0.5833 | |
|
| 0.272 | 57.1429 | 100 | 0.9648 | 0.5833 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.0 |
|
- Pytorch 2.3.0+cu121 |
|
- Datasets 2.19.1 |
|
- Tokenizers 0.19.1 |
|
|