File size: 13,984 Bytes
c8d5fc8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-small_tobacco3482_og_simkd_
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-small_tobacco3482_og_simkd_

This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 212.9178
- Accuracy: 0.855
- Brier Loss: 0.2563
- Nll: 1.4722
- F1 Micro: 0.855
- F1 Macro: 0.8333
- Ece: 0.1253
- Aurc: 0.0422

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll    | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log        | 1.0   | 25   | 219.2297        | 0.145    | 0.8895     | 6.5562 | 0.145    | 0.0567   | 0.2109 | 0.7661 |
| No log        | 2.0   | 50   | 217.9786        | 0.49     | 0.6839     | 2.2035 | 0.49     | 0.4113   | 0.3237 | 0.3097 |
| No log        | 3.0   | 75   | 216.6085        | 0.565    | 0.5671     | 1.7658 | 0.565    | 0.4471   | 0.2471 | 0.2239 |
| No log        | 4.0   | 100  | 216.0210        | 0.68     | 0.4722     | 1.8682 | 0.68     | 0.5586   | 0.2322 | 0.1557 |
| No log        | 5.0   | 125  | 215.5695        | 0.68     | 0.4668     | 1.9385 | 0.68     | 0.5570   | 0.2289 | 0.1440 |
| No log        | 6.0   | 150  | 215.3762        | 0.745    | 0.3963     | 2.1043 | 0.745    | 0.6608   | 0.1949 | 0.0976 |
| No log        | 7.0   | 175  | 214.8964        | 0.745    | 0.3675     | 1.7226 | 0.745    | 0.6693   | 0.1765 | 0.0949 |
| No log        | 8.0   | 200  | 215.0440        | 0.735    | 0.3838     | 1.9180 | 0.735    | 0.6935   | 0.2056 | 0.0958 |
| No log        | 9.0   | 225  | 214.7017        | 0.775    | 0.3466     | 1.5816 | 0.775    | 0.6897   | 0.1756 | 0.0661 |
| No log        | 10.0  | 250  | 214.6309        | 0.775    | 0.3505     | 1.6245 | 0.775    | 0.7604   | 0.1828 | 0.0763 |
| No log        | 11.0  | 275  | 214.6275        | 0.735    | 0.4314     | 2.3367 | 0.735    | 0.7342   | 0.2234 | 0.1203 |
| No log        | 12.0  | 300  | 214.5664        | 0.75     | 0.3769     | 1.7889 | 0.75     | 0.7420   | 0.1873 | 0.1070 |
| No log        | 13.0  | 325  | 214.6764        | 0.735    | 0.4425     | 2.3533 | 0.735    | 0.7404   | 0.2267 | 0.1508 |
| No log        | 14.0  | 350  | 214.5261        | 0.805    | 0.3093     | 1.8504 | 0.805    | 0.7870   | 0.1732 | 0.0580 |
| No log        | 15.0  | 375  | 214.4932        | 0.79     | 0.3255     | 1.4649 | 0.79     | 0.7575   | 0.1796 | 0.0543 |
| No log        | 16.0  | 400  | 214.3134        | 0.85     | 0.2467     | 1.4769 | 0.85     | 0.8388   | 0.1149 | 0.0513 |
| No log        | 17.0  | 425  | 214.3825        | 0.82     | 0.2845     | 1.4858 | 0.82     | 0.8014   | 0.1445 | 0.0540 |
| No log        | 18.0  | 450  | 214.2077        | 0.85     | 0.2681     | 1.4891 | 0.85     | 0.8406   | 0.1462 | 0.0684 |
| No log        | 19.0  | 475  | 214.1675        | 0.845    | 0.2623     | 1.5311 | 0.845    | 0.8329   | 0.1414 | 0.0485 |
| 220.0633      | 20.0  | 500  | 214.1433        | 0.84     | 0.2663     | 1.5269 | 0.8400   | 0.8182   | 0.1302 | 0.0562 |
| 220.0633      | 21.0  | 525  | 214.0829        | 0.805    | 0.3293     | 2.0021 | 0.805    | 0.8019   | 0.1710 | 0.0833 |
| 220.0633      | 22.0  | 550  | 213.9282        | 0.84     | 0.2586     | 1.5127 | 0.8400   | 0.8205   | 0.1397 | 0.0453 |
| 220.0633      | 23.0  | 575  | 213.9303        | 0.87     | 0.2260     | 1.4450 | 0.87     | 0.8552   | 0.1205 | 0.0365 |
| 220.0633      | 24.0  | 600  | 213.9140        | 0.84     | 0.2620     | 1.5244 | 0.8400   | 0.8161   | 0.1462 | 0.0490 |
| 220.0633      | 25.0  | 625  | 213.7616        | 0.86     | 0.2306     | 1.5288 | 0.8600   | 0.8409   | 0.1215 | 0.0361 |
| 220.0633      | 26.0  | 650  | 213.7738        | 0.845    | 0.2431     | 1.5303 | 0.845    | 0.8271   | 0.1335 | 0.0443 |
| 220.0633      | 27.0  | 675  | 213.8470        | 0.85     | 0.2427     | 1.3459 | 0.85     | 0.8275   | 0.1296 | 0.0445 |
| 220.0633      | 28.0  | 700  | 213.7198        | 0.85     | 0.2381     | 1.3868 | 0.85     | 0.8328   | 0.1267 | 0.0424 |
| 220.0633      | 29.0  | 725  | 213.6302        | 0.855    | 0.2293     | 1.4191 | 0.855    | 0.8361   | 0.1157 | 0.0394 |
| 220.0633      | 30.0  | 750  | 213.6385        | 0.85     | 0.2424     | 1.5410 | 0.85     | 0.8334   | 0.1339 | 0.0464 |
| 220.0633      | 31.0  | 775  | 213.6397        | 0.865    | 0.2234     | 1.4012 | 0.865    | 0.8464   | 0.1226 | 0.0402 |
| 220.0633      | 32.0  | 800  | 213.6658        | 0.86     | 0.2271     | 1.3863 | 0.8600   | 0.8470   | 0.1164 | 0.0354 |
| 220.0633      | 33.0  | 825  | 213.6526        | 0.85     | 0.2448     | 1.5357 | 0.85     | 0.8292   | 0.1214 | 0.0397 |
| 220.0633      | 34.0  | 850  | 213.5407        | 0.855    | 0.2282     | 1.3470 | 0.855    | 0.8405   | 0.1245 | 0.0393 |
| 220.0633      | 35.0  | 875  | 213.6166        | 0.83     | 0.2624     | 1.4288 | 0.83     | 0.8102   | 0.1415 | 0.0458 |
| 220.0633      | 36.0  | 900  | 213.5887        | 0.84     | 0.2613     | 1.3928 | 0.8400   | 0.8135   | 0.1298 | 0.0442 |
| 220.0633      | 37.0  | 925  | 213.4976        | 0.845    | 0.2338     | 1.3784 | 0.845    | 0.8244   | 0.1319 | 0.0355 |
| 220.0633      | 38.0  | 950  | 213.4554        | 0.85     | 0.2374     | 1.3680 | 0.85     | 0.8323   | 0.1192 | 0.0385 |
| 220.0633      | 39.0  | 975  | 213.4758        | 0.845    | 0.2319     | 1.4895 | 0.845    | 0.8274   | 0.1185 | 0.0385 |
| 217.7609      | 40.0  | 1000 | 213.4440        | 0.845    | 0.2432     | 1.3737 | 0.845    | 0.8265   | 0.1310 | 0.0415 |
| 217.7609      | 41.0  | 1025 | 213.4492        | 0.845    | 0.2385     | 1.4970 | 0.845    | 0.8297   | 0.1207 | 0.0373 |
| 217.7609      | 42.0  | 1050 | 213.4319        | 0.85     | 0.2384     | 1.3580 | 0.85     | 0.8276   | 0.1250 | 0.0383 |
| 217.7609      | 43.0  | 1075 | 213.3094        | 0.855    | 0.2287     | 1.4375 | 0.855    | 0.8334   | 0.1188 | 0.0353 |
| 217.7609      | 44.0  | 1100 | 213.3809        | 0.845    | 0.2514     | 1.4969 | 0.845    | 0.8250   | 0.1318 | 0.0435 |
| 217.7609      | 45.0  | 1125 | 213.3981        | 0.85     | 0.2478     | 1.6052 | 0.85     | 0.8287   | 0.1268 | 0.0408 |
| 217.7609      | 46.0  | 1150 | 213.3004        | 0.86     | 0.2292     | 1.3632 | 0.8600   | 0.8430   | 0.1180 | 0.0355 |
| 217.7609      | 47.0  | 1175 | 213.3041        | 0.86     | 0.2363     | 1.3407 | 0.8600   | 0.8444   | 0.1235 | 0.0359 |
| 217.7609      | 48.0  | 1200 | 213.2955        | 0.845    | 0.2462     | 1.5071 | 0.845    | 0.8179   | 0.1253 | 0.0396 |
| 217.7609      | 49.0  | 1225 | 213.2531        | 0.85     | 0.2433     | 1.2946 | 0.85     | 0.8277   | 0.1270 | 0.0392 |
| 217.7609      | 50.0  | 1250 | 213.2612        | 0.845    | 0.2378     | 1.2852 | 0.845    | 0.8193   | 0.1281 | 0.0361 |
| 217.7609      | 51.0  | 1275 | 213.2246        | 0.855    | 0.2370     | 1.5829 | 0.855    | 0.8393   | 0.1234 | 0.0357 |
| 217.7609      | 52.0  | 1300 | 213.1795        | 0.845    | 0.2431     | 1.4923 | 0.845    | 0.8300   | 0.1280 | 0.0372 |
| 217.7609      | 53.0  | 1325 | 213.2721        | 0.855    | 0.2467     | 1.5096 | 0.855    | 0.8333   | 0.1248 | 0.0385 |
| 217.7609      | 54.0  | 1350 | 213.1976        | 0.85     | 0.2453     | 1.4167 | 0.85     | 0.8275   | 0.1240 | 0.0384 |
| 217.7609      | 55.0  | 1375 | 213.2822        | 0.845    | 0.2430     | 1.4438 | 0.845    | 0.8193   | 0.1283 | 0.0396 |
| 217.7609      | 56.0  | 1400 | 213.1443        | 0.85     | 0.2479     | 1.5246 | 0.85     | 0.8277   | 0.1304 | 0.0389 |
| 217.7609      | 57.0  | 1425 | 213.1679        | 0.85     | 0.2455     | 1.4468 | 0.85     | 0.8291   | 0.1224 | 0.0387 |
| 217.7609      | 58.0  | 1450 | 213.1116        | 0.85     | 0.2467     | 1.4372 | 0.85     | 0.8287   | 0.1269 | 0.0378 |
| 217.7609      | 59.0  | 1475 | 213.1005        | 0.85     | 0.2490     | 1.4214 | 0.85     | 0.8271   | 0.1316 | 0.0392 |
| 217.1217      | 60.0  | 1500 | 213.1516        | 0.855    | 0.2425     | 1.4600 | 0.855    | 0.8343   | 0.1316 | 0.0369 |
| 217.1217      | 61.0  | 1525 | 213.1205        | 0.855    | 0.2458     | 1.4436 | 0.855    | 0.8303   | 0.1197 | 0.0409 |
| 217.1217      | 62.0  | 1550 | 213.1318        | 0.85     | 0.2488     | 1.4405 | 0.85     | 0.8275   | 0.1304 | 0.0378 |
| 217.1217      | 63.0  | 1575 | 213.0243        | 0.855    | 0.2521     | 1.5810 | 0.855    | 0.8328   | 0.1341 | 0.0447 |
| 217.1217      | 64.0  | 1600 | 213.1191        | 0.84     | 0.2567     | 1.4478 | 0.8400   | 0.8185   | 0.1292 | 0.0436 |
| 217.1217      | 65.0  | 1625 | 213.0329        | 0.855    | 0.2528     | 1.3910 | 0.855    | 0.8333   | 0.1311 | 0.0404 |
| 217.1217      | 66.0  | 1650 | 212.9868        | 0.85     | 0.2525     | 1.4652 | 0.85     | 0.8275   | 0.1226 | 0.0408 |
| 217.1217      | 67.0  | 1675 | 213.0856        | 0.84     | 0.2561     | 1.4601 | 0.8400   | 0.8178   | 0.1367 | 0.0419 |
| 217.1217      | 68.0  | 1700 | 213.0379        | 0.845    | 0.2544     | 1.5222 | 0.845    | 0.8216   | 0.1362 | 0.0426 |
| 217.1217      | 69.0  | 1725 | 213.0535        | 0.835    | 0.2606     | 1.5085 | 0.835    | 0.8093   | 0.1346 | 0.0445 |
| 217.1217      | 70.0  | 1750 | 213.0247        | 0.85     | 0.2530     | 1.4349 | 0.85     | 0.8274   | 0.1373 | 0.0427 |
| 217.1217      | 71.0  | 1775 | 213.0161        | 0.855    | 0.2510     | 1.4529 | 0.855    | 0.8333   | 0.1212 | 0.0411 |
| 217.1217      | 72.0  | 1800 | 213.0249        | 0.845    | 0.2494     | 1.4511 | 0.845    | 0.8229   | 0.1358 | 0.0412 |
| 217.1217      | 73.0  | 1825 | 213.0014        | 0.85     | 0.2548     | 1.4435 | 0.85     | 0.8264   | 0.1277 | 0.0390 |
| 217.1217      | 74.0  | 1850 | 213.0011        | 0.845    | 0.2527     | 1.3719 | 0.845    | 0.8206   | 0.1360 | 0.0379 |
| 217.1217      | 75.0  | 1875 | 213.0240        | 0.845    | 0.2576     | 1.4072 | 0.845    | 0.8221   | 0.1284 | 0.0425 |
| 217.1217      | 76.0  | 1900 | 212.9793        | 0.845    | 0.2534     | 1.4026 | 0.845    | 0.8212   | 0.1241 | 0.0404 |
| 217.1217      | 77.0  | 1925 | 212.9800        | 0.85     | 0.2514     | 1.5023 | 0.85     | 0.8271   | 0.1279 | 0.0407 |
| 217.1217      | 78.0  | 1950 | 212.9125        | 0.845    | 0.2564     | 1.4258 | 0.845    | 0.8211   | 0.1298 | 0.0427 |
| 217.1217      | 79.0  | 1975 | 212.9454        | 0.85     | 0.2527     | 1.5227 | 0.85     | 0.8271   | 0.1279 | 0.0423 |
| 216.765       | 80.0  | 2000 | 212.9475        | 0.845    | 0.2551     | 1.5025 | 0.845    | 0.8206   | 0.1311 | 0.0423 |
| 216.765       | 81.0  | 2025 | 212.9739        | 0.84     | 0.2567     | 1.5305 | 0.8400   | 0.8162   | 0.1294 | 0.0431 |
| 216.765       | 82.0  | 2050 | 212.9351        | 0.855    | 0.2526     | 1.5373 | 0.855    | 0.8339   | 0.1277 | 0.0401 |
| 216.765       | 83.0  | 2075 | 213.0053        | 0.845    | 0.2560     | 1.4724 | 0.845    | 0.8228   | 0.1341 | 0.0417 |
| 216.765       | 84.0  | 2100 | 212.9326        | 0.845    | 0.2568     | 1.5217 | 0.845    | 0.8206   | 0.1303 | 0.0472 |
| 216.765       | 85.0  | 2125 | 212.9555        | 0.855    | 0.2537     | 1.5265 | 0.855    | 0.8339   | 0.1233 | 0.0416 |
| 216.765       | 86.0  | 2150 | 212.9121        | 0.85     | 0.2534     | 1.5224 | 0.85     | 0.8280   | 0.1283 | 0.0398 |
| 216.765       | 87.0  | 2175 | 212.8850        | 0.845    | 0.2551     | 1.4480 | 0.845    | 0.8221   | 0.1328 | 0.0412 |
| 216.765       | 88.0  | 2200 | 212.9121        | 0.855    | 0.2518     | 1.5069 | 0.855    | 0.8339   | 0.1234 | 0.0404 |
| 216.765       | 89.0  | 2225 | 212.9327        | 0.845    | 0.2517     | 1.4532 | 0.845    | 0.8206   | 0.1231 | 0.0401 |
| 216.765       | 90.0  | 2250 | 212.9305        | 0.85     | 0.2542     | 1.4506 | 0.85     | 0.8271   | 0.1374 | 0.0398 |
| 216.765       | 91.0  | 2275 | 212.9274        | 0.85     | 0.2567     | 1.5045 | 0.85     | 0.8280   | 0.1297 | 0.0419 |
| 216.765       | 92.0  | 2300 | 212.8962        | 0.85     | 0.2545     | 1.4956 | 0.85     | 0.8280   | 0.1261 | 0.0405 |
| 216.765       | 93.0  | 2325 | 212.9133        | 0.845    | 0.2567     | 1.5274 | 0.845    | 0.8212   | 0.1291 | 0.0431 |
| 216.765       | 94.0  | 2350 | 212.8708        | 0.85     | 0.2576     | 1.4410 | 0.85     | 0.8280   | 0.1302 | 0.0410 |
| 216.765       | 95.0  | 2375 | 212.9661        | 0.855    | 0.2546     | 1.3988 | 0.855    | 0.8339   | 0.1248 | 0.0404 |
| 216.765       | 96.0  | 2400 | 212.9099        | 0.855    | 0.2547     | 1.5096 | 0.855    | 0.8333   | 0.1256 | 0.0402 |
| 216.765       | 97.0  | 2425 | 212.9668        | 0.85     | 0.2549     | 1.5337 | 0.85     | 0.8271   | 0.1289 | 0.0390 |
| 216.765       | 98.0  | 2450 | 212.9587        | 0.845    | 0.2545     | 1.5161 | 0.845    | 0.8215   | 0.1304 | 0.0412 |
| 216.765       | 99.0  | 2475 | 212.9395        | 0.855    | 0.2554     | 1.4606 | 0.855    | 0.8333   | 0.1253 | 0.0410 |
| 216.6085      | 100.0 | 2500 | 212.9178        | 0.855    | 0.2563     | 1.4722 | 0.855    | 0.8333   | 0.1253 | 0.0422 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2