wav2vec2-xls-r-1b-faroese-100h-60-epochs_20250108_v2
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1007
- Wer: 17.8878
- Cer: 3.7595
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 6000
- num_epochs: 60
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
2.1701 | 0.4877 | 1000 | 0.5709 | 57.8975 | 16.8346 |
1.5743 | 0.9754 | 2000 | 0.2891 | 35.4716 | 9.5767 |
1.6826 | 1.4628 | 3000 | 0.2146 | 31.1935 | 7.9814 |
0.8048 | 1.9505 | 4000 | 0.1991 | 30.5723 | 7.6334 |
0.6164 | 2.4379 | 5000 | 0.1955 | 29.4444 | 7.2926 |
0.6634 | 2.9256 | 6000 | 0.2030 | 30.5062 | 7.6958 |
0.6219 | 3.4131 | 7000 | 0.1829 | 28.6249 | 7.1664 |
0.5801 | 3.9008 | 8000 | 0.1784 | 27.6248 | 6.8776 |
0.5412 | 4.3882 | 9000 | 0.1776 | 27.2988 | 6.8658 |
0.5376 | 4.8759 | 10000 | 0.1672 | 26.7128 | 6.5746 |
0.4783 | 5.3633 | 11000 | 0.1622 | 26.3603 | 6.5194 |
0.4606 | 5.8510 | 12000 | 0.1589 | 25.6245 | 6.3332 |
0.3984 | 6.3385 | 13000 | 0.1458 | 25.2148 | 6.1423 |
0.402 | 6.8261 | 14000 | 0.1441 | 24.9548 | 6.0973 |
0.3887 | 7.3136 | 15000 | 0.1483 | 24.8711 | 6.0918 |
0.3614 | 7.8013 | 16000 | 0.1456 | 24.4570 | 5.8843 |
0.3017 | 8.2887 | 17000 | 0.1444 | 24.2191 | 5.7754 |
0.3353 | 8.7764 | 18000 | 0.1461 | 24.7522 | 5.9237 |
0.2625 | 9.2638 | 19000 | 0.1484 | 24.1398 | 5.7296 |
0.2877 | 9.7515 | 20000 | 0.1371 | 23.8005 | 5.6720 |
0.264 | 10.2390 | 21000 | 0.1393 | 23.5141 | 5.6483 |
0.2624 | 10.7267 | 22000 | 0.1295 | 23.3643 | 5.5103 |
0.253 | 11.2141 | 23000 | 0.1373 | 23.6155 | 5.6065 |
0.2437 | 11.7018 | 24000 | 0.1323 | 23.0295 | 5.4069 |
0.2239 | 12.1892 | 25000 | 0.1388 | 22.9854 | 5.4716 |
0.2214 | 12.6769 | 26000 | 0.1283 | 22.8268 | 5.3241 |
0.2106 | 13.1644 | 27000 | 0.1247 | 22.6418 | 5.2799 |
0.2085 | 13.6520 | 28000 | 0.1257 | 22.3642 | 5.2610 |
0.1856 | 14.1395 | 29000 | 0.1339 | 22.6638 | 5.3051 |
0.1815 | 14.6272 | 30000 | 0.1277 | 22.1351 | 5.1844 |
0.1811 | 15.1146 | 31000 | 0.1284 | 21.8839 | 5.0803 |
0.1737 | 15.6023 | 32000 | 0.1326 | 22.4611 | 5.2862 |
0.1534 | 16.0897 | 33000 | 0.1224 | 21.7297 | 5.0116 |
0.1512 | 16.5774 | 34000 | 0.1322 | 21.9324 | 5.1174 |
0.1466 | 17.0649 | 35000 | 0.1306 | 21.6152 | 5.0637 |
0.1664 | 17.5525 | 36000 | 0.1278 | 22.0866 | 5.1229 |
0.1421 | 18.0400 | 37000 | 0.1358 | 21.6857 | 5.0235 |
0.1283 | 18.5277 | 38000 | 0.1299 | 21.6108 | 4.9801 |
0.1338 | 19.0151 | 39000 | 0.1286 | 21.3508 | 4.8917 |
0.1273 | 19.5028 | 40000 | 0.1286 | 21.3332 | 4.9201 |
0.1376 | 19.9905 | 41000 | 0.1193 | 21.3641 | 4.9075 |
0.1469 | 20.4779 | 42000 | 0.1240 | 21.2715 | 4.8878 |
0.1301 | 20.9656 | 43000 | 0.1238 | 21.3685 | 4.9469 |
0.1106 | 21.4531 | 44000 | 0.1253 | 20.9807 | 4.7726 |
0.1579 | 21.9407 | 45000 | 0.1157 | 20.9279 | 4.7742 |
0.1045 | 22.4282 | 46000 | 0.1338 | 21.0380 | 4.8680 |
0.1142 | 22.9159 | 47000 | 0.1170 | 20.4520 | 4.6369 |
0.1096 | 23.4033 | 48000 | 0.1214 | 20.5842 | 4.6471 |
0.1229 | 23.8910 | 49000 | 0.1178 | 20.9058 | 4.7197 |
0.119 | 24.3784 | 50000 | 0.1193 | 20.7560 | 4.7102 |
0.1006 | 24.8661 | 51000 | 0.1217 | 20.5446 | 4.6929 |
0.1178 | 25.3536 | 52000 | 0.1132 | 20.3419 | 4.5872 |
0.1138 | 25.8413 | 53000 | 0.1135 | 20.5842 | 4.6645 |
0.1001 | 26.3287 | 54000 | 0.1200 | 20.3155 | 4.6100 |
0.0924 | 26.8164 | 55000 | 0.1189 | 20.2538 | 4.5532 |
0.0872 | 27.3038 | 56000 | 0.1188 | 20.2890 | 4.5722 |
0.1047 | 27.7915 | 57000 | 0.1219 | 20.2362 | 4.5793 |
0.1034 | 28.2790 | 58000 | 0.1139 | 20.1965 | 4.5438 |
0.0979 | 28.7666 | 59000 | 0.1160 | 19.8440 | 4.4278 |
0.1101 | 29.2541 | 60000 | 0.1212 | 20.1084 | 4.5509 |
0.1103 | 29.7418 | 61000 | 0.1159 | 19.8925 | 4.4586 |
0.0994 | 30.2292 | 62000 | 0.1199 | 20.0070 | 4.5020 |
0.1081 | 30.7169 | 63000 | 0.1146 | 19.9894 | 4.4317 |
0.0941 | 31.2043 | 64000 | 0.1151 | 19.8617 | 4.4388 |
0.0999 | 31.6920 | 65000 | 0.1140 | 19.6766 | 4.3883 |
0.1039 | 32.1795 | 66000 | 0.1184 | 19.8132 | 4.4262 |
0.0808 | 32.6672 | 67000 | 0.1171 | 19.6414 | 4.3726 |
0.0995 | 33.1546 | 68000 | 0.1191 | 19.8088 | 4.4183 |
0.0779 | 33.6423 | 69000 | 0.1087 | 19.3770 | 4.2455 |
0.0681 | 34.1297 | 70000 | 0.1162 | 19.6590 | 4.3236 |
0.1139 | 34.6174 | 71000 | 0.1150 | 19.7207 | 4.3947 |
0.0836 | 35.1049 | 72000 | 0.1166 | 19.6017 | 4.3915 |
0.0905 | 35.5925 | 73000 | 0.1159 | 19.5048 | 4.3244 |
0.0846 | 36.0800 | 74000 | 0.1173 | 19.6766 | 4.3513 |
0.0936 | 36.5677 | 75000 | 0.1112 | 19.5621 | 4.2771 |
0.0844 | 37.0551 | 76000 | 0.1127 | 19.4916 | 4.2739 |
0.0768 | 37.5428 | 77000 | 0.1098 | 19.3418 | 4.2937 |
0.0843 | 38.0302 | 78000 | 0.1137 | 19.4167 | 4.2219 |
0.0782 | 38.5179 | 79000 | 0.1124 | 19.3285 | 4.1966 |
0.0876 | 39.0054 | 80000 | 0.1088 | 19.2008 | 4.1635 |
0.0875 | 39.4931 | 81000 | 0.1082 | 19.1567 | 4.1808 |
0.0758 | 39.9807 | 82000 | 0.1122 | 19.2581 | 4.2203 |
0.0789 | 40.4682 | 83000 | 0.1084 | 19.0334 | 4.1406 |
0.0634 | 40.9559 | 84000 | 0.1117 | 18.9761 | 4.1296 |
0.0915 | 41.4433 | 85000 | 0.1106 | 19.0289 | 4.1785 |
0.0691 | 41.9310 | 86000 | 0.1145 | 19.1567 | 4.1516 |
0.0798 | 42.4184 | 87000 | 0.1059 | 19.0245 | 4.1114 |
0.0843 | 42.9061 | 88000 | 0.1082 | 18.8351 | 4.0775 |
0.0697 | 43.3936 | 89000 | 0.1114 | 18.9761 | 4.1130 |
0.0691 | 43.8812 | 90000 | 0.1107 | 18.9981 | 4.1185 |
0.0903 | 44.3687 | 91000 | 0.1104 | 18.9100 | 4.1335 |
0.058 | 44.8564 | 92000 | 0.1092 | 18.8703 | 4.0885 |
0.0798 | 45.3438 | 93000 | 0.1037 | 18.6897 | 4.0420 |
0.0702 | 45.8315 | 94000 | 0.1059 | 18.8703 | 4.1011 |
0.0608 | 46.3189 | 95000 | 0.1111 | 18.7910 | 4.0601 |
0.0606 | 46.8066 | 96000 | 0.1124 | 18.7382 | 4.0530 |
0.0616 | 47.2941 | 97000 | 0.1095 | 18.6236 | 4.0554 |
0.0462 | 47.7818 | 98000 | 0.1091 | 18.6192 | 4.0238 |
0.0557 | 48.2692 | 99000 | 0.1049 | 18.5002 | 3.9757 |
0.0517 | 48.7569 | 100000 | 0.1071 | 18.4386 | 3.9883 |
0.0481 | 49.2443 | 101000 | 0.1038 | 18.4694 | 3.9362 |
0.0726 | 49.7320 | 102000 | 0.1031 | 18.3328 | 3.9307 |
0.0578 | 50.2195 | 103000 | 0.1050 | 18.3945 | 3.9402 |
0.0388 | 50.7071 | 104000 | 0.1087 | 18.4650 | 3.9607 |
0.0545 | 51.1946 | 105000 | 0.1062 | 18.3020 | 3.9071 |
0.0497 | 51.6823 | 106000 | 0.1051 | 18.3989 | 3.9331 |
0.0375 | 52.1697 | 107000 | 0.1044 | 18.2139 | 3.8779 |
0.0354 | 52.6574 | 108000 | 0.1024 | 18.0861 | 3.8297 |
0.0421 | 53.1448 | 109000 | 0.1062 | 18.1346 | 3.8431 |
0.0318 | 53.6325 | 110000 | 0.1037 | 18.1037 | 3.8368 |
0.0406 | 54.1200 | 111000 | 0.1008 | 18.1169 | 3.8250 |
0.0393 | 54.6077 | 112000 | 0.1010 | 18.0641 | 3.8108 |
0.0433 | 55.0951 | 113000 | 0.1011 | 17.9803 | 3.7895 |
0.0398 | 55.5828 | 114000 | 0.1023 | 18.0905 | 3.8211 |
0.0372 | 56.0702 | 115000 | 0.0991 | 17.9936 | 3.7990 |
0.0376 | 56.5579 | 116000 | 0.1001 | 18.0420 | 3.7942 |
0.0375 | 57.0454 | 117000 | 0.1002 | 18.0332 | 3.7863 |
0.0277 | 57.5330 | 118000 | 0.1030 | 17.9892 | 3.7714 |
0.0307 | 58.0205 | 119000 | 0.1028 | 17.9055 | 3.7572 |
0.0244 | 58.5082 | 120000 | 0.1021 | 17.9319 | 3.7706 |
0.0336 | 58.9959 | 121000 | 0.1012 | 17.9363 | 3.7666 |
0.0286 | 59.4833 | 122000 | 0.1005 | 17.9275 | 3.7611 |
0.0277 | 59.9710 | 123000 | 0.1007 | 17.8878 | 3.7595 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 36
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for davidilag/wav2vec2-xls-r-1b-faroese-100h-60-epochs_20250108_v2
Base model
facebook/wav2vec2-xls-r-1b