wav2vec2-xlsr-ln-50hr-v2
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5435
- Model Preparation Time: 0.0042
- Wer: 0.1734
- Cer: 0.0514
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer | Cer |
---|---|---|---|---|---|---|
5.7126 | 0.9986 | 362 | 2.7144 | 0.0042 | 1.0 | 1.0 |
0.6867 | 2.0 | 725 | 0.4112 | 0.0042 | 0.3065 | 0.0896 |
0.232 | 2.9986 | 1087 | 0.3286 | 0.0042 | 0.2171 | 0.0670 |
0.1672 | 4.0 | 1450 | 0.2839 | 0.0042 | 0.1959 | 0.0573 |
0.1332 | 4.9986 | 1812 | 0.2715 | 0.0042 | 0.1884 | 0.0557 |
0.1104 | 6.0 | 2175 | 0.2864 | 0.0042 | 0.1778 | 0.0530 |
0.0955 | 6.9986 | 2537 | 0.2714 | 0.0042 | 0.1663 | 0.0498 |
0.0803 | 8.0 | 2900 | 0.2877 | 0.0042 | 0.1638 | 0.0495 |
0.0706 | 8.9986 | 3262 | 0.2848 | 0.0042 | 0.1689 | 0.0501 |
0.0628 | 10.0 | 3625 | 0.2903 | 0.0042 | 0.1651 | 0.0497 |
0.0582 | 10.9986 | 3987 | 0.2902 | 0.0042 | 0.1629 | 0.0490 |
0.0511 | 12.0 | 4350 | 0.3011 | 0.0042 | 0.1590 | 0.0482 |
0.046 | 12.9986 | 4712 | 0.3197 | 0.0042 | 0.1623 | 0.0483 |
0.0428 | 14.0 | 5075 | 0.3105 | 0.0042 | 0.1716 | 0.0499 |
0.0412 | 14.9986 | 5437 | 0.2977 | 0.0042 | 0.1554 | 0.0464 |
0.0373 | 16.0 | 5800 | 0.3222 | 0.0042 | 0.1516 | 0.0464 |
0.0337 | 16.9986 | 6162 | 0.3458 | 0.0042 | 0.1651 | 0.0517 |
0.0339 | 18.0 | 6525 | 0.3331 | 0.0042 | 0.1560 | 0.0467 |
0.0307 | 18.9986 | 6887 | 0.3184 | 0.0042 | 0.1519 | 0.0468 |
0.0299 | 20.0 | 7250 | 0.3591 | 0.0042 | 0.1592 | 0.0484 |
0.0275 | 20.9986 | 7612 | 0.3524 | 0.0042 | 0.1528 | 0.0474 |
0.0269 | 22.0 | 7975 | 0.3516 | 0.0042 | 0.1463 | 0.0450 |
0.0249 | 22.9986 | 8337 | 0.3370 | 0.0042 | 0.1533 | 0.0455 |
0.024 | 24.0 | 8700 | 0.3617 | 0.0042 | 0.1528 | 0.0456 |
0.0226 | 24.9986 | 9062 | 0.3661 | 0.0042 | 0.1520 | 0.0456 |
0.0218 | 26.0 | 9425 | 0.3470 | 0.0042 | 0.1523 | 0.0455 |
0.0196 | 26.9986 | 9787 | 0.3338 | 0.0042 | 0.1484 | 0.0449 |
0.0199 | 28.0 | 10150 | 0.3502 | 0.0042 | 0.1412 | 0.0444 |
0.0189 | 28.9986 | 10512 | 0.3429 | 0.0042 | 0.1446 | 0.0440 |
0.0187 | 30.0 | 10875 | 0.3500 | 0.0042 | 0.1452 | 0.0449 |
0.0181 | 30.9986 | 11237 | 0.3633 | 0.0042 | 0.1434 | 0.0438 |
0.0169 | 32.0 | 11600 | 0.3727 | 0.0042 | 0.1493 | 0.0451 |
0.0164 | 32.9986 | 11962 | 0.3554 | 0.0042 | 0.1567 | 0.0459 |
0.0168 | 34.0 | 12325 | 0.3715 | 0.0042 | 0.1435 | 0.044 |
0.0152 | 34.9986 | 12687 | 0.3579 | 0.0042 | 0.1409 | 0.0431 |
0.0143 | 36.0 | 13050 | 0.3873 | 0.0042 | 0.1429 | 0.0431 |
0.0148 | 36.9986 | 13412 | 0.3598 | 0.0042 | 0.1364 | 0.0421 |
0.0143 | 38.0 | 13775 | 0.3667 | 0.0042 | 0.1418 | 0.0440 |
0.0139 | 38.9986 | 14137 | 0.3864 | 0.0042 | 0.1330 | 0.0418 |
0.0139 | 40.0 | 14500 | 0.3431 | 0.0042 | 0.1368 | 0.0422 |
0.0125 | 40.9986 | 14862 | 0.3624 | 0.0042 | 0.1399 | 0.0430 |
0.0125 | 42.0 | 15225 | 0.3741 | 0.0042 | 0.1361 | 0.0432 |
0.0126 | 42.9986 | 15587 | 0.3678 | 0.0042 | 0.1402 | 0.0429 |
0.013 | 44.0 | 15950 | 0.3641 | 0.0042 | 0.1338 | 0.0412 |
0.012 | 44.9986 | 16312 | 0.3481 | 0.0042 | 0.1380 | 0.0422 |
0.0106 | 46.0 | 16675 | 0.3587 | 0.0042 | 0.1309 | 0.0404 |
0.0096 | 46.9986 | 17037 | 0.3667 | 0.0042 | 0.1305 | 0.0407 |
0.0101 | 48.0 | 17400 | 0.3897 | 0.0042 | 0.1319 | 0.0412 |
0.0109 | 48.9986 | 17762 | 0.3541 | 0.0042 | 0.1272 | 0.0409 |
0.0104 | 50.0 | 18125 | 0.3593 | 0.0042 | 0.1303 | 0.0397 |
0.0103 | 50.9986 | 18487 | 0.3572 | 0.0042 | 0.1300 | 0.0402 |
0.0097 | 52.0 | 18850 | 0.3740 | 0.0042 | 0.1305 | 0.0404 |
0.0092 | 52.9986 | 19212 | 0.3798 | 0.0042 | 0.1305 | 0.0403 |
0.0076 | 54.0 | 19575 | 0.3913 | 0.0042 | 0.1284 | 0.0400 |
0.0081 | 54.9986 | 19937 | 0.3684 | 0.0042 | 0.1335 | 0.0416 |
0.0084 | 56.0 | 20300 | 0.3895 | 0.0042 | 0.1277 | 0.0402 |
0.0079 | 56.9986 | 20662 | 0.3683 | 0.0042 | 0.1253 | 0.0394 |
0.0076 | 58.0 | 21025 | 0.3857 | 0.0042 | 0.1275 | 0.0401 |
0.0069 | 58.9986 | 21387 | 0.3922 | 0.0042 | 0.1243 | 0.0391 |
0.0069 | 60.0 | 21750 | 0.3913 | 0.0042 | 0.1278 | 0.0401 |
0.0065 | 60.9986 | 22112 | 0.4005 | 0.0042 | 0.1229 | 0.0396 |
0.0064 | 62.0 | 22475 | 0.3922 | 0.0042 | 0.1264 | 0.0396 |
0.0062 | 62.9986 | 22837 | 0.3959 | 0.0042 | 0.1261 | 0.0396 |
0.0059 | 64.0 | 23200 | 0.4036 | 0.0042 | 0.1233 | 0.0395 |
0.0063 | 64.9986 | 23562 | 0.4068 | 0.0042 | 0.1237 | 0.0392 |
0.0058 | 66.0 | 23925 | 0.4060 | 0.0042 | 0.1242 | 0.0391 |
0.0053 | 66.9986 | 24287 | 0.4147 | 0.0042 | 0.1220 | 0.0386 |
0.0057 | 68.0 | 24650 | 0.4002 | 0.0042 | 0.1217 | 0.0393 |
0.0051 | 68.9986 | 25012 | 0.4244 | 0.0042 | 0.1227 | 0.0393 |
0.005 | 70.0 | 25375 | 0.4146 | 0.0042 | 0.1221 | 0.0393 |
0.0051 | 70.9986 | 25737 | 0.4254 | 0.0042 | 0.1210 | 0.0391 |
0.0052 | 72.0 | 26100 | 0.4073 | 0.0042 | 0.1221 | 0.0391 |
0.005 | 72.9986 | 26462 | 0.4121 | 0.0042 | 0.1209 | 0.0388 |
0.0048 | 74.0 | 26825 | 0.4128 | 0.0042 | 0.1203 | 0.0389 |
0.0047 | 74.9986 | 27187 | 0.4114 | 0.0042 | 0.1209 | 0.0392 |
0.0044 | 76.0 | 27550 | 0.4068 | 0.0042 | 0.1218 | 0.0387 |
0.0042 | 76.9986 | 27912 | 0.4159 | 0.0042 | 0.1204 | 0.0384 |
0.0041 | 78.0 | 28275 | 0.4162 | 0.0042 | 0.1186 | 0.0383 |
0.004 | 78.9986 | 28637 | 0.4121 | 0.0042 | 0.1183 | 0.0379 |
0.0039 | 80.0 | 29000 | 0.4101 | 0.0042 | 0.1186 | 0.0381 |
0.0038 | 80.9986 | 29362 | 0.4109 | 0.0042 | 0.1174 | 0.0376 |
0.0037 | 82.0 | 29725 | 0.4115 | 0.0042 | 0.1183 | 0.0381 |
0.0035 | 82.9986 | 30087 | 0.4161 | 0.0042 | 0.1176 | 0.0379 |
0.0034 | 84.0 | 30450 | 0.4086 | 0.0042 | 0.1173 | 0.0379 |
0.0037 | 84.9986 | 30812 | 0.4133 | 0.0042 | 0.1180 | 0.0380 |
0.0033 | 86.0 | 31175 | 0.4163 | 0.0042 | 0.1176 | 0.0376 |
0.0036 | 86.9986 | 31537 | 0.4149 | 0.0042 | 0.1181 | 0.0378 |
0.0033 | 88.0 | 31900 | 0.4148 | 0.0042 | 0.1184 | 0.0378 |
0.0034 | 88.9986 | 32262 | 0.4142 | 0.0042 | 0.1169 | 0.0375 |
0.0032 | 90.0 | 32625 | 0.4198 | 0.0042 | 0.1168 | 0.0378 |
0.0036 | 90.9986 | 32987 | 0.4162 | 0.0042 | 0.1164 | 0.0376 |
0.0032 | 92.0 | 33350 | 0.4151 | 0.0042 | 0.1166 | 0.0377 |
0.0031 | 92.9986 | 33712 | 0.4175 | 0.0042 | 0.1167 | 0.0377 |
0.0033 | 94.0 | 34075 | 0.4170 | 0.0042 | 0.1167 | 0.0377 |
0.0031 | 94.9986 | 34437 | 0.4179 | 0.0042 | 0.1169 | 0.0377 |
0.003 | 96.0 | 34800 | 0.4179 | 0.0042 | 0.1170 | 0.0377 |
0.0032 | 96.9986 | 35162 | 0.4180 | 0.0042 | 0.1167 | 0.0377 |
0.0033 | 98.0 | 35525 | 0.4180 | 0.0042 | 0.1167 | 0.0377 |
0.0034 | 98.9986 | 35887 | 0.4179 | 0.0042 | 0.1169 | 0.0377 |
0.0032 | 99.8621 | 36200 | 0.4179 | 0.0042 | 0.1169 | 0.0377 |
Framework versions
- Transformers 4.43.3
- Pytorch 2.1.0+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for KasuleTrevor/wav2vec2-xlsr-ln-50hr-v2
Base model
facebook/wav2vec2-xls-r-300m