Visualize in Weights & Biases

wav2vec2-xls-r-wolof-mixed-75-hours

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0600
  • Wer: 0.4136
  • Cer: 0.1413

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 36

Training results

Training Loss Epoch Step Validation Loss Wer Cer
15.2262 0.3478 400 3.9480 1.0 1.0
6.2701 0.6957 800 3.1176 1.0 1.0
4.3901 1.0435 1200 1.1108 0.7271 0.2440
2.3527 1.3913 1600 0.8827 0.5767 0.1992
2.0303 1.7391 2000 0.7998 0.5375 0.1872
1.8695 2.0870 2400 0.7999 0.5131 0.1835
1.7196 2.4348 2800 0.7045 0.5181 0.1821
1.7028 2.7826 3200 0.7431 0.4909 0.1770
1.6103 3.1304 3600 0.7454 0.4996 0.1794
1.5956 3.4783 4000 0.7233 0.5003 0.1761
1.5677 3.8261 4400 0.6952 0.5044 0.1782
1.4969 4.1739 4800 0.7432 0.5196 0.1815
1.4505 4.5217 5200 0.6875 0.4674 0.1657
1.4267 4.8696 5600 0.6290 0.4924 0.1717
1.3537 5.2174 6000 0.6311 0.4858 0.1718
1.3208 5.5652 6400 0.6676 0.4720 0.1659
1.3139 5.9130 6800 0.6329 0.5023 0.1729
1.2517 6.2609 7200 0.6445 0.4987 0.1753
1.2352 6.6087 7600 0.6670 0.4629 0.1645
1.2174 6.9565 8000 0.6350 0.4776 0.1645
1.1423 7.3043 8400 0.6674 0.4537 0.1584
1.1593 7.6522 8800 0.5977 0.4508 0.1586
1.1651 8.0 9200 0.6381 0.4581 0.1598
1.0786 8.3478 9600 0.6538 0.4459 0.1573
1.0895 8.6957 10000 0.6510 0.4759 0.1635
1.0842 9.0435 10400 0.6333 0.4545 0.1587
0.9992 9.3913 10800 0.6645 0.4452 0.1547
1.0316 9.7391 11200 0.5950 0.4535 0.1582
1.0176 10.0870 11600 0.6691 0.4538 0.1591
0.9826 10.4348 12000 0.6146 0.4576 0.1612
0.9581 10.7826 12400 0.6362 0.4366 0.1518
0.9315 11.1304 12800 0.6494 0.4671 0.1598
0.9231 11.4783 13200 0.6859 0.4285 0.1516
0.9268 11.8261 13600 0.6601 0.4287 0.1528
0.9014 12.1739 14000 0.6848 0.4398 0.1536
0.8824 12.5217 14400 0.6399 0.4296 0.1505
0.8893 12.8696 14800 0.6754 0.4327 0.1555
0.8332 13.2174 15200 0.6464 0.4399 0.1555
0.8197 13.5652 15600 0.6878 0.4390 0.1533
0.8641 13.9130 16000 0.6685 0.4387 0.1536
0.8037 14.2609 16400 0.6413 0.4451 0.1521
0.8075 14.6087 16800 0.6770 0.4314 0.1501
0.7989 14.9565 17200 0.6403 0.4527 0.1578
0.777 15.3043 17600 0.6591 0.4323 0.1515
0.7647 15.6522 18000 0.6981 0.4357 0.1540
0.7668 16.0 18400 0.6699 0.4248 0.1477
0.7258 16.3478 18800 0.6743 0.4424 0.1512
0.735 16.6957 19200 0.6855 0.4392 0.1519
0.7163 17.0435 19600 0.7157 0.4334 0.1515
0.7059 17.3913 20000 0.7152 0.4391 0.1519
0.6953 17.7391 20400 0.6298 0.4291 0.1504
0.6923 18.0870 20800 0.7449 0.4373 0.1517
0.6654 18.4348 21200 0.6600 0.4243 0.1487
0.6776 18.7826 21600 0.7403 0.4291 0.1484
0.659 19.1304 22000 0.7134 0.4368 0.1509
0.6427 19.4783 22400 0.7312 0.4296 0.1491
0.6386 19.8261 22800 0.7457 0.4403 0.1538
0.6375 20.1739 23200 0.7278 0.4417 0.1524
0.6202 20.5217 23600 0.7500 0.4336 0.1493
0.6212 20.8696 24000 0.7423 0.4229 0.1480
0.6119 21.2174 24400 0.7775 0.4273 0.1485
0.588 21.5652 24800 0.7921 0.4313 0.1484
0.6015 21.9130 25200 0.7427 0.4240 0.1460
0.5641 22.2609 25600 0.8377 0.4322 0.1491
0.5718 22.6087 26000 0.7952 0.4293 0.1499
0.5936 22.9565 26400 0.8305 0.4365 0.1496
0.5587 23.3043 26800 0.8194 0.4386 0.1513
0.5488 23.6522 27200 0.7958 0.4288 0.1470
0.5532 24.0 27600 0.8417 0.4227 0.1451
0.5325 24.3478 28000 0.8885 0.4282 0.1475
0.5233 24.6957 28400 0.8220 0.4334 0.1489
0.5247 25.0435 28800 0.9235 0.4281 0.1466
0.5185 25.3913 29200 0.8981 0.4330 0.1493
0.5099 25.7391 29600 0.8012 0.4311 0.1479
0.5066 26.0870 30000 0.9317 0.4199 0.1446
0.489 26.4348 30400 0.9119 0.4260 0.1454
0.4989 26.7826 30800 0.8681 0.4209 0.1447
0.4855 27.1304 31200 0.8741 0.4313 0.1467
0.4724 27.4783 31600 0.8952 0.4205 0.1426
0.4715 27.8261 32000 0.8722 0.4275 0.1457
0.4654 28.1739 32400 0.9719 0.4321 0.1472
0.4678 28.5217 32800 0.9647 0.4250 0.1453
0.4599 28.8696 33200 0.9411 0.4300 0.1462
0.4374 29.2174 33600 0.9919 0.4231 0.1447
0.4486 29.5652 34000 0.9928 0.4220 0.1432
0.438 29.9130 34400 0.9597 0.4276 0.1453
0.435 30.2609 34800 0.9759 0.4273 0.1455
0.4252 30.6087 35200 0.9845 0.4267 0.1459
0.4173 30.9565 35600 1.0023 0.4255 0.1447
0.4281 31.3043 36000 1.0498 0.4223 0.1443
0.4079 31.6522 36400 1.0335 0.4206 0.1434
0.4176 32.0 36800 0.9958 0.4217 0.1435
0.4013 32.3478 37200 1.0534 0.4209 0.1439
0.401 32.6957 37600 1.0248 0.4145 0.1416
0.4081 33.0435 38000 1.0508 0.4165 0.1418
0.3951 33.3913 38400 1.0494 0.4182 0.1422
0.3852 33.7391 38800 1.0373 0.4138 0.1411
0.3985 34.0870 39200 1.0426 0.4168 0.1424
0.3823 34.4348 39600 1.0586 0.4169 0.1423
0.3859 34.7826 40000 1.0514 0.4179 0.1422
0.3948 35.1304 40400 1.0770 0.4145 0.1414
0.3637 35.4783 40800 1.0568 0.4145 0.1411
0.371 35.8261 41200 1.0600 0.4136 0.1413

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.5.1+cu124
  • Datasets 2.17.0
  • Tokenizers 0.20.3
Downloads last month
6
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for asr-africa/wav2vec2-xls-r-wolof-mixed-75-hours

Finetuned
(556)
this model

Evaluation results