You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-5hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4188
  • Wer: 0.6262
  • Cer: 0.1538

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
4.0858 1.0 163 2.9982 1.0 1.0
2.9847 2.0 326 2.8777 1.0 0.8974
2.6893 3.0 489 2.3977 1.0 0.8788
1.9846 4.0 652 1.5579 0.9969 0.4456
1.4038 5.0 815 1.2232 0.9731 0.3389
1.1532 6.0 978 1.0792 0.9346 0.2837
0.9962 7.0 1141 0.9944 0.9139 0.2624
0.8902 8.0 1304 0.9521 0.8855 0.2485
0.7932 9.0 1467 0.9181 0.8505 0.2324
0.7048 10.0 1630 0.8911 0.8354 0.2244
0.6321 11.0 1793 0.8618 0.8210 0.2164
0.5768 12.0 1956 0.9057 0.7999 0.2123
0.5301 13.0 2119 0.8889 0.7935 0.2085
0.4776 14.0 2282 0.9119 0.7774 0.2065
0.4523 15.0 2445 0.9176 0.7592 0.1995
0.4156 16.0 2608 0.9182 0.7537 0.1981
0.3819 17.0 2771 0.9575 0.7564 0.1964
0.3576 18.0 2934 0.9811 0.7468 0.1913
0.3412 19.0 3097 0.9944 0.7456 0.1920
0.3156 20.0 3260 0.9957 0.7393 0.1940
0.3093 21.0 3423 1.0222 0.7368 0.1921
0.2801 22.0 3586 1.0590 0.7453 0.1903
0.2645 23.0 3749 1.0664 0.7311 0.1880
0.2633 24.0 3912 1.1002 0.7381 0.1878
0.2497 25.0 4075 1.1172 0.7254 0.1866
0.2428 26.0 4238 1.1193 0.7122 0.1817
0.2271 27.0 4401 1.1254 0.7134 0.1832
0.2188 28.0 4564 1.1712 0.7110 0.1838
0.2082 29.0 4727 1.1919 0.7098 0.1811
0.2101 30.0 4890 1.1307 0.7019 0.1801
0.1996 31.0 5053 1.1777 0.7123 0.1817
0.1999 32.0 5216 1.1846 0.7040 0.1813
0.1878 33.0 5379 1.2185 0.6989 0.1796
0.1777 34.0 5542 1.1912 0.6959 0.1783
0.1739 35.0 5705 1.1820 0.6807 0.1737
0.1688 36.0 5868 1.2557 0.6891 0.1766
0.1669 37.0 6031 1.2223 0.6986 0.1777
0.1629 38.0 6194 1.2708 0.6779 0.1738
0.1598 39.0 6357 1.2525 0.6866 0.1737
0.1537 40.0 6520 1.2775 0.6864 0.1771
0.1565 41.0 6683 1.1535 0.6815 0.1725
0.1469 42.0 6846 1.2959 0.6876 0.1734
0.1473 43.0 7009 1.2196 0.6861 0.1724
0.1445 44.0 7172 1.2406 0.6888 0.1738
0.1415 45.0 7335 1.3067 0.6719 0.1700
0.1372 46.0 7498 1.3071 0.6826 0.1717
0.1367 47.0 7661 1.2246 0.6723 0.1672
0.1329 48.0 7824 1.2453 0.6734 0.1675
0.1326 49.0 7987 1.2906 0.6705 0.1673
0.1274 50.0 8150 1.2884 0.6672 0.1658
0.1278 51.0 8313 1.3130 0.6668 0.1658
0.121 52.0 8476 1.3030 0.6677 0.1660
0.1272 53.0 8639 1.3259 0.6703 0.1677
0.1187 54.0 8802 1.3041 0.6594 0.1645
0.1113 55.0 8965 1.2921 0.6580 0.1647
0.1099 56.0 9128 1.2852 0.6562 0.1654
0.1065 57.0 9291 1.3163 0.6595 0.1653
0.1073 58.0 9454 1.3684 0.6610 0.1657
0.1072 59.0 9617 1.3373 0.6531 0.1640
0.1112 60.0 9780 1.3282 0.6663 0.1661
0.1006 61.0 9943 1.3321 0.6589 0.1646
0.1028 62.0 10106 1.3799 0.6630 0.1656
0.1009 63.0 10269 1.3363 0.6496 0.1628
0.1027 64.0 10432 1.3147 0.6505 0.1617
0.0978 65.0 10595 1.3652 0.6514 0.1614
0.097 66.0 10758 1.3613 0.6584 0.1630
0.0922 67.0 10921 1.3973 0.6517 0.1628
0.0938 68.0 11084 1.3727 0.6516 0.1642
0.0945 69.0 11247 1.3770 0.6465 0.1604
0.0902 70.0 11410 1.3821 0.6463 0.1616
0.0923 71.0 11573 1.3953 0.6542 0.1603
0.0836 72.0 11736 1.3591 0.6439 0.1597
0.0834 73.0 11899 1.3937 0.6445 0.1608
0.0865 74.0 12062 1.3349 0.6404 0.1583
0.0862 75.0 12225 1.3709 0.6476 0.1598
0.0834 76.0 12388 1.3554 0.6431 0.1595
0.0801 77.0 12551 1.3893 0.6472 0.1582
0.0763 78.0 12714 1.3692 0.6399 0.1573
0.0776 79.0 12877 1.3976 0.6441 0.1582
0.0766 80.0 13040 1.3997 0.6423 0.1574
0.0769 81.0 13203 1.3624 0.6427 0.1577
0.0766 82.0 13366 1.3666 0.6349 0.1565
0.0733 83.0 13529 1.3526 0.6382 0.1562
0.0724 84.0 13692 1.3692 0.6369 0.1560
0.068 85.0 13855 1.3989 0.6289 0.1547
0.0706 86.0 14018 1.4132 0.6309 0.1549
0.069 87.0 14181 1.3883 0.6302 0.1551
0.0658 88.0 14344 1.3883 0.6313 0.1542
0.0666 89.0 14507 1.3797 0.6259 0.1542
0.0633 90.0 14670 1.4137 0.6290 0.1545
0.0623 91.0 14833 1.4279 0.6300 0.1552
0.0679 92.0 14996 1.4158 0.6279 0.1547
0.0618 93.0 15159 1.4112 0.6263 0.1534
0.0669 94.0 15322 1.4296 0.6276 0.1539
0.065 95.0 15485 1.4423 0.6292 0.1541
0.0619 96.0 15648 1.4188 0.6286 0.1539
0.0615 97.0 15811 1.4217 0.6260 0.1535
0.0635 98.0 15974 1.4130 0.6261 0.1537
0.0595 99.0 16137 1.4169 0.6267 0.1537
0.0584 100.0 16300 1.4188 0.6262 0.1538

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
19
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-5hrs-v10

Finetuned
(531)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-5hrs-v10