You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-313hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4601
  • Wer: 0.2549
  • Cer: 0.0587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4811 1.0000 15243 0.3677 0.4179 0.0980
0.2902 2.0 30487 0.3308 0.3884 0.0908
0.2556 3.0000 45730 0.3040 0.3639 0.0846
0.2322 4.0 60974 0.2930 0.3630 0.0842
0.2132 5.0000 76217 0.2876 0.3444 0.0812
0.1984 6.0 91461 0.2855 0.3463 0.0791
0.187 7.0000 106704 0.2762 0.3391 0.0782
0.1775 8.0 121948 0.2709 0.3268 0.0757
0.1684 9.0000 137191 0.2638 0.3347 0.0769
0.1599 10.0 152435 0.2634 0.3194 0.0737
0.1526 11.0000 167678 0.2595 0.3318 0.0767
0.1461 12.0 182922 0.2522 0.3455 0.0774
0.1394 13.0000 198165 0.2669 0.3176 0.0736
0.1338 14.0 213409 0.2505 0.3090 0.0716
0.1278 15.0000 228652 0.2492 0.3018 0.0687
0.1222 16.0 243896 0.2582 0.3069 0.0711
0.1168 17.0000 259139 0.2489 0.3038 0.0691
0.1111 18.0 274383 0.2557 0.3056 0.0687
0.1068 19.0000 289626 0.2539 0.3100 0.0695
0.102 20.0 304870 0.2665 0.2968 0.0680
0.097 21.0000 320113 0.2671 0.2946 0.0677
0.093 22.0 335357 0.2599 0.3002 0.0676
0.0886 23.0000 350600 0.2600 0.2949 0.0672
0.0852 24.0 365844 0.2518 0.2996 0.0687
0.0819 25.0000 381087 0.2684 0.2837 0.0644
0.0792 26.0 396331 0.2594 0.2904 0.0668
0.0757 27.0000 411574 0.2652 0.2951 0.0679
0.0729 28.0 426818 0.2712 0.2843 0.0647
0.0703 29.0000 442061 0.2807 0.2846 0.0657
0.0677 30.0 457305 0.2913 0.2882 0.0655
0.0648 31.0000 472548 0.2877 0.2894 0.0664
0.0627 32.0 487792 0.2811 0.2867 0.0661
0.061 33.0000 503035 0.2746 0.2836 0.0649
0.0589 34.0 518279 0.2912 0.2925 0.0658
0.0573 35.0000 533522 0.2948 0.2802 0.0650
0.0553 36.0 548766 0.2751 0.2911 0.0670
0.0538 37.0000 564009 0.3105 0.2822 0.0645
0.0518 38.0 579253 0.3066 0.2881 0.0654
0.0506 39.0000 594496 0.3258 0.2775 0.0650
0.0494 40.0 609740 0.2869 0.2778 0.0643
0.0479 41.0000 624983 0.3182 0.2795 0.0643
0.0468 42.0 640227 0.3191 0.2742 0.0643
0.045 43.0000 655470 0.2913 0.2760 0.0637
0.0442 44.0 670714 0.3118 0.2679 0.0626
0.0429 45.0000 685957 0.3220 0.2800 0.0648
0.0419 46.0 701201 0.3214 0.2740 0.0636
0.0404 47.0000 716444 0.3310 0.2781 0.0652
0.0399 48.0 731688 0.3278 0.2735 0.0638
0.0389 49.0000 746931 0.3374 0.2689 0.0629
0.0377 50.0 762175 0.3290 0.2688 0.0613
0.0367 51.0000 777418 0.3499 0.2734 0.0628
0.0358 52.0 792662 0.3411 0.2737 0.0623
0.0348 53.0000 807905 0.3417 0.2696 0.0618
0.0343 54.0 823149 0.3700 0.2705 0.0617
0.0337 55.0000 838392 0.3585 0.2699 0.0623
0.033 56.0 853636 0.3379 0.2634 0.0613
0.0316 57.0000 868879 0.3648 0.2644 0.0614
0.0314 58.0 884123 0.3614 0.2677 0.0621
0.0303 59.0000 899366 0.3511 0.2636 0.0612
0.0297 60.0 914610 0.3631 0.2662 0.0614
0.0292 61.0000 929853 0.3805 0.2628 0.0608
0.0284 62.0 945097 0.3762 0.2683 0.0615
0.0276 63.0000 960340 0.3928 0.2545 0.0593
0.0272 64.0 975584 0.3900 0.2613 0.0605
0.0264 65.0000 990827 0.3899 0.2614 0.0602
0.0257 66.0 1006071 0.4071 0.2549 0.0592
0.0252 67.0000 1021314 0.3988 0.2518 0.0586
0.0247 68.0 1036558 0.4146 0.2553 0.0590
0.0239 69.0000 1051801 0.4296 0.2582 0.0595
0.0234 70.0 1067045 0.4171 0.2589 0.0595
0.0229 71.0000 1082288 0.4320 0.2558 0.0597
0.0225 72.0 1097532 0.4148 0.2604 0.0604
0.0217 73.0000 1112775 0.4315 0.2607 0.0596
0.0213 74.0 1128019 0.4083 0.2568 0.0589
0.0205 75.0000 1143262 0.4253 0.2545 0.0589
0.0199 76.0 1158506 0.4413 0.2554 0.0597
0.0198 77.0000 1173749 0.4320 0.2567 0.0587
0.0191 78.0 1188993 0.4595 0.2511 0.0587
0.0189 79.0000 1204236 0.4601 0.2549 0.0587

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
30
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-313hrs-v10

Finetuned
(531)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-313hrs-v10