wav2vec2-xls-r-300m-lg-CV-Fleurs-100hrs-v11

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6212
  • Wer: 0.2948
  • Cer: 0.0651

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 80
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.5487 1.0 4599 0.4155 0.4712 0.1121
0.8135 2.0 9198 0.3650 0.4371 0.1048
0.7127 3.0 13797 0.3460 0.4146 0.0959
0.6495 4.0 18396 0.3361 0.3995 0.0932
0.6054 5.0 22995 0.3249 0.3915 0.0936
0.5634 6.0 27594 0.3077 0.3835 0.0884
0.5317 7.0 32193 0.3028 0.3811 0.0869
0.503 8.0 36792 0.3054 0.3637 0.0832
0.4765 9.0 41391 0.3085 0.3720 0.0843
0.4522 10.0 45990 0.3065 0.3592 0.0826
0.4319 11.0 50589 0.3052 0.3513 0.0817
0.4162 12.0 55188 0.2994 0.3586 0.0820
0.3951 13.0 59787 0.3037 0.3566 0.0820
0.3767 14.0 64386 0.3007 0.3443 0.0789
0.3596 15.0 68985 0.3114 0.3480 0.0786
0.342 16.0 73584 0.3060 0.3561 0.0809
0.3296 17.0 78183 0.3142 0.3505 0.0791
0.3134 18.0 82782 0.3151 0.3466 0.0791
0.3007 19.0 87381 0.3024 0.3450 0.0774
0.288 20.0 91980 0.3217 0.3490 0.0791
0.2739 21.0 96579 0.3262 0.3471 0.0784
0.263 22.0 101178 0.3330 0.3424 0.0778
0.2514 23.0 105777 0.3410 0.3395 0.0765
0.2386 24.0 110376 0.3486 0.3419 0.0775
0.2281 25.0 114975 0.3733 0.3422 0.0773
0.22 26.0 119574 0.3612 0.3432 0.0789
0.2119 27.0 124173 0.3607 0.3332 0.0754
0.2026 28.0 128772 0.3799 0.3289 0.0755
0.1948 29.0 133371 0.3843 0.3315 0.0758
0.1882 30.0 137970 0.3924 0.3323 0.0761
0.1799 31.0 142569 0.4123 0.3340 0.0769
0.1749 32.0 147168 0.4131 0.3294 0.0748
0.1675 33.0 151767 0.4262 0.3285 0.0747
0.1608 34.0 156366 0.4185 0.3320 0.0750
0.1569 35.0 160965 0.4290 0.3327 0.0763
0.1523 36.0 165564 0.4362 0.3323 0.0746
0.148 37.0 170163 0.4294 0.3344 0.0755
0.1441 38.0 174762 0.4307 0.3279 0.0741
0.1408 39.0 179361 0.4561 0.3182 0.0717
0.1341 40.0 183960 0.4516 0.3278 0.0748
0.1306 41.0 188559 0.4271 0.3230 0.0731
0.1268 42.0 193158 0.4647 0.3166 0.0724
0.1231 43.0 197757 0.4642 0.3141 0.0719
0.1211 44.0 202356 0.4817 0.3141 0.0719
0.1163 45.0 206955 0.4745 0.3172 0.0715
0.1125 46.0 211554 0.4971 0.3141 0.0709
0.1087 47.0 216153 0.4876 0.3140 0.0712
0.1068 48.0 220752 0.4946 0.3115 0.0706
0.1037 49.0 225351 0.5059 0.3133 0.0712
0.1017 50.0 229950 0.4973 0.3150 0.0711
0.098 51.0 234549 0.5247 0.3164 0.0714
0.0959 52.0 239148 0.5186 0.3094 0.0699
0.0918 53.0 243747 0.5141 0.3098 0.0696
0.0898 54.0 248346 0.5227 0.3113 0.0700
0.0876 55.0 252945 0.5191 0.3039 0.0685
0.0853 56.0 257544 0.5197 0.3057 0.0697
0.0826 57.0 262143 0.5355 0.3072 0.0694
0.0807 58.0 266742 0.5214 0.3097 0.0694
0.0781 59.0 271341 0.5354 0.3066 0.0687
0.0767 60.0 275940 0.5264 0.3034 0.0685
0.0737 61.0 280539 0.5649 0.3028 0.0682
0.0719 62.0 285138 0.5492 0.3077 0.0686
0.07 63.0 289737 0.5510 0.3002 0.0669
0.0681 64.0 294336 0.5730 0.3069 0.0682
0.0662 65.0 298935 0.5789 0.3047 0.0688
0.0651 66.0 303534 0.5612 0.2977 0.0668
0.0624 67.0 308133 0.5640 0.2947 0.0660
0.0614 68.0 312732 0.5807 0.2970 0.0662
0.06 69.0 317331 0.5735 0.2981 0.0670
0.0586 70.0 321930 0.6100 0.3002 0.0670
0.0566 71.0 326529 0.5831 0.2964 0.0665
0.0538 72.0 331128 0.6040 0.2996 0.0666
0.0528 73.0 335727 0.6159 0.2983 0.0670
0.0517 74.0 340326 0.6104 0.2996 0.0665
0.0508 75.0 344925 0.6106 0.2984 0.0663
0.0496 76.0 349524 0.6138 0.2990 0.0658
0.0486 77.0 354123 0.6097 0.2967 0.0657
0.0473 78.0 358722 0.6213 0.2954 0.0653
0.0479 79.0 363321 0.6212 0.2948 0.0651

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
17
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-100hrs-v11

Finetuned
(531)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-100hrs-v11