You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-20hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9297
  • Wer: 0.4181
  • Cer: 0.0971

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.9527 0.9992 664 1.5500 0.9976 0.4212
1.1957 2.0 1329 0.9051 0.8748 0.2407
0.8321 2.9992 1993 0.7001 0.7484 0.1890
0.6696 4.0 2658 0.6063 0.6676 0.1674
0.5776 4.9992 3322 0.5716 0.6176 0.1537
0.5096 6.0 3987 0.5550 0.5928 0.1433
0.4566 6.9992 4651 0.5552 0.5846 0.1439
0.4179 8.0 5316 0.5060 0.5598 0.1325
0.3822 8.9992 5980 0.5055 0.5411 0.1287
0.3529 10.0 6645 0.5259 0.5496 0.1327
0.3252 10.9992 7309 0.5198 0.5383 0.1278
0.3062 12.0 7974 0.5317 0.5188 0.1231
0.2883 12.9992 8638 0.5425 0.5104 0.1224
0.2692 14.0 9303 0.5573 0.5144 0.1239
0.2527 14.9992 9967 0.5541 0.5278 0.1250
0.24 16.0 10632 0.5480 0.5177 0.1243
0.2265 16.9992 11296 0.5731 0.5219 0.1220
0.2172 18.0 11961 0.5930 0.5034 0.1195
0.2066 18.9992 12625 0.5541 0.5043 0.1194
0.1954 20.0 13290 0.6169 0.4938 0.1167
0.1882 20.9992 13954 0.6457 0.5084 0.1200
0.1803 22.0 14619 0.6437 0.5041 0.1185
0.1738 22.9992 15283 0.6632 0.4930 0.1164
0.1703 24.0 15948 0.6391 0.4989 0.1196
0.1653 24.9992 16612 0.6890 0.4939 0.1156
0.1584 26.0 17277 0.6615 0.4827 0.1145
0.1516 26.9992 17941 0.6689 0.4767 0.1135
0.145 28.0 18606 0.7077 0.4856 0.1162
0.1406 28.9992 19270 0.7201 0.4829 0.1156
0.1392 30.0 19935 0.6945 0.4887 0.1160
0.1352 30.9992 20599 0.7176 0.4817 0.1142
0.1301 32.0 21264 0.7248 0.4786 0.1125
0.1281 32.9992 21928 0.7114 0.4817 0.1136
0.1242 34.0 22593 0.7242 0.4773 0.1137
0.121 34.9992 23257 0.7525 0.4774 0.1142
0.1216 36.0 23922 0.7138 0.4782 0.1124
0.1165 36.9992 24586 0.7584 0.4717 0.1118
0.1127 38.0 25251 0.7633 0.4701 0.1121
0.1087 38.9992 25915 0.7608 0.4680 0.1109
0.1078 40.0 26580 0.7820 0.4781 0.1120
0.1068 40.9992 27244 0.7711 0.4677 0.1097
0.1033 42.0 27909 0.7520 0.4646 0.1097
0.1003 42.9992 28573 0.7726 0.4675 0.1098
0.0973 44.0 29238 0.8112 0.4712 0.1134
0.1032 44.9992 29902 0.7629 0.4631 0.1088
0.0954 46.0 30567 0.7807 0.4561 0.1076
0.0968 46.9992 31231 0.7622 0.4613 0.1084
0.0908 48.0 31896 0.7806 0.4626 0.1093
0.0883 48.9992 32560 0.7897 0.4625 0.1114
0.0862 50.0 33225 0.7869 0.4582 0.1077
0.0856 50.9992 33889 0.7758 0.4553 0.1080
0.0832 52.0 34554 0.8347 0.4588 0.1079
0.0816 52.9992 35218 0.8016 0.4531 0.1069
0.0811 54.0 35883 0.8038 0.4517 0.1074
0.0777 54.9992 36547 0.8117 0.4502 0.1066
0.0777 56.0 37212 0.8471 0.4467 0.1062
0.0752 56.9992 37876 0.8348 0.4572 0.1084
0.0754 58.0 38541 0.7963 0.4582 0.1069
0.0739 58.9992 39205 0.7946 0.4529 0.1065
0.0709 60.0 39870 0.8774 0.4609 0.1081
0.0693 60.9992 40534 0.8420 0.4543 0.1066
0.0699 62.0 41199 0.8493 0.4523 0.1062
0.0656 62.9992 41863 0.8125 0.4478 0.1055
0.0648 64.0 42528 0.8573 0.4442 0.1053
0.0629 64.9992 43192 0.8419 0.4484 0.1056
0.062 66.0 43857 0.8429 0.4408 0.1046
0.061 66.9992 44521 0.8422 0.4395 0.1042
0.0635 68.0 45186 0.8596 0.4424 0.1033
0.0589 68.9992 45850 0.8556 0.4466 0.1044
0.0561 70.0 46515 0.8385 0.4379 0.1024
0.0582 70.9992 47179 0.8513 0.4381 0.1026
0.0551 72.0 47844 0.8944 0.4412 0.1034
0.054 72.9992 48508 0.8613 0.4401 0.1030
0.0518 74.0 49173 0.8743 0.4296 0.1009
0.0523 74.9992 49837 0.8648 0.4383 0.1016
0.0519 76.0 50502 0.8697 0.4349 0.0999
0.0495 76.9992 51166 0.8764 0.4299 0.1002
0.05 78.0 51831 0.8644 0.4338 0.0996
0.0477 78.9992 52495 0.8854 0.4369 0.1008
0.0471 80.0 53160 0.8584 0.4282 0.1000
0.0459 80.9992 53824 0.8885 0.4306 0.0997
0.045 82.0 54489 0.8808 0.4255 0.0996
0.0446 82.9992 55153 0.8665 0.4302 0.0997
0.0433 84.0 55818 0.8970 0.4293 0.0993
0.0425 84.9992 56482 0.8902 0.4246 0.0994
0.0416 86.0 57147 0.9117 0.4293 0.0991
0.0408 86.9992 57811 0.9017 0.4255 0.0990
0.0414 88.0 58476 0.8938 0.4244 0.0977
0.0399 88.9992 59140 0.9081 0.4252 0.0983
0.0397 90.0 59805 0.9035 0.4215 0.0978
0.037 90.9992 60469 0.9233 0.4241 0.0983
0.037 92.0 61134 0.9321 0.4246 0.0981
0.0371 92.9992 61798 0.9207 0.4246 0.0982
0.0361 94.0 62463 0.9219 0.4193 0.0975
0.0351 94.9992 63127 0.9208 0.4194 0.0977
0.0354 96.0 63792 0.9235 0.4226 0.0978
0.0341 96.9992 64456 0.9353 0.4189 0.0973
0.0332 98.0 65121 0.9400 0.4167 0.0971
0.0341 98.9992 65785 0.9314 0.4181 0.0972
0.0342 99.9248 66400 0.9297 0.4181 0.0971

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
24
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-20hrs-v10

Finetuned
(531)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-20hrs-v10