You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-10hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1102
  • Wer: 0.4982
  • Cer: 0.1179

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.5679 1.0 323 2.9514 1.0 1.0
2.5224 2.0 646 1.7053 0.9999 0.4763
1.4334 3.0 969 1.1452 0.9583 0.3166
1.0482 4.0 1292 0.9311 0.9226 0.2494
0.8469 5.0 1615 0.8079 0.8120 0.2106
0.7281 6.0 1938 0.7611 0.7556 0.1940
0.6241 7.0 2261 0.7285 0.7212 0.1837
0.5515 8.0 2584 0.7075 0.6836 0.1712
0.5013 9.0 2907 0.6871 0.6607 0.1663
0.4455 10.0 3230 0.6718 0.6645 0.1683
0.402 11.0 3553 0.6832 0.6510 0.1664
0.3728 12.0 3876 0.7118 0.6209 0.1555
0.343 13.0 4199 0.6664 0.6202 0.1539
0.32 14.0 4522 0.7267 0.6122 0.1532
0.2951 15.0 4845 0.7404 0.5956 0.1503
0.2775 16.0 5168 0.7633 0.6080 0.1498
0.2615 17.0 5491 0.7444 0.5962 0.1496
0.244 18.0 5814 0.7332 0.6077 0.1493
0.2299 19.0 6137 0.7597 0.5839 0.1443
0.2179 20.0 6460 0.7834 0.5972 0.1482
0.2108 21.0 6783 0.7985 0.5922 0.1502
0.2008 22.0 7106 0.8097 0.5736 0.1414
0.1957 23.0 7429 0.8080 0.5935 0.1462
0.1883 24.0 7752 0.8837 0.5755 0.1434
0.1812 25.0 8075 0.8384 0.5750 0.1398
0.173 26.0 8398 0.8560 0.5639 0.1378
0.1667 27.0 8721 0.8805 0.5814 0.1442
0.1591 28.0 9044 0.8186 0.5746 0.1416
0.1549 29.0 9367 0.8560 0.5685 0.1412
0.1501 30.0 9690 0.9306 0.5635 0.1392
0.1471 31.0 10013 0.8634 0.5599 0.1378
0.146 32.0 10336 0.8246 0.5636 0.1380
0.1395 33.0 10659 0.8922 0.5602 0.1374
0.1344 34.0 10982 0.8653 0.5624 0.1375
0.1325 35.0 11305 0.9303 0.5558 0.1377
0.1295 36.0 11628 0.9432 0.5611 0.1382
0.1285 37.0 11951 0.8843 0.5589 0.1372
0.1247 38.0 12274 0.9276 0.5576 0.1353
0.1219 39.0 12597 0.9278 0.5604 0.1389
0.1188 40.0 12920 0.9765 0.5487 0.1349
0.1169 41.0 13243 0.9452 0.5519 0.1331
0.1111 42.0 13566 0.9585 0.5444 0.1320
0.1115 43.0 13889 0.9163 0.5455 0.1329
0.1073 44.0 14212 0.9804 0.5525 0.1334
0.1039 45.0 14535 0.9906 0.5379 0.1299
0.104 46.0 14858 0.9769 0.5407 0.1312
0.1024 47.0 15181 0.9502 0.5441 0.1328
0.1003 48.0 15504 0.9524 0.5397 0.1310
0.0993 49.0 15827 0.9993 0.5400 0.1318
0.0941 50.0 16150 0.9701 0.5401 0.1306
0.0939 51.0 16473 1.0024 0.5431 0.1319
0.0913 52.0 16796 0.9767 0.5284 0.1281
0.0914 53.0 17119 0.9617 0.5415 0.1328
0.0889 54.0 17442 0.9570 0.5445 0.1312
0.0832 55.0 17765 1.0253 0.5332 0.1299
0.0856 56.0 18088 1.0115 0.5333 0.1298
0.0833 57.0 18411 1.0273 0.5381 0.1291
0.0845 58.0 18734 1.0109 0.5344 0.1297
0.0814 59.0 19057 0.9471 0.5290 0.1278
0.0786 60.0 19380 1.0011 0.5271 0.1282
0.0776 61.0 19703 0.9817 0.5217 0.1255
0.0763 62.0 20026 0.9942 0.5214 0.1260
0.0724 63.0 20349 1.0208 0.5252 0.1262
0.0704 64.0 20672 1.0465 0.5195 0.1242
0.0707 65.0 20995 1.0340 0.5197 0.1259
0.0722 66.0 21318 1.0156 0.5212 0.1254
0.0703 67.0 21641 1.0830 0.5181 0.1257
0.067 68.0 21964 1.0453 0.5195 0.1248
0.0657 69.0 22287 1.0392 0.5207 0.1249
0.062 70.0 22610 1.0112 0.5139 0.1234
0.0626 71.0 22933 1.0672 0.5164 0.1243
0.0598 72.0 23256 1.0807 0.5132 0.1244
0.0599 73.0 23579 1.0566 0.5103 0.1228
0.0595 74.0 23902 1.0711 0.5096 0.1222
0.058 75.0 24225 1.0465 0.5085 0.1223
0.0558 76.0 24548 1.0750 0.5073 0.1228
0.0586 77.0 24871 1.0189 0.5048 0.1211
0.0553 78.0 25194 1.0529 0.5080 0.1223
0.0549 79.0 25517 1.0568 0.5137 0.1238
0.056 80.0 25840 1.0637 0.5087 0.1225
0.054 81.0 26163 1.0778 0.5098 0.1219
0.0509 82.0 26486 1.0557 0.5087 0.1206
0.0511 83.0 26809 1.0986 0.5011 0.1198
0.0516 84.0 27132 1.0862 0.5045 0.1201
0.0481 85.0 27455 1.1018 0.5008 0.1196
0.0469 86.0 27778 1.0797 0.5022 0.1196
0.0456 87.0 28101 1.0978 0.5006 0.1190
0.0456 88.0 28424 1.0978 0.5027 0.1200
0.0463 89.0 28747 1.0884 0.4985 0.1190
0.0444 90.0 29070 1.0855 0.5039 0.1198
0.0431 91.0 29393 1.1051 0.5006 0.1193
0.0438 92.0 29716 1.0819 0.5010 0.1187
0.0421 93.0 30039 1.0824 0.5014 0.1192
0.043 94.0 30362 1.0961 0.4978 0.1184
0.0409 95.0 30685 1.1035 0.4997 0.1182
0.0403 96.0 31008 1.1039 0.5005 0.1182
0.0393 97.0 31331 1.1045 0.4997 0.1181
0.0411 98.0 31654 1.1138 0.4985 0.1181
0.0398 99.0 31977 1.1106 0.4994 0.1182
0.0403 100.0 32300 1.1102 0.4982 0.1179

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
16
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-10hrs-v10

Finetuned
(531)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-10hrs-v10