You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

facebook mms-1b-all Afrikaans - Beijuka Bruno

This model is a fine-tuned version of facebook/mms-1b-all on the NCHLT_speech_corpus/Afrikaans dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6359
  • Model Preparation Time: 0.0118
  • Wer: 0.6849
  • Cer: 0.1502

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
130.0625 1.0 39 11.2286 0.0118 6.1049 1.7167
54.5199 2.0 78 3.0118 0.0118 1.0 0.9810
12.1633 3.0 117 0.3363 0.0118 0.4967 0.0774
3.4303 4.0 156 0.2186 0.0118 0.3785 0.0544
2.6944 5.0 195 0.1856 0.0118 0.3473 0.0487
2.4226 6.0 234 0.1697 0.0118 0.3265 0.0450
2.2073 7.0 273 0.1595 0.0118 0.3118 0.0425
2.0856 8.0 312 0.1544 0.0118 0.3059 0.0409
2.0081 9.0 351 0.1517 0.0118 0.2985 0.0403
1.9099 10.0 390 0.1475 0.0118 0.2847 0.0391
1.7942 11.0 429 0.1479 0.0118 0.2834 0.0393
1.8086 12.0 468 0.1497 0.0118 0.2863 0.0393
1.7132 13.0 507 0.1456 0.0118 0.2880 0.0389
1.67 14.0 546 0.1420 0.0118 0.2693 0.0373
1.5467 15.0 585 0.1390 0.0118 0.2695 0.0366
1.5461 16.0 624 0.1377 0.0118 0.2657 0.0358
1.5588 17.0 663 0.1397 0.0118 0.2622 0.0355
1.5061 18.0 702 0.1344 0.0118 0.2552 0.0352
1.4278 19.0 741 0.1330 0.0118 0.2584 0.0350
1.3602 20.0 780 0.1315 0.0118 0.2459 0.0333
1.3764 21.0 819 0.1304 0.0118 0.2476 0.0329
1.3472 22.0 858 0.1353 0.0118 0.2638 0.0347
1.3776 23.0 897 0.1343 0.0118 0.2546 0.0335
1.3292 24.0 936 0.1298 0.0118 0.2411 0.0325
1.234 25.0 975 0.1316 0.0118 0.2503 0.0345
1.2623 26.0 1014 0.1286 0.0118 0.2413 0.0326
1.2282 27.0 1053 0.1261 0.0118 0.2354 0.0320
1.2085 28.0 1092 0.1256 0.0118 0.2362 0.0320
1.1564 29.0 1131 0.1269 0.0118 0.2310 0.0318
1.147 30.0 1170 0.1305 0.0118 0.2289 0.0315
1.1448 31.0 1209 0.1271 0.0118 0.2270 0.0310
1.1173 32.0 1248 0.1262 0.0118 0.2367 0.0314
1.095 33.0 1287 0.1270 0.0118 0.2302 0.0312
1.143 34.0 1326 0.1292 0.0118 0.2248 0.0315
1.0139 35.0 1365 0.1300 0.0118 0.2259 0.0316
1.0125 36.0 1404 0.1287 0.0118 0.2229 0.0311
1.0034 37.0 1443 0.1265 0.0118 0.2194 0.0302
1.0737 38.0 1482 0.1274 0.0118 0.2253 0.0307
1.0448 39.0 1521 0.1259 0.0118 0.2248 0.0304
0.9256 40.0 1560 0.1261 0.0118 0.2120 0.0293
0.9715 41.0 1599 0.1278 0.0118 0.2131 0.0295
1.0504 42.0 1638 0.1261 0.0118 0.2188 0.0299
0.8878 43.0 1677 0.1269 0.0118 0.2099 0.0296
0.9358 44.0 1716 0.1275 0.0118 0.2101 0.0298
0.9141 45.0 1755 0.1278 0.0118 0.2367 0.0312
0.9163 46.0 1794 0.1234 0.0118 0.2183 0.0300
0.9371 47.0 1833 0.1274 0.0118 0.2101 0.0296
0.8757 48.0 1872 0.1288 0.0118 0.2061 0.0287
0.8129 49.0 1911 0.1279 0.0118 0.2120 0.0293
0.9089 50.0 1950 0.1265 0.0118 0.2050 0.0288
0.8816 51.0 1989 0.1288 0.0118 0.2270 0.0307
0.8614 52.0 2028 0.1288 0.0118 0.2194 0.0306
0.8606 53.0 2067 0.1261 0.0118 0.2223 0.0296
0.8022 54.0 2106 0.1266 0.0118 0.2007 0.0281
0.7914 55.0 2145 0.1272 0.0118 0.2055 0.0284
0.7952 56.0 2184 0.1251 0.0118 0.2096 0.0289
0.7639 57.0 2223 0.1255 0.0118 0.2034 0.0282
0.831 58.0 2262 0.1255 0.0118 0.2025 0.0284
0.8013 59.0 2301 0.1258 0.0118 0.1982 0.0279
0.8245 60.0 2340 0.1268 0.0118 0.2137 0.0287
0.76 61.0 2379 0.1256 0.0118 0.1996 0.0273
0.8253 62.0 2418 0.1278 0.0118 0.2153 0.0281
0.7767 63.0 2457 0.1257 0.0118 0.1944 0.0269
0.7583 64.0 2496 0.1264 0.0118 0.1933 0.0274
0.7451 65.0 2535 0.1279 0.0118 0.2074 0.0280
0.7453 66.0 2574 0.1290 0.0118 0.1998 0.0278
0.706 67.0 2613 0.1298 0.0118 0.2053 0.0281
0.7925 68.0 2652 0.1270 0.0118 0.2036 0.0283
0.698 69.0 2691 0.1271 0.0118 0.2007 0.0273
0.699 70.0 2730 0.1286 0.0118 0.2004 0.0269
0.7131 71.0 2769 0.1272 0.0118 0.1998 0.0271
0.7582 72.0 2808 0.1251 0.0118 0.1988 0.0272
0.6473 73.0 2847 0.1259 0.0118 0.1950 0.0271
0.7017 74.0 2886 0.1261 0.0118 0.1944 0.0272
0.655 75.0 2925 0.1277 0.0118 0.1979 0.0275
0.6369 76.0 2964 0.1274 0.0118 0.1966 0.0269
0.691 77.0 3003 0.1275 0.0118 0.1979 0.0271
0.7352 78.0 3042 0.1268 0.0118 0.1952 0.0269
0.6745 79.0 3081 0.1266 0.0118 0.1917 0.0267
0.6987 80.0 3120 0.1273 0.0118 0.1917 0.0264
0.6736 81.0 3159 0.1276 0.0118 0.1925 0.0266
0.7108 82.0 3198 0.1256 0.0118 0.1914 0.0261
0.6158 83.0 3237 0.1255 0.0118 0.1925 0.0266
0.6384 84.0 3276 0.1260 0.0118 0.1939 0.0267
0.6876 85.0 3315 0.1254 0.0118 0.1950 0.0264
0.658 86.0 3354 0.1252 0.0118 0.1947 0.0267
0.5993 87.0 3393 0.1250 0.0118 0.1936 0.0263
0.7441 88.0 3432 0.1248 0.0118 0.1939 0.0265
0.684 89.0 3471 0.1240 0.0118 0.1933 0.0263
0.7157 90.0 3510 0.1244 0.0118 0.1901 0.0261
0.5917 91.0 3549 0.1240 0.0118 0.1906 0.0260
0.637 92.0 3588 0.1245 0.0118 0.1920 0.0262
0.6691 93.0 3627 0.1235 0.0118 0.1912 0.0261
0.6616 94.0 3666 0.1239 0.0118 0.1895 0.0258
0.6641 95.0 3705 0.1232 0.0118 0.1890 0.0258
0.638 96.0 3744 0.1234 0.0118 0.1895 0.0257
0.6668 97.0 3783 0.1232 0.0118 0.1890 0.0257
0.5726 97.4459 3800 0.1233 0.0118 0.1895 0.0258

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
0
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/mms-1B_all_NCHLT_AFRIKAANS_1hr_v1

Finetuned
(213)
this model

Evaluation results