You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

facebook mms-1b-all Afrikaans - Beijuka Bruno

This model is a fine-tuned version of facebook/mms-1b-all on the NCHLT_speech_corpus/Afrikaans dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9067
  • Model Preparation Time: 0.0116
  • Wer: 0.7652
  • Cer: 0.1872

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
63.0832 1.0 188 3.3494 0.0116 1.0 0.9088
16.9756 2.0 376 0.1944 0.0116 0.3835 0.0495
2.4562 3.0 564 0.1394 0.0116 0.2949 0.0362
2.0978 4.0 752 0.1276 0.0116 0.2703 0.0332
1.9554 5.0 940 0.1213 0.0116 0.2540 0.0309
1.7927 6.0 1128 0.1096 0.0116 0.2319 0.0275
1.6971 7.0 1316 0.1029 0.0116 0.2229 0.0265
1.6156 8.0 1504 0.0998 0.0116 0.2088 0.0249
1.5138 9.0 1692 0.0976 0.0116 0.1998 0.0237
1.4701 10.0 1880 0.0938 0.0116 0.1843 0.0229
1.41 11.0 2068 0.0887 0.0116 0.1854 0.0229
1.3588 12.0 2256 0.0844 0.0116 0.1727 0.0215
1.262 13.0 2444 0.0853 0.0116 0.1656 0.0208
1.2938 14.0 2632 0.0789 0.0116 0.1504 0.0194
1.2513 15.0 2820 0.0767 0.0116 0.1543 0.0192
1.1791 16.0 3008 0.0757 0.0116 0.1493 0.0186
1.1515 17.0 3196 0.0751 0.0116 0.1450 0.0184
1.1314 18.0 3384 0.0723 0.0116 0.1315 0.0171
1.0842 19.0 3572 0.0721 0.0116 0.1267 0.0170
1.0916 20.0 3760 0.0696 0.0116 0.1239 0.0163
1.0455 21.0 3948 0.0677 0.0116 0.1185 0.0156
1.0347 22.0 4136 0.0657 0.0116 0.1126 0.0151
0.9926 23.0 4324 0.0668 0.0116 0.1165 0.0153
0.9646 24.0 4512 0.0690 0.0116 0.1253 0.0162
0.9332 25.0 4700 0.0676 0.0116 0.1067 0.0149
0.8994 26.0 4888 0.0671 0.0116 0.1109 0.0147
0.8807 27.0 5076 0.0670 0.0116 0.1061 0.0142
0.8967 28.0 5264 0.0648 0.0116 0.0982 0.0135
0.8737 29.0 5452 0.0679 0.0116 0.1084 0.0149
0.8672 30.0 5640 0.0635 0.0116 0.0990 0.0136
0.8384 31.0 5828 0.0617 0.0116 0.0971 0.0136
0.8448 32.0 6016 0.0624 0.0116 0.0897 0.0131
0.8264 33.0 6204 0.0651 0.0116 0.0993 0.0135
0.8036 34.0 6392 0.0628 0.0116 0.0942 0.0127
0.7973 35.0 6580 0.0598 0.0116 0.0878 0.0127
0.7788 36.0 6768 0.0617 0.0116 0.0883 0.0124
0.7736 37.0 6956 0.0603 0.0116 0.0863 0.0120
0.7692 38.0 7144 0.0608 0.0116 0.0934 0.0126
0.766 39.0 7332 0.0598 0.0116 0.0886 0.0122
0.7546 40.0 7520 0.0620 0.0116 0.0835 0.0120
0.7156 41.0 7708 0.0620 0.0116 0.0920 0.0126
0.7393 42.0 7896 0.0608 0.0116 0.0835 0.0119
0.6706 43.0 8084 0.0632 0.0116 0.0841 0.0120
0.6853 44.0 8272 0.0622 0.0116 0.0861 0.0116
0.6753 45.0 8460 0.0619 0.0116 0.0900 0.0124
0.6779 46.0 8648 0.0612 0.0116 0.0824 0.0117
0.6508 47.0 8836 0.0617 0.0116 0.0838 0.0116
0.6847 48.0 9024 0.0627 0.0116 0.0849 0.0121
0.6454 49.0 9212 0.0614 0.0116 0.0776 0.0113
0.621 50.0 9400 0.0614 0.0116 0.0821 0.0114
0.6292 51.0 9588 0.0614 0.0116 0.0852 0.0118
0.6046 52.0 9776 0.0598 0.0116 0.0745 0.0108
0.6067 53.0 9964 0.0598 0.0116 0.0725 0.0106
0.5985 54.0 10152 0.0621 0.0116 0.0776 0.0110
0.5966 55.0 10340 0.0630 0.0116 0.0711 0.0102
0.5716 56.0 10528 0.0599 0.0116 0.0711 0.0105
0.6006 57.0 10716 0.0624 0.0116 0.0711 0.0107
0.5841 58.0 10904 0.0620 0.0116 0.0703 0.0105
0.5744 59.0 11092 0.0617 0.0116 0.0691 0.0104
0.5675 60.0 11280 0.0607 0.0116 0.0697 0.0103
0.5684 61.0 11468 0.0619 0.0116 0.0705 0.0104
0.5112 62.0 11656 0.0649 0.0116 0.0720 0.0103
0.545 63.0 11844 0.0603 0.0116 0.0674 0.0102
0.52 64.0 12032 0.0624 0.0116 0.0674 0.0098
0.5356 65.0 12220 0.0600 0.0116 0.0669 0.0100
0.5584 66.0 12408 0.0595 0.0116 0.0688 0.0103
0.5113 67.0 12596 0.0615 0.0116 0.0725 0.0104
0.5201 68.0 12784 0.0618 0.0116 0.0663 0.0100
0.5409 69.0 12972 0.0613 0.0116 0.0691 0.0100
0.5275 70.0 13160 0.0613 0.0116 0.0672 0.0100
0.5243 71.0 13348 0.0618 0.0116 0.0660 0.0098
0.5296 72.0 13536 0.0604 0.0116 0.0646 0.0098
0.5201 73.0 13724 0.0611 0.0116 0.0677 0.0098
0.5056 74.0 13912 0.0619 0.0116 0.0635 0.0094
0.4908 75.0 14100 0.0606 0.0116 0.0638 0.0096
0.4855 76.0 14288 0.0627 0.0116 0.0609 0.0093
0.4869 77.0 14476 0.0629 0.0116 0.0632 0.0095
0.5004 78.0 14664 0.0613 0.0116 0.0626 0.0096
0.5174 79.0 14852 0.0610 0.0116 0.0643 0.0094
0.4838 80.0 15040 0.0616 0.0116 0.0638 0.0096
0.4717 81.0 15228 0.0621 0.0116 0.0629 0.0096
0.4783 82.0 15416 0.0635 0.0116 0.0649 0.0095
0.4708 83.0 15604 0.0617 0.0116 0.0621 0.0096
0.479 84.0 15792 0.0620 0.0116 0.0618 0.0093
0.4879 85.0 15980 0.0624 0.0116 0.0609 0.0093
0.4643 86.0 16168 0.0610 0.0116 0.0590 0.0090
0.4786 87.0 16356 0.0602 0.0116 0.0601 0.0091
0.4807 88.0 16544 0.0604 0.0116 0.0609 0.0092
0.4657 89.0 16732 0.0603 0.0116 0.0618 0.0090
0.4612 90.0 16920 0.0596 0.0116 0.0618 0.0089
0.4465 91.0 17108 0.0609 0.0116 0.0615 0.0089
0.4387 92.0 17296 0.0614 0.0116 0.0593 0.0088
0.4587 93.0 17484 0.0606 0.0116 0.0587 0.0087
0.4578 94.0 17672 0.0603 0.0116 0.0573 0.0086
0.4605 95.0 17860 0.0607 0.0116 0.0615 0.0088
0.4698 96.0 18048 0.0603 0.0116 0.0590 0.0086
0.4472 97.0 18236 0.0603 0.0116 0.0607 0.0088
0.4374 98.0 18424 0.0606 0.0116 0.0607 0.0089
0.4434 99.0 18612 0.0605 0.0116 0.0590 0.0086
0.4094 99.4700 18700 0.0607 0.0116 0.0595 0.0088

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
0
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/mms-1B_all_NCHLT_AFRIKAANS_5hr_v1

Finetuned
(213)
this model

Evaluation results