You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

W2V2-Bert_nchlt_speech_corpus_Fleurs_ZULU_63hr_v1

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2162
  • Wer: 0.2210
  • Cer: 0.0466

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.6638 0.9999 2754 0.4826 0.6109 0.1148
0.1709 1.9998 5508 0.3841 0.4173 0.0855
0.1297 2.9997 8262 0.4018 0.4364 0.0840
0.1064 4.0 11017 0.3691 0.3775 0.0728
0.0911 4.9999 13771 0.3308 0.3633 0.0724
0.0784 5.9998 16525 0.3812 0.4171 0.0815
0.0675 6.9997 19279 0.3797 0.3679 0.0766
0.0586 8.0 22034 0.3910 0.3906 0.0795
0.0506 8.9999 24788 0.3661 0.3666 0.0759
0.0441 9.9998 27542 0.3398 0.3642 0.0735
0.0383 10.9997 30296 0.3580 0.3572 0.0710
0.033 12.0 33051 0.4075 0.3692 0.0787
0.0287 12.9999 35805 0.3768 0.3740 0.0759
0.0249 13.9998 38559 0.3911 0.3500 0.0706
0.0227 14.9997 41313 0.4081 0.3406 0.0694
0.0202 16.0 44068 0.3703 0.3465 0.0695
0.0184 16.9999 46822 0.4362 0.3548 0.0761
0.0164 17.9998 49576 0.4163 0.3554 0.0738
0.0157 18.9997 52330 0.3918 0.3495 0.0719
0.0142 20.0 55085 0.4352 0.3714 0.0744
0.0129 20.9999 57839 0.3833 0.3506 0.0726
0.0121 21.9998 60593 0.4778 0.3736 0.0812
0.0113 22.9997 63347 0.3956 0.3388 0.0696
0.0101 24.0 66102 0.4480 0.3456 0.0706
0.01 24.9999 68856 0.4253 0.3572 0.0747
0.0093 25.9998 71610 0.4884 0.3506 0.0728
0.0085 26.9997 74364 0.5257 0.3720 0.0768
0.0079 28.0 77119 0.4684 0.3548 0.0731
0.0076 28.9999 79873 0.4596 0.3438 0.0730
0.0069 29.9998 82627 0.4860 0.3441 0.0734
0.0067 30.9997 85381 0.5379 0.3574 0.0756
0.0064 32.0 88136 0.5630 0.3670 0.0794
0.0056 32.9999 90890 0.5131 0.3373 0.0719
0.0057 33.9998 93644 0.5058 0.3408 0.0705
0.0048 34.9997 96398 0.5383 0.3458 0.0737
0.0049 36.0 99153 0.5094 0.3364 0.0679
0.0044 36.9999 101907 0.4981 0.3303 0.0695
0.0044 37.9998 104661 0.5671 0.3517 0.0727
0.0038 38.9997 107415 0.4956 0.3349 0.0686
0.0038 40.0 110170 0.5233 0.3521 0.0742
0.0035 40.9999 112924 0.5516 0.3340 0.0720
0.0035 41.9998 115678 0.5727 0.3469 0.0775
0.003 42.9997 118432 0.5540 0.3430 0.0730
0.0032 44.0 121187 0.5262 0.3327 0.0697
0.0031 44.9999 123941 0.5168 0.3145 0.0644
0.0026 45.9998 126695 0.5278 0.3235 0.0691
0.0023 46.9997 129449 0.6085 0.3346 0.0748
0.0023 48.0 132204 0.6030 0.3333 0.0713
0.0022 48.9999 134958 0.5425 0.3333 0.0693
0.002 49.9998 137712 0.5830 0.3657 0.0743
0.0019 50.9997 140466 0.5350 0.3222 0.0664
0.0018 52.0 143221 0.5682 0.3274 0.0701
0.0016 52.9999 145975 0.5415 0.3285 0.0684
0.0019 53.9998 148729 0.5133 0.3301 0.0694
0.0016 54.9997 151483 0.5581 0.3298 0.0680

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
7
Safetensors
Model size
606M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/W2V2-Bert_nchlt_speech_corpus_Fleurs_ZULU_63hr_v1

Finetuned
(239)
this model

Collection including asr-africa/W2V2-Bert_nchlt_speech_corpus_Fleurs_ZULU_63hr_v1