You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-CV-Fleurs-lg-100hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0078
  • Wer: 0.3224
  • Cer: 0.0836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.5058 0.9999 9196 0.7558 0.8238 0.2941
0.506 2.0 18393 0.5478 0.8525 0.3442
0.3425 2.9999 27589 0.4752 0.8348 0.3556
0.2494 4.0 36786 0.4521 0.8533 0.4533
0.1797 4.9999 45982 0.4541 0.5214 0.1893
0.1245 6.0 55179 0.4823 0.4265 0.1191
0.0847 6.9999 64375 0.4999 0.4024 0.1176
0.0602 8.0 73572 0.5410 0.4060 0.1113
0.0476 8.9999 82768 0.5580 0.3866 0.0969
0.0423 10.0 91965 0.5833 0.3644 0.0919
0.037 10.9999 101161 0.5972 0.3721 0.1008
0.0292 12.0 110358 0.6086 0.3502 0.0933
0.0237 12.9999 119554 0.6345 0.3637 0.0949
0.02 14.0 128751 0.6531 0.3577 0.0891
0.0171 14.9999 137947 0.6821 0.3470 0.0873
0.0149 16.0 147144 0.6936 0.3549 0.0882
0.0128 16.9999 156340 0.6914 0.3419 0.0849
0.0111 18.0 165537 0.7349 0.3468 0.0924
0.0102 18.9999 174733 0.7507 0.3427 0.0874
0.0092 20.0 183930 0.7615 0.3390 0.0876
0.0084 20.9999 193126 0.7623 0.3422 0.0878
0.0071 22.0 202323 0.7776 0.3391 0.0860
0.0071 22.9999 211519 0.7843 0.3401 0.0874
0.0065 24.0 220716 0.8050 0.3309 0.0841
0.0059 24.9999 229912 0.7960 0.3362 0.0883
0.0056 26.0 239109 0.8172 0.3377 0.0867
0.0052 26.9999 248305 0.8298 0.3303 0.0843
0.0047 28.0 257502 0.8464 0.3402 0.0884
0.0044 28.9999 266698 0.8559 0.3341 0.0861
0.0043 30.0 275895 0.8570 0.3407 0.0886
0.0039 30.9999 285091 0.8790 0.3363 0.0869
0.0037 32.0 294288 0.8917 0.3328 0.0882
0.0034 32.9999 303484 0.8798 0.3279 0.0867
0.0033 34.0 312681 0.8820 0.3346 0.0868
0.0031 34.9999 321877 0.9076 0.3316 0.0866
0.003 36.0 331074 0.9223 0.3363 0.0876
0.003 36.9999 340270 0.8977 0.3318 0.0889
0.0027 38.0 349467 0.9163 0.3262 0.0849
0.0026 38.9999 358663 0.9140 0.3269 0.0847
0.0024 40.0 367860 0.9232 0.3361 0.0897
0.0022 40.9999 377056 0.9202 0.3287 0.0864
0.0022 42.0 386253 0.9276 0.3308 0.0851
0.0022 42.9999 395449 0.9449 0.3206 0.0845
0.0018 44.0 404646 0.9530 0.3249 0.0852
0.0018 44.9999 413842 0.9613 0.3315 0.0856
0.0016 46.0 423039 0.9330 0.3224 0.0844
0.0018 46.9999 432235 0.9742 0.3258 0.0856
0.0016 48.0 441432 0.9634 0.3230 0.0877
0.0015 48.9999 450628 0.9506 0.3248 0.0864
0.0015 50.0 459825 0.9814 0.3229 0.0848
0.0014 50.9999 469021 0.9798 0.3300 0.0851
0.0013 52.0 478218 0.9893 0.3264 0.0858
0.0013 52.9999 487414 0.9635 0.3256 0.0851
0.0011 54.0 496611 0.9968 0.3217 0.0827
0.0011 54.9999 505807 1.0078 0.3224 0.0836

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
20
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-CV-Fleurs-lg-100hrs-v1

Finetuned
(2154)
this model

Collection including asr-africa/whisper-small-CV-Fleurs-lg-100hrs-v1