You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-CV-Fleurs-lg-20hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2618
  • Wer: 0.4400
  • Cer: 0.1123

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.5547 0.9996 1328 1.2661 0.9857 0.3628
0.9846 2.0 2657 0.8858 0.8386 0.3015
0.6894 2.9996 3985 0.7434 0.9135 0.3641
0.5035 4.0 5314 0.6831 0.9614 0.4073
0.3571 4.9996 6642 0.6695 0.8254 0.3086
0.2366 6.0 7971 0.6937 0.9208 0.3723
0.1445 6.9996 9299 0.7372 0.7235 0.2559
0.0874 8.0 10628 0.7655 0.7067 0.2404
0.0567 8.9996 11956 0.7885 0.6037 0.1878
0.0425 10.0 13285 0.8165 0.5342 0.1466
0.0357 10.9996 14613 0.8397 0.5465 0.1489
0.0277 12.0 15942 0.8629 0.5385 0.1465
0.0226 12.9996 17270 0.8759 0.5037 0.1241
0.0179 14.0 18599 0.8823 0.4803 0.1167
0.0155 14.9996 19927 0.9146 0.4826 0.1232
0.0122 16.0 21256 0.9274 0.4763 0.1176
0.0105 16.9996 22584 0.9454 0.4767 0.1224
0.0097 18.0 23913 0.9583 0.4667 0.1167
0.0087 18.9996 25241 0.9675 0.4774 0.1230
0.0074 20.0 26570 0.9834 0.4589 0.1151
0.0064 20.9996 27898 1.0245 0.4691 0.1137
0.0062 22.0 29227 1.0370 0.4575 0.1108
0.0055 22.9996 30555 1.0165 0.4573 0.1122
0.0048 24.0 31884 1.0460 0.4607 0.1143
0.005 24.9996 33212 1.0521 0.4551 0.1123
0.004 26.0 34541 1.0622 0.4542 0.1119
0.0037 26.9996 35869 1.0688 0.4525 0.1149
0.0039 28.0 37198 1.0643 0.4528 0.1122
0.0042 28.9996 38526 1.0815 0.4539 0.1136
0.0034 30.0 39855 1.1070 0.4519 0.1109
0.0036 30.9996 41183 1.1133 0.4542 0.1118
0.0034 32.0 42512 1.1313 0.4530 0.1120
0.0036 32.9996 43840 1.0893 0.4470 0.1092
0.0025 34.0 45169 1.1114 0.4538 0.1116
0.0028 34.9996 46497 1.1134 0.4454 0.1110
0.0028 36.0 47826 1.1380 0.4514 0.1143
0.0025 36.9996 49154 1.1434 0.4412 0.1126
0.0023 38.0 50483 1.1544 0.4499 0.1147
0.0025 38.9996 51811 1.1741 0.4475 0.1130
0.0019 40.0 53140 1.1619 0.4399 0.1110
0.0016 40.9996 54468 1.1668 0.4353 0.1084
0.0019 42.0 55797 1.2003 0.4388 0.1100
0.0021 42.9996 57125 1.1919 0.4432 0.1085
0.0016 44.0 58454 1.1745 0.4374 0.1101
0.0014 44.9996 59782 1.2128 0.4413 0.1098
0.0018 46.0 61111 1.2135 0.4470 0.1127
0.002 46.9996 62439 1.2009 0.4490 0.1116
0.0016 48.0 63768 1.1846 0.4439 0.1111
0.0014 48.9996 65096 1.2140 0.4465 0.1114
0.0009 50.0 66425 1.2184 0.4368 0.1078
0.0011 50.9996 67753 1.2220 0.4416 0.1110
0.0013 52.0 69082 1.2325 0.4361 0.1096
0.0011 52.9996 70410 1.2618 0.4400 0.1123

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
9
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-CV-Fleurs-lg-20hrs-v1

Finetuned
(2154)
this model

Collection including asr-africa/whisper-small-CV-Fleurs-lg-20hrs-v1