Moroccan-Darija-STT-small-v1.6.9
This model is a fine-tuned version of openai/whisper-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5000
- Wer: 76.9746
- Cer: 32.8683
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.25e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.6255 | 0.0490 | 10 | 0.6187 | 129.3173 | 96.4275 |
1.2024 | 0.0980 | 20 | 0.4552 | 107.2205 | 61.3145 |
1.0577 | 0.1471 | 30 | 0.4214 | 109.9900 | 65.9409 |
0.9265 | 0.1961 | 40 | 0.4036 | 94.7205 | 49.8590 |
0.894 | 0.2451 | 50 | 0.3948 | 93.6412 | 49.8184 |
0.8693 | 0.2941 | 60 | 0.3842 | 85.6007 | 42.4674 |
0.7916 | 0.3431 | 70 | 0.3865 | 86.8725 | 41.6246 |
0.7468 | 0.3922 | 80 | 0.3781 | 84.9063 | 39.9524 |
0.7425 | 0.4412 | 90 | 0.3751 | 82.4799 | 37.6653 |
0.7395 | 0.4902 | 100 | 0.3704 | 82.1703 | 38.3004 |
0.7108 | 0.5392 | 110 | 0.3694 | 80.3380 | 35.8445 |
0.7128 | 0.5882 | 120 | 0.3696 | 84.4629 | 40.6010 |
0.6539 | 0.6373 | 130 | 0.3619 | 80.3046 | 36.9255 |
0.6301 | 0.6863 | 140 | 0.3658 | 82.1118 | 38.9457 |
0.5786 | 0.7353 | 150 | 0.3587 | 78.4806 | 33.1588 |
0.6295 | 0.7843 | 160 | 0.3609 | 80.5974 | 35.8918 |
0.5858 | 0.8333 | 170 | 0.3616 | 79.9531 | 35.4543 |
0.6411 | 0.8824 | 180 | 0.3694 | 81.7018 | 37.4964 |
0.5431 | 0.9314 | 190 | 0.3678 | 76.8574 | 31.7771 |
0.5682 | 0.9804 | 200 | 0.3681 | 78.1208 | 32.8919 |
0.5126 | 1.0294 | 210 | 0.3669 | 77.9451 | 33.1047 |
0.5202 | 1.0784 | 220 | 0.3688 | 76.6232 | 32.6538 |
0.5111 | 1.1275 | 230 | 0.3692 | 78.7734 | 34.6283 |
0.5223 | 1.1765 | 240 | 0.3714 | 75.8952 | 31.2636 |
0.4977 | 1.2255 | 250 | 0.3719 | 77.2172 | 33.7044 |
0.4655 | 1.2745 | 260 | 0.3707 | 77.4515 | 32.6554 |
0.4684 | 1.3235 | 270 | 0.3694 | 76.8407 | 32.2264 |
0.4748 | 1.3725 | 280 | 0.3713 | 76.1128 | 33.3986 |
0.4635 | 1.4216 | 290 | 0.3722 | 77.1168 | 33.7179 |
0.4614 | 1.4706 | 300 | 0.3767 | 75.8534 | 31.5660 |
0.4606 | 1.5196 | 310 | 0.3769 | 77.0750 | 32.5406 |
0.4196 | 1.5686 | 320 | 0.3821 | 76.3554 | 32.1487 |
0.4293 | 1.6176 | 330 | 0.3860 | 76.5897 | 32.8716 |
0.4296 | 1.6667 | 340 | 0.3821 | 77.8531 | 33.8496 |
0.4167 | 1.7157 | 350 | 0.3825 | 75.8952 | 31.6791 |
0.4425 | 1.7647 | 360 | 0.3818 | 76.7403 | 32.7213 |
0.4075 | 1.8137 | 370 | 0.3825 | 75.8534 | 32.9983 |
0.405 | 1.8627 | 380 | 0.3859 | 78.3886 | 34.3868 |
0.3995 | 1.9118 | 390 | 0.3869 | 76.0040 | 33.3784 |
0.408 | 1.9608 | 400 | 0.3859 | 75.3012 | 31.2721 |
0.3829 | 2.0098 | 410 | 0.3880 | 77.9869 | 33.3345 |
0.3408 | 2.0588 | 420 | 0.3926 | 76.1212 | 32.7331 |
0.325 | 2.1078 | 430 | 0.3974 | 74.9582 | 31.4410 |
0.3469 | 2.1569 | 440 | 0.3968 | 78.9993 | 33.8463 |
0.3022 | 2.2059 | 450 | 0.4074 | 75.9454 | 32.3919 |
0.3341 | 2.2549 | 460 | 0.3989 | 75.5689 | 31.6859 |
0.3297 | 2.3039 | 470 | 0.4032 | 76.4893 | 32.9037 |
0.311 | 2.3529 | 480 | 0.4084 | 75.0837 | 31.5035 |
0.308 | 2.4020 | 490 | 0.4096 | 76.5646 | 32.8294 |
0.3081 | 2.4510 | 500 | 0.4110 | 75.9538 | 31.8126 |
0.2796 | 2.5 | 510 | 0.4094 | 76.8491 | 32.5896 |
0.3051 | 2.5490 | 520 | 0.4045 | 75.7112 | 31.4460 |
0.2914 | 2.5980 | 530 | 0.4143 | 76.5311 | 32.7923 |
0.291 | 2.6471 | 540 | 0.4136 | 76.4140 | 32.7923 |
0.3095 | 2.6961 | 550 | 0.4186 | 76.4140 | 32.2264 |
0.3122 | 2.7451 | 560 | 0.4191 | 75.4434 | 31.7366 |
0.301 | 2.7941 | 570 | 0.4218 | 76.7068 | 33.1470 |
0.2895 | 2.8431 | 580 | 0.4261 | 76.7152 | 32.6740 |
0.3104 | 2.8922 | 590 | 0.4254 | 75.2426 | 32.4561 |
0.2962 | 2.9412 | 600 | 0.4213 | 75.5773 | 31.5204 |
0.2781 | 2.9902 | 610 | 0.4242 | 78.1124 | 33.3581 |
0.2504 | 3.0392 | 620 | 0.4309 | 76.2467 | 32.5946 |
0.2145 | 3.0882 | 630 | 0.4386 | 75.8869 | 32.1926 |
0.2129 | 3.1373 | 640 | 0.4356 | 76.9411 | 32.5456 |
0.2105 | 3.1863 | 650 | 0.4441 | 76.6232 | 31.9004 |
0.2188 | 3.2353 | 660 | 0.4420 | 76.1714 | 32.2230 |
0.2162 | 3.2843 | 670 | 0.4484 | 77.2005 | 32.6149 |
0.2187 | 3.3333 | 680 | 0.4488 | 76.9495 | 32.6217 |
0.2243 | 3.3824 | 690 | 0.4460 | 76.1212 | 31.9528 |
0.2041 | 3.4314 | 700 | 0.4473 | 76.3052 | 32.1622 |
0.207 | 3.4804 | 710 | 0.4498 | 75.6108 | 31.6842 |
0.2067 | 3.5294 | 720 | 0.4521 | 75.9120 | 31.8041 |
0.2196 | 3.5784 | 730 | 0.4535 | 76.0626 | 32.1588 |
0.197 | 3.6275 | 740 | 0.4521 | 76.4307 | 32.2298 |
0.2072 | 3.6765 | 750 | 0.4552 | 76.3136 | 32.2720 |
0.2083 | 3.7255 | 760 | 0.4531 | 78.3300 | 33.9290 |
0.2064 | 3.7745 | 770 | 0.4576 | 76.1212 | 32.7230 |
0.1939 | 3.8235 | 780 | 0.4597 | 76.4893 | 32.4865 |
0.2062 | 3.8725 | 790 | 0.4598 | 76.6483 | 32.1014 |
0.2083 | 3.9216 | 800 | 0.4628 | 76.5897 | 33.2061 |
0.1715 | 3.9706 | 810 | 0.4627 | 76.1379 | 32.5203 |
0.1797 | 4.0196 | 820 | 0.4663 | 76.0207 | 32.1268 |
0.1605 | 4.0686 | 830 | 0.4707 | 76.3387 | 32.0896 |
0.1711 | 4.1176 | 840 | 0.4699 | 76.1881 | 32.1268 |
0.1597 | 4.1667 | 850 | 0.4755 | 76.5144 | 32.5406 |
0.1449 | 4.2157 | 860 | 0.4762 | 76.4642 | 32.2889 |
0.1512 | 4.2647 | 870 | 0.4782 | 76.5730 | 32.2416 |
0.1336 | 4.3137 | 880 | 0.4786 | 76.3973 | 32.3632 |
0.1658 | 4.3627 | 890 | 0.4761 | 76.7068 | 32.4629 |
0.1545 | 4.4118 | 900 | 0.4805 | 77.0750 | 32.9781 |
0.1492 | 4.4608 | 910 | 0.4766 | 76.5646 | 32.4139 |
0.1604 | 4.5098 | 920 | 0.4800 | 76.7152 | 32.6402 |
0.1556 | 4.5588 | 930 | 0.4841 | 76.6064 | 32.3007 |
0.1471 | 4.6078 | 940 | 0.4839 | 76.6315 | 32.4916 |
0.146 | 4.6569 | 950 | 0.4829 | 76.5311 | 32.2044 |
0.137 | 4.7059 | 960 | 0.4859 | 76.6315 | 32.3159 |
0.1441 | 4.7549 | 970 | 0.4861 | 76.5646 | 32.1301 |
0.145 | 4.8039 | 980 | 0.4864 | 77.2005 | 33.6436 |
0.1439 | 4.8529 | 990 | 0.4875 | 76.2969 | 32.2484 |
0.1393 | 4.9020 | 1000 | 0.4867 | 77.2172 | 33.0169 |
0.1377 | 4.9510 | 1010 | 0.4897 | 76.3554 | 32.5237 |
0.1524 | 5.0 | 1020 | 0.4892 | 76.6148 | 32.6977 |
0.1293 | 5.0490 | 1030 | 0.4928 | 76.9244 | 32.9595 |
0.1166 | 5.0980 | 1040 | 0.4934 | 76.6901 | 32.7162 |
0.1127 | 5.1471 | 1050 | 0.4933 | 76.9495 | 33.6723 |
0.1123 | 5.1961 | 1060 | 0.4953 | 77.0331 | 33.1571 |
0.118 | 5.2451 | 1070 | 0.4956 | 77.2925 | 33.5507 |
0.1188 | 5.2941 | 1080 | 0.4964 | 76.7236 | 32.5372 |
0.1132 | 5.3431 | 1090 | 0.4975 | 76.6985 | 32.4730 |
0.1171 | 5.3922 | 1100 | 0.4963 | 76.5646 | 32.3852 |
0.1147 | 5.4412 | 1110 | 0.4982 | 76.6064 | 32.5372 |
0.1191 | 5.4902 | 1120 | 0.4984 | 76.7738 | 32.6858 |
0.1212 | 5.5392 | 1130 | 0.4994 | 76.7319 | 32.5406 |
0.1204 | 5.5882 | 1140 | 0.4983 | 76.4977 | 32.3936 |
0.1225 | 5.6373 | 1150 | 0.4977 | 76.9495 | 32.7534 |
0.1069 | 5.6863 | 1160 | 0.4991 | 76.4642 | 32.3531 |
0.1143 | 5.7353 | 1170 | 0.5007 | 76.4809 | 32.4207 |
0.1084 | 5.7843 | 1180 | 0.5010 | 76.5981 | 32.5034 |
0.1179 | 5.8333 | 1190 | 0.4999 | 76.8323 | 32.6385 |
0.1223 | 5.8824 | 1200 | 0.5000 | 76.5646 | 32.3936 |
0.1172 | 5.9314 | 1210 | 0.4999 | 76.6064 | 32.4882 |
0.1137 | 5.9804 | 1220 | 0.5000 | 76.9746 | 32.8683 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for BounharAbdelaziz/Moroccan-Darija-STT-small-v1.6.9
Base model
openai/whisper-small