Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Whisper Small ko

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the custom dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0820

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.9031 0.0813 10 1.5890
0.919 0.1626 20 1.5737
0.8656 0.2439 30 1.5449
0.8302 0.3252 40 1.4914
0.7353 0.4065 50 1.3898
0.5881 0.4878 60 1.1693
0.35 0.5691 70 0.9472
0.2397 0.6504 80 0.8734
0.2272 0.7317 90 0.8072
0.1772 0.8130 100 0.7618
0.1426 0.8943 110 0.7191
0.1226 0.9756 120 0.6701
0.1022 1.0569 130 0.6356
0.0866 1.1382 140 0.6036
0.0796 1.2195 150 0.5758
0.0886 1.3008 160 0.5459
0.0648 1.3821 170 0.5246
0.0716 1.4634 180 0.5128
0.0571 1.5447 190 0.5002
0.0861 1.6260 200 0.4762
0.0594 1.7073 210 0.4489
0.0494 1.7886 220 0.4278
0.0414 1.8699 230 0.4159
0.0457 1.9512 240 0.4106
0.0408 2.0325 250 0.4002
0.0469 2.1138 260 0.3972
0.0588 2.1951 270 0.3853
0.0397 2.2764 280 0.3816
0.0459 2.3577 290 0.3806
0.0394 2.4390 300 0.3644
0.0376 2.5203 310 0.3562
0.0376 2.6016 320 0.3461
0.0321 2.6829 330 0.3337
0.037 2.7642 340 0.3301
0.0377 2.8455 350 0.3240
0.0245 2.9268 360 0.3185
0.0361 3.0081 370 0.3179
0.0279 3.0894 380 0.3130
0.0338 3.1707 390 0.3066
0.0344 3.2520 400 0.3010
0.0279 3.3333 410 0.2959
0.0243 3.4146 420 0.2886
0.0346 3.4959 430 0.2899
0.0251 3.5772 440 0.2851
0.0475 3.6585 450 0.2755
0.0272 3.7398 460 0.2714
0.0242 3.8211 470 0.2719
0.0238 3.9024 480 0.2707
0.0235 3.9837 490 0.2688
0.0293 4.0650 500 0.2677
0.0211 4.1463 510 0.2641
0.0205 4.2276 520 0.2601
0.0228 4.3089 530 0.2553
0.0216 4.3902 540 0.2538
0.0247 4.4715 550 0.2522
0.0413 4.5528 560 0.2484
0.0195 4.6341 570 0.2399
0.0211 4.7154 580 0.2392
0.0176 4.7967 590 0.2457
0.0251 4.8780 600 0.2409
0.0213 4.9593 610 0.2332
0.0236 5.0407 620 0.2354
0.0181 5.1220 630 0.2347
0.0194 5.2033 640 0.2353
0.0174 5.2846 650 0.2340
0.0169 5.3659 660 0.2296
0.0193 5.4472 670 0.2252
0.0161 5.5285 680 0.2230
0.02 5.6098 690 0.2254
0.0185 5.6911 700 0.2251
0.0185 5.7724 710 0.2211
0.0141 5.8537 720 0.2191
0.0198 5.9350 730 0.2226
0.0427 6.0163 740 0.2130
0.0147 6.0976 750 0.2478
0.0139 6.1789 760 0.2450
0.0149 6.2602 770 0.2423
0.0161 6.3415 780 0.2384
0.0148 6.4228 790 0.2347
0.0179 6.5041 800 0.2346
0.0138 6.5854 810 0.2311
0.0172 6.6667 820 0.2293
0.043 6.7480 830 0.1678
0.0147 6.8293 840 0.1679
0.0167 6.9106 850 0.1645
0.017 6.9919 860 0.1646
0.0118 7.0732 870 0.1650
0.0125 7.1545 880 0.1635
0.0194 7.2358 890 0.1617
0.0092 7.3171 900 0.1600
0.0105 7.3984 910 0.1594
0.014 7.4797 920 0.1589
0.0168 7.5610 930 0.1562
0.0094 7.6423 940 0.1554
0.0114 7.7236 950 0.1546
0.0116 7.8049 960 0.1531
0.0295 7.8862 970 0.1472
0.0155 7.9675 980 0.1497
0.0144 8.0488 990 0.1507
0.0109 8.1301 1000 0.1503
0.0257 8.2114 1010 0.1361
0.0092 8.2927 1020 0.1399
0.0122 8.3740 1030 0.1411
0.011 8.4553 1040 0.1422
0.0108 8.5366 1050 0.1404
0.0088 8.6179 1060 0.1403
0.0128 8.6992 1070 0.1417
0.0146 8.7805 1080 0.1399
0.0109 8.8618 1090 0.1370
0.013 8.9431 1100 0.1367
0.0083 9.0244 1110 0.1363
0.0113 9.1057 1120 0.1379
0.0212 9.1870 1130 0.1272
0.0086 9.2683 1140 0.1267
0.0131 9.3496 1150 0.1281
0.0093 9.4309 1160 0.1274
0.0092 9.5122 1170 0.1260
0.0115 9.5935 1180 0.1253
0.0093 9.6748 1190 0.1227
0.0114 9.7561 1200 0.1224
0.0101 9.8374 1210 0.1218
0.0129 9.9187 1220 0.1227
0.0084 10.0 1230 0.1225
0.0203 10.0813 1240 0.1173
0.0086 10.1626 1250 0.1115
0.009 10.2439 1260 0.1112
0.0082 10.3252 1270 0.1139
0.0076 10.4065 1280 0.1124
0.0081 10.4878 1290 0.1120
0.0074 10.5691 1300 0.1103
0.0089 10.6504 1310 0.1083
0.0088 10.7317 1320 0.1079
0.0084 10.8130 1330 0.1079
0.0114 10.8943 1340 0.1059
0.0112 10.9756 1350 0.1068
0.0094 11.0569 1360 0.1051
0.0101 11.1382 1370 0.1050
0.0076 11.2195 1380 0.1037
0.0077 11.3008 1390 0.1032
0.0085 11.3821 1400 0.1021
0.009 11.4634 1410 0.1021
0.0082 11.5447 1420 0.1017
0.0179 11.6260 1430 0.0983
0.0073 11.7073 1440 0.0973
0.0066 11.7886 1450 0.0984
0.0091 11.8699 1460 0.0983
0.0065 11.9512 1470 0.0974
0.0081 12.0325 1480 0.1009
0.0086 12.1138 1490 0.0971
0.006 12.1951 1500 0.0975
0.0069 12.2764 1510 0.0972
0.0087 12.3577 1520 0.0974
0.0064 12.4390 1530 0.0978
0.0086 12.5203 1540 0.0957
0.0059 12.6016 1550 0.0971
0.0075 12.6829 1560 0.0951
0.0085 12.7642 1570 0.0953
0.0167 12.8455 1580 0.0933
0.0077 12.9268 1590 0.0914
0.0064 13.0081 1600 0.0912
0.0081 13.0894 1610 0.0905
0.0088 13.1707 1620 0.0919
0.0155 13.2520 1630 0.0894
0.0066 13.3333 1640 0.0875
0.0066 13.4146 1650 0.0874
0.0065 13.4959 1660 0.0874
0.0061 13.5772 1670 0.0870
0.0056 13.6585 1680 0.0867
0.0063 13.7398 1690 0.0870
0.0075 13.8211 1700 0.0869
0.0087 13.9024 1710 0.0863
0.0055 13.9837 1720 0.0861
0.007 14.0650 1730 0.0862
0.0064 14.1463 1740 0.0859
0.0069 14.2276 1750 0.0857
0.015 14.3089 1760 0.0850
0.0069 14.3902 1770 0.0845
0.0078 14.4715 1780 0.0846
0.0052 14.5528 1790 0.0855
0.0067 14.6341 1800 0.0844
0.0065 14.7154 1810 0.0839
0.0067 14.7967 1820 0.0838
0.0053 14.8780 1830 0.0840
0.0062 14.9593 1840 0.0839
0.0063 15.0407 1850 0.0834
0.006 15.1220 1860 0.0832
0.0068 15.2033 1870 0.0830
0.0056 15.2846 1880 0.0830
0.0056 15.3659 1890 0.0827
0.0069 15.4472 1900 0.0825
0.0061 15.5285 1910 0.0825
0.0129 15.6098 1920 0.0824
0.0072 15.6911 1930 0.0822
0.0062 15.7724 1940 0.0822
0.0066 15.8537 1950 0.0822
0.0076 15.9350 1960 0.0821
0.0067 16.0163 1970 0.0821
0.0138 16.0976 1980 0.0820
0.0068 16.1789 1990 0.0820
0.0053 16.2602 2000 0.0820

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for nomnoos37/stt-turbo-1227-v1.1-peft-eng-1k-1e-4

Adapter
(37)
this model