Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Whisper Small NSC part 1,2,3 (2000 steps) - Jarrett Er

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0702
  • Wer: 2.7673

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.45 0.2008 50 1.2406 152.8748
0.674 0.4016 100 0.5503 37.9635
0.5474 0.6024 150 0.4958 20.0161
0.5325 0.8032 200 0.5054 18.4041
0.6039 1.0040 250 0.5225 18.9952
0.4602 1.2048 300 0.4821 18.0011
0.492 1.4056 350 0.4468 16.7920
0.45 1.6064 400 0.4248 16.0129
0.435 1.8072 450 0.3914 14.9382
0.4395 2.0080 500 0.3767 13.7829
0.2993 2.2088 550 0.3492 13.2724
0.3069 2.4096 600 0.3280 12.4127
0.2861 2.6104 650 0.2930 12.0903
0.2507 2.8112 700 0.2764 11.4723
0.2301 3.0120 750 0.2562 11.0962
0.1827 3.2129 800 0.2455 10.1827
0.1877 3.4137 850 0.2174 8.9468
0.1805 3.6145 900 0.2072 8.7587
0.1535 3.8153 950 0.2028 8.4095
0.1502 4.0161 1000 0.1726 7.1736
0.1175 4.2169 1050 0.1639 6.7706
0.1088 4.4177 1100 0.1600 6.7437
0.129 4.6185 1150 0.1489 6.1795
0.1103 4.8193 1200 0.1331 6.2063
0.0905 5.0201 1250 0.1309 5.0510
0.0709 5.2209 1300 0.1191 5.1854
0.0733 5.4217 1350 0.1109 4.1644
0.0704 5.6225 1400 0.1047 3.9226
0.0636 5.8233 1450 0.1044 3.7883
0.0572 6.0241 1500 0.0977 3.9495
0.0447 6.2249 1550 0.0931 3.9226
0.0408 6.4257 1600 0.0871 3.3047
0.0438 6.6265 1650 0.0861 3.5196
0.0396 6.8273 1700 0.0818 3.3584
0.038 7.0281 1750 0.0794 3.0360
0.0304 7.2289 1800 0.0761 2.7942
0.0332 7.4297 1850 0.0731 2.6061
0.0459 7.6305 1900 0.0717 2.5793
0.0288 7.8313 1950 0.0706 2.6599
0.0224 8.0321 2000 0.0702 2.7673

Framework versions

  • PEFT 0.14.0
  • Transformers 4.45.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.1.dev0
  • Tokenizers 0.20.3
Downloads last month
108
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for Thecoder3281f/whisper-small-hi-nscpart123-2000

Adapter
(117)
this model