Visualize in Weights & Biases

wav2vec2-xlsr-ln-5hr-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5624
  • Model Preparation Time: 0.0057
  • Wer: 0.2571
  • Cer: 0.0734

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
16.386 1.0 69 12.2905 0.0057 1.0 1.0
5.5203 2.0 138 3.9569 0.0057 1.0 1.0
3.4 3.0 207 3.1451 0.0057 1.0 1.0
2.9614 4.0 276 2.8816 0.0057 1.0 1.0
2.8304 5.0 345 2.8508 0.0057 1.0 1.0
2.3503 6.0 414 1.4634 0.0057 0.9609 0.3336
0.9982 7.0 483 0.7307 0.0057 0.4987 0.1411
0.6755 8.0 552 0.6046 0.0057 0.4915 0.1328
0.5294 9.0 621 0.5178 0.0057 0.4908 0.1236
0.4127 10.0 690 0.4592 0.0057 0.3672 0.1024
0.351 11.0 759 0.4412 0.0057 0.3898 0.1115
0.2951 12.0 828 0.4443 0.0057 0.3696 0.1158
0.2652 13.0 897 0.4260 0.0057 0.3569 0.0966
0.2294 14.0 966 0.3866 0.0057 0.3242 0.0929
0.2015 15.0 1035 0.4026 0.0057 0.3110 0.0881
0.1806 16.0 1104 0.3866 0.0057 0.3048 0.0869
0.1593 17.0 1173 0.4070 0.0057 0.3133 0.0899
0.1427 18.0 1242 0.4013 0.0057 0.3108 0.0877
0.1285 19.0 1311 0.4124 0.0057 0.2983 0.0861
0.1223 20.0 1380 0.4172 0.0057 0.3173 0.0919
0.109 21.0 1449 0.4232 0.0057 0.2934 0.0844
0.109 22.0 1518 0.4238 0.0057 0.2952 0.0879
0.0935 23.0 1587 0.4662 0.0057 0.2797 0.0834
0.095 24.0 1656 0.4323 0.0057 0.2780 0.0824
0.0797 25.0 1725 0.4363 0.0057 0.2809 0.0823
0.0795 26.0 1794 0.4421 0.0057 0.2925 0.0835
0.0718 27.0 1863 0.4394 0.0057 0.2887 0.0848
0.071 28.0 1932 0.4397 0.0057 0.2773 0.0810
0.066 29.0 2001 0.4639 0.0057 0.2755 0.0817
0.0598 30.0 2070 0.4674 0.0057 0.2844 0.0828
0.0609 31.0 2139 0.4663 0.0057 0.2771 0.0836
0.0532 32.0 2208 0.4848 0.0057 0.2860 0.0836
0.0549 33.0 2277 0.4718 0.0057 0.2701 0.0797
0.0523 34.0 2346 0.4618 0.0057 0.2699 0.0792
0.049 35.0 2415 0.4866 0.0057 0.2679 0.0785
0.052 36.0 2484 0.4683 0.0057 0.2643 0.0781
0.0439 37.0 2553 0.4814 0.0057 0.2627 0.0794
0.0411 38.0 2622 0.4893 0.0057 0.2703 0.0786
0.0411 39.0 2691 0.4894 0.0057 0.2724 0.0788
0.0408 40.0 2760 0.4604 0.0057 0.2619 0.0781
0.0411 41.0 2829 0.4763 0.0057 0.2572 0.0778
0.0391 42.0 2898 0.4911 0.0057 0.2650 0.0778
0.0402 43.0 2967 0.4649 0.0057 0.2650 0.0786
0.0363 44.0 3036 0.4865 0.0057 0.2589 0.0784
0.0318 45.0 3105 0.5020 0.0057 0.2585 0.0784
0.0319 46.0 3174 0.4973 0.0057 0.2683 0.0785
0.0292 47.0 3243 0.5007 0.0057 0.2585 0.0784
0.0294 48.0 3312 0.5059 0.0057 0.2587 0.0768
0.0305 49.0 3381 0.5039 0.0057 0.2665 0.0783
0.0309 50.0 3450 0.4968 0.0057 0.2641 0.0783
0.0277 51.0 3519 0.5287 0.0057 0.2659 0.0771
0.024 52.0 3588 0.4998 0.0057 0.2558 0.0758
0.0243 53.0 3657 0.4997 0.0057 0.2603 0.0760
0.0255 54.0 3726 0.4989 0.0057 0.2563 0.0764
0.0233 55.0 3795 0.5191 0.0057 0.2587 0.0768
0.0247 56.0 3864 0.5036 0.0057 0.2487 0.0753
0.0224 57.0 3933 0.5060 0.0057 0.2489 0.0750
0.024 58.0 4002 0.4976 0.0057 0.2482 0.0755
0.0259 59.0 4071 0.5163 0.0057 0.2587 0.0754
0.0217 60.0 4140 0.5191 0.0057 0.2496 0.0747
0.0218 61.0 4209 0.5035 0.0057 0.2502 0.0748
0.0199 62.0 4278 0.5277 0.0057 0.2460 0.0741
0.0176 63.0 4347 0.5205 0.0057 0.2442 0.0747
0.0189 64.0 4416 0.5152 0.0057 0.2451 0.0744
0.0184 65.0 4485 0.5061 0.0057 0.2399 0.0734
0.018 66.0 4554 0.5243 0.0057 0.2422 0.0732
0.0186 67.0 4623 0.5180 0.0057 0.2480 0.0751
0.0158 68.0 4692 0.5269 0.0057 0.2449 0.0743
0.0159 69.0 4761 0.5240 0.0057 0.2431 0.0750
0.0149 70.0 4830 0.5255 0.0057 0.2413 0.0738
0.0132 71.0 4899 0.5365 0.0057 0.2442 0.0744
0.0135 72.0 4968 0.5250 0.0057 0.2433 0.0735
0.014 73.0 5037 0.5241 0.0057 0.2442 0.0735
0.0144 74.0 5106 0.5267 0.0057 0.2466 0.0735
0.0141 75.0 5175 0.5254 0.0057 0.2471 0.0730
0.0125 76.0 5244 0.5269 0.0057 0.2440 0.0730
0.0116 77.0 5313 0.5310 0.0057 0.2424 0.0734
0.015 78.0 5382 0.5198 0.0057 0.2419 0.0732
0.0113 79.0 5451 0.5275 0.0057 0.2415 0.0733
0.0128 80.0 5520 0.5222 0.0057 0.2431 0.0736
0.0114 81.0 5589 0.5249 0.0057 0.2384 0.0731
0.01 82.0 5658 0.5235 0.0057 0.2375 0.0726
0.0117 83.0 5727 0.5226 0.0057 0.2408 0.0730
0.0105 84.0 5796 0.5205 0.0057 0.2428 0.0738
0.0106 85.0 5865 0.5238 0.0057 0.2384 0.0729
0.0109 86.0 5934 0.5210 0.0057 0.2379 0.0727
0.01 87.0 6003 0.5256 0.0057 0.2397 0.0731
0.01 88.0 6072 0.5305 0.0057 0.2415 0.0735
0.0088 89.0 6141 0.5329 0.0057 0.2390 0.0731
0.0098 90.0 6210 0.5328 0.0057 0.2393 0.0738
0.0089 91.0 6279 0.5348 0.0057 0.2393 0.0736
0.0084 92.0 6348 0.5358 0.0057 0.2377 0.0732
0.0087 93.0 6417 0.5356 0.0057 0.2388 0.0734
0.0104 94.0 6486 0.5338 0.0057 0.2381 0.0732
0.01 95.0 6555 0.5344 0.0057 0.2384 0.0734
0.0103 96.0 6624 0.5343 0.0057 0.2381 0.0734
0.0091 97.0 6693 0.5346 0.0057 0.2381 0.0734
0.0092 98.0 6762 0.5347 0.0057 0.2381 0.0734
0.0098 99.0 6831 0.5348 0.0057 0.2379 0.0734
0.0099 100.0 6900 0.5348 0.0057 0.2381 0.0734

Framework versions

  • Transformers 4.43.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
131
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for KasuleTrevor/wav2vec2-xlsr-ln-5hr-v1

Finetuned
(556)
this model