wav2vec2-E_

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7138
  • Cer: 15.1116

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
29.9637 0.1289 200 4.9791 100.0
4.6123 0.2579 400 4.6053 100.0
4.546 0.3868 600 4.5594 97.9671
4.49 0.5158 800 4.5282 97.9671
4.4469 0.6447 1000 4.4909 97.9671
4.3231 0.7737 1200 4.3157 89.9412
3.7783 0.9026 1400 3.3710 58.4195
2.7113 1.0316 1600 2.4292 42.7321
2.0971 1.1605 1800 2.0556 38.2197
1.7684 1.2895 2000 1.6070 29.8472
1.5046 1.4184 2200 1.4542 27.2268
1.3672 1.5474 2400 1.2693 24.2656
1.2279 1.6763 2600 1.1761 23.5253
1.1456 1.8053 2800 1.0789 21.2338
1.0516 1.9342 3000 0.9792 19.9471
0.9609 2.0632 3200 0.9519 19.6592
0.8758 2.1921 3400 0.8677 18.0082
0.8289 2.3211 3600 0.8783 18.0082
0.783 2.4500 3800 0.7990 17.0682
0.7719 2.5790 4000 0.7492 15.7168
0.748 2.7079 4200 0.7483 16.0400
0.7377 2.8369 4400 0.7291 15.4877
0.7188 2.9658 4600 0.7138 15.1116

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
2
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Gummybear05/wav2vec2-E_

Finetuned
(556)
this model