wav2vec2-E30_speed_pause

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6351
  • Cer: 61.7540

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
32.976 0.1289 200 5.1291 100.0
4.9784 0.2579 400 4.7411 100.0
4.8371 0.3868 600 4.7057 100.0
4.7941 0.5158 800 4.7901 100.0
4.7063 0.6447 1000 4.7698 100.0
4.6822 0.7737 1200 4.6661 100.0
4.654 0.9026 1400 4.5412 100.0
4.5661 1.0316 1600 4.4702 99.9765
4.4285 1.1605 1800 4.3777 98.7253
4.1372 1.2895 2000 4.1695 97.1452
3.8557 1.4184 2200 3.8556 73.7664
3.5494 1.5474 2400 3.6468 70.5298
3.2487 1.6763 2600 3.3393 68.9086
3.0583 1.8053 2800 3.1815 67.4636
2.9404 1.9342 3000 3.1664 67.4283
2.8 2.0632 3200 2.9573 64.5266
2.6974 2.1921 3400 2.8951 63.7747
2.6229 2.3211 3600 2.8758 64.7909
2.5551 2.4500 3800 2.8143 63.4633
2.4771 2.5790 4000 2.7618 63.8628
2.4412 2.7079 4200 2.7518 64.2094
2.375 2.8369 4400 2.6586 62.5235
2.3835 2.9658 4600 2.6351 61.7540

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Gummybear05/wav2vec2-E30_speed_pause

Finetuned
(556)
this model