SpeechT5 TTS Igb0 Yoruba

This model is a fine-tuned version of microsoft/speecht5_tts on the naija_tts_concatenated dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4441

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
1.5732 0.1576 100 0.7186
1.4622 0.3152 200 0.6618
1.3725 0.4728 300 0.5930
1.19 0.6304 400 0.5175
1.1514 0.7880 500 0.5009
1.0837 0.9456 600 0.4875
1.0747 1.1024 700 0.4842
1.102 1.2600 800 0.4790
1.0613 1.4177 900 0.4723
1.0619 1.5753 1000 0.4685
1.0256 1.7329 1100 0.4646
1.0282 1.8905 1200 0.4637
1.0212 2.0473 1300 0.4633
1.001 2.2049 1400 0.4606
1.0224 2.3625 1500 0.4573
0.9996 2.5201 1600 0.4551
0.9838 2.6777 1700 0.4553
0.978 2.8353 1800 0.4547
0.9915 2.9929 1900 0.4525
0.9887 3.1497 2000 0.4526
1.0082 3.3073 2100 0.4523
0.9833 3.4649 2200 0.4485
0.9776 3.6225 2300 0.4484
0.9854 3.7801 2400 0.4492
0.9959 3.9377 2500 0.4493
0.9612 4.0946 2600 0.4467
0.9797 4.2522 2700 0.4467
0.9482 4.4098 2800 0.4458
0.9651 4.5674 2900 0.4443
0.9519 4.7250 3000 0.4455
0.9695 4.8826 3100 0.4446
0.9726 5.0394 3200 0.4439
0.9644 5.1970 3300 0.4454
0.9719 5.3546 3400 0.4440
0.9711 5.5122 3500 0.4437
0.9367 5.6698 3600 0.4437
0.961 5.8274 3700 0.4424
0.9524 5.9850 3800 0.4435
0.9732 6.1418 3900 0.4432
0.9854 6.2994 4000 0.4441

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
28
Safetensors
Model size
144M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ccibeekeoc42/speecht5_finetuned_naija_ig_yo_2024-12-20

Finetuned
(871)
this model