wav2vec2-large-xlsr-coraa-exp-11

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 8.9926
  • Wer: 0.9866
  • Cer: 0.9323

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
38.5161 1.0 14 34.2489 1.0 0.9510
38.5161 2.0 28 23.3869 1.0 0.9510
38.5161 3.0 42 19.6721 1.0 0.9510
38.5161 4.0 56 18.3735 1.0 0.9510
38.5161 5.0 70 17.5507 1.0026 0.9496
38.5161 6.0 84 16.9340 1.0738 0.9688
38.5161 7.0 98 17.3229 1.0004 0.9511
17.5323 8.0 112 16.4594 1.0156 0.9314
17.5323 9.0 126 12.4451 1.0299 0.9352
17.5323 10.0 140 10.0922 1.0 0.9619
17.5323 11.0 154 9.5186 0.9998 0.9618
17.5323 12.0 168 8.9926 0.9866 0.9323
17.5323 13.0 182 9.0185 0.9839 0.9167
17.5323 14.0 196 9.1242 0.9837 0.9216
6.6506 15.0 210 9.0501 0.9880 0.8844
6.6506 16.0 224 9.1892 0.9777 0.9022
6.6506 17.0 238 9.1733 0.9799 0.8847
6.6506 18.0 252 9.3033 0.9799 0.8733
6.6506 19.0 266 9.2853 0.9746 0.8990
6.6506 20.0 280 9.4380 0.9748 0.9086
6.6506 21.0 294 9.5132 0.9750 0.8900
3.6568 22.0 308 9.6268 0.9817 0.8811
3.6568 23.0 322 9.6989 1.0043 0.8847
3.6568 24.0 336 9.6113 0.9789 0.8963
3.6568 25.0 350 9.7947 0.9807 0.8924
3.6568 26.0 364 9.8381 0.9795 0.8979
3.6568 27.0 378 10.0306 0.9789 0.8952
3.6568 28.0 392 9.9950 0.9793 0.8947
3.316 29.0 406 10.1488 0.9781 0.8979
3.316 30.0 420 10.1934 0.9809 0.9092
3.316 31.0 434 10.2146 0.9880 0.9299
3.316 32.0 448 10.2985 0.9998 0.9593

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.