Whisper Large-V3-Turbo Basque

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the mozilla-foundation/common_voice_17_0 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2907
  • Wer: 7.3546

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.1123 2.3474 1000 0.2163 13.2370
0.067 4.6948 2000 0.2017 11.0319
0.0327 7.0423 3000 0.2156 10.7736
0.0243 9.3897 4000 0.2262 10.4190
0.0198 11.7371 5000 0.2324 10.5500
0.0129 14.0845 6000 0.2427 10.0645
0.0133 16.4319 7000 0.2425 9.9225
0.0123 18.7793 8000 0.2540 10.3851
0.0085 21.1268 9000 0.2441 9.5423
0.0091 23.4742 10000 0.2557 9.7585
0.0076 25.8216 11000 0.2545 9.6660
0.0062 28.1690 12000 0.2590 9.5139
0.0044 30.5164 13000 0.2605 9.2290
0.0046 32.8638 14000 0.2680 9.5588
0.0057 35.2113 15000 0.2734 9.8859
0.0041 37.5587 16000 0.2703 9.3087
0.0057 39.9061 17000 0.2607 9.1566
0.002 42.2535 18000 0.2685 8.8516
0.0034 44.6009 19000 0.2780 9.5964
0.0023 46.9484 20000 0.2801 9.3298
0.0022 49.2958 21000 0.2728 9.0174
0.0015 51.6432 22000 0.2747 8.9532
0.001 53.9906 23000 0.2745 8.4649
0.0018 56.3380 24000 0.2807 9.1026
0.0013 58.6854 25000 0.2686 8.4915
0.0009 61.0329 26000 0.2767 8.3010
0.0012 63.3803 27000 0.2850 8.8241
0.0012 65.7277 28000 0.2830 8.6573
0.0002 68.0751 29000 0.2806 8.3165
0.0007 70.4225 30000 0.2802 8.5419
0.0006 72.7700 31000 0.2823 8.3532
0.0002 75.1174 32000 0.2774 8.2231
0.0003 77.4648 33000 0.2807 8.3120
0.0001 79.8122 34000 0.2834 7.9043
0.0 82.1596 35000 0.2821 7.7192
0.0 84.5070 36000 0.2841 7.5442
0.0 86.8545 37000 0.2865 7.4645
0.0 89.2019 38000 0.2885 7.4306
0.0 91.5493 39000 0.2900 7.3665
0.0 93.8967 40000 0.2907 7.3546

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
4
Safetensors
Model size
809M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zuazo/whisper-large-v3-turbo-eu-cv17_0

Finetuned
(318)
this model

Dataset used to train zuazo/whisper-large-v3-turbo-eu-cv17_0

Evaluation results