whisper-medium-ml-exp2

This model is a fine-tuned version of openai/whisper-medium on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2220
  • Wer: 57.6922

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 15000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.3119 0.0333 500 0.4193 79.0917
0.0405 0.0667 1000 0.4335 73.5830
0.0355 0.1 1500 0.4332 77.2785
0.126 0.1333 2000 0.1967 58.4021
0.0519 0.1667 2500 0.1861 58.5671
0.0439 0.2 3000 0.1942 57.4274
0.0534 1.0112 3500 0.1936 61.1497
0.0214 1.0445 4000 0.2253 59.7816
0.0129 1.0779 4500 0.2630 61.0614
0.048 1.1112 5000 0.1780 56.3606
0.047 1.1445 5500 0.1638 52.9951
0.0325 1.1779 6000 0.1683 54.5512
0.0293 1.2112 6500 0.1689 57.2451
0.028 2.0224 7000 0.2145 56.5237
0.009 2.0557 7500 0.2227 56.3068
0.0076 2.0891 8000 0.2750 66.0540
0.0385 2.1224 8500 0.2178 54.4514
0.0245 2.1557 9000 0.1721 52.0031
0.0226 2.1891 9500 0.1741 53.7511
0.0212 3.0003 10000 0.2001 56.1495
0.0121 3.0336 10500 0.2322 55.4722
0.0042 3.0669 11000 0.2403 57.6864
0.0059 3.1003 11500 0.2953 64.0067
0.0248 3.1336 12000 0.1744 51.3412
0.0172 3.1669 12500 0.1872 53.5324
0.015 3.2003 13000 0.1930 54.7028
0.0158 4.0115 13500 0.2173 60.9636
0.0028 4.0448 14000 0.2330 53.4921
0.0028 4.0781 14500 0.2415 53.4767
0.0194 4.1115 15000 0.2220 57.6922

Framework versions

  • Transformers 4.51.1
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
21
Safetensors
Model size
764M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for adalat-ai/whisper-medium-ml-exp2

Finetuned
(622)
this model