whisper-small-yue

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3647
  • Wer: 73.2309

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.9266 0.15 50 2.6136 142.9438
1.0635 0.3 100 0.5730 216.4982
0.373 0.45 150 0.3679 319.4501
0.3004 0.6 200 0.3362 170.2386
0.3078 0.76 250 0.3206 118.7626
0.2687 0.91 300 0.3075 93.4897
0.204 1.06 350 0.3017 74.0801
0.1954 1.21 400 0.3015 77.5981
0.1857 1.36 450 0.2981 76.9106
0.1833 1.51 500 0.3029 75.3336
0.198 1.66 550 0.2972 76.8702
0.1902 1.81 600 0.2883 68.8233
0.196 1.96 650 0.2854 68.7829
0.0732 2.11 700 0.2895 69.9151
0.07 2.27 750 0.2939 72.2604
0.0853 2.42 800 0.3137 74.4440
0.0788 2.57 850 0.3133 71.3304
0.0713 2.72 900 0.2974 70.6025
0.0773 2.87 950 0.2884 75.2527
0.072 3.02 1000 0.2923 69.8342
0.0418 3.17 1050 0.2948 73.5544
0.0318 3.32 1100 0.2965 74.2822
0.0298 3.47 1150 0.3002 69.4703
0.036 3.63 1200 0.3004 70.1577
0.0307 3.78 1250 0.3015 74.3631
0.0403 3.93 1300 0.3000 71.4517
0.0152 4.08 1350 0.3117 68.7829
0.0168 4.23 1400 0.3140 71.3708
0.0136 4.38 1450 0.3225 68.1359
0.0145 4.53 1500 0.3244 70.6025
0.0188 4.68 1550 0.3217 72.0178
0.0202 4.83 1600 0.3087 70.8047
0.0184 4.98 1650 0.3225 72.6648
0.0082 5.14 1700 0.3285 70.2386
0.0066 5.29 1750 0.3288 71.0473
0.0091 5.44 1800 0.3370 68.2572
0.0063 5.59 1850 0.3299 71.5730
0.0093 5.74 1900 0.3311 69.3490
0.0079 5.89 1950 0.3456 75.4953
0.0087 6.04 2000 0.3357 77.5576
0.0052 6.19 2050 0.3364 79.6199
0.005 6.34 2100 0.3376 75.1314
0.0032 6.5 2150 0.3336 75.3740
0.004 6.65 2200 0.3441 74.7271
0.003 6.8 2250 0.3446 72.8265
0.0029 6.95 2300 0.3483 70.5216
0.002 7.1 2350 0.3532 74.2822
0.0014 7.25 2400 0.3469 73.5948
0.0015 7.4 2450 0.3494 75.9806
0.001 7.55 2500 0.3549 75.1314
0.0053 7.7 2550 0.3545 73.2309
0.0025 7.85 2600 0.3563 72.7457
0.001 8.01 2650 0.3562 73.5140
0.001 8.16 2700 0.3594 75.1719
0.0012 8.31 2750 0.3610 72.7052
0.0007 8.46 2800 0.3611 72.9478
0.0016 8.61 2850 0.3629 72.8265
0.0007 8.76 2900 0.3633 73.7161
0.0015 8.91 2950 0.3647 73.2309

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 1.12.1
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Chenxi-Chelsea-Liu/whisper-small-yue

Finetuned
(2809)
this model