whisper_input_decoder_shift_r_labels_no_force__0010

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 3.6757
  • Train Accuracy: 0.0136
  • Train Wermet: 0.7548
  • Validation Loss: 3.3141
  • Validation Accuracy: 0.0116
  • Validation Wermet: 0.8400
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.6348 0.0091 1.5865 4.2935 0.0093 0.9579 0
4.9212 0.0099 0.9054 4.1262 0.0097 0.9390 1
4.6819 0.0107 0.8319 3.9071 0.0103 0.8966 2
4.4443 0.0114 0.8310 3.7367 0.0106 0.8939 3
4.2479 0.0119 0.8226 3.6101 0.0109 0.8696 4
4.0911 0.0124 0.8103 3.5364 0.0110 0.8946 5
3.9590 0.0127 0.7913 3.4556 0.0113 0.8388 6
3.8513 0.0130 0.7794 3.4106 0.0114 0.8515 7
3.7607 0.0133 0.7657 3.3507 0.0115 0.8261 8
3.6757 0.0136 0.7548 3.3141 0.0116 0.8400 9

Framework versions

  • Transformers 4.34.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for bigmorning/whisper_input_decoder_shift_r_labels_no_force__0010

Finetuned
(1591)
this model