metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-large-v3
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Moroccan-Darija-STT-large-v1.6.9
results: []
Moroccan-Darija-STT-large-v1.6.9
This model is a fine-tuned version of openai/whisper-large-v3 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4739
- Wer: 100.3430
- Cer: 60.7571
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.8087 | 0.2947 | 30 | 0.3760 | 224.4562 | 178.8136 |
0.6175 | 0.5893 | 60 | 0.3120 | 81.7436 | 39.5368 |
0.5598 | 0.8840 | 90 | 0.3042 | 78.0037 | 36.8174 |
0.4367 | 1.1866 | 120 | 0.3054 | 73.2597 | 32.8649 |
0.405 | 1.4813 | 150 | 0.3193 | 79.3507 | 38.3612 |
0.3715 | 1.7759 | 180 | 0.3233 | 84.3624 | 43.0400 |
0.3028 | 2.0786 | 210 | 0.3385 | 88.1108 | 47.6260 |
0.2854 | 2.3732 | 240 | 0.3553 | 81.4592 | 41.0249 |
0.2514 | 2.6679 | 270 | 0.3605 | 92.2440 | 52.2186 |
0.2282 | 2.9626 | 300 | 0.3705 | 90.5120 | 50.7120 |
0.1739 | 3.2652 | 330 | 0.3970 | 80.0954 | 41.1854 |
0.1539 | 3.5599 | 360 | 0.4029 | 101.3805 | 58.6862 |
0.137 | 3.8545 | 390 | 0.4099 | 93.1141 | 52.6663 |
0.1125 | 4.1572 | 420 | 0.4345 | 95.6242 | 56.3637 |
0.1025 | 4.4518 | 450 | 0.4402 | 87.0231 | 47.2088 |
0.0927 | 4.7465 | 480 | 0.4484 | 100.4853 | 60.9023 |
0.0825 | 5.0491 | 510 | 0.4642 | 102.8029 | 62.0914 |
0.0722 | 5.3438 | 540 | 0.4702 | 104.8109 | 64.8869 |
0.0711 | 5.6384 | 570 | 0.4733 | 96.8039 | 57.2995 |
0.0693 | 5.9331 | 600 | 0.4739 | 100.3430 | 60.7571 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0