wav2vec2-xls-r-1b-scandinavian-E4-100h-30-epochs-20250207_v8
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: nan
- Wer: 100.0
- Cer: 99.9985
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.1587 | 0.7819 | 1000 | inf | 38.7257 | 10.7992 |
0.5254 | 1.5637 | 2000 | inf | 24.8106 | 6.8757 |
0.3998 | 2.3456 | 3000 | inf | 22.7575 | 6.3378 |
0.5036 | 3.1274 | 4000 | inf | 21.6043 | 6.0747 |
0.4676 | 3.9093 | 5000 | inf | 21.7003 | 6.1273 |
0.2668 | 4.6912 | 6000 | inf | 21.4577 | 6.0100 |
0.4587 | 5.4730 | 7000 | inf | 21.7436 | 6.1104 |
0.2215 | 6.2549 | 8000 | inf | 21.0409 | 5.8718 |
0.3593 | 7.0367 | 9000 | inf | 23.7819 | 6.6481 |
0.7166 | 7.8186 | 10000 | inf | 42.9122 | 11.9715 |
0.397 | 8.6005 | 11000 | inf | 21.5579 | 6.0382 |
0.2827 | 9.3823 | 12000 | inf | 21.2118 | 5.9241 |
0.4691 | 10.1642 | 13000 | inf | 21.9599 | 6.0617 |
0.5788 | 10.9461 | 14000 | inf | 24.1385 | 6.6962 |
0.6354 | 11.7279 | 15000 | inf | 28.7291 | 8.1150 |
0.0 | 12.5098 | 16000 | nan | 100.0 | 99.9985 |
0.0 | 13.2916 | 17000 | nan | 100.0 | 99.9985 |
0.0 | 14.0735 | 18000 | nan | 100.0 | 99.9985 |
0.0 | 14.8554 | 19000 | nan | 100.0 | 99.9985 |
0.0 | 15.6372 | 20000 | nan | 100.0 | 99.9985 |
0.0 | 16.4191 | 21000 | nan | 100.0 | 99.9985 |
0.0 | 17.2009 | 22000 | nan | 100.0 | 99.9985 |
0.0 | 17.9828 | 23000 | nan | 100.0 | 99.9985 |
0.0 | 18.7647 | 24000 | nan | 100.0 | 99.9985 |
0.0 | 19.5465 | 25000 | nan | 100.0 | 99.9985 |
0.0 | 20.3284 | 26000 | nan | 100.0 | 99.9985 |
0.0 | 21.1102 | 27000 | nan | 100.0 | 99.9985 |
0.0 | 21.8921 | 28000 | nan | 100.0 | 99.9985 |
0.0 | 22.6740 | 29000 | nan | 100.0 | 99.9985 |
0.0 | 23.4558 | 30000 | nan | 100.0 | 99.9985 |
0.0 | 24.2377 | 31000 | nan | 100.0 | 99.9985 |
0.0 | 25.0195 | 32000 | nan | 100.0 | 99.9985 |
0.0 | 25.8014 | 33000 | nan | 100.0 | 99.9985 |
0.0 | 26.5833 | 34000 | nan | 100.0 | 99.9985 |
0.0 | 27.3651 | 35000 | nan | 100.0 | 99.9985 |
0.0 | 28.1470 | 36000 | nan | 100.0 | 99.9985 |
0.0 | 28.9289 | 37000 | nan | 100.0 | 99.9985 |
0.0 | 29.7107 | 38000 | nan | 100.0 | 99.9985 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 1
Model tree for davidilag/wav2vec2-xls-r-1b-scandinavian-E4-100h-30-epochs-20250207_v8
Base model
facebook/wav2vec2-xls-r-1b