wav2vec2-xls-r-1b-scandinavian-E2-100h-30-epochs-20250123
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1348
- Wer: 21.6270
- Cer: 4.5904
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 6000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.0569 | 0.5831 | 1000 | 0.5118 | 63.8799 | 16.0725 |
0.4643 | 1.1662 | 2000 | 0.2647 | 38.1428 | 9.2101 |
0.3797 | 1.7493 | 3000 | 0.2158 | 33.0390 | 7.8553 |
0.3558 | 2.3324 | 4000 | 0.1996 | 31.8993 | 7.4365 |
0.3784 | 2.9155 | 5000 | 0.2039 | 31.7718 | 7.5216 |
0.4332 | 3.4985 | 6000 | 0.2181 | 33.0704 | 7.8748 |
0.3112 | 4.0816 | 7000 | 0.2070 | 32.4627 | 7.6624 |
0.3189 | 4.6647 | 8000 | 0.1970 | 31.6000 | 7.4421 |
0.2768 | 5.2478 | 9000 | 0.1998 | 31.2380 | 7.4350 |
0.2924 | 5.8309 | 10000 | 0.1871 | 30.2479 | 7.0815 |
0.2971 | 6.4140 | 11000 | 0.1874 | 30.3957 | 7.2106 |
0.3326 | 6.9971 | 12000 | 0.1860 | 29.7972 | 7.0494 |
0.2833 | 7.5802 | 13000 | 0.1908 | 30.0964 | 7.1554 |
0.2283 | 8.1633 | 14000 | 0.1775 | 29.1211 | 6.8593 |
0.2145 | 8.7464 | 15000 | 0.1768 | 28.5836 | 6.6787 |
0.2556 | 9.3294 | 16000 | 0.1785 | 28.7831 | 6.7674 |
0.2567 | 9.9125 | 17000 | 0.1738 | 28.3009 | 6.5702 |
0.2418 | 10.4956 | 18000 | 0.1719 | 28.5466 | 6.6845 |
0.1733 | 11.0787 | 19000 | 0.1619 | 27.0467 | 6.2287 |
0.1734 | 11.6618 | 20000 | 0.1624 | 26.6052 | 6.1501 |
0.2026 | 12.2449 | 21000 | 0.1614 | 26.8417 | 6.1794 |
0.1898 | 12.8280 | 22000 | 0.1541 | 26.6902 | 6.1292 |
0.2149 | 13.4111 | 23000 | 0.1570 | 26.3780 | 6.0525 |
0.1751 | 13.9942 | 24000 | 0.1494 | 25.8756 | 5.8931 |
0.1488 | 14.5773 | 25000 | 0.1468 | 25.5819 | 5.7224 |
0.1283 | 15.1603 | 26000 | 0.1477 | 25.3380 | 5.6983 |
0.1363 | 15.7434 | 27000 | 0.1463 | 24.8430 | 5.5717 |
0.1324 | 16.3265 | 28000 | 0.1453 | 24.9095 | 5.5905 |
0.143 | 16.9096 | 29000 | 0.1434 | 24.6915 | 5.5245 |
0.1241 | 17.4927 | 30000 | 0.1436 | 24.1965 | 5.3843 |
0.0976 | 18.0758 | 31000 | 0.1475 | 24.1429 | 5.3458 |
0.0945 | 18.6589 | 32000 | 0.1405 | 23.4871 | 5.1769 |
0.1085 | 19.2420 | 33000 | 0.1393 | 23.6515 | 5.1972 |
0.1165 | 19.8251 | 34000 | 0.1350 | 23.2599 | 5.1165 |
0.1195 | 20.4082 | 35000 | 0.1435 | 23.1196 | 5.0626 |
0.101 | 20.9913 | 36000 | 0.1366 | 22.9219 | 4.9904 |
0.0764 | 21.5743 | 37000 | 0.1359 | 22.6892 | 4.9498 |
0.0806 | 22.1574 | 38000 | 0.1372 | 22.5506 | 4.8653 |
0.084 | 22.7405 | 39000 | 0.1350 | 22.3530 | 4.8049 |
0.0825 | 23.3236 | 40000 | 0.1380 | 22.1812 | 4.7661 |
0.0788 | 23.9067 | 41000 | 0.1407 | 22.1221 | 4.7596 |
0.0632 | 24.4898 | 42000 | 0.1385 | 22.0463 | 4.7328 |
0.0502 | 25.0729 | 43000 | 0.1393 | 21.9761 | 4.6974 |
0.0482 | 25.6560 | 44000 | 0.1384 | 21.9946 | 4.6878 |
0.0612 | 26.2391 | 45000 | 0.1364 | 21.7619 | 4.6330 |
0.0708 | 26.8222 | 46000 | 0.1346 | 21.7378 | 4.6234 |
0.0609 | 27.4052 | 47000 | 0.1353 | 21.7101 | 4.6142 |
0.0632 | 27.9883 | 48000 | 0.1361 | 21.7064 | 4.6071 |
0.0528 | 28.5714 | 49000 | 0.1350 | 21.6769 | 4.6064 |
0.0696 | 29.1545 | 50000 | 0.1347 | 21.6381 | 4.5898 |
0.0564 | 29.7376 | 51000 | 0.1348 | 21.6270 | 4.5904 |
Framework versions
- Transformers 4.48.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- -