wav2vec2-xls-r-300m-pre_trained-converted-faroese-100h-30-epochs_2025-07-10
This model was trained from scratch on the Ravnursson 100h dataset. It achieves the following results on the Test set:
- Loss: 0.0986
- Wer: 8.21
- Cer: 2.28
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
3.3152 | 0.4877 | 1000 | 3.2279 | 100.0 | 99.4935 |
0.9943 | 0.9754 | 2000 | 0.6195 | 53.4035 | 15.3503 |
0.5408 | 1.4628 | 3000 | 0.2845 | 36.1149 | 9.5203 |
0.4766 | 1.9505 | 4000 | 0.2440 | 33.9825 | 8.6137 |
0.3776 | 2.4379 | 5000 | 0.1999 | 30.5283 | 7.6093 |
0.3464 | 2.9256 | 6000 | 0.1867 | 28.9289 | 7.1225 |
0.2626 | 3.4131 | 7000 | 0.1706 | 27.6248 | 6.7627 |
0.2949 | 3.9008 | 8000 | 0.1612 | 26.5454 | 6.4439 |
0.2243 | 4.3882 | 9000 | 0.1570 | 25.9858 | 6.3421 |
0.2466 | 4.8759 | 10000 | 0.1505 | 25.3866 | 6.0762 |
0.1965 | 5.3633 | 11000 | 0.1538 | 25.3470 | 6.0786 |
0.2083 | 5.8510 | 12000 | 0.1383 | 24.6068 | 5.8821 |
0.1897 | 6.3385 | 13000 | 0.1318 | 24.1398 | 5.7077 |
0.1919 | 6.8261 | 14000 | 0.1301 | 24.2587 | 5.6943 |
0.1547 | 7.3136 | 15000 | 0.1318 | 23.4965 | 5.5634 |
0.1603 | 7.8013 | 16000 | 0.1234 | 22.7607 | 5.2880 |
0.1525 | 8.2887 | 17000 | 0.1323 | 23.1572 | 5.3811 |
0.1476 | 8.7764 | 18000 | 0.1262 | 22.8621 | 5.3077 |
0.1356 | 9.2638 | 19000 | 0.1211 | 22.6814 | 5.2856 |
0.1319 | 9.7515 | 20000 | 0.1246 | 22.3289 | 5.1878 |
0.1285 | 10.2390 | 21000 | 0.1200 | 22.2100 | 5.1089 |
0.1194 | 10.7267 | 22000 | 0.1179 | 22.0822 | 5.0892 |
0.1044 | 11.2141 | 23000 | 0.1192 | 21.9985 | 5.0734 |
0.1034 | 11.7018 | 24000 | 0.1210 | 21.9544 | 5.0560 |
0.1072 | 12.1892 | 25000 | 0.1158 | 21.6989 | 4.9629 |
0.0933 | 12.6769 | 26000 | 0.1140 | 21.9016 | 5.0395 |
0.0973 | 13.1644 | 27000 | 0.1154 | 21.3068 | 4.8556 |
0.0974 | 13.6520 | 28000 | 0.1114 | 21.5447 | 4.8840 |
0.0834 | 14.1395 | 29000 | 0.1084 | 21.3332 | 4.8193 |
0.0987 | 14.6272 | 30000 | 0.1062 | 21.2099 | 4.7736 |
0.0834 | 15.1146 | 31000 | 0.1088 | 20.9323 | 4.7270 |
0.0785 | 15.6023 | 32000 | 0.1059 | 21.0336 | 4.7183 |
0.0857 | 16.0897 | 33000 | 0.1021 | 20.7913 | 4.6449 |
0.0815 | 16.5774 | 34000 | 0.1043 | 20.4917 | 4.5889 |
0.0656 | 17.0649 | 35000 | 0.1143 | 20.7957 | 4.6292 |
0.0578 | 17.5525 | 36000 | 0.1070 | 20.5622 | 4.5747 |
0.0687 | 18.0400 | 37000 | 0.1054 | 20.5005 | 4.5408 |
0.0592 | 18.5277 | 38000 | 0.1100 | 20.2185 | 4.4927 |
0.065 | 19.0151 | 39000 | 0.1052 | 20.2317 | 4.4990 |
0.059 | 19.5028 | 40000 | 0.1032 | 20.3066 | 4.4832 |
0.0508 | 19.9905 | 41000 | 0.1080 | 20.2846 | 4.4856 |
0.0553 | 20.4779 | 42000 | 0.1074 | 20.3287 | 4.4808 |
0.0469 | 20.9656 | 43000 | 0.0979 | 19.8749 | 4.3285 |
0.0447 | 21.4531 | 44000 | 0.1020 | 20.0070 | 4.3885 |
0.0695 | 21.9407 | 45000 | 0.1017 | 19.9321 | 4.3491 |
0.0576 | 22.4282 | 46000 | 0.1026 | 19.8837 | 4.3585 |
0.0642 | 22.9159 | 47000 | 0.1034 | 19.8132 | 4.3388 |
0.0597 | 23.4033 | 48000 | 0.1019 | 19.7427 | 4.2899 |
0.0473 | 23.8910 | 49000 | 0.1054 | 19.7471 | 4.3285 |
0.0544 | 24.3784 | 50000 | 0.1001 | 19.6370 | 4.2859 |
0.0479 | 24.8661 | 51000 | 0.1018 | 19.5973 | 4.2575 |
0.0467 | 25.3536 | 52000 | 0.0991 | 19.6017 | 4.2662 |
0.0443 | 25.8413 | 53000 | 0.0983 | 19.5136 | 4.2441 |
0.0483 | 26.3287 | 54000 | 0.0997 | 19.5444 | 4.2410 |
0.0436 | 26.8164 | 55000 | 0.1007 | 19.5532 | 4.2425 |
0.0459 | 27.3038 | 56000 | 0.0992 | 19.5268 | 4.2418 |
0.0508 | 27.7915 | 57000 | 0.0987 | 19.5004 | 4.2299 |
0.0526 | 28.2790 | 58000 | 0.0991 | 19.4828 | 4.2291 |
0.0417 | 28.7666 | 59000 | 0.0989 | 19.4960 | 4.2268 |
0.0517 | 29.2541 | 60000 | 0.0985 | 19.4960 | 4.2276 |
0.057 | 29.7418 | 61000 | 0.0986 | 19.4916 | 4.2276 |
Framework versions
- Transformers 4.53.1
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.2
- Downloads last month
- 17
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support