wav2vec2-xls-r-300m-converted-faroese-100h-30-epochs_2025-07-10_v2
This model was trained from scratch on the Ravnursson dataset. It achieves the following results on the Test set:
- Loss: 0.0990
- Wer: 7.72
- Cer: 2.15
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
3.3186 | 0.4877 | 1000 | 3.2374 | 100.0 | 99.9992 |
1.261 | 0.9754 | 2000 | 0.8817 | 71.1239 | 21.7903 |
0.6199 | 1.4628 | 3000 | 0.3426 | 41.8690 | 11.1985 |
0.4976 | 1.9505 | 4000 | 0.2694 | 36.4365 | 9.4027 |
0.3769 | 2.4379 | 5000 | 0.2169 | 31.5108 | 7.8902 |
0.3502 | 2.9256 | 6000 | 0.1904 | 29.9731 | 7.3221 |
0.2701 | 3.4131 | 7000 | 0.1674 | 27.8495 | 6.7232 |
0.275 | 3.9008 | 8000 | 0.1534 | 26.8802 | 6.4518 |
0.2219 | 4.3882 | 9000 | 0.1527 | 26.5850 | 6.3855 |
0.2399 | 4.8759 | 10000 | 0.1408 | 25.2985 | 6.0305 |
0.1945 | 5.3633 | 11000 | 0.1464 | 25.4747 | 6.0305 |
0.1953 | 5.8510 | 12000 | 0.1311 | 24.7654 | 5.7724 |
0.1626 | 6.3385 | 13000 | 0.1321 | 24.3292 | 5.7212 |
0.1752 | 6.8261 | 14000 | 0.1313 | 23.7829 | 5.5594 |
0.1493 | 7.3136 | 15000 | 0.1320 | 23.4260 | 5.4616 |
0.1526 | 7.8013 | 16000 | 0.1254 | 22.9898 | 5.3606 |
0.14 | 8.2887 | 17000 | 0.1223 | 22.8532 | 5.2351 |
0.1454 | 8.7764 | 18000 | 0.1251 | 22.4171 | 5.1531 |
0.1218 | 9.2638 | 19000 | 0.1152 | 22.2012 | 5.0789 |
0.1239 | 9.7515 | 20000 | 0.1223 | 22.0337 | 5.0442 |
0.1133 | 10.2390 | 21000 | 0.1155 | 22.1835 | 5.0331 |
0.1073 | 10.7267 | 22000 | 0.1159 | 22.2496 | 5.0544 |
0.0949 | 11.2141 | 23000 | 0.1169 | 21.7253 | 4.9243 |
0.093 | 11.7018 | 24000 | 0.1157 | 21.7430 | 4.9456 |
0.0963 | 12.1892 | 25000 | 0.1128 | 21.6504 | 4.9337 |
0.0913 | 12.6769 | 26000 | 0.1122 | 21.4786 | 4.8698 |
0.0853 | 13.1644 | 27000 | 0.1133 | 21.2407 | 4.8162 |
0.0824 | 13.6520 | 28000 | 0.1084 | 21.1746 | 4.7467 |
0.074 | 14.1395 | 29000 | 0.1127 | 21.0160 | 4.7089 |
0.0888 | 14.6272 | 30000 | 0.1070 | 21.0248 | 4.7057 |
0.0757 | 15.1146 | 31000 | 0.1128 | 20.8706 | 4.6702 |
0.0699 | 15.6023 | 32000 | 0.1045 | 20.8398 | 4.6244 |
0.0728 | 16.0897 | 33000 | 0.1077 | 20.6988 | 4.5929 |
0.0678 | 16.5774 | 34000 | 0.1064 | 20.5137 | 4.5242 |
0.0592 | 17.0649 | 35000 | 0.1040 | 20.4476 | 4.5021 |
0.0545 | 17.5525 | 36000 | 0.1100 | 20.4476 | 4.5084 |
0.0652 | 18.0400 | 37000 | 0.1025 | 20.2978 | 4.4437 |
0.0538 | 18.5277 | 38000 | 0.1044 | 20.0423 | 4.3901 |
0.0578 | 19.0151 | 39000 | 0.1043 | 19.9674 | 4.3767 |
0.0537 | 19.5028 | 40000 | 0.1060 | 19.9189 | 4.3506 |
0.0493 | 19.9905 | 41000 | 0.1017 | 19.8837 | 4.3199 |
0.0544 | 20.4779 | 42000 | 0.0990 | 19.8176 | 4.3017 |
0.0429 | 20.9656 | 43000 | 0.1009 | 19.7163 | 4.2694 |
0.0389 | 21.4531 | 44000 | 0.0994 | 19.6678 | 4.2354 |
0.0651 | 21.9407 | 45000 | 0.0988 | 19.6986 | 4.2394 |
0.0544 | 22.4282 | 46000 | 0.0978 | 19.5797 | 4.1991 |
0.0577 | 22.9159 | 47000 | 0.0993 | 19.6193 | 4.2181 |
0.0555 | 23.4033 | 48000 | 0.0971 | 19.6590 | 4.1984 |
0.0418 | 23.8910 | 49000 | 0.0997 | 19.4034 | 4.1739 |
0.0524 | 24.3784 | 50000 | 0.0976 | 19.4739 | 4.1660 |
0.0424 | 24.8661 | 51000 | 0.0980 | 19.4079 | 4.1526 |
0.0412 | 25.3536 | 52000 | 0.0992 | 19.3021 | 4.1400 |
0.0394 | 25.8413 | 53000 | 0.0992 | 19.3506 | 4.1621 |
0.0459 | 26.3287 | 54000 | 0.0978 | 19.3021 | 4.1337 |
0.0391 | 26.8164 | 55000 | 0.0987 | 19.2536 | 4.1179 |
0.0434 | 27.3038 | 56000 | 0.0985 | 19.2492 | 4.1037 |
0.0494 | 27.7915 | 57000 | 0.0982 | 19.2228 | 4.0982 |
0.0417 | 28.2790 | 58000 | 0.0997 | 19.2360 | 4.1084 |
0.0417 | 28.7666 | 59000 | 0.0991 | 19.2360 | 4.1076 |
0.0509 | 29.2541 | 60000 | 0.0990 | 19.2492 | 4.1092 |
0.0498 | 29.7418 | 61000 | 0.0990 | 19.2404 | 4.1084 |
Framework versions
- Transformers 4.53.1
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.2
- Downloads last month
- 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support