wav2vec2-large-xlsr-coraa-exp-15
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5572
- Wer: 0.3509
- Cer: 0.1800
- Per: 0.3419
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer | Per |
---|---|---|---|---|---|---|
38.389 | 1.0 | 14 | 41.3800 | 1.2235 | 1.2272 | 1.2229 |
38.389 | 2.0 | 28 | 9.4651 | 1.0 | 0.9619 | 1.0 |
38.389 | 3.0 | 42 | 4.3683 | 1.0 | 0.9619 | 1.0 |
38.389 | 4.0 | 56 | 3.7985 | 1.0 | 0.9619 | 1.0 |
38.389 | 5.0 | 70 | 3.5499 | 1.0 | 0.9619 | 1.0 |
38.389 | 6.0 | 84 | 3.3330 | 1.0 | 0.9619 | 1.0 |
38.389 | 7.0 | 98 | 3.2145 | 1.0 | 0.9619 | 1.0 |
11.0275 | 8.0 | 112 | 3.1199 | 1.0 | 0.9619 | 1.0 |
11.0275 | 9.0 | 126 | 3.0737 | 1.0 | 0.9619 | 1.0 |
11.0275 | 10.0 | 140 | 3.0931 | 1.0 | 0.9619 | 1.0 |
11.0275 | 11.0 | 154 | 3.0363 | 1.0 | 0.9619 | 1.0 |
11.0275 | 12.0 | 168 | 3.0763 | 1.0 | 0.9619 | 1.0 |
11.0275 | 13.0 | 182 | 3.0298 | 1.0 | 0.9619 | 1.0 |
11.0275 | 14.0 | 196 | 3.0064 | 1.0 | 0.9619 | 1.0 |
2.9809 | 15.0 | 210 | 3.0108 | 1.0 | 0.9619 | 1.0 |
2.9809 | 16.0 | 224 | 3.0049 | 1.0 | 0.9619 | 1.0 |
2.9809 | 17.0 | 238 | 3.0058 | 1.0 | 0.9619 | 1.0 |
2.9809 | 18.0 | 252 | 3.0002 | 1.0 | 0.9619 | 1.0 |
2.9809 | 19.0 | 266 | 2.9971 | 1.0 | 0.9619 | 1.0 |
2.9809 | 20.0 | 280 | 3.0035 | 1.0 | 0.9619 | 1.0 |
2.9809 | 21.0 | 294 | 2.9971 | 1.0 | 0.9619 | 1.0 |
2.9263 | 22.0 | 308 | 2.9916 | 1.0 | 0.9619 | 1.0 |
2.9263 | 23.0 | 322 | 2.9800 | 1.0 | 0.9619 | 1.0 |
2.9263 | 24.0 | 336 | 2.9640 | 1.0 | 0.9619 | 1.0 |
2.9263 | 25.0 | 350 | 2.8945 | 1.0 | 0.9619 | 1.0 |
2.9263 | 26.0 | 364 | 2.7927 | 1.0 | 0.9619 | 1.0 |
2.9263 | 27.0 | 378 | 2.6844 | 1.0 | 0.9589 | 1.0 |
2.9263 | 28.0 | 392 | 2.4282 | 1.0 | 0.8397 | 1.0 |
2.7713 | 29.0 | 406 | 1.9364 | 1.0 | 0.5755 | 1.0 |
2.7713 | 30.0 | 420 | 1.4716 | 1.0 | 0.4155 | 1.0 |
2.7713 | 31.0 | 434 | 1.1548 | 0.9970 | 0.3730 | 0.9970 |
2.7713 | 32.0 | 448 | 0.9997 | 0.9726 | 0.3450 | 0.9707 |
2.7713 | 33.0 | 462 | 0.9343 | 0.5624 | 0.2351 | 0.5414 |
2.7713 | 34.0 | 476 | 0.7747 | 0.4933 | 0.2163 | 0.4695 |
2.7713 | 35.0 | 490 | 0.7341 | 0.4307 | 0.2048 | 0.4094 |
1.1643 | 36.0 | 504 | 0.6947 | 0.4102 | 0.1987 | 0.3913 |
1.1643 | 37.0 | 518 | 0.7270 | 0.4273 | 0.2054 | 0.4082 |
1.1643 | 38.0 | 532 | 0.6677 | 0.4092 | 0.1979 | 0.3941 |
1.1643 | 39.0 | 546 | 0.6832 | 0.4130 | 0.1993 | 0.3986 |
1.1643 | 40.0 | 560 | 0.6691 | 0.3994 | 0.1978 | 0.3846 |
1.1643 | 41.0 | 574 | 0.6585 | 0.3941 | 0.1962 | 0.3818 |
1.1643 | 42.0 | 588 | 0.6199 | 0.3992 | 0.1942 | 0.3856 |
0.493 | 43.0 | 602 | 0.6268 | 0.3970 | 0.1929 | 0.3854 |
0.493 | 44.0 | 616 | 0.5925 | 0.3842 | 0.1902 | 0.3679 |
0.493 | 45.0 | 630 | 0.6071 | 0.3797 | 0.1908 | 0.3645 |
0.493 | 46.0 | 644 | 0.6037 | 0.3840 | 0.1902 | 0.3690 |
0.493 | 47.0 | 658 | 0.5819 | 0.3728 | 0.1879 | 0.3576 |
0.493 | 48.0 | 672 | 0.5930 | 0.3671 | 0.1862 | 0.3509 |
0.493 | 49.0 | 686 | 0.6292 | 0.3748 | 0.1914 | 0.3598 |
0.3417 | 50.0 | 700 | 0.6298 | 0.3777 | 0.1908 | 0.3623 |
0.3417 | 51.0 | 714 | 0.6183 | 0.3677 | 0.1881 | 0.3539 |
0.3417 | 52.0 | 728 | 0.6200 | 0.3728 | 0.1891 | 0.3590 |
0.3417 | 53.0 | 742 | 0.6202 | 0.3681 | 0.1883 | 0.3539 |
0.3417 | 54.0 | 756 | 0.5683 | 0.3663 | 0.1842 | 0.3537 |
0.3417 | 55.0 | 770 | 0.5832 | 0.3625 | 0.1849 | 0.3501 |
0.3417 | 56.0 | 784 | 0.5890 | 0.3655 | 0.1846 | 0.3529 |
0.3417 | 57.0 | 798 | 0.5770 | 0.3663 | 0.1843 | 0.3547 |
0.2811 | 58.0 | 812 | 0.5655 | 0.3606 | 0.1844 | 0.3488 |
0.2811 | 59.0 | 826 | 0.5698 | 0.3553 | 0.1818 | 0.3458 |
0.2811 | 60.0 | 840 | 0.5964 | 0.3633 | 0.1849 | 0.3545 |
0.2811 | 61.0 | 854 | 0.5919 | 0.3608 | 0.1859 | 0.3533 |
0.2811 | 62.0 | 868 | 0.5771 | 0.3549 | 0.1826 | 0.3472 |
0.2811 | 63.0 | 882 | 0.6055 | 0.3555 | 0.1858 | 0.3476 |
0.2811 | 64.0 | 896 | 0.5833 | 0.3588 | 0.1832 | 0.3488 |
0.2169 | 65.0 | 910 | 0.5864 | 0.3582 | 0.1841 | 0.3490 |
0.2169 | 66.0 | 924 | 0.5833 | 0.3578 | 0.1830 | 0.3490 |
0.2169 | 67.0 | 938 | 0.5663 | 0.3531 | 0.1811 | 0.3442 |
0.2169 | 68.0 | 952 | 0.5572 | 0.3509 | 0.1800 | 0.3419 |
0.2169 | 69.0 | 966 | 0.5641 | 0.3533 | 0.1800 | 0.3448 |
0.2169 | 70.0 | 980 | 0.5700 | 0.3525 | 0.1790 | 0.3446 |
0.2169 | 71.0 | 994 | 0.5835 | 0.3549 | 0.1813 | 0.3468 |
0.1834 | 72.0 | 1008 | 0.5718 | 0.3499 | 0.1811 | 0.3417 |
0.1834 | 73.0 | 1022 | 0.5938 | 0.3539 | 0.1837 | 0.3454 |
0.1834 | 74.0 | 1036 | 0.5955 | 0.3560 | 0.1833 | 0.3466 |
0.1834 | 75.0 | 1050 | 0.5658 | 0.3519 | 0.1819 | 0.3438 |
0.1834 | 76.0 | 1064 | 0.5671 | 0.3497 | 0.1806 | 0.3415 |
0.1834 | 77.0 | 1078 | 0.5772 | 0.3541 | 0.1827 | 0.3454 |
0.1834 | 78.0 | 1092 | 0.5744 | 0.3507 | 0.1806 | 0.3442 |
0.1877 | 79.0 | 1106 | 0.5727 | 0.3476 | 0.1796 | 0.3403 |
0.1877 | 80.0 | 1120 | 0.5696 | 0.3468 | 0.1797 | 0.3393 |
0.1877 | 81.0 | 1134 | 0.5846 | 0.3482 | 0.1805 | 0.3395 |
0.1877 | 82.0 | 1148 | 0.5943 | 0.3511 | 0.1799 | 0.3432 |
0.1877 | 83.0 | 1162 | 0.5738 | 0.3456 | 0.1791 | 0.3389 |
0.1877 | 84.0 | 1176 | 0.6163 | 0.3553 | 0.1834 | 0.3478 |
0.1877 | 85.0 | 1190 | 0.5756 | 0.3488 | 0.1801 | 0.3417 |
0.1684 | 86.0 | 1204 | 0.6208 | 0.3547 | 0.1831 | 0.3480 |
0.1684 | 87.0 | 1218 | 0.5924 | 0.3484 | 0.1811 | 0.3413 |
0.1684 | 88.0 | 1232 | 0.5953 | 0.3509 | 0.1813 | 0.3436 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.