wav2vec2-large-xlsr-coraa-exp-10

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5470
  • Wer: 0.3417
  • Cer: 0.1785

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
37.9224 1.0 14 24.6578 1.0 0.9618
37.9224 2.0 28 7.1453 1.0 0.9619
37.9224 3.0 42 4.4391 1.0 0.9619
37.9224 4.0 56 3.9092 1.0 0.9619
37.9224 5.0 70 3.6835 1.0 0.9619
37.9224 6.0 84 3.5223 1.0 0.9619
37.9224 7.0 98 3.3716 1.0 0.9619
9.0651 8.0 112 3.2723 1.0 0.9619
9.0651 9.0 126 3.1860 1.0 0.9619
9.0651 10.0 140 3.1461 1.0 0.9619
9.0651 11.0 154 3.1368 1.0 0.9619
9.0651 12.0 168 3.0961 1.0 0.9619
9.0651 13.0 182 3.0767 1.0 0.9619
9.0651 14.0 196 3.0509 1.0 0.9619
3.0601 15.0 210 3.0871 1.0 0.9619
3.0601 16.0 224 3.0415 1.0 0.9619
3.0601 17.0 238 3.0330 1.0 0.9619
3.0601 18.0 252 3.0192 1.0 0.9619
3.0601 19.0 266 3.0266 1.0 0.9619
3.0601 20.0 280 3.0243 1.0 0.9619
3.0601 21.0 294 3.0106 1.0 0.9619
2.9552 22.0 308 3.0053 1.0 0.9619
2.9552 23.0 322 2.9986 1.0 0.9619
2.9552 24.0 336 3.0030 1.0 0.9619
2.9552 25.0 350 2.9950 1.0 0.9619
2.9552 26.0 364 3.0058 1.0 0.9619
2.9552 27.0 378 2.9943 1.0 0.9619
2.9552 28.0 392 2.9845 1.0 0.9619
2.9213 29.0 406 2.9713 1.0 0.9619
2.9213 30.0 420 2.9485 1.0 0.9619
2.9213 31.0 434 2.9415 1.0 0.9619
2.9213 32.0 448 2.8913 1.0 0.9619
2.9213 33.0 462 2.8057 1.0 0.9612
2.9213 34.0 476 2.6984 1.0 0.9599
2.9213 35.0 490 2.5785 1.0 0.9067
2.7804 36.0 504 2.3545 1.0 0.7929
2.7804 37.0 518 2.0433 1.0 0.5933
2.7804 38.0 532 1.7438 1.0 0.4701
2.7804 39.0 546 1.4659 1.0 0.4139
2.7804 40.0 560 1.2873 0.9929 0.3840
2.7804 41.0 574 1.1588 0.9315 0.3387
2.7804 42.0 588 1.0163 0.7395 0.2706
1.6517 43.0 602 0.9399 0.5331 0.2258
1.6517 44.0 616 0.9131 0.4929 0.2167
1.6517 45.0 630 0.8352 0.4770 0.2114
1.6517 46.0 644 0.8115 0.4555 0.2084
1.6517 47.0 658 0.7850 0.4403 0.2038
1.6517 48.0 672 0.7574 0.4356 0.2019
1.6517 49.0 686 0.7238 0.4248 0.1989
0.7966 50.0 700 0.7132 0.4130 0.1960
0.7966 51.0 714 0.7054 0.4128 0.1963
0.7966 52.0 728 0.7119 0.4134 0.1990
0.7966 53.0 742 0.6793 0.3990 0.1945
0.7966 54.0 756 0.6718 0.3944 0.1932
0.7966 55.0 770 0.6718 0.4013 0.1949
0.7966 56.0 784 0.6831 0.3976 0.1965
0.7966 57.0 798 0.6400 0.3870 0.1916
0.5799 58.0 812 0.6423 0.3844 0.1906
0.5799 59.0 826 0.6394 0.3834 0.1908
0.5799 60.0 840 0.6574 0.3785 0.1924
0.5799 61.0 854 0.6321 0.3816 0.1918
0.5799 62.0 868 0.6306 0.3801 0.1913
0.5799 63.0 882 0.6433 0.3799 0.1916
0.5799 64.0 896 0.6342 0.3811 0.1896
0.445 65.0 910 0.6212 0.3811 0.1904
0.445 66.0 924 0.6164 0.3789 0.1895
0.445 67.0 938 0.6006 0.3732 0.1872
0.445 68.0 952 0.6054 0.3746 0.1891
0.445 69.0 966 0.6245 0.3722 0.1894
0.445 70.0 980 0.6090 0.3688 0.1878
0.445 71.0 994 0.6073 0.3669 0.1876
0.3746 72.0 1008 0.5989 0.3708 0.1889
0.3746 73.0 1022 0.5968 0.3681 0.1874
0.3746 74.0 1036 0.5946 0.3659 0.1870
0.3746 75.0 1050 0.5874 0.3623 0.1864
0.3746 76.0 1064 0.5928 0.3639 0.1870
0.3746 77.0 1078 0.5889 0.3681 0.1882
0.3746 78.0 1092 0.5723 0.3683 0.1864
0.3543 79.0 1106 0.5928 0.3657 0.1863
0.3543 80.0 1120 0.5832 0.3649 0.1855
0.3543 81.0 1134 0.5785 0.3645 0.1849
0.3543 82.0 1148 0.5877 0.3580 0.1842
0.3543 83.0 1162 0.5870 0.3627 0.1853
0.3543 84.0 1176 0.5738 0.3618 0.1846
0.3543 85.0 1190 0.5641 0.3576 0.1816
0.3207 86.0 1204 0.5728 0.3566 0.1821
0.3207 87.0 1218 0.5706 0.3560 0.1817
0.3207 88.0 1232 0.5607 0.3570 0.1813
0.3207 89.0 1246 0.5644 0.3557 0.1817
0.3207 90.0 1260 0.5660 0.3582 0.1824
0.3207 91.0 1274 0.5688 0.3566 0.1829
0.3207 92.0 1288 0.5635 0.3541 0.1807
0.2984 93.0 1302 0.5663 0.3503 0.1814
0.2984 94.0 1316 0.5515 0.3543 0.1807
0.2984 95.0 1330 0.5563 0.3517 0.1803
0.2984 96.0 1344 0.5618 0.3509 0.1809
0.2984 97.0 1358 0.5554 0.3517 0.1807
0.2984 98.0 1372 0.5606 0.3529 0.1809
0.2984 99.0 1386 0.5597 0.3511 0.1813
0.2622 100.0 1400 0.5628 0.3505 0.1812
0.2622 101.0 1414 0.5564 0.3495 0.1799
0.2622 102.0 1428 0.5626 0.3484 0.1811
0.2622 103.0 1442 0.5556 0.3470 0.1800
0.2622 104.0 1456 0.5603 0.3464 0.1799
0.2622 105.0 1470 0.5571 0.3454 0.1799
0.2622 106.0 1484 0.5618 0.3466 0.1799
0.2622 107.0 1498 0.5519 0.3440 0.1787
0.2519 108.0 1512 0.5541 0.3440 0.1790
0.2519 109.0 1526 0.5574 0.3464 0.1794
0.2519 110.0 1540 0.5590 0.3454 0.1801
0.2519 111.0 1554 0.5530 0.3448 0.1796
0.2519 112.0 1568 0.5501 0.3438 0.1792
0.2519 113.0 1582 0.5595 0.3448 0.1799
0.2519 114.0 1596 0.5536 0.3446 0.1801
0.245 115.0 1610 0.5480 0.3432 0.1789
0.245 116.0 1624 0.5623 0.3486 0.1798
0.245 117.0 1638 0.5496 0.3427 0.1790
0.245 118.0 1652 0.5552 0.3421 0.1789
0.245 119.0 1666 0.5558 0.3438 0.1787
0.245 120.0 1680 0.5524 0.3425 0.1783
0.245 121.0 1694 0.5582 0.3421 0.1786
0.2322 122.0 1708 0.5534 0.3425 0.1786
0.2322 123.0 1722 0.5596 0.3464 0.1801
0.2322 124.0 1736 0.5486 0.3432 0.1790
0.2322 125.0 1750 0.5581 0.3425 0.1792
0.2322 126.0 1764 0.5470 0.3417 0.1785
0.2322 127.0 1778 0.5544 0.3413 0.1781
0.2322 128.0 1792 0.5501 0.3436 0.1781
0.2324 129.0 1806 0.5518 0.3440 0.1782
0.2324 130.0 1820 0.5511 0.3389 0.1775
0.2324 131.0 1834 0.5584 0.3417 0.1782
0.2324 132.0 1848 0.5493 0.3373 0.1775
0.2324 133.0 1862 0.5506 0.3395 0.1777
0.2324 134.0 1876 0.5543 0.3409 0.1782
0.2324 135.0 1890 0.5589 0.3399 0.1781
0.2077 136.0 1904 0.5556 0.3391 0.1778
0.2077 137.0 1918 0.5555 0.3407 0.1778
0.2077 138.0 1932 0.5501 0.3391 0.1774
0.2077 139.0 1946 0.5544 0.3375 0.1772
0.2077 140.0 1960 0.5554 0.3387 0.1773
0.2077 141.0 1974 0.5504 0.3381 0.1772
0.2077 142.0 1988 0.5484 0.3383 0.1770
0.2089 143.0 2002 0.5519 0.3385 0.1772
0.2089 144.0 2016 0.5532 0.3391 0.1772
0.2089 145.0 2030 0.5530 0.3397 0.1775
0.2089 146.0 2044 0.5551 0.3397 0.1775

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
112
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.