wav2vec2-large-xlsr-coraa-exp-12

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5650
  • Wer: 0.3527
  • Cer: 0.1823

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
37.6216 1.0 14 23.2071 1.0 0.9619
37.6216 2.0 28 6.9366 1.0 0.9619
37.6216 3.0 42 4.4250 1.0 0.9619
37.6216 4.0 56 3.9154 1.0 0.9619
37.6216 5.0 70 3.6849 1.0 0.9619
37.6216 6.0 84 3.5283 1.0 0.9619
37.6216 7.0 98 3.3716 1.0 0.9619
8.823 8.0 112 3.2657 1.0 0.9619
8.823 9.0 126 3.1796 1.0 0.9619
8.823 10.0 140 3.1568 1.0 0.9619
8.823 11.0 154 3.1071 1.0 0.9619
8.823 12.0 168 3.0891 1.0 0.9619
8.823 13.0 182 3.0588 1.0 0.9619
8.823 14.0 196 3.0422 1.0 0.9619
3.0574 15.0 210 3.0388 1.0 0.9619
3.0574 16.0 224 3.0324 1.0 0.9619
3.0574 17.0 238 3.0253 1.0 0.9619
3.0574 18.0 252 3.0100 1.0 0.9619
3.0574 19.0 266 3.0079 1.0 0.9619
3.0574 20.0 280 3.0150 1.0 0.9619
3.0574 21.0 294 3.0033 1.0 0.9619
2.95 22.0 308 2.9999 1.0 0.9619
2.95 23.0 322 2.9940 1.0 0.9619
2.95 24.0 336 2.9982 1.0 0.9619
2.95 25.0 350 3.0212 1.0 0.9619
2.95 26.0 364 2.9951 1.0 0.9619
2.95 27.0 378 2.9893 1.0 0.9619
2.95 28.0 392 2.9907 1.0 0.9619
2.9233 29.0 406 2.9889 1.0 0.9619
2.9233 30.0 420 2.9813 1.0 0.9619
2.9233 31.0 434 2.9795 1.0 0.9619
2.9233 32.0 448 2.9633 1.0 0.9619
2.9233 33.0 462 2.9653 1.0 0.9585
2.9233 34.0 476 2.9050 1.0 0.9619
2.9233 35.0 490 2.8806 1.0 0.9619
2.8852 36.0 504 2.8230 1.0 0.9619
2.8852 37.0 518 2.7805 1.0 0.9619
2.8852 38.0 532 2.7044 1.0 0.9572
2.8852 39.0 546 2.6561 1.0 0.9559
2.8852 40.0 560 2.5475 1.0 0.9254
2.8852 41.0 574 2.3336 1.0 0.7458
2.8852 42.0 588 2.0696 1.0 0.5468
2.5339 43.0 602 1.7760 1.0 0.4971
2.5339 44.0 616 1.5433 1.0 0.4546
2.5339 45.0 630 1.3529 1.0 0.4067
2.5339 46.0 644 1.2149 0.9998 0.3834
2.5339 47.0 658 1.0925 0.9943 0.3578
2.5339 48.0 672 1.0236 0.8954 0.3129
2.5339 49.0 686 0.9525 0.7062 0.2623
1.3395 50.0 700 0.8922 0.5063 0.2201
1.3395 51.0 714 0.8068 0.4774 0.2115
1.3395 52.0 728 0.7932 0.4553 0.2076
1.3395 53.0 742 0.7726 0.4453 0.2066
1.3395 54.0 756 0.7551 0.4340 0.2027
1.3395 55.0 770 0.7420 0.4305 0.2039
1.3395 56.0 784 0.7146 0.4212 0.2008
1.3395 57.0 798 0.6768 0.4096 0.1957
0.7419 58.0 812 0.6767 0.4080 0.1962
0.7419 59.0 826 0.6709 0.4069 0.1971
0.7419 60.0 840 0.6791 0.4025 0.1967
0.7419 61.0 854 0.6560 0.4029 0.1938
0.7419 62.0 868 0.6474 0.3976 0.1939
0.7419 63.0 882 0.6584 0.3982 0.1941
0.7419 64.0 896 0.6619 0.3960 0.1938
0.5254 65.0 910 0.6514 0.3923 0.1936
0.5254 66.0 924 0.6363 0.3874 0.1915
0.5254 67.0 938 0.6173 0.3797 0.1900
0.5254 68.0 952 0.6284 0.3887 0.1918
0.5254 69.0 966 0.6153 0.3767 0.1897
0.5254 70.0 980 0.6084 0.3736 0.1879
0.5254 71.0 994 0.6196 0.3773 0.1900
0.4219 72.0 1008 0.6075 0.3730 0.1899
0.4219 73.0 1022 0.6017 0.3712 0.1884
0.4219 74.0 1036 0.5947 0.3694 0.1872
0.4219 75.0 1050 0.5975 0.3696 0.1889
0.4219 76.0 1064 0.6020 0.3728 0.1887
0.4219 77.0 1078 0.5994 0.3704 0.1892
0.4219 78.0 1092 0.5822 0.3716 0.1877
0.385 79.0 1106 0.6073 0.3742 0.1893
0.385 80.0 1120 0.6029 0.3728 0.1874
0.385 81.0 1134 0.5961 0.3700 0.1868
0.385 82.0 1148 0.6032 0.3702 0.1870
0.385 83.0 1162 0.6115 0.3722 0.1889
0.385 84.0 1176 0.6018 0.3690 0.1883
0.385 85.0 1190 0.5824 0.3665 0.1855
0.3463 86.0 1204 0.5985 0.3669 0.1866
0.3463 87.0 1218 0.5833 0.3669 0.1861
0.3463 88.0 1232 0.5775 0.3637 0.1862
0.3463 89.0 1246 0.5747 0.3606 0.1850
0.3463 90.0 1260 0.5784 0.3639 0.1851
0.3463 91.0 1274 0.5841 0.3604 0.1858
0.3463 92.0 1288 0.5762 0.3655 0.1850
0.3237 93.0 1302 0.5836 0.3598 0.1854
0.3237 94.0 1316 0.5761 0.3588 0.1841
0.3237 95.0 1330 0.5822 0.3596 0.1848
0.3237 96.0 1344 0.5886 0.3592 0.1850
0.3237 97.0 1358 0.5696 0.3574 0.1830
0.3237 98.0 1372 0.5794 0.3588 0.1836
0.3237 99.0 1386 0.5768 0.3570 0.1837
0.2799 100.0 1400 0.5837 0.3578 0.1844
0.2799 101.0 1414 0.5697 0.3525 0.1826
0.2799 102.0 1428 0.5796 0.3566 0.1834
0.2799 103.0 1442 0.5712 0.3549 0.1825
0.2799 104.0 1456 0.5796 0.3555 0.1829
0.2799 105.0 1470 0.5759 0.3553 0.1835
0.2799 106.0 1484 0.5750 0.3562 0.1831
0.2799 107.0 1498 0.5650 0.3527 0.1823
0.2674 108.0 1512 0.5677 0.3499 0.1823
0.2674 109.0 1526 0.5699 0.3541 0.1826
0.2674 110.0 1540 0.5779 0.3555 0.1837
0.2674 111.0 1554 0.5792 0.3551 0.1834
0.2674 112.0 1568 0.5697 0.3574 0.1829
0.2674 113.0 1582 0.5852 0.3590 0.1839
0.2674 114.0 1596 0.5735 0.3537 0.1829
0.2611 115.0 1610 0.5774 0.3545 0.1832
0.2611 116.0 1624 0.5836 0.3555 0.1841
0.2611 117.0 1638 0.5750 0.3517 0.1832
0.2611 118.0 1652 0.5772 0.3521 0.1825
0.2611 119.0 1666 0.5793 0.3521 0.1831
0.2611 120.0 1680 0.5756 0.3517 0.1828
0.2611 121.0 1694 0.5794 0.3517 0.1830
0.2476 122.0 1708 0.5719 0.3521 0.1827
0.2476 123.0 1722 0.5804 0.3543 0.1830
0.2476 124.0 1736 0.5729 0.3539 0.1825
0.2476 125.0 1750 0.5874 0.3519 0.1832
0.2476 126.0 1764 0.5777 0.3533 0.1826
0.2476 127.0 1778 0.5762 0.3531 0.1822

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
7
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.