File size: 10,663 Bytes
d4d10c2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
93531df
 
 
 
d4d10c2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-large-xlsr-coraa-exp-17
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-large-xlsr-coraa-exp-17

This model is a fine-tuned version of [Edresson/wav2vec2-large-xlsr-coraa-portuguese](https://huggingface.co/Edresson/wav2vec2-large-xlsr-coraa-portuguese) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5486
- Wer: 0.3511
- Cer: 0.1797
- Per: 0.3387

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 150
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer    | Cer    | Per    |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|
| 38.4208       | 1.0   | 14   | 41.8095         | 1.0057 | 1.2146 | 1.0057 |
| 38.4208       | 2.0   | 28   | 12.2873         | 1.0    | 0.9619 | 1.0    |
| 38.4208       | 3.0   | 42   | 4.8093          | 1.0    | 0.9619 | 1.0    |
| 38.4208       | 4.0   | 56   | 3.9738          | 1.0    | 0.9619 | 1.0    |
| 38.4208       | 5.0   | 70   | 3.6684          | 1.0    | 0.9619 | 1.0    |
| 38.4208       | 6.0   | 84   | 3.5007          | 1.0    | 0.9619 | 1.0    |
| 38.4208       | 7.0   | 98   | 3.3854          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 8.0   | 112  | 3.4506          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 9.0   | 126  | 3.1789          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 10.0  | 140  | 3.1274          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 11.0  | 154  | 3.1624          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 12.0  | 168  | 3.1066          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 13.0  | 182  | 3.0580          | 1.0    | 0.9619 | 1.0    |
| 11.8009       | 14.0  | 196  | 3.0477          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 15.0  | 210  | 3.0519          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 16.0  | 224  | 3.0364          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 17.0  | 238  | 3.0152          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 18.0  | 252  | 3.0167          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 19.0  | 266  | 3.0130          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 20.0  | 280  | 3.0103          | 1.0    | 0.9619 | 1.0    |
| 3.0395        | 21.0  | 294  | 2.9994          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 22.0  | 308  | 2.9999          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 23.0  | 322  | 3.0009          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 24.0  | 336  | 3.0024          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 25.0  | 350  | 3.0001          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 26.0  | 364  | 2.9891          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 27.0  | 378  | 2.9881          | 1.0    | 0.9619 | 1.0    |
| 2.9424        | 28.0  | 392  | 2.9703          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 29.0  | 406  | 2.9531          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 30.0  | 420  | 2.9208          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 31.0  | 434  | 2.8981          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 32.0  | 448  | 2.8321          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 33.0  | 462  | 2.7583          | 1.0    | 0.9619 | 1.0    |
| 2.9154        | 34.0  | 476  | 2.6405          | 1.0    | 0.9616 | 1.0    |
| 2.9154        | 35.0  | 490  | 2.5072          | 1.0    | 0.8832 | 1.0    |
| 2.7552        | 36.0  | 504  | 2.1547          | 1.0    | 0.6144 | 1.0    |
| 2.7552        | 37.0  | 518  | 1.7565          | 1.0    | 0.4996 | 1.0    |
| 2.7552        | 38.0  | 532  | 1.4602          | 1.0    | 0.4065 | 1.0    |
| 2.7552        | 39.0  | 546  | 1.2269          | 0.9896 | 0.3658 | 0.9892 |
| 2.7552        | 40.0  | 560  | 1.0906          | 0.8881 | 0.3205 | 0.8834 |
| 2.7552        | 41.0  | 574  | 0.9941          | 0.6772 | 0.2631 | 0.6603 |
| 2.7552        | 42.0  | 588  | 0.9133          | 0.5423 | 0.2322 | 0.5154 |
| 1.4599        | 43.0  | 602  | 0.8487          | 0.5142 | 0.2241 | 0.4882 |
| 1.4599        | 44.0  | 616  | 0.8211          | 0.4898 | 0.2207 | 0.4626 |
| 1.4599        | 45.0  | 630  | 0.7672          | 0.4803 | 0.2140 | 0.4518 |
| 1.4599        | 46.0  | 644  | 0.7432          | 0.4707 | 0.2092 | 0.4445 |
| 1.4599        | 47.0  | 658  | 0.7390          | 0.4492 | 0.2059 | 0.4262 |
| 1.4599        | 48.0  | 672  | 0.6994          | 0.4348 | 0.2011 | 0.4106 |
| 1.4599        | 49.0  | 686  | 0.6999          | 0.4230 | 0.1991 | 0.3998 |
| 0.7585        | 50.0  | 700  | 0.6738          | 0.4122 | 0.1959 | 0.3883 |
| 0.7585        | 51.0  | 714  | 0.6697          | 0.4094 | 0.1963 | 0.3858 |
| 0.7585        | 52.0  | 728  | 0.6707          | 0.4163 | 0.1996 | 0.3954 |
| 0.7585        | 53.0  | 742  | 0.6397          | 0.4031 | 0.1942 | 0.3832 |
| 0.7585        | 54.0  | 756  | 0.6293          | 0.4039 | 0.1939 | 0.3836 |
| 0.7585        | 55.0  | 770  | 0.6479          | 0.4027 | 0.1946 | 0.3852 |
| 0.7585        | 56.0  | 784  | 0.6307          | 0.3982 | 0.1934 | 0.3822 |
| 0.7585        | 57.0  | 798  | 0.6166          | 0.3844 | 0.1908 | 0.3673 |
| 0.5473        | 58.0  | 812  | 0.6099          | 0.3860 | 0.1906 | 0.3708 |
| 0.5473        | 59.0  | 826  | 0.6007          | 0.3868 | 0.1904 | 0.3730 |
| 0.5473        | 60.0  | 840  | 0.6191          | 0.3885 | 0.1928 | 0.3744 |
| 0.5473        | 61.0  | 854  | 0.6015          | 0.3885 | 0.1892 | 0.3732 |
| 0.5473        | 62.0  | 868  | 0.5965          | 0.3838 | 0.1902 | 0.3688 |
| 0.5473        | 63.0  | 882  | 0.5926          | 0.3826 | 0.1904 | 0.3667 |
| 0.5473        | 64.0  | 896  | 0.6188          | 0.3921 | 0.1921 | 0.3765 |
| 0.443         | 65.0  | 910  | 0.5835          | 0.3830 | 0.1892 | 0.3690 |
| 0.443         | 66.0  | 924  | 0.5914          | 0.3870 | 0.1903 | 0.3722 |
| 0.443         | 67.0  | 938  | 0.5828          | 0.3779 | 0.1876 | 0.3627 |
| 0.443         | 68.0  | 952  | 0.5745          | 0.3722 | 0.1857 | 0.3576 |
| 0.443         | 69.0  | 966  | 0.5786          | 0.3795 | 0.1882 | 0.3633 |
| 0.443         | 70.0  | 980  | 0.5869          | 0.3751 | 0.1884 | 0.3604 |
| 0.443         | 71.0  | 994  | 0.5923          | 0.3753 | 0.1888 | 0.3596 |
| 0.3564        | 72.0  | 1008 | 0.5707          | 0.3714 | 0.1859 | 0.3578 |
| 0.3564        | 73.0  | 1022 | 0.5733          | 0.3700 | 0.1857 | 0.3551 |
| 0.3564        | 74.0  | 1036 | 0.5731          | 0.3706 | 0.1854 | 0.3566 |
| 0.3564        | 75.0  | 1050 | 0.5644          | 0.3669 | 0.1847 | 0.3531 |
| 0.3564        | 76.0  | 1064 | 0.5661          | 0.3702 | 0.1852 | 0.3555 |
| 0.3564        | 77.0  | 1078 | 0.5705          | 0.3675 | 0.1847 | 0.3513 |
| 0.3564        | 78.0  | 1092 | 0.5631          | 0.3671 | 0.1835 | 0.3527 |
| 0.3456        | 79.0  | 1106 | 0.5675          | 0.3651 | 0.1831 | 0.3503 |
| 0.3456        | 80.0  | 1120 | 0.5697          | 0.3645 | 0.1846 | 0.3507 |
| 0.3456        | 81.0  | 1134 | 0.5644          | 0.3631 | 0.1841 | 0.3492 |
| 0.3456        | 82.0  | 1148 | 0.5657          | 0.3627 | 0.1843 | 0.3480 |
| 0.3456        | 83.0  | 1162 | 0.5831          | 0.3679 | 0.1876 | 0.3523 |
| 0.3456        | 84.0  | 1176 | 0.5824          | 0.3659 | 0.1862 | 0.3523 |
| 0.3456        | 85.0  | 1190 | 0.5567          | 0.3653 | 0.1833 | 0.3509 |
| 0.3073        | 86.0  | 1204 | 0.5755          | 0.3649 | 0.1852 | 0.3507 |
| 0.3073        | 87.0  | 1218 | 0.5590          | 0.3586 | 0.1829 | 0.3450 |
| 0.3073        | 88.0  | 1232 | 0.5663          | 0.3610 | 0.1835 | 0.3480 |
| 0.3073        | 89.0  | 1246 | 0.5734          | 0.3618 | 0.1851 | 0.3468 |
| 0.3073        | 90.0  | 1260 | 0.5657          | 0.3602 | 0.1830 | 0.3458 |
| 0.3073        | 91.0  | 1274 | 0.5651          | 0.3578 | 0.1828 | 0.3442 |
| 0.3073        | 92.0  | 1288 | 0.5608          | 0.3557 | 0.1820 | 0.3415 |
| 0.2836        | 93.0  | 1302 | 0.5505          | 0.3525 | 0.1807 | 0.3389 |
| 0.2836        | 94.0  | 1316 | 0.5495          | 0.3501 | 0.1798 | 0.3375 |
| 0.2836        | 95.0  | 1330 | 0.5693          | 0.3557 | 0.1816 | 0.3432 |
| 0.2836        | 96.0  | 1344 | 0.5638          | 0.3564 | 0.1822 | 0.3417 |
| 0.2836        | 97.0  | 1358 | 0.5486          | 0.3511 | 0.1797 | 0.3387 |
| 0.2836        | 98.0  | 1372 | 0.5618          | 0.3545 | 0.1810 | 0.3415 |
| 0.2836        | 99.0  | 1386 | 0.5637          | 0.3515 | 0.1800 | 0.3399 |
| 0.2502        | 100.0 | 1400 | 0.5658          | 0.3555 | 0.1810 | 0.3438 |
| 0.2502        | 101.0 | 1414 | 0.5527          | 0.3525 | 0.1795 | 0.3411 |
| 0.2502        | 102.0 | 1428 | 0.5701          | 0.3562 | 0.1807 | 0.3440 |
| 0.2502        | 103.0 | 1442 | 0.5543          | 0.3497 | 0.1794 | 0.3389 |
| 0.2502        | 104.0 | 1456 | 0.5660          | 0.3509 | 0.1803 | 0.3399 |
| 0.2502        | 105.0 | 1470 | 0.5543          | 0.3501 | 0.1795 | 0.3399 |
| 0.2502        | 106.0 | 1484 | 0.5742          | 0.3547 | 0.1817 | 0.3432 |
| 0.2502        | 107.0 | 1498 | 0.5527          | 0.3454 | 0.1789 | 0.3350 |
| 0.2368        | 108.0 | 1512 | 0.5577          | 0.3497 | 0.1789 | 0.3379 |
| 0.2368        | 109.0 | 1526 | 0.5539          | 0.3452 | 0.1789 | 0.3356 |
| 0.2368        | 110.0 | 1540 | 0.5700          | 0.3517 | 0.1802 | 0.3417 |
| 0.2368        | 111.0 | 1554 | 0.5627          | 0.3501 | 0.1794 | 0.3397 |
| 0.2368        | 112.0 | 1568 | 0.5622          | 0.3497 | 0.1797 | 0.3405 |
| 0.2368        | 113.0 | 1582 | 0.5708          | 0.3495 | 0.1801 | 0.3403 |
| 0.2368        | 114.0 | 1596 | 0.5733          | 0.3511 | 0.1805 | 0.3401 |
| 0.2288        | 115.0 | 1610 | 0.5615          | 0.3486 | 0.1795 | 0.3387 |
| 0.2288        | 116.0 | 1624 | 0.5741          | 0.3497 | 0.1809 | 0.3397 |
| 0.2288        | 117.0 | 1638 | 0.5610          | 0.3460 | 0.1796 | 0.3373 |


### Framework versions

- Transformers 4.28.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.13.3