End of training
Browse files
README.md
CHANGED
@@ -1,199 +1,116 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
###
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
#### Testing Data
|
110 |
-
|
111 |
-
<!-- This should link to a Dataset Card if possible. -->
|
112 |
-
|
113 |
-
[More Information Needed]
|
114 |
-
|
115 |
-
#### Factors
|
116 |
-
|
117 |
-
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
|
118 |
-
|
119 |
-
[More Information Needed]
|
120 |
-
|
121 |
-
#### Metrics
|
122 |
-
|
123 |
-
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
124 |
-
|
125 |
-
[More Information Needed]
|
126 |
-
|
127 |
-
### Results
|
128 |
-
|
129 |
-
[More Information Needed]
|
130 |
-
|
131 |
-
#### Summary
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
## Model Examination [optional]
|
136 |
-
|
137 |
-
<!-- Relevant interpretability work for the model goes here -->
|
138 |
-
|
139 |
-
[More Information Needed]
|
140 |
-
|
141 |
-
## Environmental Impact
|
142 |
-
|
143 |
-
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
144 |
-
|
145 |
-
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
146 |
-
|
147 |
-
- **Hardware Type:** [More Information Needed]
|
148 |
-
- **Hours used:** [More Information Needed]
|
149 |
-
- **Cloud Provider:** [More Information Needed]
|
150 |
-
- **Compute Region:** [More Information Needed]
|
151 |
-
- **Carbon Emitted:** [More Information Needed]
|
152 |
-
|
153 |
-
## Technical Specifications [optional]
|
154 |
-
|
155 |
-
### Model Architecture and Objective
|
156 |
-
|
157 |
-
[More Information Needed]
|
158 |
-
|
159 |
-
### Compute Infrastructure
|
160 |
-
|
161 |
-
[More Information Needed]
|
162 |
-
|
163 |
-
#### Hardware
|
164 |
-
|
165 |
-
[More Information Needed]
|
166 |
-
|
167 |
-
#### Software
|
168 |
-
|
169 |
-
[More Information Needed]
|
170 |
-
|
171 |
-
## Citation [optional]
|
172 |
-
|
173 |
-
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
174 |
-
|
175 |
-
**BibTeX:**
|
176 |
-
|
177 |
-
[More Information Needed]
|
178 |
-
|
179 |
-
**APA:**
|
180 |
-
|
181 |
-
[More Information Needed]
|
182 |
-
|
183 |
-
## Glossary [optional]
|
184 |
-
|
185 |
-
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
186 |
-
|
187 |
-
[More Information Needed]
|
188 |
-
|
189 |
-
## More Information [optional]
|
190 |
-
|
191 |
-
[More Information Needed]
|
192 |
-
|
193 |
-
## Model Card Authors [optional]
|
194 |
-
|
195 |
-
[More Information Needed]
|
196 |
-
|
197 |
-
## Model Card Contact
|
198 |
-
|
199 |
-
[More Information Needed]
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
+
license: apache-2.0
|
4 |
+
base_model: facebook/wav2vec2-xls-r-1b
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
metrics:
|
8 |
+
- wer
|
9 |
+
model-index:
|
10 |
+
- name: wav2vec2-xls-r-1b-scandinavian-E2-100h-30-epochs-20250123
|
11 |
+
results: []
|
12 |
---
|
13 |
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# wav2vec2-xls-r-1b-scandinavian-E2-100h-30-epochs-20250123
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on an unknown dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 0.1348
|
22 |
+
- Wer: 21.6270
|
23 |
+
- Cer: 4.5904
|
24 |
+
|
25 |
+
## Model description
|
26 |
+
|
27 |
+
More information needed
|
28 |
+
|
29 |
+
## Intended uses & limitations
|
30 |
+
|
31 |
+
More information needed
|
32 |
+
|
33 |
+
## Training and evaluation data
|
34 |
+
|
35 |
+
More information needed
|
36 |
+
|
37 |
+
## Training procedure
|
38 |
+
|
39 |
+
### Training hyperparameters
|
40 |
+
|
41 |
+
The following hyperparameters were used during training:
|
42 |
+
- learning_rate: 0.0001
|
43 |
+
- train_batch_size: 16
|
44 |
+
- eval_batch_size: 8
|
45 |
+
- seed: 42
|
46 |
+
- gradient_accumulation_steps: 2
|
47 |
+
- total_train_batch_size: 32
|
48 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
49 |
+
- lr_scheduler_type: cosine
|
50 |
+
- lr_scheduler_warmup_steps: 6000
|
51 |
+
- num_epochs: 30
|
52 |
+
- mixed_precision_training: Native AMP
|
53 |
+
|
54 |
+
### Training results
|
55 |
+
|
56 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|
57 |
+
|:-------------:|:-------:|:-----:|:---------------:|:-------:|:-------:|
|
58 |
+
| 1.0569 | 0.5831 | 1000 | 0.5118 | 63.8799 | 16.0725 |
|
59 |
+
| 0.4643 | 1.1662 | 2000 | 0.2647 | 38.1428 | 9.2101 |
|
60 |
+
| 0.3797 | 1.7493 | 3000 | 0.2158 | 33.0390 | 7.8553 |
|
61 |
+
| 0.3558 | 2.3324 | 4000 | 0.1996 | 31.8993 | 7.4365 |
|
62 |
+
| 0.3784 | 2.9155 | 5000 | 0.2039 | 31.7718 | 7.5216 |
|
63 |
+
| 0.4332 | 3.4985 | 6000 | 0.2181 | 33.0704 | 7.8748 |
|
64 |
+
| 0.3112 | 4.0816 | 7000 | 0.2070 | 32.4627 | 7.6624 |
|
65 |
+
| 0.3189 | 4.6647 | 8000 | 0.1970 | 31.6000 | 7.4421 |
|
66 |
+
| 0.2768 | 5.2478 | 9000 | 0.1998 | 31.2380 | 7.4350 |
|
67 |
+
| 0.2924 | 5.8309 | 10000 | 0.1871 | 30.2479 | 7.0815 |
|
68 |
+
| 0.2971 | 6.4140 | 11000 | 0.1874 | 30.3957 | 7.2106 |
|
69 |
+
| 0.3326 | 6.9971 | 12000 | 0.1860 | 29.7972 | 7.0494 |
|
70 |
+
| 0.2833 | 7.5802 | 13000 | 0.1908 | 30.0964 | 7.1554 |
|
71 |
+
| 0.2283 | 8.1633 | 14000 | 0.1775 | 29.1211 | 6.8593 |
|
72 |
+
| 0.2145 | 8.7464 | 15000 | 0.1768 | 28.5836 | 6.6787 |
|
73 |
+
| 0.2556 | 9.3294 | 16000 | 0.1785 | 28.7831 | 6.7674 |
|
74 |
+
| 0.2567 | 9.9125 | 17000 | 0.1738 | 28.3009 | 6.5702 |
|
75 |
+
| 0.2418 | 10.4956 | 18000 | 0.1719 | 28.5466 | 6.6845 |
|
76 |
+
| 0.1733 | 11.0787 | 19000 | 0.1619 | 27.0467 | 6.2287 |
|
77 |
+
| 0.1734 | 11.6618 | 20000 | 0.1624 | 26.6052 | 6.1501 |
|
78 |
+
| 0.2026 | 12.2449 | 21000 | 0.1614 | 26.8417 | 6.1794 |
|
79 |
+
| 0.1898 | 12.8280 | 22000 | 0.1541 | 26.6902 | 6.1292 |
|
80 |
+
| 0.2149 | 13.4111 | 23000 | 0.1570 | 26.3780 | 6.0525 |
|
81 |
+
| 0.1751 | 13.9942 | 24000 | 0.1494 | 25.8756 | 5.8931 |
|
82 |
+
| 0.1488 | 14.5773 | 25000 | 0.1468 | 25.5819 | 5.7224 |
|
83 |
+
| 0.1283 | 15.1603 | 26000 | 0.1477 | 25.3380 | 5.6983 |
|
84 |
+
| 0.1363 | 15.7434 | 27000 | 0.1463 | 24.8430 | 5.5717 |
|
85 |
+
| 0.1324 | 16.3265 | 28000 | 0.1453 | 24.9095 | 5.5905 |
|
86 |
+
| 0.143 | 16.9096 | 29000 | 0.1434 | 24.6915 | 5.5245 |
|
87 |
+
| 0.1241 | 17.4927 | 30000 | 0.1436 | 24.1965 | 5.3843 |
|
88 |
+
| 0.0976 | 18.0758 | 31000 | 0.1475 | 24.1429 | 5.3458 |
|
89 |
+
| 0.0945 | 18.6589 | 32000 | 0.1405 | 23.4871 | 5.1769 |
|
90 |
+
| 0.1085 | 19.2420 | 33000 | 0.1393 | 23.6515 | 5.1972 |
|
91 |
+
| 0.1165 | 19.8251 | 34000 | 0.1350 | 23.2599 | 5.1165 |
|
92 |
+
| 0.1195 | 20.4082 | 35000 | 0.1435 | 23.1196 | 5.0626 |
|
93 |
+
| 0.101 | 20.9913 | 36000 | 0.1366 | 22.9219 | 4.9904 |
|
94 |
+
| 0.0764 | 21.5743 | 37000 | 0.1359 | 22.6892 | 4.9498 |
|
95 |
+
| 0.0806 | 22.1574 | 38000 | 0.1372 | 22.5506 | 4.8653 |
|
96 |
+
| 0.084 | 22.7405 | 39000 | 0.1350 | 22.3530 | 4.8049 |
|
97 |
+
| 0.0825 | 23.3236 | 40000 | 0.1380 | 22.1812 | 4.7661 |
|
98 |
+
| 0.0788 | 23.9067 | 41000 | 0.1407 | 22.1221 | 4.7596 |
|
99 |
+
| 0.0632 | 24.4898 | 42000 | 0.1385 | 22.0463 | 4.7328 |
|
100 |
+
| 0.0502 | 25.0729 | 43000 | 0.1393 | 21.9761 | 4.6974 |
|
101 |
+
| 0.0482 | 25.6560 | 44000 | 0.1384 | 21.9946 | 4.6878 |
|
102 |
+
| 0.0612 | 26.2391 | 45000 | 0.1364 | 21.7619 | 4.6330 |
|
103 |
+
| 0.0708 | 26.8222 | 46000 | 0.1346 | 21.7378 | 4.6234 |
|
104 |
+
| 0.0609 | 27.4052 | 47000 | 0.1353 | 21.7101 | 4.6142 |
|
105 |
+
| 0.0632 | 27.9883 | 48000 | 0.1361 | 21.7064 | 4.6071 |
|
106 |
+
| 0.0528 | 28.5714 | 49000 | 0.1350 | 21.6769 | 4.6064 |
|
107 |
+
| 0.0696 | 29.1545 | 50000 | 0.1347 | 21.6381 | 4.5898 |
|
108 |
+
| 0.0564 | 29.7376 | 51000 | 0.1348 | 21.6270 | 4.5904 |
|
109 |
+
|
110 |
+
|
111 |
+
### Framework versions
|
112 |
+
|
113 |
+
- Transformers 4.48.1
|
114 |
+
- Pytorch 2.5.1+cu124
|
115 |
+
- Datasets 3.2.0
|
116 |
+
- Tokenizers 0.21.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|