w2v2_ablation_200epoch-with_ling_head-0drop-0load_best-best_on_tp0.025_tl10_fp0.001_fl16
This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5364
- Wer: 0.1808
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 200
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
5.0922 | 4.72 | 500 | 5.2296 | 1.0 |
4.3263 | 9.43 | 1000 | 5.4260 | 1.0 |
1.9702 | 14.15 | 1500 | 1.6761 | 0.4369 |
0.7013 | 18.87 | 2000 | 0.6799 | 0.2360 |
0.4391 | 23.58 | 2500 | 0.5237 | 0.1964 |
0.3015 | 28.3 | 3000 | 0.4437 | 0.1849 |
0.2416 | 33.02 | 3500 | 0.4311 | 0.2081 |
0.2057 | 37.74 | 4000 | 0.4202 | 0.1697 |
0.1714 | 42.45 | 4500 | 0.4270 | 0.1738 |
0.1812 | 47.17 | 5000 | 0.4467 | 0.1600 |
0.1498 | 51.89 | 5500 | 0.4322 | 0.2197 |
0.1255 | 56.6 | 6000 | 0.4408 | 0.1696 |
0.1148 | 61.32 | 6500 | 0.4531 | 0.1765 |
0.1112 | 66.04 | 7000 | 0.4572 | 0.2148 |
0.1038 | 70.75 | 7500 | 0.4648 | 0.1894 |
0.0923 | 75.47 | 8000 | 0.4812 | 0.1558 |
0.086 | 80.19 | 8500 | 0.4882 | 0.1894 |
0.0872 | 84.91 | 9000 | 0.4662 | 0.1744 |
0.0778 | 89.62 | 9500 | 0.4800 | 0.1750 |
0.0709 | 94.34 | 10000 | 0.5077 | 0.1960 |
0.0703 | 99.06 | 10500 | 0.5038 | 0.1740 |
0.0721 | 103.77 | 11000 | 0.5131 | 0.1763 |
0.0717 | 108.49 | 11500 | 0.5091 | 0.1896 |
0.0818 | 113.21 | 12000 | 0.5173 | 0.1908 |
0.0626 | 117.92 | 12500 | 0.5158 | 0.1865 |
0.0749 | 122.64 | 13000 | 0.5208 | 0.1865 |
0.0592 | 127.36 | 13500 | 0.5244 | 0.1781 |
0.055 | 132.08 | 14000 | 0.5303 | 0.1810 |
0.0487 | 136.79 | 14500 | 0.5264 | 0.1739 |
0.0486 | 141.51 | 15000 | 0.5225 | 0.1814 |
0.0478 | 146.23 | 15500 | 0.5316 | 0.1870 |
0.0453 | 150.94 | 16000 | 0.5270 | 0.1776 |
0.0449 | 155.66 | 16500 | 0.5318 | 0.1821 |
0.0585 | 160.38 | 17000 | 0.5332 | 0.1775 |
0.0481 | 165.09 | 17500 | 0.5373 | 0.1784 |
0.0459 | 169.81 | 18000 | 0.5335 | 0.1756 |
0.0473 | 174.53 | 18500 | 0.5360 | 0.1808 |
0.0512 | 179.25 | 19000 | 0.5347 | 0.1791 |
0.046 | 183.96 | 19500 | 0.5367 | 0.1778 |
0.048 | 188.68 | 20000 | 0.5354 | 0.1783 |
0.0471 | 193.4 | 20500 | 0.5366 | 0.1814 |
0.0419 | 198.11 | 21000 | 0.5364 | 0.1808 |
Framework versions
- Transformers 4.35.2
- Pytorch 1.13.1+cu117
- Datasets 2.12.0
- Tokenizers 0.14.1
- Downloads last month
- 0
Model tree for tuanio/w2v2_ablation_200epoch-with_ling_head-0drop-0load_best-best_on_tp0.025_tl10_fp0.001_fl16
Base model
nguyenvulebinh/wav2vec2-base-vietnamese-250h