ft_0124_korean_1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4593
  • Cer: 0.1067

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
33.156 0.44 500 10.0563 1.0
4.9299 0.88 1000 4.8856 1.0
4.6283 1.33 1500 4.5959 1.0
4.4245 1.77 2000 4.2900 0.9513
3.8155 2.21 2500 2.7733 0.5324
2.6597 2.65 3000 2.0091 0.4216
2.1347 3.09 3500 1.5842 0.3535
1.7847 3.53 4000 1.3425 0.3124
1.6031 3.98 4500 1.1478 0.2750
1.3867 4.42 5000 0.9914 0.2466
1.2552 4.86 5500 0.8959 0.2258
1.1442 5.3 6000 0.8326 0.2123
1.0747 5.74 6500 0.7708 0.2053
0.985 6.18 7000 0.7137 0.1864
0.921 6.63 7500 0.6822 0.1818
0.8817 7.07 8000 0.6435 0.1716
0.8043 7.51 8500 0.6338 0.1692
0.7938 7.95 9000 0.6075 0.1613
0.7296 8.39 9500 0.5844 0.1578
0.7061 8.83 10000 0.5695 0.1533
0.6566 9.28 10500 0.5695 0.1478
0.6452 9.72 11000 0.5346 0.1439
0.6178 10.16 11500 0.5184 0.1404
0.5887 10.6 12000 0.5152 0.1360
0.5739 11.04 12500 0.5062 0.1356
0.5338 11.48 13000 0.5135 0.1321
0.5391 11.93 13500 0.5021 0.1316
0.4964 12.37 14000 0.4924 0.1269
0.4959 12.81 14500 0.4860 0.1262
0.4731 13.25 15000 0.4893 0.1227
0.4651 13.69 15500 0.4718 0.1204
0.4446 14.13 16000 0.4815 0.1180
0.4175 14.58 16500 0.4780 0.1189
0.4249 15.02 17000 0.4678 0.1163
0.4073 15.46 17500 0.4599 0.1141
0.3948 15.9 18000 0.4676 0.1136
0.3795 16.34 18500 0.4656 0.1119
0.3807 16.78 19000 0.4642 0.1100
0.3675 17.23 19500 0.4661 0.1108
0.3609 17.67 20000 0.4589 0.1086
0.3454 18.11 20500 0.4645 0.1088
0.3451 18.55 21000 0.4570 0.1076
0.3496 18.99 21500 0.4555 0.1072
0.3327 19.43 22000 0.4619 0.1075
0.334 19.88 22500 0.4593 0.1067

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for yoon1000/ft_0124_korean_1

Finetuned
(736)
this model