indicwav2vec_outputs

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: nan
  • Cer: 1.0
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 1011
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 2000
  • num_epochs: 35.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
3.8795 0.3028 500 3.7869 0.9860 1.0
1.8805 0.6057 1000 2.0423 0.4124 0.6416
1.5823 0.9085 1500 1.7622 0.3701 0.5792
2.2702 1.2114 2000 2.0595 0.5233 0.8442
2.7429 1.5142 2500 2.9181 0.8706 0.9792
3.1077 1.8171 3000 3.0393 0.9061 0.9898
2.9896 2.1199 3500 2.8581 0.8528 0.9778
3.2643 2.4228 4000 3.0456 0.8025 0.9649
3.6542 2.7256 4500 3.4606 0.8008 0.9658
3.7622 3.0285 5000 3.6476 0.8315 0.9835
3.8614 3.3313 5500 3.8326 0.8628 0.9924
3.9769 3.6342 6000 4.0055 0.8808 0.9953
4.1241 3.9370 6500 4.1374 0.8920 0.9965
4.1261 4.2399 7000 4.1374 0.8920 0.9965
4.1009 4.5427 7500 4.1374 0.8920 0.9965
4.1698 4.8455 8000 4.1374 0.8920 0.9965
4.129 5.1484 8500 4.1374 0.8920 0.9965
4.1413 5.4512 9000 4.1374 0.8920 0.9965
4.122 5.7541 9500 4.1374 0.8920 0.9965
4.1652 6.0569 10000 4.1374 0.8920 0.9965
4.1801 6.3598 10500 4.1374 0.8920 0.9965
4.092 6.6626 11000 4.1374 0.8920 0.9965
4.0204 6.9655 11500 4.1374 0.8920 0.9965
4.1036 7.2683 12000 4.1374 0.8920 0.9965
4.1918 7.5712 12500 4.1374 0.8920 0.9965
4.1059 7.8740 13000 4.1374 0.8920 0.9965
4.0833 8.1769 13500 4.1374 0.8920 0.9965
4.1278 8.4797 14000 4.1374 0.8920 0.9965
4.1365 8.7826 14500 4.1374 0.8920 0.9965
4.1201 9.0854 15000 4.1374 0.8920 0.9965
4.1476 9.3882 15500 4.1374 0.8920 0.9965
4.0935 9.6911 16000 4.1374 0.8920 0.9965
4.1109 9.9939 16500 4.1374 0.8920 0.9965
4.1389 10.2968 17000 4.1374 0.8920 0.9965
4.0907 10.5996 17500 4.1374 0.8920 0.9965
4.0825 10.9025 18000 4.1374 0.8920 0.9965
4.1094 11.2053 18500 4.1374 0.8920 0.9965
4.0689 11.5082 19000 4.1374 0.8920 0.9965
4.0984 11.8110 19500 4.1374 0.8920 0.9965
4.0569 12.1139 20000 4.1374 0.8920 0.9965
4.1462 12.4167 20500 4.1374 0.8920 0.9965
4.1554 12.7196 21000 4.1374 0.8920 0.9965
4.2207 13.0224 21500 4.1374 0.8920 0.9965
4.1518 13.3253 22000 4.1374 0.8920 0.9965
4.1521 13.6281 22500 4.1374 0.8920 0.9965
4.1367 13.9310 23000 4.1374 0.8920 0.9965
4.0904 14.2338 23500 4.1374 0.8920 0.9965
4.0813 14.5366 24000 4.1374 0.8920 0.9965
4.1001 14.8395 24500 4.1374 0.8920 0.9965
4.1333 15.1423 25000 4.1374 0.8920 0.9965
4.0785 15.4452 25500 4.1374 0.8920 0.9965
4.1651 15.7480 26000 4.1374 0.8920 0.9965
4.0987 16.0509 26500 4.1374 0.8920 0.9965
4.1327 16.3537 27000 4.1374 0.8920 0.9965
4.1128 16.6566 27500 4.1374 0.8920 0.9965
4.0694 16.9594 28000 4.1374 0.8920 0.9965
5.946 17.2623 28500 nan 1.0 1.0
0.0 17.5651 29000 nan 1.0 1.0
0.0 17.8680 29500 nan 1.0 1.0
0.0 18.1708 30000 nan 1.0 1.0
0.0 18.4737 30500 nan 1.0 1.0
0.0 18.7765 31000 nan 1.0 1.0
0.0 19.0793 31500 nan 1.0 1.0
0.0 19.3822 32000 nan 1.0 1.0
0.0 19.6850 32500 nan 1.0 1.0
0.0 19.9879 33000 nan 1.0 1.0
0.0 20.2907 33500 nan 1.0 1.0
0.0 20.5936 34000 nan 1.0 1.0
0.0 20.8964 34500 nan 1.0 1.0
0.0 21.1993 35000 nan 1.0 1.0
0.0 21.5021 35500 nan 1.0 1.0
0.0 21.8050 36000 nan 1.0 1.0
0.0 22.1078 36500 nan 1.0 1.0
0.0 22.4107 37000 nan 1.0 1.0
0.0 22.7135 37500 nan 1.0 1.0
0.0 23.0164 38000 nan 1.0 1.0
0.0 23.3192 38500 nan 1.0 1.0
0.0 23.6220 39000 nan 1.0 1.0
0.0 23.9249 39500 nan 1.0 1.0
0.0 24.2277 40000 nan 1.0 1.0
0.0 24.5306 40500 nan 1.0 1.0
0.0 24.8334 41000 nan 1.0 1.0
0.0 25.1363 41500 nan 1.0 1.0
0.0 25.4391 42000 nan 1.0 1.0
0.0 25.7420 42500 nan 1.0 1.0
0.0 26.0448 43000 nan 1.0 1.0
0.0 26.3477 43500 nan 1.0 1.0
0.0 26.6505 44000 nan 1.0 1.0
0.0 26.9534 44500 nan 1.0 1.0
0.0 27.2562 45000 nan 1.0 1.0
0.0 27.5591 45500 nan 1.0 1.0
0.0 27.8619 46000 nan 1.0 1.0
0.0 28.1647 46500 nan 1.0 1.0
0.0 28.4676 47000 nan 1.0 1.0
0.0 28.7704 47500 nan 1.0 1.0
0.0 29.0733 48000 nan 1.0 1.0
0.0 29.3761 48500 nan 1.0 1.0
0.0 29.6790 49000 nan 1.0 1.0
0.0 29.9818 49500 nan 1.0 1.0
0.0 30.2847 50000 nan 1.0 1.0
0.0 30.5875 50500 nan 1.0 1.0
0.0 30.8904 51000 nan 1.0 1.0
0.0 31.1932 51500 nan 1.0 1.0
0.0 31.4961 52000 nan 1.0 1.0
0.0 31.7989 52500 nan 1.0 1.0
0.0 32.1018 53000 nan 1.0 1.0
0.0 32.4046 53500 nan 1.0 1.0
0.0 32.7075 54000 nan 1.0 1.0
0.0 33.0103 54500 nan 1.0 1.0
0.0 33.3131 55000 nan 1.0 1.0
0.0 33.6160 55500 nan 1.0 1.0
0.0 33.9188 56000 nan 1.0 1.0
0.0 34.2217 56500 nan 1.0 1.0
0.0 34.5245 57000 nan 1.0 1.0
0.0 34.8274 57500 nan 1.0 1.0

Framework versions

  • Transformers 4.43.1
  • Pytorch 2.4.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
316M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support