Pretrained

image/png

PyLaia Training Metrics

Training Summary

  • Total epochs: 76
  • Best validation CER: 0.005937 (epoch 65)
  • Best validation WER: 0.016457 (epoch 62)

Full Metrics Table

epoch tr_cer tr_wer train_loss_epoch train_loss_step va_cer va_wer val_loss
0 0.129245 0.31886 39.4692 12.3066 0.016895 0.052086 5.70541
1 0.027248 0.109056 8.92865 11.5946 0.012966 0.043093 4.50416
2 0.02189 0.086051 7.20965 2.85532 0.0108 0.029745 3.50374
3 0.01887 0.074292 6.18997 7.53472 0.009848 0.028535 3.24674
4 0.017496 0.067933 5.74712 2.76993 0.01269 0.031484 4.38984
5 0.016812 0.063935 5.5235 4.21415 0.00976 0.0278 3.24001
6 0.015686 0.060124 5.10744 1.70046 0.008806 0.02412 2.912
7 0.01477 0.056549 4.81694 0.78812 0.008375 0.022697 2.72714
8 0.014652 0.055727 4.73216 1.51364 0.009163 0.02446 2.85291
9 0.013993 0.053153 4.51401 8.49484 0.008506 0.022889 2.64098
10 0.013825 0.051805 4.46963 17.547 0.009276 0.024612 2.9348
11 0.013776 0.050979 4.4778 1.46055 0.007802 0.02094 2.4691
12 0.013263 0.04988 4.29078 12.1256 0.008381 0.021463 2.56431
13 0.012958 0.048507 4.16489 0.517483 0.009516 0.025627 2.99656
14 0.012718 0.047834 4.08675 15.1593 0.007643 0.020674 2.38766
15 0.012782 0.047346 4.14667 11.5345 0.008414 0.022718 2.53537
16 0.013121 0.04756 4.28663 0.704873 0.008986 0.022636 2.85818
17 0.012989 0.047326 4.20438 0.226486 0.007613 0.019826 2.36213
18 0.012409 0.045543 3.986 8.26527 0.008039 0.021331 2.44117
19 0.012467 0.045933 4.01241 1.33168 0.007981 0.020712 2.5643
20 0.012641 0.046513 4.09855 1.48213 0.008311 0.021949 2.48846
21 0.012292 0.045097 3.96557 1.26379 0.007863 0.020584 2.38992
22 0.012018 0.044462 3.863 23.7959 0.008187 0.020854 2.48543
23 0.011232 0.040438 3.53586 0.883133 0.007534 0.020268 2.27983
24 0.010547 0.037563 3.28549 0.167232 0.007306 0.020027 2.25233
25 0.010414 0.037 3.27071 0.371634 0.007236 0.020221 2.22727
26 0.010176 0.036015 3.17962 1.32554 0.007038 0.019737 2.18342
27 0.010005 0.035514 3.13593 1.54786 0.006995 0.019352 2.15701
28 0.009769 0.0347 3.03312 3.60537 0.006651 0.018249 2.08475
29 0.009681 0.034254 3.00317 1.65505 0.006729 0.018778 2.072
30 0.009502 0.033844 2.94229 0.635581 0.00678 0.018843 2.10965
31 0.009577 0.033859 2.94446 0.213813 0.006994 0.01945 2.14548
32 0.00963 0.033855 2.95764 0.406756 0.006863 0.01937 2.09923
33 0.009445 0.033263 2.95371 0.665865 0.006856 0.018825 2.08154
34 0.009255 0.032772 2.86192 0.996484 0.006557 0.018038 2.00478
35 0.009224 0.032645 2.8533 9.40311 0.006938 0.019332 2.10517
36 0.009375 0.03285 2.89019 2.9548 0.006606 0.018241 2.02943
37 0.009225 0.032457 2.84777 12.7469 0.007089 0.019089 2.14697
38 0.009138 0.031929 2.8019 0.16169 0.006444 0.017973 1.96117
39 0.009054 0.031969 2.79038 14.2437 0.006723 0.018281 1.99717
40 0.009104 0.031877 2.79368 8.42458 0.006686 0.018588 2.0263
41 0.009042 0.031731 2.79296 0.014788 0.006775 0.019004 2.04582
42 0.008958 0.031392 2.75503 0.895635 0.006447 0.018211 1.9721
43 0.008843 0.031237 2.73262 5.40016 0.006335 0.017754 1.94061
44 0.0089 0.031457 2.73986 0.01812 0.00648 0.017889 1.96678
45 0.008895 0.031266 2.7319 0.235641 0.006433 0.017983 1.95497
46 0.008668 0.030706 2.68418 3.64022 0.006313 0.01754 1.91343
47 0.008705 0.030696 2.68547 9.00642 0.006597 0.018216 1.96827
48 0.008727 0.030832 2.68516 0.309562 0.006333 0.017886 1.94401
49 0.008538 0.030283 2.64099 0.014653 0.006258 0.017683 1.92426
50 0.008553 0.030119 2.62904 0.243859 0.006267 0.01731 1.91222
51 0.00859 0.030244 2.64838 0.788468 0.006256 0.017584 1.9147
52 0.008542 0.030204 2.61293 1.36889 0.006164 0.017457 1.90145
53 0.008511 0.030127 2.60048 3.81542 0.006275 0.017553 1.90049
54 0.008507 0.030023 2.58761 0.250174 0.006136 0.017044 1.87604
55 0.008361 0.029653 2.57442 0.036428 0.006122 0.017115 1.86923
56 0.008348 0.029541 2.57007 0.167346 0.006051 0.017036 1.84211
57 0.00835 0.029681 2.60374 0.025574 0.005967 0.01684 1.83604
58 0.008356 0.029606 2.56192 0.150598 0.006249 0.017518 1.89512
59 0.008439 0.029808 2.57937 0.088004 0.006153 0.017248 1.89009
60 0.008303 0.029524 2.55604 11.6615 0.006058 0.017358 1.86274
61 0.008235 0.029226 2.53185 0.484483 0.006156 0.017041 1.90813
62 0.008354 0.029538 2.56148 9.77606 0.005976 0.016457 1.8516
63 0.008138 0.028867 2.49159 0.092906 0.006003 0.016879 1.84359
64 0.008134 0.028773 2.47939 11.7722 0.005976 0.016716 1.83542
65 0.008106 0.028679 2.48362 8.10345 0.005937 0.016689 1.82221
66 0.008114 0.028817 2.46834 0.496928 0.005938 0.016746 1.82858
67 0.008079 0.02861 2.47971 2.91834 0.00596 0.016791 1.82926
68 0.00809 0.028662 2.47953 4.82707 0.005963 0.016743 1.82742
69 0.008074 0.028644 2.47506 1.08142 0.005979 0.016915 1.82871
70 0.008035 0.028538 2.45257 0.458501 0.005944 0.016721 1.83211
71 0.00808 0.028751 2.46476 0.604646 0.005949 0.016764 1.83054
72 0.008029 0.028533 2.45826 9.35829 0.005937 0.016716 1.82579
73 0.008038 0.028501 2.44883 1.12761 0.005954 0.016793 1.82871
74 0.008088 0.028739 2.46324 0.010121 0.005958 0.016824 1.82953
75 0.008065 0.028638 2.45915 0.812815 0.005949 0.016781 1.82817
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train johnlockejrr/pylaia-heb_synth_lines_pytorch_2