You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

w2v-bert-2.0-naijavoices-hausa-v0.0

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5015
  • Wer: 0.0977
  • Cer: 0.0316

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 9e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.025
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4727 1.0 4339 0.2880 0.2519 0.0646
0.2148 2.0 8678 0.2837 0.2513 0.0656
0.214 3.0 13017 0.2773 0.2588 0.0676
0.2014 4.0 17356 0.2588 0.2358 0.0612
0.1919 5.0 21695 0.2652 0.2419 0.0637
0.185 6.0 26034 0.2586 0.2398 0.0621
0.1771 7.0 30373 0.2595 0.2345 0.0605
0.1731 8.0 34712 0.2442 0.2298 0.0597
0.1673 9.0 39051 0.2487 0.2319 0.0599
0.1617 10.0 43390 0.2402 0.2241 0.0581
0.1553 11.0 47729 0.2371 0.2236 0.0577
0.1488 12.0 52068 0.2364 0.2179 0.0566
0.1426 13.0 56407 0.2456 0.2166 0.0566
0.1357 14.0 60746 0.2399 0.2144 0.0564
0.1284 15.0 65085 0.2326 0.2046 0.0537
0.1203 16.0 69424 0.2305 0.2021 0.0533
0.114 17.0 73763 0.2321 0.1928 0.0509
0.1063 18.0 78102 0.2246 0.1918 0.0512
0.1005 19.0 82441 0.2280 0.1840 0.0492
0.0939 20.0 86780 0.2217 0.1765 0.0479
0.0872 21.0 91119 0.2240 0.1739 0.0475
0.0824 22.0 95458 0.2369 0.1696 0.0467
0.0781 23.0 99797 0.2266 0.1627 0.0449
0.0713 24.0 104136 0.2203 0.1579 0.0442
0.0655 25.0 108475 0.2384 0.1597 0.0451
0.0616 26.0 112814 0.2373 0.1486 0.0422
0.0581 27.0 117153 0.2577 0.1481 0.0419
0.0529 28.0 121492 0.2549 0.1470 0.0423
0.0508 29.0 125831 0.2395 0.1451 0.0417
0.0462 30.0 130170 0.2447 0.1396 0.0405
0.0426 31.0 134509 0.2511 0.1355 0.0397
0.0399 32.0 138848 0.2582 0.1379 0.0406
0.0373 33.0 143187 0.2499 0.1374 0.0402
0.0348 34.0 147526 0.2643 0.1327 0.0393
0.0324 35.0 151865 0.2710 0.1320 0.0390
0.0317 36.0 156204 0.2673 0.1337 0.0395
0.0291 37.0 160543 0.2943 0.1263 0.0377
0.0269 38.0 164882 0.2997 0.1259 0.0380
0.0253 39.0 169221 0.2998 0.1217 0.0369
0.0239 40.0 173560 0.2799 0.1267 0.0382
0.0229 41.0 177899 0.2898 0.1203 0.0364
0.0211 42.0 182238 0.3049 0.1203 0.0365
0.0201 43.0 186577 0.2963 0.1210 0.0369
0.019 44.0 190916 0.3006 0.1210 0.0369
0.0181 45.0 195255 0.2990 0.1212 0.0368
0.0171 46.0 199594 0.3176 0.1180 0.0363
0.0153 47.0 203933 0.3190 0.1173 0.0361
0.015 48.0 208272 0.3325 0.1163 0.0356
0.0144 49.0 212611 0.3454 0.1148 0.0356
0.0138 50.0 216950 0.3399 0.1114 0.0345
0.0128 51.0 221289 0.3601 0.1115 0.0347
0.012 52.0 225628 0.3493 0.1115 0.0347
0.0114 53.0 229967 0.3350 0.1142 0.0355
0.0109 54.0 234306 0.3564 0.1103 0.0343
0.0101 55.0 238645 0.3654 0.1103 0.0343
0.0095 56.0 242984 0.3583 0.1080 0.0339
0.0091 57.0 247323 0.3467 0.1119 0.0347
0.0084 58.0 251662 0.3738 0.1089 0.0342
0.0082 59.0 256001 0.3751 0.1082 0.0341
0.0078 60.0 260340 0.3638 0.1085 0.0341
0.0072 61.0 264679 0.3883 0.1073 0.0336
0.0068 62.0 269018 0.3815 0.1073 0.0338
0.0065 63.0 273357 0.3882 0.1080 0.0340
0.0061 64.0 277696 0.3902 0.1067 0.0335
0.0061 65.0 282035 0.3948 0.1044 0.0331
0.0054 66.0 286374 0.3917 0.1064 0.0335
0.0053 67.0 290713 0.4028 0.1046 0.0331
0.0049 68.0 295052 0.4127 0.1027 0.0325
0.0046 69.0 299391 0.4085 0.1064 0.0335
0.0047 70.0 303730 0.4076 0.1030 0.0328
0.0043 71.0 308069 0.4098 0.1033 0.0327
0.0039 72.0 312408 0.4299 0.1017 0.0325
0.0035 73.0 316747 0.4302 0.1037 0.0328
0.0033 74.0 321086 0.4271 0.1038 0.0329
0.0031 75.0 325425 0.4347 0.1022 0.0326
0.003 76.0 329764 0.4419 0.0995 0.0319
0.0029 77.0 334103 0.4482 0.1004 0.0322
0.0025 78.0 338442 0.4601 0.0992 0.0318
0.0026 79.0 342781 0.4657 0.0986 0.0316
0.0022 80.0 347120 0.4636 0.0993 0.0318
0.002 81.0 351459 0.4621 0.0992 0.0317
0.0019 82.0 355798 0.4720 0.0990 0.0318
0.0017 83.0 360137 0.4917 0.0983 0.0316
0.0016 84.0 364476 0.4931 0.0974 0.0315
0.0014 85.0 368815 0.4973 0.0990 0.0319
0.0013 86.0 373154 0.5015 0.0977 0.0316

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
0
Safetensors
Model size
606M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for asr-africa/w2v-bert-2.0-naijavoices-hausa-v0.0

Finetuned
(316)
this model