You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Wav2Vec2-BERT Hausa - Alvin Nahabwe

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the NaijaVoices dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2522
  • Wer: 0.0818
  • Cer: 0.0247

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 9e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • distributed_type: multi-GPU
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.025
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.277 0.9999 4752 0.2384 0.2385 0.0590
0.2236 2.0 9505 0.2280 0.2329 0.0572
0.212 2.9999 14257 0.2135 0.2233 0.0546
0.2014 4.0 19010 0.2108 0.2255 0.0553
0.1938 4.9999 23762 0.1977 0.2137 0.0523
0.1856 6.0 28515 0.1980 0.2125 0.0520
0.1796 6.9999 33267 0.1918 0.2066 0.0499
0.1723 8.0 38020 0.1899 0.2038 0.0498
0.1661 8.9999 42772 0.1897 0.2089 0.0508
0.16 10.0 47525 0.1818 0.1956 0.0470
0.1541 10.9999 52277 0.1768 0.1921 0.0460
0.1474 12.0 57030 0.1811 0.1896 0.0460
0.1376 12.9999 61782 0.1693 0.1812 0.0437
0.1312 14.0 66535 0.1710 0.1764 0.0431
0.1249 14.9999 71287 0.1673 0.1686 0.0417
0.1175 16.0 76040 0.1626 0.1641 0.0403
0.1103 16.9999 80792 0.1557 0.1561 0.0385
0.1028 18.0 85545 0.1594 0.1481 0.0372
0.0958 18.9999 90297 0.1557 0.1445 0.0367
0.0901 20.0 95050 0.1576 0.1417 0.0365
0.0845 20.9999 99802 0.1554 0.1354 0.0352
0.0779 22.0 104555 0.1572 0.1365 0.0355
0.0737 22.9999 109307 0.1570 0.1317 0.0350
0.0686 24.0 114060 0.1607 0.1234 0.0332
0.0633 24.9999 118812 0.1584 0.1247 0.0336
0.06 26.0 123565 0.1511 0.1156 0.0314
0.0548 26.9999 128317 0.1594 0.1102 0.0302
0.0509 28.0 133070 0.1673 0.1090 0.0301
0.0475 28.9999 137822 0.1632 0.1084 0.0300
0.0454 30.0 142575 0.1654 0.1088 0.0305
0.0419 30.9999 147327 0.1614 0.1102 0.0309
0.0397 32.0 152080 0.1692 0.1030 0.0291
0.0371 32.9999 156832 0.1657 0.1034 0.0293
0.0345 34.0 161585 0.1693 0.0995 0.0282
0.0324 34.9999 166337 0.1755 0.0997 0.0284
0.0302 36.0 171090 0.1845 0.0988 0.0283
0.0294 36.9999 175842 0.1811 0.0962 0.0277
0.0273 38.0 180595 0.1822 0.0959 0.0278
0.0259 38.9999 185347 0.1864 0.0967 0.0281
0.0244 40.0 190100 0.1910 0.0953 0.0278
0.0228 40.9999 194852 0.1797 0.0959 0.0280
0.0221 42.0 199605 0.1823 0.0946 0.0276
0.0202 42.9999 204357 0.1931 0.0908 0.0265
0.0194 44.0 209110 0.2000 0.0916 0.0268
0.0179 44.9999 213862 0.1907 0.0925 0.0270
0.0173 46.0 218615 0.1866 0.0954 0.0280
0.0164 46.9999 223367 0.2048 0.0965 0.0290
0.0154 48.0 228120 0.2041 0.0887 0.0261
0.0149 48.9999 232872 0.2147 0.0876 0.0259
0.0138 50.0 237625 0.2003 0.0950 0.0277
0.0134 50.9999 242377 0.2165 0.0903 0.0269
0.0127 52.0 247130 0.2093 0.0911 0.0268
0.0122 52.9999 251882 0.2130 0.0868 0.0260
0.0114 54.0 256635 0.2100 0.0917 0.0276
0.0106 54.9999 261387 0.2189 0.0913 0.0276
0.0102 56.0 266140 0.2227 0.0877 0.0264
0.0098 56.9999 270892 0.2305 0.0873 0.0266
0.0091 58.0 275645 0.2332 0.0813 0.0246
0.0086 58.9999 280397 0.2289 0.0843 0.0256
0.0082 60.0 285150 0.2365 0.0823 0.0249
0.008 60.9999 289902 0.2464 0.0851 0.0258
0.0075 62.0 294655 0.2314 0.0848 0.0259
0.0069 62.9999 299407 0.2434 0.0827 0.0249
0.0068 64.0 304160 0.2362 0.0818 0.0250
0.0062 64.9999 308912 0.2337 0.0821 0.0251
0.0058 66.0 313665 0.2555 0.0836 0.0254
0.0055 66.9999 318417 0.2501 0.0836 0.0256
0.0054 68.0 323170 0.2522 0.0818 0.0247

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
21
Safetensors
Model size
606M params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for asr-africa/w2v-bert-2.0-naijavoices-hausa-500hr-v0

Finetuned
(281)
this model

Dataset used to train asr-africa/w2v-bert-2.0-naijavoices-hausa-500hr-v0

Evaluation results