Edit model card

xlsr-a-nomi-ag

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3211
  • Wer: 0.3369

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.5146 2.2727 200 2.5284 1.0
1.4558 4.5455 400 0.4417 0.6139
0.253 6.8182 600 0.2623 0.4254
0.1064 9.0909 800 0.3151 0.4013
0.0715 11.3636 1000 0.2867 0.3744
0.0575 13.6364 1200 0.3176 0.3584
0.0445 15.9091 1400 0.3323 0.3432
0.029 18.1818 1600 0.3549 0.3467
0.0291 20.4545 1800 0.2824 0.3458
0.0196 22.7273 2000 0.3145 0.3414
0.0205 25.0 2200 0.3013 0.3396
0.0168 27.2727 2400 0.3207 0.3387
0.0136 29.5455 2600 0.3306 0.3333
0.0109 31.8182 2800 0.3197 0.3324
0.0077 34.0909 3000 0.3511 0.3378
0.0082 36.3636 3200 0.3408 0.3342
0.0088 38.6364 3400 0.3354 0.3369
0.0066 40.9091 3600 0.3246 0.3360
0.0035 43.1818 3800 0.3295 0.3369
0.0044 45.4545 4000 0.3220 0.3369
0.0049 47.7273 4200 0.3196 0.3378
0.0029 50.0 4400 0.3211 0.3369

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
0
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for susmitabhatt/xlsr-a-nomi-ag

Finetuned
(204)
this model