wav2vec2-large-xls-r-1b-malay-colab

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2224
  • Wer: 0.3983

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.0949 4.2553 400 1.2956 0.7778
1.2835 8.5106 800 1.0625 0.6145
0.8461 12.7660 1200 0.8611 0.4936
0.5795 17.0213 1600 0.9722 0.4649
0.3936 21.2766 2000 1.0780 0.4255
0.2668 25.5319 2400 1.1468 0.4036
0.1717 29.7872 2800 1.2224 0.3983

Framework versions

  • Transformers 4.49.0.dev0
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.1.dev0
  • Tokenizers 0.21.0
Downloads last month
-
Safetensors
Model size
963M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for imhakim1/wav2vec2-large-xls-r-1b-malay-colab

Finetuned
(111)
this model