base_sami_22k_ftpseudo_ftlabelled_sami_parliament

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 220.5877
  • Wer: 0.4061
  • Cer: 0.1238

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.25
  • num_epochs: 60.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1059.4936 1.0 446 233.5363 0.4193 0.1316
868.2499 2.0 892 220.7794 0.4036 0.1232
820.7753 3.0 1338 256.3383 0.4162 0.1329
844.8047 4.0 1784 253.8045 0.4216 0.1430
792.2914 5.0 2230 250.6644 0.4473 0.1485
825.7003 6.0 2676 307.4147 0.4676 0.1611
840.0486 7.0 3122 304.3511 0.4777 0.1686

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
78
Safetensors
Model size
94.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support