--- license: apache-2.0 base_model: jonatasgrosman/wav2vec2-large-xlsr-53-english tags: - generated_from_trainer metrics: - accuracy model-index: - name: wav2vec2-large-xlsr-53-english-finetuned-ravdess-v6 results: [] --- # wav2vec2-large-xlsr-53-english-finetuned-ravdess-v6 This model is a fine-tuned version of [jonatasgrosman/wav2vec2-large-xlsr-53-english](https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-english) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1552 - Accuracy: 0.5660 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.079 | 0.07 | 10 | 2.0767 | 0.1667 | | 2.0728 | 0.14 | 20 | 2.0719 | 0.1389 | | 2.0713 | 0.21 | 30 | 2.0576 | 0.1562 | | 2.056 | 0.28 | 40 | 2.0382 | 0.1181 | | 2.0759 | 0.35 | 50 | 2.0160 | 0.2778 | | 2.0117 | 0.42 | 60 | 1.9332 | 0.2778 | | 1.8598 | 0.49 | 70 | 1.8759 | 0.2882 | | 1.9277 | 0.56 | 80 | 1.8321 | 0.2812 | | 1.7897 | 0.62 | 90 | 1.7278 | 0.3819 | | 1.8157 | 0.69 | 100 | 1.7270 | 0.3646 | | 1.9104 | 0.76 | 110 | 1.6997 | 0.3021 | | 1.8557 | 0.83 | 120 | 1.6664 | 0.4271 | | 1.8803 | 0.9 | 130 | 1.7943 | 0.3021 | | 1.7548 | 0.97 | 140 | 1.8016 | 0.3021 | | 1.7166 | 1.04 | 150 | 1.6303 | 0.3785 | | 1.7237 | 1.11 | 160 | 1.6330 | 0.4132 | | 1.7228 | 1.18 | 170 | 1.5905 | 0.4306 | | 1.5683 | 1.25 | 180 | 1.5216 | 0.4340 | | 1.716 | 1.32 | 190 | 1.4973 | 0.4306 | | 1.562 | 1.39 | 200 | 1.5994 | 0.3715 | | 1.5617 | 1.46 | 210 | 1.5699 | 0.4236 | | 1.6539 | 1.53 | 220 | 1.5024 | 0.3993 | | 1.58 | 1.6 | 230 | 1.4787 | 0.4132 | | 1.5107 | 1.67 | 240 | 1.4252 | 0.4444 | | 1.5934 | 1.74 | 250 | 1.4125 | 0.4444 | | 1.54 | 1.81 | 260 | 1.4032 | 0.4236 | | 1.4717 | 1.88 | 270 | 1.3636 | 0.4896 | | 1.5257 | 1.94 | 280 | 1.5080 | 0.4306 | | 1.4537 | 2.01 | 290 | 1.3346 | 0.4757 | | 1.356 | 2.08 | 300 | 1.3636 | 0.4653 | | 1.3572 | 2.15 | 310 | 1.3122 | 0.4757 | | 1.2657 | 2.22 | 320 | 1.2927 | 0.5174 | | 1.4931 | 2.29 | 330 | 1.3161 | 0.5382 | | 1.3314 | 2.36 | 340 | 1.3248 | 0.5 | | 1.375 | 2.43 | 350 | 1.2859 | 0.5521 | | 1.3316 | 2.5 | 360 | 1.2747 | 0.5556 | | 1.1443 | 2.57 | 370 | 1.2243 | 0.5625 | | 1.3866 | 2.64 | 380 | 1.2122 | 0.5590 | | 1.3274 | 2.71 | 390 | 1.2192 | 0.5174 | | 1.1248 | 2.78 | 400 | 1.1993 | 0.5278 | | 1.1337 | 2.85 | 410 | 1.1746 | 0.5556 | | 1.1394 | 2.92 | 420 | 1.1603 | 0.5625 | | 1.2199 | 2.99 | 430 | 1.1553 | 0.5660 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3