clapAI/xlm-roberta-base-ViHSD-ep50
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4267
- Micro F1: 91.6981
- Micro Precision: 91.6981
- Micro Recall: 91.6981
- Macro F1: 88.7494
- Macro Precision: 90.1507
- Macro Recall: 87.5713
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 50.0
Training results
Training Loss | Epoch | Step | Validation Loss | Micro F1 | Micro Precision | Micro Recall | Macro F1 | Macro Precision | Macro Recall |
---|---|---|---|---|---|---|---|---|---|
0.3566 | 1.0 | 56 | 0.3159 | 87.4214 | 87.4214 | 87.4214 | 81.0490 | 88.3189 | 77.6058 |
0.1918 | 2.0 | 112 | 0.2780 | 89.6226 | 89.6226 | 89.6226 | 86.5601 | 86.0956 | 87.0629 |
0.1995 | 3.0 | 168 | 0.2644 | 91.1321 | 91.1321 | 91.1321 | 87.5953 | 90.6506 | 85.4180 |
0.1661 | 4.0 | 224 | 0.2735 | 90.4403 | 90.4403 | 90.4403 | 86.4870 | 90.0802 | 84.0667 |
0.1304 | 5.0 | 280 | 0.2908 | 90.5660 | 90.5660 | 90.5660 | 87.2153 | 88.5626 | 86.0852 |
0.1223 | 6.0 | 336 | 0.2799 | 91.5094 | 91.5094 | 91.5094 | 88.6410 | 89.4330 | 87.9280 |
0.0858 | 7.0 | 392 | 0.2820 | 90.7547 | 90.7547 | 90.7547 | 87.6522 | 88.3510 | 87.0179 |
0.0834 | 8.0 | 448 | 0.3349 | 90.3774 | 90.3774 | 90.3774 | 87.0384 | 88.0992 | 86.1196 |
0.0757 | 9.0 | 504 | 0.4175 | 90.6918 | 90.6918 | 90.6918 | 87.4077 | 88.6792 | 86.3309 |
0.0559 | 10.0 | 560 | 0.4314 | 91.0692 | 91.0692 | 91.0692 | 87.5886 | 90.2648 | 85.6175 |
0.0502 | 11.0 | 616 | 0.4291 | 90.6918 | 90.6918 | 90.6918 | 87.2966 | 88.9900 | 85.9280 |
0.0413 | 12.0 | 672 | 0.4433 | 90.6289 | 90.6289 | 90.6289 | 87.1305 | 89.1346 | 85.5634 |
0.0379 | 13.0 | 728 | 0.4141 | 91.4465 | 91.4465 | 91.4465 | 88.4287 | 89.7340 | 87.3216 |
0.0319 | 14.0 | 784 | 0.4593 | 90.5660 | 90.5660 | 90.5660 | 86.9136 | 89.4415 | 85.0376 |
0.0357 | 15.0 | 840 | 0.4267 | 91.6981 | 91.6981 | 91.6981 | 88.7494 | 90.1507 | 87.5713 |
0.0335 | 16.0 | 896 | 0.4645 | 91.0692 | 91.0692 | 91.0692 | 88.1221 | 88.6493 | 87.6321 |
0.0204 | 17.0 | 952 | 0.4793 | 91.6352 | 91.6352 | 91.6352 | 88.7133 | 89.9027 | 87.6902 |
0.0184 | 18.0 | 1008 | 0.4481 | 91.6981 | 91.6981 | 91.6981 | 88.7494 | 90.1507 | 87.5713 |
0.0274 | 19.0 | 1064 | 0.5104 | 90.3145 | 90.3145 | 90.3145 | 87.1821 | 87.4953 | 86.8832 |
0.0171 | 20.0 | 1120 | 0.5093 | 90.8176 | 90.8176 | 90.8176 | 87.7671 | 88.3600 | 87.2213 |
0.0201 | 21.0 | 1176 | 0.5293 | 90.9434 | 90.9434 | 90.9434 | 87.7052 | 89.1543 | 86.5000 |
0.0106 | 22.0 | 1232 | 0.6152 | 90.9434 | 90.9434 | 90.9434 | 87.3904 | 90.1587 | 85.3718 |
0.0144 | 23.0 | 1288 | 0.6013 | 90.7547 | 90.7547 | 90.7547 | 87.4600 | 88.8578 | 86.2926 |
0.0052 | 24.0 | 1344 | 0.7308 | 91.1950 | 91.1950 | 91.1950 | 87.5538 | 91.2457 | 85.0574 |
0.0009 | 25.0 | 1400 | 0.6435 | 91.3208 | 91.3208 | 91.3208 | 88.0046 | 90.3981 | 86.1895 |
0.0084 | 26.0 | 1456 | 0.7136 | 91.3836 | 91.3836 | 91.3836 | 87.9922 | 90.8570 | 85.9094 |
0.0069 | 27.0 | 1512 | 0.6588 | 91.0063 | 91.0063 | 91.0063 | 87.5586 | 89.9755 | 85.7364 |
0.0044 | 28.0 | 1568 | 0.6204 | 90.9434 | 90.9434 | 90.9434 | 87.7480 | 89.0308 | 86.6612 |
0.0104 | 29.0 | 1624 | 0.6941 | 91.3208 | 91.3208 | 91.3208 | 87.6602 | 91.7369 | 84.9807 |
0.0014 | 30.0 | 1680 | 0.6836 | 91.5723 | 91.5723 | 91.5723 | 88.5590 | 90.0399 | 87.3256 |
0.0018 | 31.0 | 1736 | 0.7542 | 91.5723 | 91.5723 | 91.5723 | 88.4156 | 90.5348 | 86.7615 |
0.0001 | 32.0 | 1792 | 0.7364 | 91.4465 | 91.4465 | 91.4465 | 88.3473 | 89.9964 | 86.9993 |
0.0007 | 33.0 | 1848 | 0.7620 | 91.0063 | 91.0063 | 91.0063 | 87.8436 | 89.0884 | 86.7840 |
0.012 | 34.0 | 1904 | 0.7525 | 91.3836 | 91.3836 | 91.3836 | 87.9922 | 90.8570 | 85.9094 |
0.0001 | 35.0 | 1960 | 0.7146 | 91.1950 | 91.1950 | 91.1950 | 87.8750 | 90.0623 | 86.1855 |
0.0013 | 36.0 | 2016 | 0.7613 | 90.6289 | 90.6289 | 90.6289 | 87.3336 | 88.5620 | 86.2887 |
0.0006 | 37.0 | 2072 | 0.7886 | 91.6352 | 91.6352 | 91.6352 | 88.3428 | 91.2327 | 86.2396 |
0.0001 | 38.0 | 2128 | 0.8067 | 91.0063 | 91.0063 | 91.0063 | 87.7798 | 89.2746 | 86.5423 |
0.0 | 39.0 | 2184 | 0.8097 | 90.8176 | 90.8176 | 90.8176 | 87.6420 | 88.6803 | 86.7378 |
0.0 | 40.0 | 2240 | 0.8077 | 90.9434 | 90.9434 | 90.9434 | 87.7692 | 88.9708 | 86.7417 |
0.0004 | 41.0 | 2296 | 0.8075 | 91.0692 | 91.0692 | 91.0692 | 87.8332 | 89.4610 | 86.5039 |
0.0001 | 42.0 | 2352 | 0.8013 | 90.8805 | 90.8805 | 90.8805 | 87.6306 | 89.0346 | 86.4577 |
0.0001 | 43.0 | 2408 | 0.8036 | 91.0692 | 91.0692 | 91.0692 | 87.7681 | 89.6645 | 86.2622 |
0.0005 | 44.0 | 2464 | 0.8019 | 91.2579 | 91.2579 | 91.2579 | 88.0585 | 89.8306 | 86.6307 |
0.0045 | 45.0 | 2520 | 0.8000 | 91.2579 | 91.2579 | 91.2579 | 88.0372 | 89.9001 | 86.5501 |
0.0 | 46.0 | 2576 | 0.8028 | 91.1950 | 91.1950 | 91.1950 | 87.9833 | 89.7067 | 86.5885 |
0.0001 | 47.0 | 2632 | 0.8077 | 91.0692 | 91.0692 | 91.0692 | 87.8332 | 89.4610 | 86.5039 |
0.0 | 48.0 | 2688 | 0.8071 | 91.1321 | 91.1321 | 91.1321 | 87.9082 | 89.5835 | 86.5462 |
0.0002 | 49.0 | 2744 | 0.8076 | 91.0692 | 91.0692 | 91.0692 | 87.8332 | 89.4610 | 86.5039 |
0.0007 | 50.0 | 2800 | 0.8071 | 91.1950 | 91.1950 | 91.1950 | 88.0045 | 89.6394 | 86.6690 |
Framework versions
- Transformers 4.50.0
- Pytorch 2.4.0+cu121
- Datasets 2.15.0
- Tokenizers 0.21.1
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for clapAI/xlm-roberta-base-ViHSD-ep50
Base model
FacebookAI/xlm-roberta-base