gbert-base-defakts-fake-binary
This Model is finetuned for sequence classification (binary fake-news classification task) on the german DeFaktS-Dataset. It achieves the following results on the evaluation set:
- Loss: 0.3441
- Accuracy: 0.8526
- F1: 0.8413
- Precision: 0.8545
- Recall: 0.8337
Model description
This Model is finetuned for sequence classification
Dataset
Trained on the DeFactS dataset https://github.com/caisa-lab/DeFaktS-Dataset-Disinformaton-Detection, feature catposfake/catneutral to detect fake news
Intended uses & limitations
Fake news classification
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy@de | F1@de | Precision@de | Recall@de | Loss@de |
---|---|---|---|---|---|---|---|---|
0.6236 | 0.0888 | 50 | 0.5162 | 0.7226 | 0.7219 | 0.7377 | 0.7460 | 0.5162 |
0.4355 | 0.1776 | 100 | 0.4000 | 0.8201 | 0.8112 | 0.8116 | 0.8107 | 0.4000 |
0.3937 | 0.2664 | 150 | 0.3916 | 0.8316 | 0.8181 | 0.8323 | 0.8106 | 0.3917 |
0.409 | 0.3552 | 200 | 0.3726 | 0.8331 | 0.8216 | 0.8299 | 0.8163 | 0.3727 |
0.3579 | 0.4440 | 250 | 0.3572 | 0.8386 | 0.8285 | 0.8339 | 0.8246 | 0.3573 |
0.366 | 0.5329 | 300 | 0.3519 | 0.8456 | 0.8370 | 0.8396 | 0.8349 | 0.3521 |
0.3735 | 0.6217 | 350 | 0.4180 | 0.7991 | 0.7976 | 0.8038 | 0.8179 | 0.4180 |
0.3849 | 0.7105 | 400 | 0.3605 | 0.8386 | 0.8198 | 0.8592 | 0.8064 | 0.3607 |
0.3752 | 0.7993 | 450 | 0.3379 | 0.8581 | 0.8532 | 0.8499 | 0.8586 | 0.3379 |
0.3547 | 0.8881 | 500 | 0.3441 | 0.8526 | 0.8413 | 0.8545 | 0.8337 | 0.3442 |
0.344 | 0.9769 | 550 | 0.3439 | 0.8461 | 0.8320 | 0.8540 | 0.8218 | 0.3440 |
0.2853 | 1.0657 | 600 | 0.3525 | 0.8446 | 0.8386 | 0.8361 | 0.8421 | 0.3526 |
0.2505 | 1.1545 | 650 | 0.3427 | 0.8626 | 0.8512 | 0.8686 | 0.8419 | 0.3429 |
0.2358 | 1.2433 | 700 | 0.3537 | 0.8626 | 0.8556 | 0.8565 | 0.8547 | 0.3537 |
0.257 | 1.3321 | 750 | 0.3258 | 0.8621 | 0.8559 | 0.8548 | 0.8572 | 0.3259 |
0.2369 | 1.4210 | 800 | 0.3810 | 0.8551 | 0.8505 | 0.8469 | 0.8571 | 0.3811 |
0.248 | 1.5098 | 850 | 0.3576 | 0.8721 | 0.8641 | 0.8701 | 0.8596 | 0.3578 |
0.2612 | 1.5986 | 900 | 0.3273 | 0.8686 | 0.8583 | 0.8729 | 0.8500 | 0.3275 |
0.2532 | 1.6874 | 950 | 0.3235 | 0.8636 | 0.8567 | 0.8575 | 0.8560 | 0.3236 |
0.2225 | 1.7762 | 1000 | 0.3513 | 0.8666 | 0.8567 | 0.8690 | 0.8492 | 0.3515 |
0.2497 | 1.8650 | 1050 | 0.3497 | 0.8711 | 0.8625 | 0.8705 | 0.8570 | 0.3498 |
0.2291 | 1.9538 | 1100 | 0.3395 | 0.8761 | 0.8685 | 0.8740 | 0.8643 | 0.3396 |
0.1675 | 2.0426 | 1150 | 0.3944 | 0.8671 | 0.8583 | 0.8660 | 0.8530 | 0.3946 |
0.1182 | 2.1314 | 1200 | 0.4743 | 0.8626 | 0.8532 | 0.8621 | 0.8473 | 0.4746 |
0.1453 | 2.2202 | 1250 | 0.4977 | 0.8646 | 0.8531 | 0.8719 | 0.8433 | 0.4981 |
0.1288 | 2.3091 | 1300 | 0.4345 | 0.8696 | 0.8631 | 0.8637 | 0.8625 | 0.4347 |
0.1399 | 2.3979 | 1350 | 0.4128 | 0.8751 | 0.8665 | 0.8758 | 0.8603 | 0.4132 |
0.1384 | 2.4867 | 1400 | 0.3688 | 0.8776 | 0.8713 | 0.8723 | 0.8704 | 0.3690 |
0.1292 | 2.5755 | 1450 | 0.4154 | 0.8781 | 0.8710 | 0.8749 | 0.8679 | 0.4157 |
0.112 | 2.6643 | 1500 | 0.4399 | 0.8661 | 0.8592 | 0.8603 | 0.8583 | 0.4401 |
0.108 | 2.7531 | 1550 | 0.4439 | 0.8731 | 0.8659 | 0.8692 | 0.8631 | 0.4442 |
0.1153 | 2.8419 | 1600 | 0.4476 | 0.8676 | 0.8590 | 0.8662 | 0.8539 | 0.4479 |
0.1228 | 2.9307 | 1650 | 0.4933 | 0.8691 | 0.8589 | 0.8733 | 0.8506 | 0.4936 |
0.0955 | 3.0195 | 1700 | 0.5272 | 0.8806 | 0.8726 | 0.8810 | 0.8668 | 0.5275 |
0.0502 | 3.1083 | 1750 | 0.6531 | 0.8661 | 0.8554 | 0.8708 | 0.8468 | 0.6537 |
0.0604 | 3.1972 | 1800 | 0.6515 | 0.8721 | 0.8635 | 0.8719 | 0.8578 | 0.6520 |
0.068 | 3.2860 | 1850 | 0.6422 | 0.8756 | 0.8656 | 0.8820 | 0.8564 | 0.6427 |
0.0505 | 3.3748 | 1900 | 0.6262 | 0.8681 | 0.8606 | 0.8639 | 0.8579 | 0.6266 |
0.065 | 3.4636 | 1950 | 0.6342 | 0.8681 | 0.8614 | 0.8623 | 0.8606 | 0.6345 |
0.0645 | 3.5524 | 2000 | 0.6472 | 0.8696 | 0.8623 | 0.8653 | 0.8598 | 0.6476 |
0.0951 | 3.6412 | 2050 | 0.6048 | 0.8661 | 0.8576 | 0.8641 | 0.8529 | 0.6051 |
0.0336 | 3.7300 | 2100 | 0.6603 | 0.8736 | 0.8661 | 0.8705 | 0.8626 | 0.6608 |
0.0551 | 3.8188 | 2150 | 0.6932 | 0.8716 | 0.8638 | 0.8689 | 0.8599 | 0.6938 |
0.0595 | 3.9076 | 2200 | 0.6379 | 0.8756 | 0.8687 | 0.8715 | 0.8663 | 0.6384 |
0.0681 | 3.9964 | 2250 | 0.6327 | 0.8751 | 0.8661 | 0.8774 | 0.8589 | 0.6332 |
0.0273 | 4.0853 | 2300 | 0.6414 | 0.8731 | 0.8656 | 0.8701 | 0.8620 | 0.6419 |
0.0261 | 4.1741 | 2350 | 0.6590 | 0.8761 | 0.8699 | 0.8705 | 0.8692 | 0.6594 |
0.0173 | 4.2629 | 2400 | 0.7341 | 0.8776 | 0.8703 | 0.8750 | 0.8666 | 0.7347 |
0.024 | 4.3517 | 2450 | 0.7647 | 0.8706 | 0.8639 | 0.8652 | 0.8626 | 0.7651 |
0.0324 | 4.4405 | 2500 | 0.7651 | 0.8741 | 0.8657 | 0.8738 | 0.8601 | 0.7658 |
0.0144 | 4.5293 | 2550 | 0.7918 | 0.8691 | 0.8599 | 0.8698 | 0.8535 | 0.7925 |
0.0402 | 4.6181 | 2600 | 0.7661 | 0.8691 | 0.8604 | 0.8684 | 0.8549 | 0.7667 |
0.0311 | 4.7069 | 2650 | 0.7688 | 0.8706 | 0.8619 | 0.8701 | 0.8564 | 0.7694 |
0.0127 | 4.7957 | 2700 | 0.7880 | 0.8691 | 0.8607 | 0.8675 | 0.8558 | 0.7886 |
0.03 | 4.8845 | 2750 | 0.7616 | 0.8736 | 0.8672 | 0.8680 | 0.8665 | 0.7620 |
0.0431 | 4.9734 | 2800 | 0.7842 | 0.8716 | 0.8631 | 0.8710 | 0.8576 | 0.7848 |
0.0243 | 5.0622 | 2850 | 0.7620 | 0.8701 | 0.8612 | 0.8704 | 0.8550 | 0.7626 |
0.0135 | 5.1510 | 2900 | 0.7799 | 0.8711 | 0.8624 | 0.8710 | 0.8565 | 0.7805 |
0.0177 | 5.2398 | 2950 | 0.7644 | 0.8746 | 0.8672 | 0.8716 | 0.8637 | 0.7649 |
0.0126 | 5.3286 | 3000 | 0.7826 | 0.8736 | 0.8656 | 0.8721 | 0.8608 | 0.7832 |
0.011 | 5.4174 | 3050 | 0.7951 | 0.8761 | 0.8681 | 0.8751 | 0.8631 | 0.7957 |
0.0214 | 5.5062 | 3100 | 0.7953 | 0.8741 | 0.8657 | 0.8738 | 0.8601 | 0.7960 |
0.0099 | 5.5950 | 3150 | 0.7855 | 0.8746 | 0.8667 | 0.8729 | 0.8621 | 0.7861 |
0.0193 | 5.6838 | 3200 | 0.7967 | 0.8746 | 0.8661 | 0.8747 | 0.8603 | 0.7974 |
0.0171 | 5.7726 | 3250 | 0.7956 | 0.8751 | 0.8669 | 0.8744 | 0.8616 | 0.7962 |
0.013 | 5.8615 | 3300 | 0.7972 | 0.8741 | 0.8658 | 0.8736 | 0.8604 | 0.7978 |
0.0176 | 5.9503 | 3350 | 0.8003 | 0.8736 | 0.8651 | 0.8734 | 0.8595 | 0.8009 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.3.1+cu121
- Tokenizers 0.20.3
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.