ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7791
  • Qwk: 0.0236
  • Mse: 0.7791
  • Rmse: 0.8827

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0204 2 3.5928 -0.0154 3.5928 1.8955
No log 0.0408 4 1.8765 0.0191 1.8765 1.3699
No log 0.0612 6 1.5376 0.0685 1.5376 1.2400
No log 0.0816 8 1.0385 -0.1609 1.0385 1.0191
No log 0.1020 10 0.9592 -0.0285 0.9592 0.9794
No log 0.1224 12 0.9561 0.0046 0.9561 0.9778
No log 0.1429 14 0.6609 0.0555 0.6609 0.8129
No log 0.1633 16 0.6561 0.0 0.6561 0.8100
No log 0.1837 18 0.6853 0.1444 0.6853 0.8278
No log 0.2041 20 0.7706 0.0191 0.7706 0.8778
No log 0.2245 22 0.7892 -0.0331 0.7892 0.8884
No log 0.2449 24 0.7664 -0.0331 0.7664 0.8754
No log 0.2653 26 0.7523 0.1379 0.7523 0.8674
No log 0.2857 28 0.8250 0.0318 0.8250 0.9083
No log 0.3061 30 1.0575 -0.0157 1.0575 1.0284
No log 0.3265 32 1.0189 -0.0424 1.0189 1.0094
No log 0.3469 34 0.9202 0.1228 0.9202 0.9593
No log 0.3673 36 1.0071 0.1028 1.0071 1.0035
No log 0.3878 38 1.0006 0.1584 1.0006 1.0003
No log 0.4082 40 0.9044 0.1475 0.9044 0.9510
No log 0.4286 42 0.9139 0.1198 0.9139 0.9560
No log 0.4490 44 1.0795 0.0437 1.0795 1.0390
No log 0.4694 46 1.0758 0.0631 1.0758 1.0372
No log 0.4898 48 1.0274 0.0775 1.0274 1.0136
No log 0.5102 50 1.0965 -0.0007 1.0965 1.0472
No log 0.5306 52 0.9954 -0.0256 0.9954 0.9977
No log 0.5510 54 1.4192 0.0636 1.4192 1.1913
No log 0.5714 56 1.5287 0.0399 1.5287 1.2364
No log 0.5918 58 0.9841 0.0659 0.9841 0.9920
No log 0.6122 60 1.1634 -0.0937 1.1634 1.0786
No log 0.6327 62 1.1412 -0.0320 1.1412 1.0683
No log 0.6531 64 0.8109 0.1135 0.8109 0.9005
No log 0.6735 66 1.4188 0.0874 1.4188 1.1911
No log 0.6939 68 1.6762 0.0882 1.6762 1.2947
No log 0.7143 70 1.3187 0.0610 1.3187 1.1483
No log 0.7347 72 0.8202 0.1646 0.8202 0.9057
No log 0.7551 74 1.2282 0.0462 1.2282 1.1082
No log 0.7755 76 1.3194 0.0919 1.3194 1.1487
No log 0.7959 78 0.8934 0.0719 0.8934 0.9452
No log 0.8163 80 0.8396 0.0238 0.8396 0.9163
No log 0.8367 82 1.0380 0.1634 1.0380 1.0188
No log 0.8571 84 0.9793 0.1652 0.9793 0.9896
No log 0.8776 86 0.7687 0.2118 0.7687 0.8768
No log 0.8980 88 1.0344 0.0440 1.0344 1.0171
No log 0.9184 90 1.3916 0.0967 1.3916 1.1797
No log 0.9388 92 1.1115 0.0042 1.1115 1.0543
No log 0.9592 94 0.7921 0.1009 0.7921 0.8900
No log 0.9796 96 0.7867 0.0851 0.7867 0.8869
No log 1.0 98 0.8062 0.1400 0.8062 0.8979
No log 1.0204 100 0.8226 0.1782 0.8226 0.9069
No log 1.0408 102 0.9278 0.1358 0.9278 0.9632
No log 1.0612 104 0.9223 0.1329 0.9223 0.9603
No log 1.0816 106 0.8248 0.1359 0.8248 0.9082
No log 1.1020 108 0.8538 0.1621 0.8538 0.9240
No log 1.1224 110 0.7948 0.1196 0.7948 0.8915
No log 1.1429 112 0.8232 0.0119 0.8232 0.9073
No log 1.1633 114 0.8397 0.0913 0.8397 0.9163
No log 1.1837 116 0.9017 0.1513 0.9017 0.9496
No log 1.2041 118 1.2682 0.0324 1.2682 1.1262
No log 1.2245 120 1.1390 0.1045 1.1390 1.0672
No log 1.2449 122 1.0237 0.1370 1.0237 1.0118
No log 1.2653 124 1.3970 0.1528 1.3970 1.1820
No log 1.2857 126 1.3462 0.1593 1.3462 1.1603
No log 1.3061 128 0.9045 0.0974 0.9045 0.9511
No log 1.3265 130 0.8684 0.1065 0.8684 0.9319
No log 1.3469 132 1.0761 0.1469 1.0761 1.0374
No log 1.3673 134 0.9010 0.0719 0.9010 0.9492
No log 1.3878 136 0.7657 0.1048 0.7657 0.8751
No log 1.4082 138 0.7888 0.1299 0.7888 0.8881
No log 1.4286 140 0.8606 0.1304 0.8606 0.9277
No log 1.4490 142 0.7990 0.0327 0.7990 0.8939
No log 1.4694 144 0.8271 0.1379 0.8271 0.9095
No log 1.4898 146 0.8647 0.1758 0.8647 0.9299
No log 1.5102 148 0.8109 0.0639 0.8109 0.9005
No log 1.5306 150 0.7733 0.0680 0.7733 0.8794
No log 1.5510 152 0.8251 0.0902 0.8251 0.9084
No log 1.5714 154 0.8457 0.0905 0.8457 0.9196
No log 1.5918 156 0.7947 0.0741 0.7947 0.8914
No log 1.6122 158 0.8304 0.0961 0.8304 0.9112
No log 1.6327 160 0.8112 0.0257 0.8112 0.9007
No log 1.6531 162 0.8594 0.0944 0.8594 0.9270
No log 1.6735 164 0.8565 -0.0051 0.8565 0.9255
No log 1.6939 166 0.9052 0.0880 0.9052 0.9514
No log 1.7143 168 1.0757 0.1044 1.0757 1.0371
No log 1.7347 170 1.0485 0.1110 1.0485 1.0240
No log 1.7551 172 1.1544 0.0952 1.1544 1.0744
No log 1.7755 174 0.9012 0.0920 0.9012 0.9493
No log 1.7959 176 0.8745 -0.0459 0.8745 0.9351
No log 1.8163 178 0.9319 0.0627 0.9319 0.9653
No log 1.8367 180 0.8660 -0.0851 0.8660 0.9306
No log 1.8571 182 0.8464 0.1003 0.8464 0.9200
No log 1.8776 184 0.9662 0.0596 0.9662 0.9829
No log 1.8980 186 0.9064 0.1329 0.9064 0.9520
No log 1.9184 188 0.8741 0.0101 0.8741 0.9350
No log 1.9388 190 1.0920 0.1369 1.0920 1.0450
No log 1.9592 192 1.1949 0.0810 1.1949 1.0931
No log 1.9796 194 0.9294 -0.0167 0.9294 0.9641
No log 2.0 196 1.1448 0.1353 1.1448 1.0700
No log 2.0204 198 1.4939 0.0027 1.4939 1.2222
No log 2.0408 200 1.2611 0.1758 1.2611 1.1230
No log 2.0612 202 0.8045 0.1495 0.8045 0.8969
No log 2.0816 204 0.7438 -0.0062 0.7438 0.8625
No log 2.1020 206 0.8548 0.0617 0.8548 0.9245
No log 2.1224 208 0.8656 0.0592 0.8656 0.9304
No log 2.1429 210 0.7959 -0.0118 0.7959 0.8921
No log 2.1633 212 0.8184 0.1048 0.8184 0.9046
No log 2.1837 214 0.8517 0.0837 0.8517 0.9229
No log 2.2041 216 0.7890 0.1048 0.7890 0.8883
No log 2.2245 218 0.7939 0.1434 0.7939 0.8910
No log 2.2449 220 0.7759 0.0503 0.7759 0.8808
No log 2.2653 222 0.7449 0.0914 0.7449 0.8631
No log 2.2857 224 0.7872 0.1495 0.7872 0.8873
No log 2.3061 226 0.8632 0.0876 0.8632 0.9291
No log 2.3265 228 0.8065 0.1423 0.8065 0.8981
No log 2.3469 230 0.7903 0.0821 0.7903 0.8890
No log 2.3673 232 0.8099 0.1095 0.8099 0.8999
No log 2.3878 234 0.7860 0.1095 0.7860 0.8866
No log 2.4082 236 0.8229 0.1440 0.8229 0.9071
No log 2.4286 238 0.8830 0.1235 0.8830 0.9397
No log 2.4490 240 0.9044 0.1235 0.9044 0.9510
No log 2.4694 242 0.8551 0.0725 0.8551 0.9247
No log 2.4898 244 1.0424 0.1076 1.0424 1.0210
No log 2.5102 246 1.1698 0.1103 1.1698 1.0816
No log 2.5306 248 0.9782 0.1077 0.9782 0.9890
No log 2.5510 250 0.8220 0.1541 0.8220 0.9066
No log 2.5714 252 1.0890 0.0915 1.0890 1.0436
No log 2.5918 254 1.1801 0.0653 1.1801 1.0863
No log 2.6122 256 0.8957 0.1235 0.8957 0.9464
No log 2.6327 258 0.8248 0.0441 0.8248 0.9082
No log 2.6531 260 0.9410 0.0048 0.9410 0.9701
No log 2.6735 262 0.8312 -0.1045 0.8312 0.9117
No log 2.6939 264 0.7042 0.1318 0.7042 0.8392
No log 2.7143 266 0.8934 0.1107 0.8934 0.9452
No log 2.7347 268 1.1271 -0.0047 1.1271 1.0617
No log 2.7551 270 1.0198 0.0025 1.0198 1.0098
No log 2.7755 272 0.7734 0.1047 0.7734 0.8794
No log 2.7959 274 0.7837 0.0481 0.7837 0.8853
No log 2.8163 276 0.8318 -0.0956 0.8318 0.9120
No log 2.8367 278 0.8075 -0.0661 0.8075 0.8986
No log 2.8571 280 0.7280 0.0454 0.7280 0.8532
No log 2.8776 282 0.6990 0.2339 0.6990 0.8361
No log 2.8980 284 0.7805 0.1329 0.7805 0.8835
No log 2.9184 286 0.7611 0.2288 0.7611 0.8724
No log 2.9388 288 0.7587 0.1800 0.7587 0.8710
No log 2.9592 290 0.7272 0.1553 0.7272 0.8528
No log 2.9796 292 0.7357 0.1047 0.7357 0.8577
No log 3.0 294 0.7069 0.1902 0.7069 0.8408
No log 3.0204 296 0.7315 -0.0541 0.7315 0.8553
No log 3.0408 298 0.7644 -0.0958 0.7644 0.8743
No log 3.0612 300 0.7778 0.0 0.7778 0.8819
No log 3.0816 302 0.8504 0.1758 0.8504 0.9222
No log 3.1020 304 0.8906 0.1758 0.8906 0.9437
No log 3.1224 306 0.8215 -0.0156 0.8215 0.9063
No log 3.1429 308 0.8297 -0.0785 0.8297 0.9109
No log 3.1633 310 0.7651 -0.0118 0.7651 0.8747
No log 3.1837 312 0.7396 0.0628 0.7396 0.8600
No log 3.2041 314 0.7219 0.0318 0.7219 0.8496
No log 3.2245 316 0.7358 0.0 0.7358 0.8578
No log 3.2449 318 0.7462 -0.0958 0.7462 0.8638
No log 3.2653 320 0.7285 -0.0062 0.7285 0.8535
No log 3.2857 322 0.7405 0.1627 0.7405 0.8605
No log 3.3061 324 0.7787 0.1097 0.7787 0.8824
No log 3.3265 326 0.8438 0.0799 0.8438 0.9186
No log 3.3469 328 0.7957 0.1003 0.7957 0.8920
No log 3.3673 330 0.7843 0.1529 0.7843 0.8856
No log 3.3878 332 0.7926 0.1585 0.7926 0.8903
No log 3.4082 334 0.8478 0.0842 0.8478 0.9208
No log 3.4286 336 0.8156 0.1281 0.8156 0.9031
No log 3.4490 338 0.7593 0.0834 0.7593 0.8714
No log 3.4694 340 0.7541 0.0089 0.7541 0.8684
No log 3.4898 342 0.7094 0.0436 0.7094 0.8422
No log 3.5102 344 0.7122 0.2258 0.7122 0.8439
No log 3.5306 346 0.7416 0.2544 0.7416 0.8612
No log 3.5510 348 0.7953 0.1585 0.7953 0.8918
No log 3.5714 350 0.8854 0.1251 0.8854 0.9410
No log 3.5918 352 0.8901 0.1483 0.8901 0.9434
No log 3.6122 354 0.8673 0.0810 0.8673 0.9313
No log 3.6327 356 0.8588 0.0452 0.8588 0.9267
No log 3.6531 358 0.8158 0.0490 0.8158 0.9032
No log 3.6735 360 0.7738 0.0650 0.7738 0.8797
No log 3.6939 362 0.7791 0.1506 0.7791 0.8826
No log 3.7143 364 0.7500 0.1675 0.7500 0.8660
No log 3.7347 366 0.7476 0.0869 0.7476 0.8647
No log 3.7551 368 0.7775 0.0723 0.7775 0.8818
No log 3.7755 370 0.8238 0.1817 0.8238 0.9076
No log 3.7959 372 0.7490 0.0814 0.7490 0.8654
No log 3.8163 374 0.7562 0.1148 0.7562 0.8696
No log 3.8367 376 0.7252 0.1444 0.7252 0.8516
No log 3.8571 378 0.7404 -0.0406 0.7404 0.8604
No log 3.8776 380 0.7416 -0.0406 0.7416 0.8612
No log 3.8980 382 0.7105 0.0914 0.7105 0.8429
No log 3.9184 384 0.7483 0.1097 0.7483 0.8650
No log 3.9388 386 0.7704 0.1096 0.7704 0.8777
No log 3.9592 388 0.8388 0.0580 0.8388 0.9158
No log 3.9796 390 0.9252 0.1300 0.9252 0.9619
No log 4.0 392 0.8687 0.0476 0.8687 0.9320
No log 4.0204 394 0.8223 0.0961 0.8223 0.9068
No log 4.0408 396 0.8025 0.1139 0.8025 0.8958
No log 4.0612 398 0.7882 0.0709 0.7882 0.8878
No log 4.0816 400 0.8075 0.0917 0.8075 0.8986
No log 4.1020 402 0.8032 0.0917 0.8031 0.8962
No log 4.1224 404 0.8069 0.0917 0.8069 0.8983
No log 4.1429 406 0.7979 0.0959 0.7979 0.8932
No log 4.1633 408 0.7667 0.0289 0.7667 0.8756
No log 4.1837 410 0.7720 0.0357 0.7720 0.8786
No log 4.2041 412 0.7891 0.0574 0.7891 0.8883
No log 4.2245 414 0.7901 0.0622 0.7901 0.8889
No log 4.2449 416 0.7714 0.0828 0.7714 0.8783
No log 4.2653 418 0.7772 -0.0488 0.7772 0.8816
No log 4.2857 420 0.7563 -0.0958 0.7563 0.8697
No log 4.3061 422 0.7185 0.0436 0.7185 0.8476
No log 4.3265 424 0.8389 0.0909 0.8389 0.9159
No log 4.3469 426 1.0658 0.0717 1.0658 1.0324
No log 4.3673 428 1.0005 0.0224 1.0005 1.0003
No log 4.3878 430 0.8253 0.1395 0.8253 0.9085
No log 4.4082 432 0.7798 -0.0488 0.7798 0.8831
No log 4.4286 434 0.8948 0.1078 0.8948 0.9459
No log 4.4490 436 0.9207 0.1078 0.9207 0.9595
No log 4.4694 438 0.7936 -0.0488 0.7936 0.8908
No log 4.4898 440 0.7859 0.1047 0.7859 0.8865
No log 4.5102 442 0.8957 0.0719 0.8957 0.9464
No log 4.5306 444 0.8450 0.0409 0.8450 0.9193
No log 4.5510 446 0.7292 0.1097 0.7292 0.8539
No log 4.5714 448 0.7348 0.0414 0.7348 0.8572
No log 4.5918 450 0.8003 0.0545 0.8003 0.8946
No log 4.6122 452 0.8141 0.0559 0.8141 0.9023
No log 4.6327 454 0.8055 0.1423 0.8055 0.8975
No log 4.6531 456 0.9211 0.1385 0.9211 0.9598
No log 4.6735 458 0.9913 0.0576 0.9913 0.9956
No log 4.6939 460 0.8411 0.1475 0.8411 0.9171
No log 4.7143 462 0.7647 0.1097 0.7647 0.8745
No log 4.7347 464 0.7704 0.1047 0.7704 0.8777
No log 4.7551 466 0.7621 0.1047 0.7621 0.8730
No log 4.7755 468 0.7605 0.1097 0.7605 0.8721
No log 4.7959 470 0.7796 0.0639 0.7796 0.8829
No log 4.8163 472 0.7730 0.0639 0.7730 0.8792
No log 4.8367 474 0.7801 0.1003 0.7801 0.8832
No log 4.8571 476 0.7623 0.1146 0.7623 0.8731
No log 4.8776 478 0.7581 0.0600 0.7581 0.8707
No log 4.8980 480 0.7661 0.0600 0.7661 0.8753
No log 4.9184 482 0.7781 0.0600 0.7781 0.8821
No log 4.9388 484 0.8411 0.1342 0.8411 0.9171
No log 4.9592 486 0.8162 0.0562 0.8162 0.9034
No log 4.9796 488 0.7945 0.0357 0.7945 0.8913
No log 5.0 490 0.8302 0.0580 0.8302 0.9111
No log 5.0204 492 0.8022 0.0930 0.8022 0.8956
No log 5.0408 494 0.8151 0.0959 0.8151 0.9029
No log 5.0612 496 1.0330 0.1396 1.0330 1.0164
No log 5.0816 498 1.0893 0.0666 1.0893 1.0437
0.3214 5.1020 500 0.9070 0.0377 0.9070 0.9523
0.3214 5.1224 502 0.7443 0.1097 0.7443 0.8628
0.3214 5.1429 504 0.7350 0.0914 0.7350 0.8573
0.3214 5.1633 506 0.7599 0.0449 0.7599 0.8717
0.3214 5.1837 508 0.7717 0.1196 0.7717 0.8785
0.3214 5.2041 510 0.8452 0.1467 0.8452 0.9193
0.3214 5.2245 512 0.8857 0.1467 0.8857 0.9411
0.3214 5.2449 514 0.8426 0.0509 0.8426 0.9179
0.3214 5.2653 516 0.8526 0.1718 0.8526 0.9234
0.3214 5.2857 518 0.8781 0.0123 0.8781 0.9371
0.3214 5.3061 520 0.8539 0.1050 0.8539 0.9241
0.3214 5.3265 522 0.8836 0.1467 0.8836 0.9400
0.3214 5.3469 524 0.9024 0.1846 0.9024 0.9499
0.3214 5.3673 526 0.8239 0.1701 0.8239 0.9077
0.3214 5.3878 528 0.7887 0.0257 0.7887 0.8881
0.3214 5.4082 530 0.7959 0.0889 0.7959 0.8922
0.3214 5.4286 532 0.7873 0.1340 0.7873 0.8873
0.3214 5.4490 534 0.7659 0.0879 0.7659 0.8751
0.3214 5.4694 536 0.7708 0.0639 0.7708 0.8780
0.3214 5.4898 538 0.8265 0.1395 0.8265 0.9091
0.3214 5.5102 540 0.8160 0.0959 0.8160 0.9033
0.3214 5.5306 542 0.7699 0.0690 0.7699 0.8774
0.3214 5.5510 544 0.7655 0.1815 0.7655 0.8749
0.3214 5.5714 546 0.7483 0.1304 0.7483 0.8650
0.3214 5.5918 548 0.7930 0.0999 0.7930 0.8905
0.3214 5.6122 550 1.0539 0.0309 1.0539 1.0266
0.3214 5.6327 552 1.1108 0.0162 1.1108 1.0540
0.3214 5.6531 554 1.0294 0.0366 1.0294 1.0146
0.3214 5.6735 556 0.8007 0.1395 0.8007 0.8948
0.3214 5.6939 558 0.7603 0.0432 0.7603 0.8720
0.3214 5.7143 560 0.8295 0.0081 0.8295 0.9108
0.3214 5.7347 562 0.8056 0.0056 0.8056 0.8976
0.3214 5.7551 564 0.7510 0.0914 0.7510 0.8666
0.3214 5.7755 566 0.7482 0.1047 0.7482 0.8650
0.3214 5.7959 568 0.7373 0.1047 0.7373 0.8587
0.3214 5.8163 570 0.7303 0.0914 0.7303 0.8546
0.3214 5.8367 572 0.7294 0.0914 0.7294 0.8540
0.3214 5.8571 574 0.7434 0.0355 0.7434 0.8622
0.3214 5.8776 576 0.7593 0.0247 0.7593 0.8714
0.3214 5.8980 578 0.7945 0.0956 0.7945 0.8914
0.3214 5.9184 580 0.7814 0.0562 0.7814 0.8840
0.3214 5.9388 582 0.7888 0.0269 0.7888 0.8881
0.3214 5.9592 584 0.8184 -0.0082 0.8184 0.9047
0.3214 5.9796 586 0.8164 0.0226 0.8164 0.9036
0.3214 6.0 588 0.8004 0.0152 0.8004 0.8946
0.3214 6.0204 590 0.7791 0.0236 0.7791 0.8827

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task3_organization

Finetuned
(3994)
this model