ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9776
  • Qwk: -0.0486
  • Mse: 0.9776
  • Rmse: 0.9887

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 3.5131 -0.0047 3.5131 1.8743
No log 0.1333 4 2.4381 -0.0136 2.4381 1.5614
No log 0.2 6 1.8738 -0.0015 1.8738 1.3689
No log 0.2667 8 1.6415 0.0213 1.6415 1.2812
No log 0.3333 10 1.0148 0.0493 1.0148 1.0074
No log 0.4 12 0.7753 -0.1241 0.7753 0.8805
No log 0.4667 14 0.8139 -0.1241 0.8139 0.9021
No log 0.5333 16 0.8538 -0.1257 0.8538 0.9240
No log 0.6 18 1.1198 -0.0234 1.1198 1.0582
No log 0.6667 20 1.1956 -0.0234 1.1956 1.0934
No log 0.7333 22 1.1666 -0.0468 1.1666 1.0801
No log 0.8 24 1.3179 -0.0490 1.3179 1.1480
No log 0.8667 26 1.2304 -0.0479 1.2304 1.1092
No log 0.9333 28 1.0196 0.1120 1.0196 1.0097
No log 1.0 30 0.9136 -0.0852 0.9136 0.9558
No log 1.0667 32 0.9139 -0.1257 0.9139 0.9560
No log 1.1333 34 1.0957 -0.0117 1.0957 1.0468
No log 1.2 36 1.1909 -0.0728 1.1909 1.0913
No log 1.2667 38 1.2048 -0.0490 1.2048 1.0976
No log 1.3333 40 1.1212 -0.0207 1.1212 1.0589
No log 1.4 42 1.0794 -0.0207 1.0794 1.0389
No log 1.4667 44 1.0006 -0.0117 1.0006 1.0003
No log 1.5333 46 0.9367 -0.0031 0.9367 0.9678
No log 1.6 48 0.9499 -0.0982 0.9499 0.9746
No log 1.6667 50 0.9470 -0.0949 0.9470 0.9731
No log 1.7333 52 0.9841 -0.0972 0.9841 0.9920
No log 1.8 54 0.8885 -0.0870 0.8885 0.9426
No log 1.8667 56 0.8609 -0.0833 0.8609 0.9278
No log 1.9333 58 0.9142 -0.0474 0.9142 0.9561
No log 2.0 60 1.2423 0.0080 1.2423 1.1146
No log 2.0667 62 1.5006 -0.0247 1.5006 1.2250
No log 2.1333 64 0.9827 -0.0923 0.9827 0.9913
No log 2.2 66 0.9473 -0.1270 0.9473 0.9733
No log 2.2667 68 1.1466 -0.0149 1.1466 1.0708
No log 2.3333 70 1.2618 -0.0468 1.2618 1.1233
No log 2.4 72 1.4064 -0.0490 1.4064 1.1859
No log 2.4667 74 1.1259 -0.0648 1.1259 1.0611
No log 2.5333 76 0.9266 -0.0056 0.9266 0.9626
No log 2.6 78 0.8329 -0.1738 0.8329 0.9127
No log 2.6667 80 0.8766 -0.2201 0.8766 0.9363
No log 2.7333 82 1.2317 0.0107 1.2317 1.1098
No log 2.8 84 1.0827 -0.0097 1.0827 1.0405
No log 2.8667 86 0.7884 -0.1765 0.7884 0.8879
No log 2.9333 88 0.7640 -0.1765 0.7640 0.8741
No log 3.0 90 0.8561 -0.0371 0.8561 0.9253
No log 3.0667 92 0.9791 -0.0143 0.9791 0.9895
No log 3.1333 94 1.1213 -0.0236 1.1213 1.0589
No log 3.2 96 0.8628 0.0099 0.8628 0.9289
No log 3.2667 98 0.8022 0.0282 0.8022 0.8956
No log 3.3333 100 0.9561 0.0748 0.9561 0.9778
No log 3.4 102 1.5883 -0.0157 1.5883 1.2603
No log 3.4667 104 1.4914 -0.0367 1.4914 1.2212
No log 3.5333 106 0.8354 0.1047 0.8354 0.9140
No log 3.6 108 0.7572 -0.0069 0.7572 0.8701
No log 3.6667 110 0.7803 -0.1158 0.7803 0.8833
No log 3.7333 112 0.8736 0.0191 0.8736 0.9347
No log 3.8 114 0.9521 0.0099 0.9521 0.9758
No log 3.8667 116 0.8729 -0.0264 0.8729 0.9343
No log 3.9333 118 0.9282 0.0099 0.9282 0.9634
No log 4.0 120 0.9595 0.0071 0.9595 0.9795
No log 4.0667 122 0.9910 0.0476 0.9910 0.9955
No log 4.1333 124 1.0764 0.1064 1.0764 1.0375
No log 4.2 126 1.1015 0.0587 1.1015 1.0495
No log 4.2667 128 1.0262 -0.0076 1.0262 1.0130
No log 4.3333 130 0.8181 -0.0690 0.8181 0.9045
No log 4.4 132 0.8146 -0.0628 0.8146 0.9025
No log 4.4667 134 0.8775 0.0956 0.8775 0.9367
No log 4.5333 136 0.9801 0.0651 0.9801 0.9900
No log 4.6 138 1.0972 -0.0597 1.0972 1.0475
No log 4.6667 140 0.9882 0.0200 0.9882 0.9941
No log 4.7333 142 0.8248 0.1495 0.8248 0.9082
No log 4.8 144 0.8245 0.1001 0.8245 0.9080
No log 4.8667 146 0.7699 0.0863 0.7699 0.8774
No log 4.9333 148 0.8749 0.0525 0.8749 0.9354
No log 5.0 150 1.0273 0.0304 1.0273 1.0136
No log 5.0667 152 0.8164 -0.0228 0.8164 0.9035
No log 5.1333 154 0.7985 -0.1081 0.7985 0.8936
No log 5.2 156 0.8151 -0.1100 0.8151 0.9028
No log 5.2667 158 0.9998 0.0424 0.9998 0.9999
No log 5.3333 160 1.1507 -0.0877 1.1507 1.0727
No log 5.4 162 1.2389 -0.1848 1.2389 1.1131
No log 5.4667 164 1.0366 -0.0031 1.0366 1.0181
No log 5.5333 166 0.8563 -0.1535 0.8563 0.9254
No log 5.6 168 0.8931 -0.1208 0.8931 0.9450
No log 5.6667 170 0.8559 -0.1964 0.8559 0.9251
No log 5.7333 172 0.9774 0.0456 0.9774 0.9886
No log 5.8 174 1.0261 0.0727 1.0261 1.0129
No log 5.8667 176 0.9066 -0.0170 0.9066 0.9521
No log 5.9333 178 0.8845 -0.0992 0.8845 0.9405
No log 6.0 180 0.8732 -0.1964 0.8732 0.9344
No log 6.0667 182 0.9828 0.0831 0.9828 0.9914
No log 6.1333 184 1.2192 -0.1240 1.2192 1.1042
No log 6.2 186 1.0906 0.0175 1.0906 1.0443
No log 6.2667 188 0.8674 -0.0204 0.8674 0.9313
No log 6.3333 190 0.8596 0.0282 0.8596 0.9271
No log 6.4 192 0.9030 0.1001 0.9030 0.9503
No log 6.4667 194 0.9673 0.0304 0.9673 0.9835
No log 6.5333 196 0.8846 -0.0583 0.8846 0.9405
No log 6.6 198 0.9039 -0.0163 0.9039 0.9508
No log 6.6667 200 1.0723 -0.0151 1.0723 1.0355
No log 6.7333 202 1.2864 -0.1508 1.2864 1.1342
No log 6.8 204 1.2213 -0.1201 1.2213 1.1051
No log 6.8667 206 0.9545 -0.0209 0.9545 0.9770
No log 6.9333 208 0.9092 -0.1529 0.9092 0.9535
No log 7.0 210 0.9356 -0.0173 0.9356 0.9672
No log 7.0667 212 0.8927 -0.1397 0.8927 0.9448
No log 7.1333 214 1.1073 -0.0877 1.1073 1.0523
No log 7.2 216 1.6210 -0.0961 1.6210 1.2732
No log 7.2667 218 1.5771 -0.0435 1.5771 1.2558
No log 7.3333 220 1.1479 -0.0936 1.1479 1.0714
No log 7.4 222 0.9211 -0.0658 0.9211 0.9597
No log 7.4667 224 0.8975 -0.2126 0.8975 0.9474
No log 7.5333 226 1.0027 -0.0373 1.0027 1.0013
No log 7.6 228 1.0472 -0.0818 1.0472 1.0233
No log 7.6667 230 0.9830 -0.0767 0.9830 0.9915
No log 7.7333 232 0.9349 -0.1668 0.9349 0.9669
No log 7.8 234 1.0282 0.0016 1.0282 1.0140
No log 7.8667 236 1.0299 0.0392 1.0299 1.0148
No log 7.9333 238 0.9387 -0.1334 0.9387 0.9689
No log 8.0 240 0.9171 -0.2326 0.9171 0.9577
No log 8.0667 242 0.9730 0.0586 0.9731 0.9864
No log 8.1333 244 1.1650 -0.1186 1.1650 1.0793
No log 8.2 246 1.1539 -0.0482 1.1539 1.0742
No log 8.2667 248 1.0349 -0.1152 1.0349 1.0173
No log 8.3333 250 0.9398 -0.1675 0.9398 0.9695
No log 8.4 252 0.9130 -0.1233 0.9130 0.9555
No log 8.4667 254 0.9593 -0.0766 0.9593 0.9794
No log 8.5333 256 0.9160 -0.1233 0.9160 0.9571
No log 8.6 258 0.9095 -0.1233 0.9095 0.9537
No log 8.6667 260 0.9293 -0.0240 0.9293 0.9640
No log 8.7333 262 0.9654 -0.0753 0.9654 0.9825
No log 8.8 264 1.0399 0.0099 1.0399 1.0198
No log 8.8667 266 0.9838 -0.1180 0.9838 0.9919
No log 8.9333 268 0.9702 -0.2649 0.9702 0.9850
No log 9.0 270 0.9467 -0.2108 0.9467 0.9730
No log 9.0667 272 0.8985 -0.1538 0.8985 0.9479
No log 9.1333 274 0.9196 -0.1662 0.9196 0.9590
No log 9.2 276 1.1337 -0.1236 1.1337 1.0648
No log 9.2667 278 1.2378 -0.0966 1.2378 1.1125
No log 9.3333 280 1.0888 -0.0892 1.0888 1.0435
No log 9.4 282 0.9050 -0.1172 0.9050 0.9513
No log 9.4667 284 0.8758 -0.2884 0.8758 0.9358
No log 9.5333 286 0.8855 -0.2997 0.8855 0.9410
No log 9.6 288 0.8558 -0.2067 0.8558 0.9251
No log 9.6667 290 0.8522 -0.1163 0.8522 0.9232
No log 9.7333 292 1.1085 -0.1232 1.1085 1.0528
No log 9.8 294 1.2754 -0.0961 1.2754 1.1293
No log 9.8667 296 1.2014 -0.1238 1.2014 1.0961
No log 9.9333 298 1.0478 -0.0471 1.0478 1.0236
No log 10.0 300 0.9106 -0.1111 0.9106 0.9543
No log 10.0667 302 0.8955 -0.1531 0.8955 0.9463
No log 10.1333 304 0.8917 -0.1535 0.8917 0.9443
No log 10.2 306 0.9311 -0.0316 0.9311 0.9649
No log 10.2667 308 0.9587 0.0476 0.9587 0.9791
No log 10.3333 310 0.8815 0.0680 0.8815 0.9389
No log 10.4 312 0.8335 -0.2144 0.8335 0.9130
No log 10.4667 314 0.8314 -0.2564 0.8314 0.9118
No log 10.5333 316 0.8088 -0.1697 0.8088 0.8993
No log 10.6 318 0.8758 0.1148 0.8758 0.9358
No log 10.6667 320 1.0104 -0.0558 1.0104 1.0052
No log 10.7333 322 0.9818 0.0642 0.9818 0.9908
No log 10.8 324 0.8745 -0.0252 0.8745 0.9351
No log 10.8667 326 0.8403 -0.1706 0.8403 0.9167
No log 10.9333 328 0.8489 -0.1706 0.8489 0.9213
No log 11.0 330 0.8710 -0.0228 0.8710 0.9333
No log 11.0667 332 0.9946 -0.1214 0.9946 0.9973
No log 11.1333 334 1.0534 -0.1276 1.0534 1.0263
No log 11.2 336 0.9664 -0.0033 0.9664 0.9831
No log 11.2667 338 0.8673 0.0225 0.8673 0.9313
No log 11.3333 340 0.8289 -0.0240 0.8289 0.9104
No log 11.4 342 0.8358 0.0670 0.8358 0.9142
No log 11.4667 344 0.8397 0.0628 0.8397 0.9164
No log 11.5333 346 0.8100 -0.0188 0.8100 0.9000
No log 11.6 348 0.7887 -0.1223 0.7887 0.8881
No log 11.6667 350 0.7983 -0.1158 0.7983 0.8935
No log 11.7333 352 0.8266 0.0723 0.8266 0.9091
No log 11.8 354 0.8676 0.0956 0.8676 0.9315
No log 11.8667 356 0.9649 0.0233 0.9649 0.9823
No log 11.9333 358 1.0372 -0.0558 1.0372 1.0184
No log 12.0 360 0.9375 0.0676 0.9375 0.9682
No log 12.0667 362 0.8667 0.0953 0.8667 0.9310
No log 12.1333 364 0.8363 -0.0690 0.8363 0.9145
No log 12.2 366 0.8260 -0.1675 0.8260 0.9089
No log 12.2667 368 0.8511 -0.0690 0.8511 0.9225
No log 12.3333 370 0.9107 0.0152 0.9107 0.9543
No log 12.4 372 0.9321 0.0490 0.9321 0.9654
No log 12.4667 374 0.8635 -0.0690 0.8635 0.9292
No log 12.5333 376 0.8385 -0.1168 0.8385 0.9157
No log 12.6 378 0.8506 -0.0228 0.8506 0.9223
No log 12.6667 380 0.9131 0.0123 0.9131 0.9556
No log 12.7333 382 1.0454 0.0233 1.0454 1.0225
No log 12.8 384 1.0903 -0.0583 1.0903 1.0442
No log 12.8667 386 1.0069 -0.0182 1.0069 1.0035
No log 12.9333 388 0.8618 0.1449 0.8618 0.9283
No log 13.0 390 0.8220 0.1148 0.8220 0.9066
No log 13.0667 392 0.7961 -0.0660 0.7961 0.8922
No log 13.1333 394 0.8033 -0.1158 0.8033 0.8963
No log 13.2 396 0.8689 0.1096 0.8689 0.9322
No log 13.2667 398 1.0573 -0.0513 1.0573 1.0283
No log 13.3333 400 1.0554 -0.0513 1.0554 1.0273
No log 13.4 402 1.0422 -0.0118 1.0422 1.0209
No log 13.4667 404 0.9958 -0.0052 0.9958 0.9979
No log 13.5333 406 0.9793 0.0438 0.9793 0.9896
No log 13.6 408 1.0005 0.0016 1.0005 1.0003
No log 13.6667 410 1.0218 -0.0425 1.0218 1.0108
No log 13.7333 412 1.0467 -0.0513 1.0467 1.0231
No log 13.8 414 1.0040 -0.0118 1.0040 1.0020
No log 13.8667 416 0.9532 0.0016 0.9532 0.9763
No log 13.9333 418 0.9559 -0.0008 0.9559 0.9777
No log 14.0 420 1.0258 -0.0200 1.0258 1.0128
No log 14.0667 422 1.0561 -0.0200 1.0561 1.0277
No log 14.1333 424 1.0139 -0.0441 1.0139 1.0069
No log 14.2 426 0.9686 -0.0408 0.9686 0.9842
No log 14.2667 428 0.9869 -0.0076 0.9869 0.9934
No log 14.3333 430 0.9903 0.0362 0.9903 0.9952
No log 14.4 432 0.9468 0.0490 0.9468 0.9730
No log 14.4667 434 0.9366 0.0490 0.9366 0.9678
No log 14.5333 436 0.9002 0.0095 0.9002 0.9488
No log 14.6 438 0.8666 -0.1656 0.8666 0.9309
No log 14.6667 440 0.8570 -0.1172 0.8570 0.9257
No log 14.7333 442 0.8645 -0.1176 0.8645 0.9298
No log 14.8 444 0.8718 -0.1176 0.8718 0.9337
No log 14.8667 446 0.9087 0.0867 0.9087 0.9533
No log 14.9333 448 0.9223 0.0748 0.9223 0.9604
No log 15.0 450 0.9028 0.0442 0.9028 0.9501
No log 15.0667 452 0.8608 0.0191 0.8608 0.9278
No log 15.1333 454 0.8544 -0.0690 0.8544 0.9243
No log 15.2 456 0.8816 0.0670 0.8816 0.9389
No log 15.2667 458 0.9061 -0.0295 0.9061 0.9519
No log 15.3333 460 0.9230 -0.0755 0.9230 0.9607
No log 15.4 462 0.9615 -0.0425 0.9615 0.9806
No log 15.4667 464 1.0388 -0.0143 1.0388 1.0192
No log 15.5333 466 1.0613 -0.0558 1.0613 1.0302
No log 15.6 468 0.9523 -0.0122 0.9523 0.9758
No log 15.6667 470 0.8562 0.0129 0.8562 0.9253
No log 15.7333 472 0.8286 -0.0179 0.8286 0.9103
No log 15.8 474 0.8430 -0.0179 0.8430 0.9182
No log 15.8667 476 0.8688 -0.0704 0.8688 0.9321
No log 15.9333 478 0.8951 0.0600 0.8951 0.9461
No log 16.0 480 0.9302 0.0456 0.9302 0.9645
No log 16.0667 482 0.9860 0.0304 0.9860 0.9930
No log 16.1333 484 0.9396 -0.0031 0.9396 0.9693
No log 16.2 486 0.8675 0.0600 0.8675 0.9314
No log 16.2667 488 0.8016 -0.0179 0.8016 0.8953
No log 16.3333 490 0.7812 -0.1697 0.7812 0.8839
No log 16.4 492 0.7698 -0.1223 0.7698 0.8774
No log 16.4667 494 0.8518 0.1047 0.8518 0.9229
No log 16.5333 496 1.0567 -0.0916 1.0567 1.0279
No log 16.6 498 1.1061 -0.1281 1.1061 1.0517
0.2931 16.6667 500 1.0393 -0.0143 1.0393 1.0195
0.2931 16.7333 502 0.9329 0.0525 0.9329 0.9659
0.2931 16.8 504 0.9059 0.0600 0.9059 0.9518
0.2931 16.8667 506 0.9510 0.0562 0.9510 0.9752
0.2931 16.9333 508 1.0373 -0.0076 1.0373 1.0185
0.2931 17.0 510 1.1550 -0.0595 1.1550 1.0747
0.2931 17.0667 512 1.2322 -0.1282 1.2322 1.1100
0.2931 17.1333 514 1.1344 -0.0943 1.1344 1.0651
0.2931 17.2 516 0.9585 0.0377 0.9585 0.9790
0.2931 17.2667 518 0.8701 0.0600 0.8701 0.9328
0.2931 17.3333 520 0.8325 0.0680 0.8325 0.9124
0.2931 17.4 522 0.8413 0.1096 0.8413 0.9172
0.2931 17.4667 524 0.8374 0.0680 0.8374 0.9151
0.2931 17.5333 526 0.8357 -0.0550 0.8357 0.9142
0.2931 17.6 528 0.8365 -0.1040 0.8365 0.9146
0.2931 17.6667 530 0.8296 0.0690 0.8296 0.9108
0.2931 17.7333 532 0.9083 0.0913 0.9083 0.9531
0.2931 17.8 534 1.0396 -0.0885 1.0396 1.0196
0.2931 17.8667 536 1.1124 -0.0936 1.1124 1.0547
0.2931 17.9333 538 1.0843 -0.0936 1.0843 1.0413
0.2931 18.0 540 1.0450 -0.0583 1.0450 1.0223
0.2931 18.0667 542 0.9776 -0.0486 0.9776 0.9887

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task3_organization

Finetuned
(3994)
this model