ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9081
  • Qwk: 0.0670
  • Mse: 0.9081
  • Rmse: 0.9529

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 3.8651 0.0104 3.8651 1.9660
No log 0.0976 4 2.1298 0.0213 2.1298 1.4594
No log 0.1463 6 1.2484 0.0 1.2484 1.1173
No log 0.1951 8 1.1012 -0.0720 1.1012 1.0494
No log 0.2439 10 0.7365 0.0460 0.7365 0.8582
No log 0.2927 12 0.7494 -0.0101 0.7494 0.8657
No log 0.3415 14 0.7442 -0.0662 0.7442 0.8627
No log 0.3902 16 1.0377 -0.0686 1.0377 1.0187
No log 0.4390 18 1.6287 0.0172 1.6287 1.2762
No log 0.4878 20 1.5361 0.0172 1.5361 1.2394
No log 0.5366 22 1.0249 -0.0117 1.0249 1.0124
No log 0.5854 24 0.8673 0.0486 0.8673 0.9313
No log 0.6341 26 0.8726 0.1316 0.8726 0.9341
No log 0.6829 28 1.2429 0.0585 1.2429 1.1148
No log 0.7317 30 1.2144 0.0852 1.2144 1.1020
No log 0.7805 32 1.1029 0.0193 1.1029 1.0502
No log 0.8293 34 1.1001 -0.0146 1.1001 1.0489
No log 0.8780 36 1.0166 -0.0746 1.0166 1.0083
No log 0.9268 38 0.9808 -0.0781 0.9808 0.9904
No log 0.9756 40 1.0573 0.0142 1.0573 1.0282
No log 1.0244 42 0.9536 0.0042 0.9536 0.9765
No log 1.0732 44 0.9885 0.0336 0.9885 0.9942
No log 1.1220 46 0.8140 0.1212 0.8140 0.9022
No log 1.1707 48 0.9662 -0.0927 0.9662 0.9829
No log 1.2195 50 0.8738 0.0392 0.8738 0.9348
No log 1.2683 52 0.7635 -0.0316 0.7635 0.8738
No log 1.3171 54 0.8573 -0.0355 0.8573 0.9259
No log 1.3659 56 0.8222 -0.0465 0.8222 0.9068
No log 1.4146 58 0.9239 -0.1145 0.9239 0.9612
No log 1.4634 60 0.9714 -0.0497 0.9714 0.9856
No log 1.5122 62 0.9861 0.1520 0.9861 0.9930
No log 1.5610 64 0.9636 0.1468 0.9636 0.9816
No log 1.6098 66 0.9731 0.1341 0.9731 0.9865
No log 1.6585 68 0.9271 0.1754 0.9271 0.9629
No log 1.7073 70 1.0480 0.1012 1.0480 1.0237
No log 1.7561 72 0.9559 0.1309 0.9559 0.9777
No log 1.8049 74 0.8245 0.0633 0.8245 0.9080
No log 1.8537 76 0.8352 0.1471 0.8352 0.9139
No log 1.9024 78 1.0849 0.1013 1.0849 1.0416
No log 1.9512 80 1.1001 0.1277 1.1001 1.0489
No log 2.0 82 1.0098 0.0431 1.0098 1.0049
No log 2.0488 84 1.0310 0.0906 1.0310 1.0154
No log 2.0976 86 1.0700 0.1013 1.0700 1.0344
No log 2.1463 88 1.0058 0.1013 1.0058 1.0029
No log 2.1951 90 0.8956 0.0822 0.8956 0.9464
No log 2.2439 92 0.8002 0.0449 0.8002 0.8945
No log 2.2927 94 0.7716 -0.0030 0.7716 0.8784
No log 2.3415 96 0.7965 0.0978 0.7965 0.8925
No log 2.3902 98 0.8653 0.0930 0.8653 0.9302
No log 2.4390 100 0.9427 -0.1093 0.9427 0.9709
No log 2.4878 102 0.9672 -0.1501 0.9672 0.9835
No log 2.5366 104 0.9495 -0.0164 0.9495 0.9744
No log 2.5854 106 0.9484 -0.0794 0.9484 0.9738
No log 2.6341 108 1.0152 -0.1119 1.0152 1.0076
No log 2.6829 110 0.9707 0.0178 0.9707 0.9852
No log 2.7317 112 1.2795 -0.0023 1.2795 1.1311
No log 2.7805 114 1.0988 0.0083 1.0988 1.0483
No log 2.8293 116 1.0719 0.0775 1.0719 1.0353
No log 2.8780 118 1.0085 0.0161 1.0085 1.0042
No log 2.9268 120 1.0834 -0.0595 1.0834 1.0409
No log 2.9756 122 1.1516 0.0542 1.1516 1.0731
No log 3.0244 124 0.8849 0.0998 0.8849 0.9407
No log 3.0732 126 1.0663 -0.0090 1.0663 1.0326
No log 3.1220 128 0.9799 0.0363 0.9799 0.9899
No log 3.1707 130 0.8397 0.1443 0.8397 0.9163
No log 3.2195 132 0.9611 -0.0253 0.9611 0.9804
No log 3.2683 134 0.8508 0.0617 0.8508 0.9224
No log 3.3171 136 0.9441 0.0041 0.9441 0.9717
No log 3.3659 138 0.8669 -0.0262 0.8669 0.9311
No log 3.4146 140 0.8317 0.0030 0.8317 0.9120
No log 3.4634 142 0.8919 -0.0116 0.8919 0.9444
No log 3.5122 144 0.8703 -0.0350 0.8703 0.9329
No log 3.5610 146 0.8473 -0.0118 0.8473 0.9205
No log 3.6098 148 0.8859 -0.0195 0.8859 0.9412
No log 3.6585 150 0.8839 -0.0030 0.8839 0.9402
No log 3.7073 152 0.8900 0.0058 0.8900 0.9434
No log 3.7561 154 0.8931 -0.1121 0.8931 0.9450
No log 3.8049 156 0.8986 -0.0086 0.8986 0.9480
No log 3.8537 158 0.9592 0.0307 0.9592 0.9794
No log 3.9024 160 0.9539 -0.1126 0.9539 0.9767
No log 3.9512 162 0.9974 -0.1093 0.9974 0.9987
No log 4.0 164 0.9897 -0.1126 0.9897 0.9948
No log 4.0488 166 1.0040 -0.0470 1.0040 1.0020
No log 4.0976 168 0.9774 -0.0238 0.9774 0.9886
No log 4.1463 170 0.8991 -0.0583 0.8991 0.9482
No log 4.1951 172 1.1009 -0.0877 1.1009 1.0492
No log 4.2439 174 1.1482 -0.0245 1.1482 1.0715
No log 4.2927 176 0.8712 -0.0252 0.8712 0.9334
No log 4.3415 178 0.8553 0.0528 0.8553 0.9248
No log 4.3902 180 1.0396 -0.0204 1.0396 1.0196
No log 4.4390 182 0.9727 0.0665 0.9727 0.9863
No log 4.4878 184 0.9703 -0.0643 0.9703 0.9851
No log 4.5366 186 0.9934 -0.0717 0.9934 0.9967
No log 4.5854 188 0.8963 0.0376 0.8963 0.9468
No log 4.6341 190 0.8748 0.0248 0.8748 0.9353
No log 4.6829 192 0.8617 -0.0761 0.8617 0.9283
No log 4.7317 194 0.7674 0.0922 0.7674 0.8760
No log 4.7805 196 0.7625 0.0414 0.7625 0.8732
No log 4.8293 198 0.7710 0.0414 0.7710 0.8781
No log 4.8780 200 0.8008 0.0879 0.8008 0.8949
No log 4.9268 202 0.8498 0.0633 0.8498 0.9219
No log 4.9756 204 0.9241 0.1697 0.9241 0.9613
No log 5.0244 206 0.9210 0.1290 0.9210 0.9597
No log 5.0732 208 0.9518 0.0113 0.9518 0.9756
No log 5.1220 210 0.9437 0.0866 0.9437 0.9714
No log 5.1707 212 0.9266 0.0851 0.9266 0.9626
No log 5.2195 214 0.9343 -0.0322 0.9343 0.9666
No log 5.2683 216 0.8440 0.0471 0.8440 0.9187
No log 5.3171 218 0.8393 0.0557 0.8393 0.9161
No log 5.3659 220 0.8057 -0.0032 0.8057 0.8976
No log 5.4146 222 0.8507 -0.0699 0.8507 0.9223
No log 5.4634 224 0.8313 -0.0240 0.8313 0.9118
No log 5.5122 226 0.8708 0.0664 0.8708 0.9332
No log 5.5610 228 1.0088 -0.0820 1.0088 1.0044
No log 5.6098 230 0.8836 0.0654 0.8836 0.9400
No log 5.6585 232 0.8719 0.0574 0.8719 0.9337
No log 5.7073 234 0.8645 0.0574 0.8645 0.9298
No log 5.7561 236 0.8021 0.0 0.8021 0.8956
No log 5.8049 238 0.9537 -0.0685 0.9537 0.9766
No log 5.8537 240 0.9047 -0.0442 0.9047 0.9512
No log 5.9024 242 0.8147 -0.0195 0.8147 0.9026
No log 5.9512 244 1.2255 -0.0030 1.2255 1.1070
No log 6.0 246 1.3494 0.0112 1.3494 1.1616
No log 6.0488 248 1.0098 -0.0157 1.0098 1.0049
No log 6.0976 250 0.7877 0.0414 0.7877 0.8875
No log 6.1463 252 0.8525 -0.0451 0.8525 0.9233
No log 6.1951 254 0.8923 -0.0551 0.8923 0.9446
No log 6.2439 256 0.8020 -0.0451 0.8020 0.8955
No log 6.2927 258 0.8663 -0.0336 0.8663 0.9307
No log 6.3415 260 0.9907 -0.0870 0.9907 0.9953
No log 6.3902 262 0.8991 -0.0355 0.8991 0.9482
No log 6.4390 264 0.8898 0.0376 0.8898 0.9433
No log 6.4878 266 0.9060 0.0883 0.9060 0.9518
No log 6.5366 268 0.8491 0.0846 0.8491 0.9215
No log 6.5854 270 0.8158 0.0412 0.8158 0.9032
No log 6.6341 272 0.7877 0.0922 0.7877 0.8875
No log 6.6829 274 0.7793 0.0922 0.7793 0.8828
No log 6.7317 276 0.7924 0.1675 0.7924 0.8902
No log 6.7805 278 0.8443 0.2078 0.8443 0.9188
No log 6.8293 280 0.8587 0.0889 0.8587 0.9267
No log 6.8780 282 0.8876 0.0537 0.8876 0.9421
No log 6.9268 284 0.8872 0.0441 0.8872 0.9419
No log 6.9756 286 0.9035 0.1882 0.9035 0.9505
No log 7.0244 288 0.8778 0.1004 0.8778 0.9369
No log 7.0732 290 0.8037 0.0874 0.8037 0.8965
No log 7.1220 292 0.8300 0.0557 0.8300 0.9110
No log 7.1707 294 0.7803 0.0973 0.7803 0.8833
No log 7.2195 296 0.7977 0.0680 0.7977 0.8931
No log 7.2683 298 0.8179 0.0680 0.8179 0.9044
No log 7.3171 300 0.8268 0.1298 0.8268 0.9093
No log 7.3659 302 0.9186 0.0248 0.9186 0.9584
No log 7.4146 304 0.9516 -0.0035 0.9516 0.9755
No log 7.4634 306 0.8476 0.1734 0.8476 0.9206
No log 7.5122 308 0.8272 0.0214 0.8272 0.9095
No log 7.5610 310 0.8001 0.0214 0.8001 0.8945
No log 7.6098 312 0.7554 0.0454 0.7554 0.8691
No log 7.6585 314 0.7957 -0.0449 0.7957 0.8920
No log 7.7073 316 0.7832 0.0031 0.7832 0.8850
No log 7.7561 318 0.7920 -0.0218 0.7920 0.8899
No log 7.8049 320 0.9869 0.0224 0.9869 0.9934
No log 7.8537 322 1.0158 0.0224 1.0158 1.0079
No log 7.9024 324 0.8640 -0.0699 0.8640 0.9295
No log 7.9512 326 0.9016 -0.0180 0.9016 0.9495
No log 8.0 328 0.9617 -0.1236 0.9617 0.9807
No log 8.0488 330 0.8587 -0.0195 0.8587 0.9267
No log 8.0976 332 0.7640 0.0869 0.7640 0.8740
No log 8.1463 334 0.9722 -0.0163 0.9722 0.9860
No log 8.1951 336 1.2178 0.0695 1.2178 1.1035
No log 8.2439 338 1.1585 0.0536 1.1585 1.0764
No log 8.2927 340 0.8978 -0.0441 0.8978 0.9475
No log 8.3415 342 0.7403 0.1371 0.7403 0.8604
No log 8.3902 344 0.8166 0.0116 0.8166 0.9036
No log 8.4390 346 0.8587 -0.0949 0.8587 0.9267
No log 8.4878 348 0.8136 0.0973 0.8136 0.9020
No log 8.5366 350 0.7978 0.0821 0.7978 0.8932
No log 8.5854 352 0.8414 0.0956 0.8414 0.9173
No log 8.6341 354 0.8151 0.1097 0.8151 0.9028
No log 8.6829 356 0.7458 0.1311 0.7458 0.8636
No log 8.7317 358 0.7264 0.0821 0.7264 0.8523
No log 8.7805 360 0.7510 0.0869 0.7510 0.8666
No log 8.8293 362 0.7847 0.0821 0.7847 0.8858
No log 8.8780 364 0.8340 -0.0218 0.8340 0.9133
No log 8.9268 366 0.8621 0.0175 0.8621 0.9285
No log 8.9756 368 0.8039 0.0269 0.8039 0.8966
No log 9.0244 370 0.7960 0.0987 0.7960 0.8922
No log 9.0732 372 0.8310 -0.0133 0.8310 0.9116
No log 9.1220 374 0.7685 0.0518 0.7685 0.8767
No log 9.1707 376 0.7661 0.1347 0.7661 0.8753
No log 9.2195 378 0.7933 0.0840 0.7933 0.8907
No log 9.2683 380 0.8528 0.0732 0.8528 0.9235
No log 9.3171 382 0.9049 0.0456 0.9049 0.9513
No log 9.3659 384 0.8319 0.1096 0.8319 0.9121
No log 9.4146 386 0.7594 0.0821 0.7594 0.8715
No log 9.4634 388 0.7648 0.0973 0.7648 0.8746
No log 9.5122 390 0.7555 0.0414 0.7555 0.8692
No log 9.5610 392 0.7590 0.0821 0.7590 0.8712
No log 9.6098 394 0.7753 0.1254 0.7753 0.8805
No log 9.6585 396 0.8160 0.1553 0.8160 0.9033
No log 9.7073 398 0.8326 0.0269 0.8326 0.9125
No log 9.7561 400 0.8418 0.1298 0.8418 0.9175
No log 9.8049 402 0.8827 0.0964 0.8827 0.9395
No log 9.8537 404 0.8857 0.0236 0.8857 0.9411
No log 9.9024 406 0.9011 0.0861 0.9011 0.9493
No log 9.9512 408 0.8667 0.0856 0.8667 0.9310
No log 10.0 410 0.8303 0.0509 0.8303 0.9112
No log 10.0488 412 0.7566 0.0031 0.7566 0.8698
No log 10.0976 414 0.6928 -0.0065 0.6928 0.8323
No log 10.1463 416 0.7110 0.1444 0.7110 0.8432
No log 10.1951 418 0.7660 0.1740 0.7660 0.8752
No log 10.2439 420 0.7923 0.1249 0.7923 0.8901
No log 10.2927 422 0.8422 0.0509 0.8422 0.9177
No log 10.3415 424 0.8950 -0.0148 0.8950 0.9461
No log 10.3902 426 0.8452 0.1292 0.8452 0.9194
No log 10.4390 428 0.8944 -0.0262 0.8944 0.9457
No log 10.4878 430 0.9044 0.0095 0.9044 0.9510
No log 10.5366 432 0.8247 0.0247 0.8247 0.9082
No log 10.5854 434 0.7806 0.0768 0.7806 0.8835
No log 10.6341 436 0.7760 0.0768 0.7760 0.8809
No log 10.6829 438 0.8149 0.0183 0.8149 0.9027
No log 10.7317 440 0.8155 0.0152 0.8155 0.9031
No log 10.7805 442 0.7966 -0.0195 0.7966 0.8925
No log 10.8293 444 0.8219 0.1347 0.8219 0.9066
No log 10.8780 446 0.8399 0.1292 0.8399 0.9165
No log 10.9268 448 0.8352 0.0303 0.8352 0.9139
No log 10.9756 450 0.8578 0.1001 0.8578 0.9261
No log 11.0244 452 0.8797 0.0016 0.8797 0.9379
No log 11.0732 454 0.9542 -0.0076 0.9542 0.9768
No log 11.1220 456 0.9305 -0.0008 0.9305 0.9646
No log 11.1707 458 0.8813 -0.0209 0.8813 0.9388
No log 11.2195 460 0.8841 0.0846 0.8841 0.9403
No log 11.2683 462 0.9050 0.0580 0.9050 0.9513
No log 11.3171 464 0.8506 0.1340 0.8506 0.9223
No log 11.3659 466 0.8758 -0.0008 0.8758 0.9358
No log 11.4146 468 0.9247 -0.0143 0.9247 0.9616
No log 11.4634 470 0.8740 0.0316 0.8740 0.9349
No log 11.5122 472 0.7679 0.1311 0.7679 0.8763
No log 11.5610 474 0.7872 0.0 0.7872 0.8872
No log 11.6098 476 0.8255 0.0030 0.8255 0.9085
No log 11.6585 478 0.8296 0.1354 0.8296 0.9108
No log 11.7073 480 0.9201 -0.0441 0.9201 0.9592
No log 11.7561 482 0.9484 -0.0837 0.9484 0.9739
No log 11.8049 484 0.8664 0.0041 0.8664 0.9308
No log 11.8537 486 0.8033 0.1254 0.8033 0.8963
No log 11.9024 488 0.7775 0.1371 0.7775 0.8818
No log 11.9512 490 0.7813 0.1311 0.7813 0.8839
No log 12.0 492 0.8335 0.0525 0.8335 0.9130
No log 12.0488 494 0.8646 0.0016 0.8646 0.9299
No log 12.0976 496 0.8313 0.0196 0.8313 0.9118
No log 12.1463 498 0.8326 0.1287 0.8326 0.9125
0.2918 12.1951 500 0.8529 0.1277 0.8529 0.9235
0.2918 12.2439 502 0.8448 0.1277 0.8448 0.9191
0.2918 12.2927 504 0.8272 0.1333 0.8272 0.9095
0.2918 12.3415 506 0.8160 0.1340 0.8160 0.9033
0.2918 12.3902 508 0.8345 0.0893 0.8345 0.9135
0.2918 12.4390 510 0.8714 -0.0238 0.8714 0.9335
0.2918 12.4878 512 0.8888 0.1277 0.8888 0.9427
0.2918 12.5366 514 0.9271 0.0633 0.9271 0.9628
0.2918 12.5854 516 0.9081 0.0670 0.9081 0.9529

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task3_organization

Finetuned
(3994)
this model