ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7201
  • Qwk: -0.0532
  • Mse: 0.7201
  • Rmse: 0.8486

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 3.4821 0.0141 3.4821 1.8660
No log 0.1290 4 1.6542 -0.0276 1.6542 1.2861
No log 0.1935 6 1.3267 0.0 1.3267 1.1518
No log 0.2581 8 0.9717 -0.0345 0.9717 0.9857
No log 0.3226 10 0.8341 -0.0309 0.8341 0.9133
No log 0.3871 12 0.8454 0.0525 0.8454 0.9194
No log 0.4516 14 1.2386 -0.0170 1.2386 1.1129
No log 0.5161 16 1.1888 -0.0424 1.1888 1.0903
No log 0.5806 18 1.3351 0.1149 1.3351 1.1555
No log 0.6452 20 1.7678 0.1042 1.7678 1.3296
No log 0.7097 22 1.1318 0.0431 1.1318 1.0639
No log 0.7742 24 0.7371 0.0460 0.7371 0.8586
No log 0.8387 26 0.7519 -0.0571 0.7519 0.8671
No log 0.9032 28 0.7362 -0.1148 0.7362 0.8580
No log 0.9677 30 0.7422 -0.0131 0.7422 0.8615
No log 1.0323 32 1.0683 0.0710 1.0683 1.0336
No log 1.0968 34 1.2847 0.0411 1.2847 1.1335
No log 1.1613 36 1.0628 0.1223 1.0628 1.0309
No log 1.2258 38 0.8484 0.0873 0.8484 0.9211
No log 1.2903 40 0.6777 0.1259 0.6777 0.8232
No log 1.3548 42 0.6651 0.0555 0.6651 0.8155
No log 1.4194 44 0.7570 -0.0513 0.7570 0.8701
No log 1.4839 46 0.8008 -0.0660 0.8008 0.8949
No log 1.5484 48 0.8237 -0.0264 0.8237 0.9076
No log 1.6129 50 0.8826 -0.0852 0.8826 0.9395
No log 1.6774 52 0.8708 -0.0408 0.8708 0.9332
No log 1.7419 54 0.8709 -0.1135 0.8709 0.9332
No log 1.8065 56 0.8509 -0.0629 0.8509 0.9225
No log 1.8710 58 0.8787 0.1231 0.8787 0.9374
No log 1.9355 60 0.8972 0.0378 0.8972 0.9472
No log 2.0 62 0.8648 0.0119 0.8648 0.9299
No log 2.0645 64 1.0878 0.0355 1.0878 1.0430
No log 2.1290 66 0.9707 -0.0014 0.9707 0.9852
No log 2.1935 68 0.7685 0.1539 0.7685 0.8766
No log 2.2581 70 0.9541 0.0182 0.9541 0.9768
No log 2.3226 72 0.8046 0.0407 0.8046 0.8970
No log 2.3871 74 0.7996 0.1277 0.7996 0.8942
No log 2.4516 76 0.9916 -0.0981 0.9916 0.9958
No log 2.5161 78 0.9996 -0.0311 0.9996 0.9998
No log 2.5806 80 0.9427 0.0300 0.9427 0.9709
No log 2.6452 82 1.0035 0.0583 1.0035 1.0017
No log 2.7097 84 1.0941 0.0353 1.0941 1.0460
No log 2.7742 86 1.4153 0.0868 1.4153 1.1897
No log 2.8387 88 1.3055 0.0582 1.3055 1.1426
No log 2.9032 90 1.0009 0.0138 1.0009 1.0004
No log 2.9677 92 0.8478 -0.0735 0.8478 0.9208
No log 3.0323 94 1.3674 0.0887 1.3674 1.1694
No log 3.0968 96 1.3072 0.0709 1.3072 1.1433
No log 3.1613 98 0.9087 0.0545 0.9087 0.9533
No log 3.2258 100 0.7090 -0.0131 0.7090 0.8420
No log 3.2903 102 0.7512 0.0031 0.7512 0.8667
No log 3.3548 104 0.8227 0.0179 0.8227 0.9070
No log 3.4194 106 0.8197 -0.0573 0.8197 0.9054
No log 3.4839 108 0.8867 0.0870 0.8867 0.9416
No log 3.5484 110 0.9315 0.0206 0.9315 0.9652
No log 3.6129 112 1.3002 0.1394 1.3002 1.1402
No log 3.6774 114 1.4854 0.0602 1.4854 1.2188
No log 3.7419 116 1.3536 0.0308 1.3536 1.1634
No log 3.8065 118 1.1168 0.1578 1.1168 1.0568
No log 3.8710 120 0.8682 0.0627 0.8682 0.9318
No log 3.9355 122 0.8918 0.0421 0.8918 0.9444
No log 4.0 124 0.8122 0.0670 0.8122 0.9012
No log 4.0645 126 0.8723 0.0549 0.8723 0.9340
No log 4.1290 128 0.9288 0.0405 0.9288 0.9637
No log 4.1935 130 0.8604 0.0285 0.8604 0.9276
No log 4.2581 132 0.7932 0.0361 0.7932 0.8906
No log 4.3226 134 0.8611 -0.0291 0.8611 0.9280
No log 4.3871 136 0.8363 0.0633 0.8363 0.9145
No log 4.4516 138 0.9312 0.0721 0.9312 0.9650
No log 4.5161 140 1.1628 0.0531 1.1628 1.0783
No log 4.5806 142 1.0405 -0.0120 1.0405 1.0200
No log 4.6452 144 0.8132 0.1034 0.8132 0.9018
No log 4.7097 146 0.8408 0.1504 0.8408 0.9170
No log 4.7742 148 0.8570 0.0682 0.8570 0.9258
No log 4.8387 150 0.8840 0.1079 0.8840 0.9402
No log 4.9032 152 1.0297 -0.0245 1.0297 1.0147
No log 4.9677 154 1.0656 0.0416 1.0656 1.0323
No log 5.0323 156 0.9828 0.1039 0.9828 0.9914
No log 5.0968 158 0.9129 0.0239 0.9129 0.9554
No log 5.1613 160 0.8991 0.0214 0.8991 0.9482
No log 5.2258 162 0.8595 0.0784 0.8595 0.9271
No log 5.2903 164 0.8566 0.1093 0.8566 0.9255
No log 5.3548 166 0.8501 0.0437 0.8501 0.9220
No log 5.4194 168 1.0332 0.0431 1.0332 1.0165
No log 5.4839 170 1.0869 0.0476 1.0869 1.0425
No log 5.5484 172 0.9421 -0.0089 0.9421 0.9706
No log 5.6129 174 0.8184 0.1267 0.8184 0.9046
No log 5.6774 176 0.7970 0.1267 0.7970 0.8927
No log 5.7419 178 0.8182 0.0866 0.8182 0.9046
No log 5.8065 180 0.7996 0.1228 0.7996 0.8942
No log 5.8710 182 0.8262 0.1539 0.8262 0.9090
No log 5.9355 184 0.8045 0.1224 0.8045 0.8969
No log 6.0 186 0.8947 0.0319 0.8947 0.9459
No log 6.0645 188 0.9650 0.0125 0.9650 0.9823
No log 6.1290 190 0.8516 -0.0089 0.8516 0.9228
No log 6.1935 192 0.7821 0.1277 0.7821 0.8843
No log 6.2581 194 0.8045 0.1224 0.8045 0.8969
No log 6.3226 196 0.8711 0.0613 0.8711 0.9333
No log 6.3871 198 0.9613 0.1078 0.9613 0.9804
No log 6.4516 200 0.9058 0.1037 0.9058 0.9517
No log 6.5161 202 0.9969 0.1471 0.9969 0.9984
No log 6.5806 204 1.0158 0.1155 1.0158 1.0079
No log 6.6452 206 0.8521 0.0535 0.8521 0.9231
No log 6.7097 208 0.7990 0.1272 0.7990 0.8938
No log 6.7742 210 0.8209 0.0140 0.8209 0.9060
No log 6.8387 212 0.7926 0.0140 0.7926 0.8903
No log 6.9032 214 0.7757 0.1827 0.7757 0.8807
No log 6.9677 216 0.9929 0.0159 0.9929 0.9964
No log 7.0323 218 1.0742 -0.0120 1.0742 1.0364
No log 7.0968 220 0.9142 0.0767 0.9142 0.9561
No log 7.1613 222 0.7867 0.1714 0.7867 0.8870
No log 7.2258 224 0.7732 0.1267 0.7732 0.8793
No log 7.2903 226 0.7888 0.0376 0.7888 0.8881
No log 7.3548 228 0.8089 0.1267 0.8089 0.8994
No log 7.4194 230 0.8461 0.2036 0.8461 0.9198
No log 7.4839 232 0.8420 0.2015 0.8420 0.9176
No log 7.5484 234 0.7988 0.2057 0.7988 0.8937
No log 7.6129 236 0.7900 0.1224 0.7900 0.8888
No log 7.6774 238 0.7808 0.0376 0.7808 0.8836
No log 7.7419 240 0.7704 0.0289 0.7704 0.8777
No log 7.8065 242 0.7623 0.1144 0.7623 0.8731
No log 7.8710 244 0.7446 0.0465 0.7446 0.8629
No log 7.9355 246 0.7467 0.1387 0.7467 0.8641
No log 8.0 248 0.7571 0.0690 0.7571 0.8701
No log 8.0645 250 0.8259 -0.0054 0.8259 0.9088
No log 8.1290 252 0.7477 0.0175 0.7477 0.8647
No log 8.1935 254 0.7783 0.1810 0.7783 0.8822
No log 8.2581 256 0.7657 0.0606 0.7657 0.8750
No log 8.3226 258 0.7662 0.0606 0.7662 0.8753
No log 8.3871 260 0.7145 0.0532 0.7145 0.8453
No log 8.4516 262 0.6959 0.1347 0.6959 0.8342
No log 8.5161 264 0.7389 0.0639 0.7389 0.8596
No log 8.5806 266 0.7672 0.0909 0.7672 0.8759
No log 8.6452 268 0.6995 0.0840 0.6995 0.8364
No log 8.7097 270 0.8496 0.1120 0.8496 0.9217
No log 8.7742 272 0.9738 0.0815 0.9738 0.9868
No log 8.8387 274 0.8483 0.1122 0.8483 0.9210
No log 8.9032 276 0.7214 0.0889 0.7214 0.8494
No log 8.9677 278 0.7100 0.0879 0.7100 0.8426
No log 9.0323 280 0.7313 0.1379 0.7313 0.8551
No log 9.0968 282 0.8100 0.1762 0.8100 0.9000
No log 9.1613 284 0.8627 0.1355 0.8627 0.9288
No log 9.2258 286 0.7938 0.2430 0.7938 0.8910
No log 9.2903 288 0.8282 0.1748 0.8282 0.9101
No log 9.3548 290 0.7743 0.2132 0.7743 0.8800
No log 9.4194 292 0.7294 0.0357 0.7294 0.8541
No log 9.4839 294 0.7295 -0.0029 0.7295 0.8541
No log 9.5484 296 0.7808 0.1451 0.7808 0.8836
No log 9.6129 298 0.7656 0.1460 0.7656 0.8750
No log 9.6774 300 0.7170 0.0303 0.7170 0.8468
No log 9.7419 302 0.7723 0.0095 0.7723 0.8788
No log 9.8065 304 0.7528 0.1146 0.7528 0.8676
No log 9.8710 306 0.7990 0.0944 0.7990 0.8939
No log 9.9355 308 0.8733 0.0248 0.8733 0.9345
No log 10.0 310 0.7956 0.0940 0.7956 0.8919
No log 10.0645 312 0.8020 0.0456 0.8020 0.8956
No log 10.1290 314 0.8574 0.0333 0.8574 0.9260
No log 10.1935 316 0.7759 -0.0355 0.7759 0.8808
No log 10.2581 318 0.7746 0.2258 0.7746 0.8801
No log 10.3226 320 0.8601 0.1165 0.8601 0.9274
No log 10.3871 322 0.8221 0.0693 0.8221 0.9067
No log 10.4516 324 0.7392 0.0926 0.7392 0.8597
No log 10.5161 326 0.7403 -0.0145 0.7403 0.8604
No log 10.5806 328 0.7561 0.0357 0.7561 0.8696
No log 10.6452 330 0.7793 0.1272 0.7793 0.8828
No log 10.7097 332 0.7863 0.1321 0.7863 0.8867
No log 10.7742 334 0.7917 0.1425 0.7917 0.8898
No log 10.8387 336 0.8410 -0.0138 0.8410 0.9171
No log 10.9032 338 0.7835 0.0987 0.7835 0.8852
No log 10.9677 340 0.7456 0.0303 0.7456 0.8635
No log 11.0323 342 0.7433 0.0236 0.7433 0.8621
No log 11.0968 344 0.7381 0.0783 0.7381 0.8591
No log 11.1613 346 0.7594 0.0898 0.7594 0.8714
No log 11.2258 348 0.7457 0.0893 0.7457 0.8635
No log 11.2903 350 0.7417 0.1282 0.7417 0.8612
No log 11.3548 352 0.7388 0.0893 0.7388 0.8595
No log 11.4194 354 0.7875 0.0129 0.7875 0.8874
No log 11.4839 356 0.8085 0.0249 0.8085 0.8992
No log 11.5484 358 0.8004 0.0151 0.8004 0.8947
No log 11.6129 360 0.7952 0.0024 0.7952 0.8917
No log 11.6774 362 0.8194 0.1051 0.8194 0.9052
No log 11.7419 364 0.8576 0.0916 0.8576 0.9261
No log 11.8065 366 0.9821 0.1390 0.9821 0.9910
No log 11.8710 368 0.9040 0.1435 0.9040 0.9508
No log 11.9355 370 0.7838 0.1184 0.7838 0.8853
No log 12.0 372 0.8061 -0.0283 0.8061 0.8978
No log 12.0645 374 0.8070 -0.0316 0.8070 0.8984
No log 12.1290 376 0.7623 0.0700 0.7623 0.8731
No log 12.1935 378 0.7723 0.0123 0.7723 0.8788
No log 12.2581 380 0.7348 0.0639 0.7348 0.8572
No log 12.3226 382 0.6951 -0.0086 0.6951 0.8338
No log 12.3871 384 0.6814 -0.0086 0.6814 0.8255
No log 12.4516 386 0.6818 0.0318 0.6818 0.8257
No log 12.5161 388 0.7297 0.1899 0.7297 0.8542
No log 12.5806 390 0.7027 0.0680 0.7027 0.8383
No log 12.6452 392 0.6723 -0.0152 0.6723 0.8199
No log 12.7097 394 0.6737 -0.0096 0.6737 0.8208
No log 12.7742 396 0.6768 -0.0096 0.6768 0.8227
No log 12.8387 398 0.6808 -0.0096 0.6808 0.8251
No log 12.9032 400 0.6700 -0.0152 0.6700 0.8185
No log 12.9677 402 0.7342 0.2077 0.7342 0.8568
No log 13.0323 404 0.7417 0.1605 0.7417 0.8612
No log 13.0968 406 0.7112 0.0303 0.7112 0.8433
No log 13.1613 408 0.7386 0.1729 0.7386 0.8594
No log 13.2258 410 0.7577 0.1729 0.7577 0.8705
No log 13.2903 412 0.7430 0.1187 0.7430 0.8620
No log 13.3548 414 0.7516 0.0214 0.7516 0.8670
No log 13.4194 416 0.7298 0.0798 0.7298 0.8543
No log 13.4839 418 0.7407 0.0798 0.7407 0.8606
No log 13.5484 420 0.7881 0.1193 0.7881 0.8877
No log 13.6129 422 0.9049 0.0152 0.9049 0.9513
No log 13.6774 424 0.8126 0.1193 0.8126 0.9014
No log 13.7419 426 0.7562 0.1232 0.7562 0.8696
No log 13.8065 428 0.7861 0.0501 0.7861 0.8866
No log 13.8710 430 0.7648 0.1333 0.7648 0.8745
No log 13.9355 432 0.7377 0.0308 0.7377 0.8589
No log 14.0 434 0.7369 0.0308 0.7369 0.8584
No log 14.0645 436 0.7367 -0.0082 0.7367 0.8583
No log 14.1290 438 0.7857 0.2095 0.7857 0.8864
No log 14.1935 440 0.7783 0.1793 0.7783 0.8822
No log 14.2581 442 0.7227 0.1379 0.7227 0.8501
No log 14.3226 444 0.7128 0.0828 0.7128 0.8443
No log 14.3871 446 0.7206 0.0828 0.7206 0.8489
No log 14.4516 448 0.7179 0.1287 0.7179 0.8473
No log 14.5161 450 0.7372 0.0481 0.7372 0.8586
No log 14.5806 452 0.7758 0.1352 0.7758 0.8808
No log 14.6452 454 0.7626 0.0460 0.7626 0.8733
No log 14.7097 456 0.8285 0.0362 0.8285 0.9102
No log 14.7742 458 0.9582 0.0443 0.9582 0.9789
No log 14.8387 460 0.9145 0.0182 0.9145 0.9563
No log 14.9032 462 0.7854 0.1049 0.7854 0.8862
No log 14.9677 464 0.7566 0.0 0.7566 0.8698
No log 15.0323 466 0.7846 0.0469 0.7846 0.8858
No log 15.0968 468 0.7824 0.1138 0.7824 0.8845
No log 15.1613 470 0.8215 0.0016 0.8215 0.9064
No log 15.2258 472 0.8086 0.0913 0.8086 0.8992
No log 15.2903 474 0.7808 -0.0262 0.7808 0.8836
No log 15.3548 476 0.7626 0.1136 0.7626 0.8733
No log 15.4194 478 0.7609 0.0051 0.7609 0.8723
No log 15.4839 480 0.7550 0.0428 0.7550 0.8689
No log 15.5484 482 0.7501 0.1292 0.7501 0.8661
No log 15.6129 484 0.7687 0.0628 0.7687 0.8768
No log 15.6774 486 0.8228 0.1291 0.8228 0.9071
No log 15.7419 488 0.8050 0.0826 0.8050 0.8972
No log 15.8065 490 0.7588 0.1336 0.7588 0.8711
No log 15.8710 492 0.7250 0.0764 0.7250 0.8514
No log 15.9355 494 0.7182 0.1379 0.7182 0.8475
No log 16.0 496 0.7046 0.0889 0.7046 0.8394
No log 16.0645 498 0.7008 0.0934 0.7008 0.8372
0.2877 16.1290 500 0.6967 0.0338 0.6967 0.8347
0.2877 16.1935 502 0.7223 0.0639 0.7223 0.8499
0.2877 16.2581 504 0.7248 0.1148 0.7248 0.8514
0.2877 16.3226 506 0.6979 0.0807 0.6979 0.8354
0.2877 16.3871 508 0.7041 0.0759 0.7041 0.8391
0.2877 16.4516 510 0.7294 -0.0295 0.7294 0.8541
0.2877 16.5161 512 0.8239 0.0986 0.8239 0.9077
0.2877 16.5806 514 0.8743 0.0810 0.8743 0.9350
0.2877 16.6452 516 0.8000 0.0618 0.8000 0.8944
0.2877 16.7097 518 0.6940 0.0776 0.6940 0.8331
0.2877 16.7742 520 0.7027 0.1029 0.7027 0.8383
0.2877 16.8387 522 0.7171 0.0622 0.7171 0.8468
0.2877 16.9032 524 0.6878 0.0513 0.6878 0.8294
0.2877 16.9677 526 0.6960 0.0776 0.6960 0.8343
0.2877 17.0323 528 0.6878 0.0776 0.6878 0.8293
0.2877 17.0968 530 0.6845 0.0874 0.6845 0.8273
0.2877 17.1613 532 0.6873 0.0918 0.6873 0.8290
0.2877 17.2258 534 0.6863 0.0863 0.6863 0.8284
0.2877 17.2903 536 0.7191 0.0549 0.7191 0.8480
0.2877 17.3548 538 0.8014 0.0946 0.8014 0.8952
0.2877 17.4194 540 0.7934 0.1360 0.7934 0.8907
0.2877 17.4839 542 0.7226 0.0146 0.7226 0.8500
0.2877 17.5484 544 0.7120 0.1333 0.7120 0.8438
0.2877 17.6129 546 0.7400 0.1310 0.7400 0.8602
0.2877 17.6774 548 0.7383 0.1310 0.7383 0.8592
0.2877 17.7419 550 0.7081 0.0840 0.7081 0.8415
0.2877 17.8065 552 0.6982 0.0723 0.6982 0.8356
0.2877 17.8710 554 0.6861 0.1362 0.6861 0.8283
0.2877 17.9355 556 0.6797 0.1362 0.6797 0.8244
0.2877 18.0 558 0.6873 0.2283 0.6873 0.8290
0.2877 18.0645 560 0.7011 0.0828 0.7011 0.8373
0.2877 18.1290 562 0.7189 -0.0218 0.7189 0.8479
0.2877 18.1935 564 0.7303 -0.0209 0.7303 0.8546
0.2877 18.2581 566 0.7476 0.0831 0.7476 0.8646
0.2877 18.3226 568 0.7432 0.0831 0.7432 0.8621
0.2877 18.3871 570 0.7432 0.0831 0.7432 0.8621
0.2877 18.4516 572 0.7282 0.0175 0.7282 0.8534
0.2877 18.5161 574 0.7509 0.0798 0.7509 0.8666
0.2877 18.5806 576 0.7797 0.0361 0.7797 0.8830
0.2877 18.6452 578 0.8126 0.0393 0.8126 0.9014
0.2877 18.7097 580 0.8088 0.0376 0.8088 0.8993
0.2877 18.7742 582 0.7968 0.0884 0.7968 0.8926
0.2877 18.8387 584 0.7802 0.0495 0.7802 0.8833
0.2877 18.9032 586 0.8007 0.0558 0.8007 0.8948
0.2877 18.9677 588 0.8312 0.1079 0.8312 0.9117
0.2877 19.0323 590 0.8075 0.0240 0.8075 0.8986
0.2877 19.0968 592 0.8083 0.0291 0.8083 0.8991
0.2877 19.1613 594 0.7474 0.1133 0.7474 0.8645
0.2877 19.2258 596 0.7146 0.0585 0.7146 0.8453
0.2877 19.2903 598 0.7202 0.0428 0.7202 0.8487
0.2877 19.3548 600 0.7435 -0.0614 0.7435 0.8623
0.2877 19.4194 602 0.7440 -0.0132 0.7440 0.8626
0.2877 19.4839 604 0.7535 0.1796 0.7535 0.8680
0.2877 19.5484 606 0.8006 0.1352 0.8006 0.8948
0.2877 19.6129 608 0.8648 0.1442 0.8648 0.9299
0.2877 19.6774 610 0.7974 0.1032 0.7974 0.8930
0.2877 19.7419 612 0.7307 0.0393 0.7307 0.8548
0.2877 19.8065 614 0.7363 0.0282 0.7363 0.8581
0.2877 19.8710 616 0.7333 -0.0228 0.7333 0.8563
0.2877 19.9355 618 0.7287 -0.0170 0.7287 0.8536
0.2877 20.0 620 0.7231 -0.0145 0.7231 0.8503
0.2877 20.0645 622 0.7201 -0.0532 0.7201 0.8486

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task3_organization

Finetuned
(3994)
this model