ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7915
  • Qwk: 0.0318
  • Mse: 0.7915
  • Rmse: 0.8897

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0215 2 3.4211 -0.0047 3.4211 1.8496
No log 0.0430 4 1.8284 0.0172 1.8284 1.3522
No log 0.0645 6 1.0244 -0.0101 1.0244 1.0121
No log 0.0860 8 0.7656 0.0260 0.7656 0.8750
No log 0.1075 10 0.7627 -0.0264 0.7627 0.8734
No log 0.1290 12 0.7332 -0.0690 0.7332 0.8563
No log 0.1505 14 0.7279 0.0541 0.7279 0.8531
No log 0.1720 16 0.7215 0.0723 0.7215 0.8494
No log 0.1935 18 1.0622 0.1211 1.0622 1.0306
No log 0.2151 20 0.8976 0.1152 0.8976 0.9474
No log 0.2366 22 0.7644 -0.0711 0.7644 0.8743
No log 0.2581 24 0.7143 -0.0069 0.7143 0.8452
No log 0.2796 26 0.7774 0.0304 0.7774 0.8817
No log 0.3011 28 1.1562 0.1087 1.1562 1.0752
No log 0.3226 30 0.7096 0.2105 0.7096 0.8424
No log 0.3441 32 0.7412 0.0555 0.7412 0.8609
No log 0.3656 34 0.7387 0.0555 0.7387 0.8595
No log 0.3871 36 0.8077 0.0490 0.8077 0.8987
No log 0.4086 38 1.0793 0.1150 1.0793 1.0389
No log 0.4301 40 1.0100 0.1228 1.0100 1.0050
No log 0.4516 42 0.7402 0.1440 0.7402 0.8604
No log 0.4731 44 0.7015 0.0759 0.7015 0.8376
No log 0.4946 46 0.8385 0.1445 0.8385 0.9157
No log 0.5161 48 1.0091 0.1044 1.0091 1.0046
No log 0.5376 50 0.8600 0.2717 0.8600 0.9274
No log 0.5591 52 1.0138 0.0219 1.0138 1.0069
No log 0.5806 54 1.0805 -0.0060 1.0805 1.0395
No log 0.6022 56 0.8991 0.0271 0.8991 0.9482
No log 0.6237 58 0.8549 0.2476 0.8549 0.9246
No log 0.6452 60 0.8321 0.0842 0.8321 0.9122
No log 0.6667 62 0.7418 -0.0029 0.7418 0.8613
No log 0.6882 64 0.7376 0.1644 0.7376 0.8589
No log 0.7097 66 0.7821 0.0509 0.7821 0.8844
No log 0.7312 68 0.8084 0.1136 0.8084 0.8991
No log 0.7527 70 0.8688 0.1290 0.8688 0.9321
No log 0.7742 72 0.9591 0.1650 0.9591 0.9793
No log 0.7957 74 0.9440 0.1198 0.9440 0.9716
No log 0.8172 76 0.9921 0.1260 0.9921 0.9960
No log 0.8387 78 1.1471 0.0169 1.1471 1.0710
No log 0.8602 80 1.0966 0.0950 1.0966 1.0472
No log 0.8817 82 0.9739 0.1725 0.9739 0.9869
No log 0.9032 84 1.0414 0.0458 1.0414 1.0205
No log 0.9247 86 0.8695 0.0181 0.8695 0.9325
No log 0.9462 88 0.8569 0.2057 0.8569 0.9257
No log 0.9677 90 1.1070 0.0188 1.1070 1.0521
No log 0.9892 92 0.9487 0.1893 0.9487 0.9740
No log 1.0108 94 1.2214 0.1293 1.2214 1.1052
No log 1.0323 96 1.6749 0.0839 1.6749 1.2942
No log 1.0538 98 1.0853 0.1013 1.0853 1.0418
No log 1.0753 100 0.9627 0.0355 0.9627 0.9812
No log 1.0968 102 1.2114 0.0842 1.2114 1.1006
No log 1.1183 104 1.0214 0.0154 1.0214 1.0107
No log 1.1398 106 0.7929 0.0146 0.7929 0.8905
No log 1.1613 108 0.9766 -0.0832 0.9766 0.9882
No log 1.1828 110 0.8145 0.0611 0.8145 0.9025
No log 1.2043 112 0.9749 0.1374 0.9749 0.9874
No log 1.2258 114 1.1689 -0.0386 1.1689 1.0811
No log 1.2473 116 0.9405 0.0883 0.9405 0.9698
No log 1.2688 118 1.2033 0.0344 1.2033 1.0969
No log 1.2903 120 1.2775 0.0291 1.2775 1.1303
No log 1.3118 122 0.9243 0.0161 0.9243 0.9614
No log 1.3333 124 1.1567 0.0078 1.1567 1.0755
No log 1.3548 126 1.1950 -0.0180 1.1950 1.0932
No log 1.3763 128 1.0363 0.0872 1.0363 1.0180
No log 1.3978 130 1.2453 0.0379 1.2453 1.1159
No log 1.4194 132 1.0230 0.0075 1.0230 1.0114
No log 1.4409 134 1.1459 0.0955 1.1459 1.0705
No log 1.4624 136 1.5734 0.1103 1.5734 1.2544
No log 1.4839 138 1.3805 0.0529 1.3805 1.1749
No log 1.5054 140 1.0738 0.0490 1.0738 1.0363
No log 1.5269 142 1.2002 0.0957 1.2002 1.0955
No log 1.5484 144 0.9783 0.0760 0.9783 0.9891
No log 1.5699 146 0.9415 0.0966 0.9415 0.9703
No log 1.5914 148 0.9626 0.0729 0.9626 0.9811
No log 1.6129 150 0.7809 0.0944 0.7809 0.8837
No log 1.6344 152 0.7432 0.0759 0.7432 0.8621
No log 1.6559 154 0.7205 0.0857 0.7205 0.8488
No log 1.6774 156 0.7271 0.0869 0.7271 0.8527
No log 1.6989 158 0.7746 0.2593 0.7746 0.8801
No log 1.7204 160 1.0203 0.0487 1.0203 1.0101
No log 1.7419 162 0.9246 0.1284 0.9246 0.9616
No log 1.7634 164 0.7822 0.1249 0.7822 0.8844
No log 1.7849 166 0.8655 -0.0008 0.8655 0.9303
No log 1.8065 168 0.8571 -0.0008 0.8571 0.9258
No log 1.8280 170 0.7163 0.0874 0.7163 0.8463
No log 1.8495 172 0.7472 0.2275 0.7472 0.8644
No log 1.8710 174 0.7258 0.0879 0.7258 0.8519
No log 1.8925 176 0.7692 0.0622 0.7692 0.8771
No log 1.9140 178 0.7270 0.1304 0.7270 0.8527
No log 1.9355 180 0.7294 0.1304 0.7294 0.8540
No log 1.9570 182 0.7944 0.0757 0.7944 0.8913
No log 1.9785 184 0.8498 0.2486 0.8498 0.9218
No log 2.0 186 0.9443 0.0975 0.9443 0.9718
No log 2.0215 188 1.1138 0.0496 1.1138 1.0553
No log 2.0430 190 0.9288 0.1006 0.9288 0.9637
No log 2.0645 192 0.8643 0.0175 0.8643 0.9297
No log 2.0860 194 0.9473 -0.0128 0.9473 0.9733
No log 2.1075 196 0.7702 0.0723 0.7702 0.8776
No log 2.1290 198 0.7441 0.1906 0.7441 0.8626
No log 2.1505 200 0.7427 0.1333 0.7427 0.8618
No log 2.1720 202 0.7457 0.2046 0.7457 0.8635
No log 2.1935 204 0.8842 0.0333 0.8842 0.9403
No log 2.2151 206 0.8459 0.0015 0.8459 0.9198
No log 2.2366 208 0.8242 0.2933 0.8242 0.9078
No log 2.2581 210 0.8384 0.2673 0.8384 0.9156
No log 2.2796 212 0.8054 0.2218 0.8054 0.8974
No log 2.3011 214 0.8091 0.1863 0.8091 0.8995
No log 2.3226 216 0.7966 0.1687 0.7966 0.8925
No log 2.3441 218 0.7719 0.1292 0.7719 0.8786
No log 2.3656 220 0.7775 0.0828 0.7775 0.8818
No log 2.3871 222 0.8093 0.1333 0.8093 0.8996
No log 2.4086 224 0.8565 0.0514 0.8565 0.9255
No log 2.4301 226 0.9628 0.0138 0.9628 0.9812
No log 2.4516 228 0.9386 0.0121 0.9386 0.9688
No log 2.4731 230 0.8630 -0.0307 0.8630 0.9290
No log 2.4946 232 0.8294 0.0874 0.8294 0.9107
No log 2.5161 234 0.8826 0.1742 0.8826 0.9395
No log 2.5376 236 0.8478 0.1859 0.8478 0.9208
No log 2.5591 238 0.7682 0.0432 0.7682 0.8764
No log 2.5806 240 0.7778 0.1443 0.7778 0.8820
No log 2.6022 242 0.7619 0.1298 0.7619 0.8729
No log 2.6237 244 0.7595 0.1298 0.7595 0.8715
No log 2.6452 246 0.7595 0.0983 0.7595 0.8715
No log 2.6667 248 0.7586 0.1379 0.7586 0.8710
No log 2.6882 250 0.7874 0.2827 0.7874 0.8873
No log 2.7097 252 0.8238 0.1268 0.8238 0.9076
No log 2.7312 254 0.8358 0.2109 0.8358 0.9142
No log 2.7527 256 0.9833 -0.0237 0.9833 0.9916
No log 2.7742 258 0.9318 0.0025 0.9318 0.9653
No log 2.7957 260 0.8226 0.2476 0.8226 0.9070
No log 2.8172 262 1.0904 -0.0204 1.0904 1.0442
No log 2.8387 264 1.0551 -0.0187 1.0551 1.0272
No log 2.8602 266 0.7896 0.1298 0.7896 0.8886
No log 2.8817 268 0.9040 -0.0237 0.9040 0.9508
No log 2.9032 270 0.9224 0.0169 0.9224 0.9604
No log 2.9247 272 0.8134 0.0 0.8134 0.9019
No log 2.9462 274 0.8316 0.1841 0.8316 0.9119
No log 2.9677 276 0.8516 0.1372 0.8516 0.9228
No log 2.9892 278 0.7962 0.0821 0.7962 0.8923
No log 3.0108 280 0.7805 0.0432 0.7805 0.8835
No log 3.0323 282 0.7579 0.1249 0.7579 0.8706
No log 3.0538 284 0.7532 0.0375 0.7532 0.8678
No log 3.0753 286 0.7758 0.0874 0.7758 0.8808
No log 3.0968 288 0.8142 0.1846 0.8142 0.9023
No log 3.1183 290 0.8245 0.2709 0.8245 0.9080
No log 3.1398 292 0.7875 0.1189 0.7875 0.8874
No log 3.1613 294 0.7794 0.0432 0.7794 0.8828
No log 3.1828 296 0.7665 0.1705 0.7665 0.8755
No log 3.2043 298 0.8118 0.2361 0.8118 0.9010
No log 3.2258 300 0.8627 0.1615 0.8627 0.9288
No log 3.2473 302 0.8104 0.1649 0.8104 0.9002
No log 3.2688 304 0.7318 0.2588 0.7318 0.8555
No log 3.2903 306 0.7363 0.1495 0.7363 0.8581
No log 3.3118 308 0.7202 0.1249 0.7202 0.8486
No log 3.3333 310 0.7422 0.0481 0.7422 0.8615
No log 3.3548 312 0.7705 0.1770 0.7705 0.8778
No log 3.3763 314 0.7620 0.1277 0.7620 0.8729
No log 3.3978 316 0.7734 0.1767 0.7734 0.8794
No log 3.4194 318 0.7740 0.0551 0.7740 0.8798
No log 3.4409 320 0.7640 0.2486 0.7640 0.8741
No log 3.4624 322 0.8791 0.1043 0.8791 0.9376
No log 3.4839 324 0.8320 0.2381 0.8320 0.9121
No log 3.5054 326 0.7964 0.2556 0.7964 0.8924
No log 3.5269 328 0.8072 0.2591 0.8072 0.8985
No log 3.5484 330 0.8235 0.2103 0.8235 0.9075
No log 3.5699 332 0.8811 0.2254 0.8811 0.9387
No log 3.5914 334 0.8585 0.2740 0.8585 0.9266
No log 3.6129 336 0.8768 0.2580 0.8768 0.9364
No log 3.6344 338 0.8769 0.1637 0.8769 0.9364
No log 3.6559 340 0.8055 0.0893 0.8055 0.8975
No log 3.6774 342 0.7471 0.0375 0.7471 0.8643
No log 3.6989 344 0.7258 -0.0032 0.7258 0.8519
No log 3.7204 346 0.7371 0.0375 0.7371 0.8586
No log 3.7419 348 0.7932 -0.0108 0.7932 0.8906
No log 3.7634 350 0.8750 0.2439 0.8750 0.9354
No log 3.7849 352 1.0000 0.1417 1.0000 1.0000
No log 3.8065 354 1.0831 0.0730 1.0831 1.0407
No log 3.8280 356 0.8635 0.2609 0.8635 0.9292
No log 3.8495 358 0.8624 0.0708 0.8624 0.9287
No log 3.8710 360 0.7849 0.1049 0.7849 0.8860
No log 3.8925 362 0.7785 0.2181 0.7785 0.8823
No log 3.9140 364 0.8530 0.1367 0.8530 0.9236
No log 3.9355 366 0.8680 0.1367 0.8680 0.9317
No log 3.9570 368 0.8544 0.1752 0.8544 0.9244
No log 3.9785 370 0.8648 0.1268 0.8648 0.9299
No log 4.0 372 0.8426 0.1500 0.8426 0.9179
No log 4.0215 374 1.0113 0.1014 1.0113 1.0056
No log 4.0430 376 1.1773 0.0741 1.1773 1.0850
No log 4.0645 378 1.0333 0.1271 1.0333 1.0165
No log 4.0860 380 0.8307 0.1440 0.8307 0.9115
No log 4.1075 382 0.7712 0.1196 0.7712 0.8782
No log 4.1290 384 0.7290 0.0918 0.7290 0.8538
No log 4.1505 386 0.7694 0.0528 0.7694 0.8771
No log 4.1720 388 0.8572 0.0706 0.8572 0.9259
No log 4.1935 390 0.8387 0.0249 0.8387 0.9158
No log 4.2151 392 0.7916 0.0940 0.7916 0.8897
No log 4.2366 394 0.8973 0.0847 0.8973 0.9473
No log 4.2581 396 0.9344 0.1587 0.9344 0.9667
No log 4.2796 398 0.9051 0.0770 0.9051 0.9514
No log 4.3011 400 0.8864 0.0600 0.8864 0.9415
No log 4.3226 402 0.8192 0.1345 0.8192 0.9051
No log 4.3441 404 0.7624 0.0937 0.7624 0.8732
No log 4.3656 406 0.7279 0.0454 0.7279 0.8532
No log 4.3871 408 0.7174 0.0918 0.7174 0.8470
No log 4.4086 410 0.7342 0.0732 0.7342 0.8569
No log 4.4301 412 0.7589 0.1095 0.7589 0.8711
No log 4.4516 414 0.7574 0.0791 0.7574 0.8703
No log 4.4731 416 0.8158 0.1739 0.8158 0.9032
No log 4.4946 418 0.8058 0.1393 0.8058 0.8977
No log 4.5161 420 0.7643 0.1387 0.7643 0.8742
No log 4.5376 422 0.7682 0.2024 0.7682 0.8765
No log 4.5591 424 0.7898 0.1761 0.7898 0.8887
No log 4.5806 426 0.7874 0.2481 0.7874 0.8873
No log 4.6022 428 0.7971 0.1095 0.7971 0.8928
No log 4.6237 430 0.8286 0.1277 0.8286 0.9103
No log 4.6452 432 0.8058 0.1633 0.8058 0.8977
No log 4.6667 434 0.8495 0.0995 0.8495 0.9217
No log 4.6882 436 0.8010 0.0940 0.8010 0.8950
No log 4.7097 438 0.8178 0.1823 0.8178 0.9043
No log 4.7312 440 0.8754 0.1142 0.8754 0.9356
No log 4.7527 442 0.8320 0.1863 0.8320 0.9121
No log 4.7742 444 0.9024 0.2379 0.9024 0.9500
No log 4.7957 446 0.9200 0.2182 0.9200 0.9592
No log 4.8172 448 0.8586 0.2545 0.8586 0.9266
No log 4.8387 450 0.8174 0.2519 0.8174 0.9041
No log 4.8602 452 0.8652 0.1349 0.8652 0.9301
No log 4.8817 454 0.8513 0.0673 0.8513 0.9227
No log 4.9032 456 0.7353 0.0 0.7353 0.8575
No log 4.9247 458 0.7330 0.1199 0.7330 0.8561
No log 4.9462 460 0.7349 0.1199 0.7349 0.8573
No log 4.9677 462 0.7313 -0.0032 0.7313 0.8552
No log 4.9892 464 0.7799 0.1974 0.7799 0.8831
No log 5.0108 466 0.9273 -0.0159 0.9273 0.9630
No log 5.0323 468 0.8927 0.0404 0.8927 0.9448
No log 5.0538 470 0.7255 0.0375 0.7255 0.8518
No log 5.0753 472 0.7809 0.0909 0.7809 0.8837
No log 5.0968 474 0.7935 0.1243 0.7935 0.8908
No log 5.1183 476 0.7078 0.0768 0.7078 0.8413
No log 5.1398 478 0.7427 0.2747 0.7427 0.8618
No log 5.1613 480 0.8381 0.1039 0.8381 0.9155
No log 5.1828 482 0.8256 0.1419 0.8256 0.9086
No log 5.2043 484 0.7367 0.0454 0.7367 0.8583
No log 5.2258 486 0.7227 0.1691 0.7227 0.8501
No log 5.2473 488 0.8213 0.0826 0.8213 0.9062
No log 5.2688 490 0.8019 0.0826 0.8019 0.8955
No log 5.2903 492 0.7219 0.0338 0.7219 0.8496
No log 5.3118 494 0.8707 0.1398 0.8707 0.9331
No log 5.3333 496 1.0029 0.0458 1.0029 1.0015
No log 5.3548 498 0.8913 0.1269 0.8913 0.9441
0.3181 5.3763 500 0.8004 0.0804 0.8004 0.8947
0.3181 5.3978 502 0.7759 0.0700 0.7759 0.8808
0.3181 5.4194 504 0.7670 0.0884 0.7670 0.8758
0.3181 5.4409 506 0.7673 0.0449 0.7673 0.8760
0.3181 5.4624 508 0.7751 0.0879 0.7751 0.8804
0.3181 5.4839 510 0.8330 0.1294 0.8330 0.9127
0.3181 5.5054 512 0.8517 0.1601 0.8517 0.9229
0.3181 5.5269 514 0.8682 0.1591 0.8682 0.9318
0.3181 5.5484 516 0.9040 0.0329 0.9040 0.9508
0.3181 5.5699 518 0.8252 0.0957 0.8252 0.9084
0.3181 5.5914 520 0.7763 0.2096 0.7763 0.8811
0.3181 5.6129 522 0.7951 0.1859 0.7951 0.8917
0.3181 5.6344 524 0.7699 0.1249 0.7699 0.8774
0.3181 5.6559 526 0.7784 0.0432 0.7784 0.8823
0.3181 5.6774 528 0.7889 0.0449 0.7889 0.8882
0.3181 5.6989 530 0.7620 0.0821 0.7620 0.8729
0.3181 5.7204 532 0.7542 -0.0091 0.7542 0.8685
0.3181 5.7419 534 0.7733 0.0449 0.7733 0.8794
0.3181 5.7634 536 0.7861 0.0465 0.7861 0.8866
0.3181 5.7849 538 0.7842 0.0449 0.7842 0.8855
0.3181 5.8065 540 0.7708 0.1644 0.7708 0.8780
0.3181 5.8280 542 0.7509 0.0432 0.7509 0.8665
0.3181 5.8495 544 0.8178 0.1443 0.8178 0.9043
0.3181 5.8710 546 0.8279 0.1443 0.8279 0.9099
0.3181 5.8925 548 0.7527 0.0973 0.7527 0.8676
0.3181 5.9140 550 0.7626 0.2105 0.7626 0.8733
0.3181 5.9355 552 0.8911 0.1593 0.8911 0.9440
0.3181 5.9570 554 0.8935 0.1593 0.8935 0.9453
0.3181 5.9785 556 0.8143 0.0791 0.8143 0.9024
0.3181 6.0 558 0.9160 -0.0055 0.9160 0.9571
0.3181 6.0215 560 0.9142 -0.0055 0.9142 0.9561
0.3181 6.0430 562 0.8236 0.0026 0.8236 0.9075
0.3181 6.0645 564 0.7915 0.0318 0.7915 0.8897

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task3_organization

Finetuned
(3994)
this model