ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8114
  • Qwk: 0.0713
  • Mse: 0.8114
  • Rmse: 0.9008

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0244 2 4.7164 0.0083 4.7164 2.1717
No log 0.0488 4 2.4229 -0.0316 2.4229 1.5566
No log 0.0732 6 1.0658 -0.1278 1.0658 1.0324
No log 0.0976 8 0.8153 -0.0331 0.8153 0.9029
No log 0.1220 10 1.2203 -0.0065 1.2203 1.1047
No log 0.1463 12 2.1113 0.0104 2.1113 1.4530
No log 0.1707 14 1.2538 -0.0234 1.2538 1.1197
No log 0.1951 16 0.8889 0.0111 0.8889 0.9428
No log 0.2195 18 0.6748 0.0909 0.6748 0.8215
No log 0.2439 20 0.6564 0.2144 0.6564 0.8102
No log 0.2683 22 0.7071 0.0260 0.7071 0.8409
No log 0.2927 24 1.0003 -0.0245 1.0003 1.0001
No log 0.3171 26 1.1808 0.1087 1.1808 1.0866
No log 0.3415 28 0.8787 0.1025 0.8787 0.9374
No log 0.3659 30 0.7531 0.0783 0.7531 0.8678
No log 0.3902 32 0.8698 0.1414 0.8698 0.9326
No log 0.4146 34 0.8751 0.1672 0.8751 0.9355
No log 0.4390 36 0.9326 0.2027 0.9326 0.9657
No log 0.4634 38 1.6468 0.0557 1.6468 1.2833
No log 0.4878 40 1.2917 0.1224 1.2917 1.1365
No log 0.5122 42 0.8968 0.1472 0.8968 0.9470
No log 0.5366 44 0.9122 0.1631 0.9122 0.9551
No log 0.5610 46 0.8044 0.0097 0.8044 0.8969
No log 0.5854 48 0.7967 0.1094 0.7967 0.8926
No log 0.6098 50 0.8148 0.0455 0.8148 0.9027
No log 0.6341 52 0.8754 0.0 0.8754 0.9356
No log 0.6585 54 0.9647 0.0563 0.9647 0.9822
No log 0.6829 56 1.1975 0.1014 1.1975 1.0943
No log 0.7073 58 1.8323 -0.0202 1.8323 1.3536
No log 0.7317 60 1.9094 -0.0202 1.9094 1.3818
No log 0.7561 62 1.1895 0.0741 1.1895 1.0907
No log 0.7805 64 1.0151 0.1781 1.0151 1.0075
No log 0.8049 66 0.9693 0.1573 0.9693 0.9845
No log 0.8293 68 0.8439 0.0823 0.8439 0.9187
No log 0.8537 70 1.0260 0.1222 1.0260 1.0129
No log 0.8780 72 0.9732 0.1551 0.9732 0.9865
No log 0.9024 74 0.7996 0.1762 0.7996 0.8942
No log 0.9268 76 0.8509 0.1714 0.8509 0.9225
No log 0.9512 78 1.0358 0.1771 1.0358 1.0177
No log 0.9756 80 1.4824 0.1445 1.4824 1.2175
No log 1.0 82 1.2928 0.1002 1.2928 1.1370
No log 1.0244 84 0.8694 0.2023 0.8694 0.9324
No log 1.0488 86 0.9892 0.1514 0.9892 0.9946
No log 1.0732 88 0.9462 0.1312 0.9462 0.9727
No log 1.0976 90 0.7943 0.1093 0.7943 0.8912
No log 1.1220 92 1.1602 0.0529 1.1602 1.0771
No log 1.1463 94 1.2502 0.0462 1.2502 1.1181
No log 1.1707 96 0.9603 0.0257 0.9603 0.9799
No log 1.1951 98 0.8509 0.1809 0.8509 0.9225
No log 1.2195 100 0.9810 0.1764 0.9810 0.9904
No log 1.2439 102 1.0101 0.2234 1.0101 1.0050
No log 1.2683 104 0.9815 0.1403 0.9815 0.9907
No log 1.2927 106 1.0092 0.1369 1.0092 1.0046
No log 1.3171 108 0.9691 0.1406 0.9691 0.9844
No log 1.3415 110 0.9112 0.1453 0.9112 0.9546
No log 1.3659 112 0.9101 0.0931 0.9101 0.9540
No log 1.3902 114 0.8774 0.0537 0.8774 0.9367
No log 1.4146 116 0.8875 0.1775 0.8875 0.9421
No log 1.4390 118 0.9244 0.1391 0.9244 0.9615
No log 1.4634 120 0.9469 0.1519 0.9469 0.9731
No log 1.4878 122 0.9634 0.0856 0.9634 0.9815
No log 1.5122 124 0.9066 0.1053 0.9066 0.9522
No log 1.5366 126 1.0372 0.0794 1.0372 1.0184
No log 1.5610 128 1.0822 0.0891 1.0822 1.0403
No log 1.5854 130 0.9080 0.0988 0.9080 0.9529
No log 1.6098 132 0.7904 0.0549 0.7904 0.8890
No log 1.6341 134 0.7731 0.0768 0.7731 0.8793
No log 1.6585 136 0.8040 0.1362 0.8040 0.8967
No log 1.6829 138 0.9264 0.0995 0.9264 0.9625
No log 1.7073 140 1.0127 -0.0220 1.0127 1.0063
No log 1.7317 142 1.0675 0.1488 1.0675 1.0332
No log 1.7561 144 1.2073 0.1379 1.2073 1.0988
No log 1.7805 146 1.0827 0.1758 1.0827 1.0405
No log 1.8049 148 0.9443 0.0353 0.9443 0.9718
No log 1.8293 150 0.9059 0.0134 0.9059 0.9518
No log 1.8537 152 0.9223 0.0831 0.9223 0.9604
No log 1.8780 154 0.7603 0.1259 0.7603 0.8720
No log 1.9024 156 0.7490 0.1259 0.7490 0.8654
No log 1.9268 158 0.8423 0.0512 0.8423 0.9178
No log 1.9512 160 0.8392 0.1336 0.8392 0.9161
No log 1.9756 162 0.7925 0.0834 0.7925 0.8903
No log 2.0 164 0.8562 0.0697 0.8562 0.9253
No log 2.0244 166 0.9037 0.0306 0.9037 0.9506
No log 2.0488 168 1.0071 0.1560 1.0071 1.0035
No log 2.0732 170 1.0171 0.0734 1.0171 1.0085
No log 2.0976 172 0.8866 0.1010 0.8866 0.9416
No log 2.1220 174 0.8786 -0.0208 0.8786 0.9373
No log 2.1463 176 0.8408 -0.0208 0.8408 0.9170
No log 2.1707 178 0.8087 -0.0370 0.8087 0.8993
No log 2.1951 180 0.8131 0.0679 0.8131 0.9017
No log 2.2195 182 0.8765 0.0159 0.8765 0.9362
No log 2.2439 184 1.0387 0.1866 1.0387 1.0192
No log 2.2683 186 0.9760 0.1863 0.9760 0.9879
No log 2.2927 188 0.9510 0.1575 0.9510 0.9752
No log 2.3171 190 1.2939 0.0679 1.2939 1.1375
No log 2.3415 192 1.0307 0.1468 1.0307 1.0152
No log 2.3659 194 0.7665 0.0834 0.7665 0.8755
No log 2.3902 196 0.8013 0.0172 0.8013 0.8951
No log 2.4146 198 0.7486 0.0488 0.7486 0.8652
No log 2.4390 200 0.8040 0.1965 0.8040 0.8967
No log 2.4634 202 0.9454 0.0498 0.9454 0.9723
No log 2.4878 204 0.8399 0.1196 0.8399 0.9165
No log 2.5122 206 0.7820 0.0488 0.7820 0.8843
No log 2.5366 208 0.8452 -0.0238 0.8452 0.9194
No log 2.5610 210 0.8177 0.1049 0.8177 0.9043
No log 2.5854 212 0.9364 0.0956 0.9364 0.9677
No log 2.6098 214 0.8269 0.1431 0.8269 0.9093
No log 2.6341 216 0.8341 -0.0247 0.8341 0.9133
No log 2.6585 218 0.8246 0.0119 0.8246 0.9081
No log 2.6829 220 0.7980 0.0834 0.7980 0.8933
No log 2.7073 222 0.8642 0.1416 0.8642 0.9296
No log 2.7317 224 1.1773 0.0402 1.1773 1.0851
No log 2.7561 226 1.1278 0.0426 1.1278 1.0620
No log 2.7805 228 0.8613 0.1228 0.8613 0.9280
No log 2.8049 230 0.8482 0.0985 0.8482 0.9210
No log 2.8293 232 0.8277 0.0985 0.8277 0.9098
No log 2.8537 234 0.7822 0.2122 0.7822 0.8844
No log 2.8780 236 0.9239 0.0502 0.9239 0.9612
No log 2.9024 238 0.9522 0.0502 0.9522 0.9758
No log 2.9268 240 0.8343 0.1379 0.8343 0.9134
No log 2.9512 242 0.8284 0.1365 0.8284 0.9102
No log 2.9756 244 0.8843 0.0913 0.8843 0.9404
No log 3.0 246 0.8743 0.2308 0.8743 0.9350
No log 3.0244 248 1.0826 0.0196 1.0826 1.0405
No log 3.0488 250 1.1371 0.0487 1.1371 1.0664
No log 3.0732 252 0.9427 0.0594 0.9427 0.9709
No log 3.0976 254 0.8124 0.2138 0.8124 0.9013
No log 3.1220 256 0.8018 0.0978 0.8018 0.8954
No log 3.1463 258 0.7666 0.0973 0.7666 0.8756
No log 3.1707 260 0.8009 0.2078 0.8009 0.8949
No log 3.1951 262 0.9607 0.0888 0.9607 0.9802
No log 3.2195 264 0.9196 0.1379 0.9196 0.9590
No log 3.2439 266 0.7894 0.2053 0.7894 0.8885
No log 3.2683 268 0.7814 0.0874 0.7814 0.8840
No log 3.2927 270 0.7738 0.0783 0.7738 0.8796
No log 3.3171 272 0.7895 0.2605 0.7895 0.8885
No log 3.3415 274 0.9330 0.0888 0.9330 0.9659
No log 3.3659 276 0.9552 0.0888 0.9552 0.9773
No log 3.3902 278 0.8131 0.2251 0.8131 0.9017
No log 3.4146 280 0.7779 0.1823 0.7779 0.8820
No log 3.4390 282 0.8081 0.0757 0.8081 0.8989
No log 3.4634 284 0.8574 0.0818 0.8574 0.9260
No log 3.4878 286 0.8502 0.0748 0.8502 0.9221
No log 3.5122 288 0.8409 0.0488 0.8409 0.9170
No log 3.5366 290 0.8197 0.0893 0.8197 0.9054
No log 3.5610 292 0.7934 0.0495 0.7934 0.8907
No log 3.5854 294 0.7487 0.0375 0.7487 0.8653
No log 3.6098 296 0.7951 0.2105 0.7951 0.8917
No log 3.6341 298 0.7884 0.2180 0.7884 0.8879
No log 3.6585 300 0.7781 0.0338 0.7781 0.8821
No log 3.6829 302 0.7909 0.0978 0.7909 0.8893
No log 3.7073 304 0.7812 0.0978 0.7812 0.8838
No log 3.7317 306 0.7966 0.2232 0.7966 0.8925
No log 3.7561 308 0.8468 0.1504 0.8468 0.9202
No log 3.7805 310 0.8167 0.1648 0.8167 0.9037
No log 3.8049 312 0.7958 0.1882 0.7958 0.8921
No log 3.8293 314 0.9576 0.1189 0.9576 0.9786
No log 3.8537 316 1.1575 0.0909 1.1575 1.0759
No log 3.8780 318 1.0488 0.1587 1.0488 1.0241
No log 3.9024 320 0.9446 0.1800 0.9446 0.9719
No log 3.9268 322 1.1233 0.1422 1.1233 1.0599
No log 3.9512 324 1.1114 0.1422 1.1114 1.0542
No log 3.9756 326 0.9207 0.1800 0.9207 0.9595
No log 4.0 328 0.8539 0.1353 0.8539 0.9241
No log 4.0244 330 0.8431 0.1093 0.8431 0.9182
No log 4.0488 332 0.8040 0.0749 0.8040 0.8967
No log 4.0732 334 0.8017 0.1752 0.8017 0.8954
No log 4.0976 336 0.9804 0.0888 0.9804 0.9901
No log 4.1220 338 1.0956 0.1573 1.0956 1.0467
No log 4.1463 340 1.0482 0.1077 1.0482 1.0238
No log 4.1707 342 0.8789 0.1513 0.8789 0.9375
No log 4.1951 344 0.8069 0.1823 0.8069 0.8983
No log 4.2195 346 0.8084 0.1823 0.8084 0.8991
No log 4.2439 348 0.8087 0.1529 0.8087 0.8993
No log 4.2683 350 0.8304 0.1372 0.8304 0.9113
No log 4.2927 352 1.0216 0.1111 1.0216 1.0108
No log 4.3171 354 1.1754 -0.0306 1.1754 1.0842
No log 4.3415 356 1.0878 0.0679 1.0878 1.0430
No log 4.3659 358 0.8925 0.0551 0.8925 0.9447
No log 4.3902 360 0.9310 0.1166 0.9310 0.9649
No log 4.4146 362 0.9676 0.0818 0.9676 0.9837
No log 4.4390 364 0.9696 0.0378 0.9696 0.9847
No log 4.4634 366 1.0651 0.0631 1.0651 1.0320
No log 4.4878 368 1.2284 -0.0027 1.2284 1.1083
No log 4.5122 370 1.1029 0.1077 1.1029 1.0502
No log 4.5366 372 0.9356 0.0991 0.9356 0.9673
No log 4.5610 374 0.8046 0.1400 0.8046 0.8970
No log 4.5854 376 0.8020 0.1277 0.8020 0.8956
No log 4.6098 378 0.7838 0.1236 0.7838 0.8853
No log 4.6341 380 0.7823 0.2169 0.7823 0.8845
No log 4.6585 382 0.8386 0.2251 0.8386 0.9158
No log 4.6829 384 0.8119 0.2372 0.8119 0.9011
No log 4.7073 386 0.8039 0.1823 0.8039 0.8966
No log 4.7317 388 0.8380 0.1400 0.8380 0.9154
No log 4.7561 390 0.8415 0.2027 0.8415 0.9173
No log 4.7805 392 0.8341 0.1550 0.8341 0.9133
No log 4.8049 394 0.7928 0.1407 0.7928 0.8904
No log 4.8293 396 0.7955 0.2169 0.7955 0.8919
No log 4.8537 398 0.8219 0.1571 0.8219 0.9066
No log 4.8780 400 0.8925 0.0991 0.8925 0.9447
No log 4.9024 402 0.8255 0.1522 0.8255 0.9086
No log 4.9268 404 0.7527 0.1807 0.7527 0.8676
No log 4.9512 406 0.7406 0.0914 0.7406 0.8606
No log 4.9756 408 0.7410 0.1807 0.7410 0.8608
No log 5.0 410 0.8337 0.1522 0.8337 0.9131
No log 5.0244 412 0.9733 0.1517 0.9733 0.9866
No log 5.0488 414 0.9339 0.1306 0.9339 0.9664
No log 5.0732 416 0.8245 0.0512 0.8245 0.9080
No log 5.0976 418 0.8038 0.0922 0.8038 0.8965
No log 5.1220 420 0.8526 0.0944 0.8526 0.9234
No log 5.1463 422 0.8737 0.1268 0.8737 0.9347
No log 5.1707 424 0.9773 0.1254 0.9773 0.9886
No log 5.1951 426 1.0756 0.1077 1.0756 1.0371
No log 5.2195 428 1.0085 0.1509 1.0085 1.0042
No log 5.2439 430 0.9733 0.1551 0.9733 0.9866
No log 5.2683 432 0.8912 0.0991 0.8912 0.9440
No log 5.2927 434 0.8389 0.1991 0.8389 0.9159
No log 5.3171 436 0.8005 0.1465 0.8005 0.8947
No log 5.3415 438 0.8011 0.1823 0.8011 0.8950
No log 5.3659 440 0.8591 0.1935 0.8591 0.9269
No log 5.3902 442 0.9777 0.1584 0.9777 0.9888
No log 5.4146 444 1.1115 0.1692 1.1115 1.0543
No log 5.4390 446 1.1191 0.1651 1.1191 1.0579
No log 5.4634 448 0.9227 0.1297 0.9227 0.9606
No log 5.4878 450 0.7506 0.1986 0.7506 0.8664
No log 5.5122 452 0.7241 0.1371 0.7241 0.8509
No log 5.5366 454 0.7128 0.1371 0.7128 0.8443
No log 5.5610 456 0.7410 0.2180 0.7410 0.8608
No log 5.5854 458 0.8304 0.1794 0.8304 0.9113
No log 5.6098 460 0.8572 0.1794 0.8572 0.9259
No log 5.6341 462 0.8726 0.1846 0.8726 0.9341
No log 5.6585 464 0.8351 0.1609 0.8351 0.9138
No log 5.6829 466 0.8282 0.1609 0.8282 0.9101
No log 5.7073 468 0.8467 0.1522 0.8467 0.9201
No log 5.7317 470 0.9875 0.1044 0.9875 0.9937
No log 5.7561 472 0.9652 0.1044 0.9652 0.9825
No log 5.7805 474 0.8522 0.1846 0.8522 0.9231
No log 5.8049 476 0.8036 0.1767 0.8036 0.8964
No log 5.8293 478 0.8315 0.1347 0.8315 0.9119
No log 5.8537 480 0.8463 0.1347 0.8463 0.9199
No log 5.8780 482 0.8440 0.2107 0.8440 0.9187
No log 5.9024 484 0.9068 0.1337 0.9068 0.9523
No log 5.9268 486 0.9708 0.1835 0.9708 0.9853
No log 5.9512 488 0.9873 0.1835 0.9873 0.9936
No log 5.9756 490 0.8778 0.1337 0.8778 0.9369
No log 6.0 492 0.7950 0.2534 0.7950 0.8916
No log 6.0244 494 0.8159 0.2534 0.8159 0.9033
No log 6.0488 496 0.8680 0.1379 0.8680 0.9317
No log 6.0732 498 0.9544 0.1882 0.9544 0.9769
0.3124 6.0976 500 0.9866 0.1468 0.9866 0.9933
0.3124 6.1220 502 0.9504 0.1560 0.9504 0.9749
0.3124 6.1463 504 0.9085 0.1604 0.9085 0.9531
0.3124 6.1707 506 0.9155 0.1604 0.9155 0.9568
0.3124 6.1951 508 0.9343 0.1604 0.9343 0.9666
0.3124 6.2195 510 0.8351 0.0805 0.8351 0.9138
0.3124 6.2439 512 0.7878 0.0700 0.7878 0.8876
0.3124 6.2683 514 0.7787 0.1094 0.7787 0.8825
0.3124 6.2927 516 0.7947 0.0705 0.7947 0.8914
0.3124 6.3171 518 0.8114 0.0713 0.8114 0.9008

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
28
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task3_organization

Finetuned
(3994)
this model