ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8387
  • Qwk: 0.0460
  • Mse: 0.8387
  • Rmse: 0.9158

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 3.7827 0.0017 3.7827 1.9449
No log 0.0519 4 1.8284 0.0136 1.8284 1.3522
No log 0.0779 6 0.9332 0.0134 0.9332 0.9660
No log 0.1039 8 0.8623 -0.0842 0.8623 0.9286
No log 0.1299 10 1.3640 -0.0164 1.3640 1.1679
No log 0.1558 12 0.7873 -0.0778 0.7873 0.8873
No log 0.1818 14 0.7033 0.0964 0.7033 0.8387
No log 0.2078 16 0.7434 0.0588 0.7434 0.8622
No log 0.2338 18 0.7891 0.0588 0.7891 0.8883
No log 0.2597 20 0.8384 -0.0280 0.8384 0.9156
No log 0.2857 22 0.8515 -0.0187 0.8515 0.9228
No log 0.3117 24 0.8668 0.1998 0.8668 0.9310
No log 0.3377 26 1.3116 -0.0272 1.3116 1.1452
No log 0.3636 28 1.3169 -0.0238 1.3169 1.1476
No log 0.3896 30 0.9412 0.0277 0.9412 0.9702
No log 0.4156 32 0.9388 0.0416 0.9388 0.9689
No log 0.4416 34 0.9095 0.0670 0.9095 0.9537
No log 0.4675 36 1.0151 -0.0076 1.0151 1.0075
No log 0.4935 38 0.8864 0.1673 0.8864 0.9415
No log 0.5195 40 0.8033 0.1095 0.8033 0.8963
No log 0.5455 42 0.7796 0.2586 0.7796 0.8830
No log 0.5714 44 0.7941 0.0831 0.7941 0.8911
No log 0.5974 46 0.7523 0.1415 0.7523 0.8674
No log 0.6234 48 0.8062 0.1993 0.8062 0.8979
No log 0.6494 50 0.9525 0.1284 0.9525 0.9760
No log 0.6753 52 0.8532 0.2341 0.8532 0.9237
No log 0.7013 54 0.8357 0.2576 0.8357 0.9142
No log 0.7273 56 0.8785 0.2513 0.8785 0.9373
No log 0.7532 58 0.8859 0.2216 0.8859 0.9412
No log 0.7792 60 0.9198 0.1683 0.9198 0.9591
No log 0.8052 62 0.9467 0.1452 0.9467 0.9730
No log 0.8312 64 0.8959 0.1091 0.8959 0.9465
No log 0.8571 66 1.1017 0.0679 1.1017 1.0496
No log 0.8831 68 1.0610 0.1300 1.0610 1.0301
No log 0.9091 70 0.9891 0.2109 0.9891 0.9945
No log 0.9351 72 1.3542 0.0529 1.3542 1.1637
No log 0.9610 74 1.2204 0.0721 1.2204 1.1047
No log 0.9870 76 0.9187 0.2402 0.9187 0.9585
No log 1.0130 78 0.8594 0.2430 0.8594 0.9270
No log 1.0390 80 0.9112 0.1672 0.9112 0.9546
No log 1.0649 82 0.7820 0.1660 0.7820 0.8843
No log 1.0909 84 0.7905 0.1218 0.7905 0.8891
No log 1.1169 86 0.9036 0.1005 0.9036 0.9506
No log 1.1429 88 0.9019 0.1673 0.9019 0.9497
No log 1.1688 90 0.8481 0.1260 0.8481 0.9209
No log 1.1948 92 0.8712 0.1264 0.8712 0.9334
No log 1.2208 94 0.8809 0.0978 0.8809 0.9386
No log 1.2468 96 0.9137 0.1809 0.9137 0.9559
No log 1.2727 98 0.9551 0.1871 0.9551 0.9773
No log 1.2987 100 0.9088 0.1616 0.9088 0.9533
No log 1.3247 102 0.8746 0.1447 0.8746 0.9352
No log 1.3506 104 0.9544 0.0991 0.9544 0.9769
No log 1.3766 106 1.1961 0.0585 1.1961 1.0937
No log 1.4026 108 0.9617 0.0217 0.9617 0.9807
No log 1.4286 110 0.8321 0.1321 0.8321 0.9122
No log 1.4545 112 0.9039 -0.0409 0.9039 0.9507
No log 1.4805 114 0.7704 0.1287 0.7704 0.8777
No log 1.5065 116 0.8998 0.0207 0.8998 0.9486
No log 1.5325 118 0.9084 -0.0143 0.9084 0.9531
No log 1.5584 120 0.7506 0.1254 0.7506 0.8664
No log 1.5844 122 1.0116 -0.0142 1.0116 1.0058
No log 1.6104 124 0.9836 0.0104 0.9836 0.9918
No log 1.6364 126 0.8551 0.0091 0.8551 0.9247
No log 1.6623 128 1.5396 0.0822 1.5396 1.2408
No log 1.6883 130 1.5051 0.0343 1.5051 1.2268
No log 1.7143 132 0.8941 0.0781 0.8941 0.9456
No log 1.7403 134 1.3760 0.1515 1.3760 1.1730
No log 1.7662 136 1.5703 0.1076 1.5703 1.2531
No log 1.7922 138 1.1920 0.1548 1.1920 1.0918
No log 1.8182 140 0.8285 0.1723 0.8285 0.9102
No log 1.8442 142 1.0455 0.1111 1.0455 1.0225
No log 1.8701 144 1.0487 0.0764 1.0487 1.0241
No log 1.8961 146 0.8153 0.1495 0.8153 0.9029
No log 1.9221 148 0.7524 0.1244 0.7524 0.8674
No log 1.9481 150 0.7661 0.0934 0.7661 0.8752
No log 1.9740 152 0.7330 0.1244 0.7330 0.8561
No log 2.0 154 0.7291 0.2070 0.7291 0.8539
No log 2.0260 156 0.8333 0.1687 0.8333 0.9128
No log 2.0519 158 0.9358 0.0547 0.9358 0.9674
No log 2.0779 160 0.8121 0.1379 0.8121 0.9012
No log 2.1039 162 0.9156 0.1605 0.9156 0.9569
No log 2.1299 164 1.0465 0.2047 1.0465 1.0230
No log 2.1558 166 1.1609 0.1849 1.1609 1.0774
No log 2.1818 168 1.0361 0.2287 1.0361 1.0179
No log 2.2078 170 0.9533 0.2306 0.9533 0.9764
No log 2.2338 172 0.8739 0.2339 0.8739 0.9348
No log 2.2597 174 0.8422 0.1624 0.8422 0.9177
No log 2.2857 176 0.7910 0.1007 0.7910 0.8894
No log 2.3117 178 0.7947 0.1373 0.7947 0.8914
No log 2.3377 180 0.7954 0.1037 0.7954 0.8919
No log 2.3636 182 0.7238 0.1362 0.7238 0.8507
No log 2.3896 184 0.8371 0.0799 0.8371 0.9149
No log 2.4156 186 0.8420 0.0799 0.8420 0.9176
No log 2.4416 188 0.7272 0.2239 0.7272 0.8528
No log 2.4675 190 0.9144 0.1042 0.9144 0.9562
No log 2.4935 192 0.9127 0.0681 0.9127 0.9553
No log 2.5195 194 0.7435 0.1835 0.7435 0.8623
No log 2.5455 196 0.8867 0.0769 0.8867 0.9416
No log 2.5714 198 0.8814 0.1105 0.8814 0.9389
No log 2.5974 200 0.7282 0.1199 0.7282 0.8533
No log 2.6234 202 0.7984 0.0991 0.7984 0.8935
No log 2.6494 204 0.9467 0.1116 0.9467 0.9730
No log 2.6753 206 0.7891 0.0989 0.7891 0.8883
No log 2.7013 208 0.7442 0.0814 0.7442 0.8627
No log 2.7273 210 0.9113 0.0799 0.9113 0.9546
No log 2.7532 212 0.8955 0.0799 0.8955 0.9463
No log 2.7792 214 0.7506 0.1196 0.7506 0.8664
No log 2.8052 216 0.7897 0.1358 0.7897 0.8886
No log 2.8312 218 0.7322 0.0495 0.7322 0.8557
No log 2.8571 220 0.7059 0.1828 0.7059 0.8402
No log 2.8831 222 0.8952 0.1493 0.8952 0.9461
No log 2.9091 224 0.8852 0.1493 0.8852 0.9408
No log 2.9351 226 0.7224 0.2390 0.7224 0.8499
No log 2.9610 228 0.7730 0.0989 0.7730 0.8792
No log 2.9870 230 1.0735 0.1077 1.0735 1.0361
No log 3.0130 232 1.0207 0.1396 1.0207 1.0103
No log 3.0390 234 0.7820 0.0861 0.7820 0.8843
No log 3.0649 236 0.9876 0.0953 0.9876 0.9938
No log 3.0909 238 1.0354 0.1182 1.0354 1.0176
No log 3.1169 240 0.7845 0.2092 0.7845 0.8857
No log 3.1429 242 0.6882 0.1371 0.6882 0.8296
No log 3.1688 244 0.6977 0.0414 0.6977 0.8353
No log 3.1948 246 0.7322 0.0874 0.7322 0.8557
No log 3.2208 248 0.7679 0.0776 0.7679 0.8763
No log 3.2468 250 0.9265 0.1297 0.9265 0.9625
No log 3.2727 252 1.0354 0.1110 1.0354 1.0176
No log 3.2987 254 0.9784 0.1180 0.9784 0.9891
No log 3.3247 256 0.8459 0.1103 0.8459 0.9197
No log 3.3506 258 0.7853 0.1359 0.7853 0.8862
No log 3.3766 260 0.7790 0.1646 0.7790 0.8826
No log 3.4026 262 0.7631 0.1236 0.7631 0.8736
No log 3.4286 264 0.7651 0.1143 0.7651 0.8747
No log 3.4545 266 0.9085 0.0921 0.9085 0.9531
No log 3.4805 268 1.0924 0.1353 1.0924 1.0452
No log 3.5065 270 1.0353 0.1111 1.0353 1.0175
No log 3.5325 272 0.8158 0.2155 0.8158 0.9032
No log 3.5584 274 0.7422 0.0926 0.7422 0.8615
No log 3.5844 276 0.7692 0.0937 0.7692 0.8770
No log 3.6104 278 0.7504 0.0926 0.7504 0.8663
No log 3.6364 280 0.8242 0.1954 0.8242 0.9078
No log 3.6623 282 0.9951 0.0856 0.9951 0.9976
No log 3.6883 284 1.0172 0.0451 1.0172 1.0086
No log 3.7143 286 0.8420 0.1105 0.8420 0.9176
No log 3.7403 288 0.7164 0.1758 0.7164 0.8464
No log 3.7662 290 0.6943 0.1828 0.6943 0.8333
No log 3.7922 292 0.7167 0.1758 0.7167 0.8466
No log 3.8182 294 0.8173 0.1522 0.8173 0.9040
No log 3.8442 296 0.9090 0.1180 0.9090 0.9534
No log 3.8701 298 0.8568 0.2204 0.8568 0.9256
No log 3.8961 300 0.8241 0.1760 0.8241 0.9078
No log 3.9221 302 0.7909 0.1790 0.7909 0.8893
No log 3.9481 304 0.7410 0.2605 0.7410 0.8608
No log 3.9740 306 0.7256 0.2544 0.7256 0.8518
No log 4.0 308 0.7021 0.2150 0.7021 0.8379
No log 4.0260 310 0.7278 0.1644 0.7278 0.8531
No log 4.0519 312 0.7727 0.2641 0.7727 0.8790
No log 4.0779 314 0.7644 0.2024 0.7644 0.8743
No log 4.1039 316 0.7635 0.1282 0.7635 0.8738
No log 4.1299 318 0.7725 0.1456 0.7725 0.8789
No log 4.1558 320 0.7649 0.1456 0.7649 0.8746
No log 4.1818 322 0.7600 0.0889 0.7600 0.8718
No log 4.2078 324 0.7745 0.2142 0.7745 0.8801
No log 4.2338 326 0.7533 0.1240 0.7533 0.8679
No log 4.2597 328 0.8175 0.0348 0.8175 0.9041
No log 4.2857 330 0.7764 0.1286 0.7764 0.8812
No log 4.3117 332 0.7584 0.1286 0.7584 0.8709
No log 4.3377 334 0.7222 0.1362 0.7222 0.8498
No log 4.3636 336 0.7648 0.0053 0.7648 0.8745
No log 4.3896 338 0.7367 0.0449 0.7367 0.8583
No log 4.4156 340 0.7523 0.0953 0.7523 0.8673
No log 4.4416 342 0.8638 0.0719 0.8638 0.9294
No log 4.4675 344 0.8479 0.0719 0.8479 0.9208
No log 4.4935 346 0.7852 0.0953 0.7852 0.8861
No log 4.5195 348 0.7553 0.0828 0.7553 0.8691
No log 4.5455 350 0.7795 0.1240 0.7795 0.8829
No log 4.5714 352 0.7852 0.1141 0.7852 0.8861
No log 4.5974 354 0.7638 0.0783 0.7638 0.8739
No log 4.6234 356 0.7352 0.1740 0.7352 0.8575
No log 4.6494 358 0.7252 0.2180 0.7252 0.8516
No log 4.6753 360 0.7469 0.1565 0.7469 0.8643
No log 4.7013 362 0.7280 0.2180 0.7280 0.8532
No log 4.7273 364 0.7549 0.2053 0.7549 0.8689
No log 4.7532 366 0.8056 0.1228 0.8056 0.8976
No log 4.7792 368 0.8273 0.1224 0.8273 0.9095
No log 4.8052 370 0.7851 0.1585 0.7851 0.8861
No log 4.8312 372 0.7528 0.2078 0.7528 0.8676
No log 4.8571 374 0.7064 0.0821 0.7064 0.8405
No log 4.8831 376 0.7196 0.0918 0.7196 0.8483
No log 4.9091 378 0.6905 0.0869 0.6905 0.8309
No log 4.9351 380 0.7093 0.1758 0.7093 0.8422
No log 4.9610 382 0.7654 0.1047 0.7654 0.8748
No log 4.9870 384 0.7964 0.1144 0.7964 0.8924
No log 5.0130 386 0.8680 0.1624 0.8680 0.9317
No log 5.0390 388 0.9276 0.0906 0.9276 0.9631
No log 5.0649 390 0.9091 0.1661 0.9091 0.9535
No log 5.0909 392 0.8770 0.1264 0.8770 0.9365
No log 5.1169 394 0.8827 0.1609 0.8827 0.9395
No log 5.1429 396 0.8163 0.1660 0.8163 0.9035
No log 5.1688 398 0.7888 0.2107 0.7888 0.8882
No log 5.1948 400 0.7333 0.0863 0.7333 0.8563
No log 5.2208 402 0.7787 0.0028 0.7787 0.8824
No log 5.2468 404 0.8527 0.0192 0.8527 0.9234
No log 5.2727 406 0.8274 0.0172 0.8274 0.9096
No log 5.2987 408 0.7942 0.0791 0.7942 0.8912
No log 5.3247 410 0.8594 0.2466 0.8594 0.9270
No log 5.3506 412 0.8424 0.2466 0.8424 0.9178
No log 5.3766 414 0.7696 0.1644 0.7696 0.8773
No log 5.4026 416 0.7638 0.0394 0.7638 0.8740
No log 5.4286 418 0.7671 0.1199 0.7671 0.8759
No log 5.4545 420 0.8344 0.1758 0.8344 0.9135
No log 5.4805 422 0.8640 0.0438 0.8640 0.9295
No log 5.5065 424 0.8555 0.1329 0.8555 0.9250
No log 5.5325 426 0.8232 0.1139 0.8232 0.9073
No log 5.5584 428 0.8269 0.0460 0.8269 0.9094
No log 5.5844 430 0.8181 0.1094 0.8181 0.9045
No log 5.6104 432 0.9209 0.0676 0.9209 0.9596
No log 5.6364 434 0.9284 0.1027 0.9284 0.9635
No log 5.6623 436 0.8331 0.2092 0.8331 0.9127
No log 5.6883 438 0.7815 0.1244 0.7815 0.8840
No log 5.7143 440 0.8235 0.0123 0.8235 0.9075
No log 5.7403 442 0.8225 0.0123 0.8225 0.9069
No log 5.7662 444 0.7892 0.0840 0.7892 0.8883
No log 5.7922 446 0.7982 0.2466 0.7982 0.8934
No log 5.8182 448 0.8093 0.2248 0.8093 0.8996
No log 5.8442 450 0.8194 0.2318 0.8194 0.9052
No log 5.8701 452 0.7653 0.1249 0.7653 0.8748
No log 5.8961 454 0.7787 0.0834 0.7787 0.8824
No log 5.9221 456 0.8083 0.0834 0.8083 0.8991
No log 5.9481 458 0.8322 0.1922 0.8322 0.9123
No log 5.9740 460 0.8192 0.1863 0.8192 0.9051
No log 6.0 462 0.8166 0.1365 0.8166 0.9036
No log 6.0260 464 0.8212 0.1415 0.8212 0.9062
No log 6.0519 466 0.8582 0.0923 0.8582 0.9264
No log 6.0779 468 0.9130 0.0700 0.9130 0.9555
No log 6.1039 470 0.9920 0.0918 0.9920 0.9960
No log 6.1299 472 0.9495 0.0988 0.9495 0.9744
No log 6.1558 474 0.8351 0.1324 0.8351 0.9139
No log 6.1818 476 0.7869 0.1304 0.7869 0.8871
No log 6.2078 478 0.7671 0.1304 0.7671 0.8758
No log 6.2338 480 0.7622 0.2053 0.7622 0.8730
No log 6.2597 482 0.7674 0.2431 0.7674 0.8760
No log 6.2857 484 0.8195 0.0799 0.8195 0.9053
No log 6.3117 486 0.7855 0.2155 0.7855 0.8863
No log 6.3377 488 0.7207 0.1740 0.7207 0.8489
No log 6.3636 490 0.7230 0.1740 0.7230 0.8503
No log 6.3896 492 0.7481 0.2053 0.7481 0.8649
No log 6.4156 494 0.7975 0.1465 0.7975 0.8930
No log 6.4416 496 0.8784 0.1448 0.8784 0.9372
No log 6.4675 498 0.8989 0.1386 0.8989 0.9481
0.2854 6.4935 500 0.8877 0.1268 0.8877 0.9422
0.2854 6.5195 502 0.8102 0.0798 0.8102 0.9001
0.2854 6.5455 504 0.7941 0.0412 0.7941 0.8911
0.2854 6.5714 506 0.8259 0.0893 0.8259 0.9088
0.2854 6.5974 508 0.8524 0.1498 0.8524 0.9232
0.2854 6.6234 510 1.0122 0.0585 1.0122 1.0061
0.2854 6.6494 512 1.0943 0.0585 1.0943 1.0461
0.2854 6.6753 514 0.9967 0.0603 0.9967 0.9984
0.2854 6.7013 516 0.8835 0.1228 0.8835 0.9399
0.2854 6.7273 518 0.8387 0.0460 0.8387 0.9158

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task3_organization

Finetuned
(3994)
this model