ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7491
  • Qwk: -0.0086
  • Mse: 0.7491
  • Rmse: 0.8655

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 3.3537 0.0062 3.3537 1.8313
No log 0.0702 4 1.7732 0.0213 1.7732 1.3316
No log 0.1053 6 2.0548 0.0136 2.0548 1.4335
No log 0.1404 8 1.9176 0.0405 1.9176 1.3848
No log 0.1754 10 2.1459 0.0381 2.1459 1.4649
No log 0.2105 12 1.0550 -0.0628 1.0550 1.0271
No log 0.2456 14 0.6888 0.0460 0.6888 0.8299
No log 0.2807 16 0.7091 0.1082 0.7091 0.8421
No log 0.3158 18 0.8480 0.0748 0.8480 0.9208
No log 0.3509 20 0.8233 0.1395 0.8233 0.9074
No log 0.3860 22 0.7416 -0.0131 0.7416 0.8611
No log 0.4211 24 0.6909 -0.0644 0.6909 0.8312
No log 0.4561 26 0.7753 0.0670 0.7753 0.8805
No log 0.4912 28 0.8320 0.0476 0.8320 0.9122
No log 0.5263 30 0.9430 -0.0504 0.9430 0.9711
No log 0.5614 32 0.7374 0.0857 0.7374 0.8587
No log 0.5965 34 0.7189 0.0334 0.7189 0.8479
No log 0.6316 36 0.8687 -0.0056 0.8687 0.9320
No log 0.6667 38 0.8618 -0.0079 0.8618 0.9283
No log 0.7018 40 0.7462 0.1148 0.7462 0.8638
No log 0.7368 42 0.7798 0.1146 0.7798 0.8830
No log 0.7719 44 0.8367 -0.0774 0.8367 0.9147
No log 0.8070 46 1.0187 -0.0927 1.0187 1.0093
No log 0.8421 48 0.9851 0.0602 0.9851 0.9925
No log 0.8772 50 0.9029 0.1509 0.9029 0.9502
No log 0.9123 52 1.0639 0.0356 1.0639 1.0315
No log 0.9474 54 0.9485 0.0589 0.9485 0.9739
No log 0.9825 56 0.9216 0.0931 0.9216 0.9600
No log 1.0175 58 1.1742 0.0746 1.1742 1.0836
No log 1.0526 60 0.8584 0.0875 0.8584 0.9265
No log 1.0877 62 1.5362 0.1174 1.5362 1.2394
No log 1.1228 64 1.9536 0.0150 1.9536 1.3977
No log 1.1579 66 1.1083 0.0291 1.1083 1.0528
No log 1.1930 68 1.0028 0.0428 1.0028 1.0014
No log 1.2281 70 1.2994 0.0103 1.2994 1.1399
No log 1.2632 72 0.9971 -0.0120 0.9971 0.9985
No log 1.2982 74 0.7942 0.0660 0.7942 0.8912
No log 1.3333 76 1.0420 0.0596 1.0420 1.0208
No log 1.3684 78 0.9313 -0.0118 0.9313 0.9650
No log 1.4035 80 0.7699 0.1232 0.7699 0.8774
No log 1.4386 82 0.8960 -0.0036 0.8960 0.9466
No log 1.4737 84 0.8877 0.0189 0.8877 0.9422
No log 1.5088 86 0.9212 -0.0179 0.9212 0.9598
No log 1.5439 88 1.0590 0.0098 1.0590 1.0291
No log 1.5789 90 1.0738 0.0424 1.0738 1.0363
No log 1.6140 92 0.9016 0.0833 0.9016 0.9495
No log 1.6491 94 0.8687 0.1221 0.8687 0.9320
No log 1.6842 96 0.9119 0.0239 0.9119 0.9549
No log 1.7193 98 0.8715 0.0905 0.8715 0.9335
No log 1.7544 100 0.8486 0.0359 0.8486 0.9212
No log 1.7895 102 0.9142 0.0993 0.9142 0.9562
No log 1.8246 104 0.8816 0.0909 0.8816 0.9389
No log 1.8596 106 0.8736 -0.0443 0.8736 0.9347
No log 1.8947 108 0.9185 0.0437 0.9185 0.9584
No log 1.9298 110 0.9634 0.1174 0.9634 0.9815
No log 1.9649 112 0.9630 0.1172 0.9630 0.9813
No log 2.0 114 0.9281 0.1172 0.9281 0.9634
No log 2.0351 116 0.8901 0.0611 0.8901 0.9435
No log 2.0702 118 0.9518 0.0618 0.9518 0.9756
No log 2.1053 120 0.8771 0.0291 0.8771 0.9365
No log 2.1404 122 0.7534 0.1585 0.7534 0.8680
No log 2.1754 124 0.7847 0.0574 0.7847 0.8858
No log 2.2105 126 0.8000 0.1518 0.8000 0.8944
No log 2.2456 128 0.8377 0.1962 0.8377 0.9153
No log 2.2807 130 0.8534 0.1942 0.8534 0.9238
No log 2.3158 132 0.8880 0.0110 0.8880 0.9423
No log 2.3509 134 0.9581 0.0271 0.9581 0.9788
No log 2.3860 136 0.8505 0.1187 0.8505 0.9222
No log 2.4211 138 0.8461 0.0810 0.8461 0.9199
No log 2.4561 140 0.9106 0.0323 0.9106 0.9543
No log 2.4912 142 0.8425 0.0172 0.8425 0.9179
No log 2.5263 144 0.9381 0.0351 0.9381 0.9686
No log 2.5614 146 0.8531 0.0545 0.8531 0.9237
No log 2.5965 148 0.8523 0.0129 0.8523 0.9232
No log 2.6316 150 0.9254 0.0319 0.9254 0.9620
No log 2.6667 152 0.8452 0.0628 0.8452 0.9193
No log 2.7018 154 0.8037 0.0449 0.8037 0.8965
No log 2.7368 156 0.8302 0.0152 0.8302 0.9112
No log 2.7719 158 0.7973 0.1030 0.7973 0.8929
No log 2.8070 160 0.9494 0.0428 0.9494 0.9744
No log 2.8421 162 0.8372 0.1451 0.8372 0.9150
No log 2.8772 164 0.8434 0.0562 0.8434 0.9184
No log 2.9123 166 0.7982 -0.0704 0.7982 0.8934
No log 2.9474 168 0.8704 0.0728 0.8704 0.9330
No log 2.9825 170 0.8211 0.1494 0.8211 0.9062
No log 3.0175 172 0.7280 -0.0179 0.7280 0.8532
No log 3.0526 174 0.8601 0.1196 0.8601 0.9274
No log 3.0877 176 0.7838 -0.0179 0.7838 0.8853
No log 3.1228 178 0.9388 0.0428 0.9388 0.9689
No log 3.1579 180 0.9690 0.0758 0.9690 0.9844
No log 3.1930 182 0.8463 0.0338 0.8463 0.9200
No log 3.2281 184 0.9230 0.0421 0.9230 0.9607
No log 3.2632 186 0.8410 -0.0179 0.8410 0.9171
No log 3.2982 188 0.8318 0.0851 0.8318 0.9120
No log 3.3333 190 0.8179 0.0474 0.8179 0.9044
No log 3.3684 192 0.7655 -0.0179 0.7655 0.8749
No log 3.4035 194 0.8621 0.1291 0.8621 0.9285
No log 3.4386 196 0.8978 0.1064 0.8978 0.9475
No log 3.4737 198 0.7494 -0.0179 0.7494 0.8657
No log 3.5088 200 0.9168 0.0728 0.9168 0.9575
No log 3.5439 202 0.9406 0.1078 0.9406 0.9698
No log 3.5789 204 0.7751 0.0884 0.7751 0.8804
No log 3.6140 206 0.8175 -0.0704 0.8175 0.9042
No log 3.6491 208 0.8418 -0.0274 0.8418 0.9175
No log 3.6842 210 0.7772 0.0821 0.7772 0.8816
No log 3.7193 212 0.8507 -0.0111 0.8507 0.9224
No log 3.7544 214 0.7874 0.0454 0.7874 0.8874
No log 3.7895 216 0.7739 0.0260 0.7739 0.8797
No log 3.8246 218 0.8763 0.0017 0.8763 0.9361
No log 3.8596 220 0.8029 0.0225 0.8029 0.8961
No log 3.8947 222 0.8002 -0.0488 0.8002 0.8945
No log 3.9298 224 0.8019 0.0821 0.8019 0.8955
No log 3.9649 226 0.8248 -0.0704 0.8248 0.9082
No log 4.0 228 0.7740 0.0318 0.7740 0.8798
No log 4.0351 230 0.7797 0.0085 0.7797 0.8830
No log 4.0702 232 0.8128 0.1079 0.8128 0.9015
No log 4.1053 234 0.7814 0.0303 0.7814 0.8840
No log 4.1404 236 0.9223 0.1064 0.9223 0.9604
No log 4.1754 238 0.8348 0.0490 0.8348 0.9137
No log 4.2105 240 0.7851 0.0444 0.7851 0.8861
No log 4.2456 242 0.7926 -0.0300 0.7926 0.8903
No log 4.2807 244 0.7621 -0.0054 0.7621 0.8730
No log 4.3158 246 0.8270 0.0867 0.8270 0.9094
No log 4.3509 248 0.8990 0.1773 0.8990 0.9482
No log 4.3860 250 0.7298 0.0759 0.7298 0.8543
No log 4.4211 252 0.7452 -0.0033 0.7452 0.8632
No log 4.4561 254 0.8368 -0.0440 0.8368 0.9148
No log 4.4912 256 0.8527 -0.0536 0.8527 0.9234
No log 4.5263 258 0.8475 -0.0408 0.8475 0.9206
No log 4.5614 260 0.8583 -0.0357 0.8583 0.9265
No log 4.5965 262 0.9894 0.0364 0.9894 0.9947
No log 4.6316 264 1.0139 0.0048 1.0139 1.0069
No log 4.6667 266 0.8449 -0.0444 0.8449 0.9192
No log 4.7018 268 0.9426 0.0871 0.9426 0.9709
No log 4.7368 270 1.0981 0.0454 1.0981 1.0479
No log 4.7719 272 1.0853 0.1225 1.0853 1.0418
No log 4.8070 274 0.8180 -0.0274 0.8180 0.9044
No log 4.8421 276 0.9802 0.0758 0.9802 0.9901
No log 4.8772 278 1.2019 0.0028 1.2019 1.0963
No log 4.9123 280 0.9666 0.0465 0.9666 0.9832
No log 4.9474 282 0.7331 0.0914 0.7331 0.8562
No log 4.9825 284 0.8459 0.0043 0.8459 0.9197
No log 5.0175 286 0.9293 0.0986 0.9293 0.9640
No log 5.0526 288 0.8039 0.0099 0.8039 0.8966
No log 5.0877 290 0.7493 0.0282 0.7493 0.8656
No log 5.1228 292 0.7438 0.0355 0.7438 0.8624
No log 5.1579 294 0.7425 0.0355 0.7425 0.8617
No log 5.1930 296 0.7419 0.0355 0.7419 0.8613
No log 5.2281 298 0.7378 0.0355 0.7378 0.8589
No log 5.2632 300 0.7956 0.0214 0.7956 0.8920
No log 5.2982 302 0.8166 0.0214 0.8166 0.9037
No log 5.3333 304 0.7989 0.0375 0.7989 0.8938
No log 5.3684 306 0.7708 0.0375 0.7708 0.8780
No log 5.4035 308 0.7430 0.0355 0.7430 0.8620
No log 5.4386 310 0.7690 0.0375 0.7690 0.8769
No log 5.4737 312 0.7915 0.0375 0.7915 0.8897
No log 5.5088 314 0.7986 -0.0145 0.7986 0.8936
No log 5.5439 316 0.8525 -0.0200 0.8525 0.9233
No log 5.5789 318 0.8768 -0.0200 0.8768 0.9364
No log 5.6140 320 0.9189 0.0517 0.9189 0.9586
No log 5.6491 322 0.8910 -0.0117 0.8910 0.9439
No log 5.6842 324 0.8653 0.0361 0.8653 0.9302
No log 5.7193 326 0.8248 0.0205 0.8248 0.9082
No log 5.7544 328 0.8720 0.0517 0.8720 0.9338
No log 5.7895 330 0.8127 0.0175 0.8127 0.9015
No log 5.8246 332 0.8071 0.0454 0.8071 0.8984
No log 5.8596 334 0.8672 0.0123 0.8672 0.9313
No log 5.8947 336 0.9287 -0.0221 0.9287 0.9637
No log 5.9298 338 0.9705 0.0140 0.9705 0.9851
No log 5.9649 340 0.8957 -0.0240 0.8957 0.9464
No log 6.0 342 0.8112 0.0318 0.8112 0.9007
No log 6.0351 344 0.8022 -0.0204 0.8022 0.8957
No log 6.0702 346 0.8847 0.0159 0.8847 0.9406
No log 6.1053 348 0.8630 -0.0274 0.8630 0.9290
No log 6.1404 350 0.8381 -0.0228 0.8381 0.9155
No log 6.1754 352 0.8823 -0.0274 0.8823 0.9393
No log 6.2105 354 0.9997 -0.0777 0.9997 0.9998
No log 6.2456 356 0.9700 -0.0440 0.9700 0.9849
No log 6.2807 358 0.8249 0.0123 0.8249 0.9082
No log 6.3158 360 0.7093 0.0436 0.7093 0.8422
No log 6.3509 362 0.7780 0.1408 0.7780 0.8821
No log 6.3860 364 0.7943 0.0991 0.7943 0.8913
No log 6.4211 366 0.7587 0.0454 0.7587 0.8710
No log 6.4561 368 0.8235 0.0551 0.8235 0.9075
No log 6.4912 370 0.9125 0.1027 0.9125 0.9553
No log 6.5263 372 0.8456 0.0517 0.8456 0.9196
No log 6.5614 374 0.7887 0.0690 0.7887 0.8881
No log 6.5965 376 0.7730 0.0690 0.7730 0.8792
No log 6.6316 378 0.7546 0.0690 0.7546 0.8687
No log 6.6667 380 0.7310 -0.0179 0.7310 0.8550
No log 6.7018 382 0.7141 0.0355 0.7141 0.8450
No log 6.7368 384 0.7119 0.0355 0.7119 0.8438
No log 6.7719 386 0.7474 0.0454 0.7474 0.8645
No log 6.8070 388 0.7543 0.0776 0.7543 0.8685
No log 6.8421 390 0.7439 0.0776 0.7439 0.8625
No log 6.8772 392 0.7458 0.1196 0.7458 0.8636
No log 6.9123 394 0.7636 0.1541 0.7636 0.8739
No log 6.9474 396 0.7743 0.1048 0.7743 0.8800
No log 6.9825 398 0.7980 0.1048 0.7980 0.8933
No log 7.0175 400 0.8643 0.0309 0.8643 0.9297
No log 7.0526 402 0.8297 0.0574 0.8297 0.9109
No log 7.0877 404 0.8008 0.0236 0.8008 0.8949
No log 7.1228 406 0.7968 0.0152 0.7968 0.8926
No log 7.1579 408 0.7694 0.0680 0.7694 0.8772
No log 7.1930 410 0.7567 -0.0152 0.7567 0.8699
No log 7.2281 412 0.8606 0.0641 0.8606 0.9277
No log 7.2632 414 0.9027 0.0649 0.9027 0.9501
No log 7.2982 416 0.8105 -0.0599 0.8105 0.9003
No log 7.3333 418 0.8839 0.0504 0.8839 0.9401
No log 7.3684 420 0.8478 0.0152 0.8478 0.9208
No log 7.4035 422 0.8081 -0.0567 0.8081 0.8989
No log 7.4386 424 0.7833 0.0355 0.7833 0.8851
No log 7.4737 426 0.7635 0.0355 0.7635 0.8738
No log 7.5088 428 0.7622 0.1199 0.7622 0.8731
No log 7.5439 430 0.7682 -0.0125 0.7682 0.8765
No log 7.5789 432 0.8178 0.0545 0.8178 0.9043
No log 7.6140 434 0.7975 -0.0595 0.7975 0.8930
No log 7.6491 436 0.8413 0.0538 0.8413 0.9172
No log 7.6842 438 0.8224 0.0574 0.8224 0.9068
No log 7.7193 440 0.8076 0.0680 0.8076 0.8987
No log 7.7544 442 0.7660 -0.0595 0.7660 0.8752
No log 7.7895 444 0.8157 0.1400 0.8157 0.9032
No log 7.8246 446 0.7498 0.0973 0.7498 0.8659
No log 7.8596 448 0.7194 0.0355 0.7194 0.8482
No log 7.8947 450 0.7410 0.1148 0.7410 0.8608
No log 7.9298 452 0.7503 0.1148 0.7503 0.8662
No log 7.9649 454 0.7545 0.0247 0.7545 0.8686
No log 8.0 456 0.7446 0.0375 0.7446 0.8629
No log 8.0351 458 0.7485 0.0432 0.7485 0.8651
No log 8.0702 460 0.7358 0.0282 0.7358 0.8578
No log 8.1053 462 0.8186 0.0043 0.8186 0.9048
No log 8.1404 464 0.7970 0.0588 0.7970 0.8927
No log 8.1754 466 0.7662 0.0432 0.7662 0.8753
No log 8.2105 468 0.8086 0.1885 0.8086 0.8992
No log 8.2456 470 0.7712 0.1453 0.7712 0.8782
No log 8.2807 472 0.7376 -0.0179 0.7376 0.8588
No log 8.3158 474 0.8283 0.0456 0.8283 0.9101
No log 8.3509 476 0.8230 0.0456 0.8230 0.9072
No log 8.3860 478 0.7710 0.0690 0.7710 0.8781
No log 8.4211 480 0.7990 0.0432 0.7990 0.8939
No log 8.4561 482 0.8784 0.1977 0.8784 0.9373
No log 8.4912 484 0.8697 0.0778 0.8697 0.9326
No log 8.5263 486 0.8380 0.0749 0.8380 0.9154
No log 8.5614 488 0.7995 0.0690 0.7995 0.8942
No log 8.5965 490 0.8178 0.1047 0.8178 0.9043
No log 8.6316 492 0.7823 0.0690 0.7823 0.8845
No log 8.6667 494 0.7454 0.1196 0.7454 0.8634
No log 8.7018 496 0.7312 0.0814 0.7312 0.8551
No log 8.7368 498 0.7648 0.1495 0.7648 0.8745
0.2862 8.7719 500 0.7379 0.0247 0.7379 0.8590
0.2862 8.8070 502 0.7331 0.0914 0.7331 0.8562
0.2862 8.8421 504 0.7444 0.0454 0.7444 0.8628
0.2862 8.8772 506 0.7297 0.0914 0.7297 0.8542
0.2862 8.9123 508 0.7627 0.0723 0.7627 0.8733
0.2862 8.9474 510 0.8197 0.1687 0.8197 0.9054
0.2862 8.9825 512 0.8100 0.1633 0.8100 0.9000
0.2862 9.0175 514 0.7639 0.1048 0.7639 0.8740
0.2862 9.0526 516 0.7573 0.1003 0.7573 0.8703
0.2862 9.0877 518 0.7055 0.0814 0.7055 0.8400
0.2862 9.1228 520 0.7067 0.0814 0.7067 0.8407
0.2862 9.1579 522 0.7482 0.1048 0.7482 0.8650
0.2862 9.1930 524 0.8569 0.1105 0.8569 0.9257
0.2862 9.2281 526 0.9546 0.0891 0.9546 0.9771
0.2862 9.2632 528 0.9383 0.1373 0.9383 0.9686
0.2862 9.2982 530 0.8849 0.2118 0.8849 0.9407
0.2862 9.3333 532 0.8703 0.2580 0.8703 0.9329
0.2862 9.3684 534 0.8083 0.1550 0.8083 0.8990
0.2862 9.4035 536 0.7986 0.1190 0.7986 0.8936
0.2862 9.4386 538 0.8318 0.0711 0.8318 0.9120
0.2862 9.4737 540 0.7504 0.0680 0.7504 0.8662
0.2862 9.5088 542 0.7220 0.0814 0.7220 0.8497
0.2862 9.5439 544 0.7411 -0.0125 0.7411 0.8609
0.2862 9.5789 546 0.7213 0.0814 0.7213 0.8493
0.2862 9.6140 548 0.7504 0.1148 0.7504 0.8663
0.2862 9.6491 550 0.7631 0.1148 0.7631 0.8736
0.2862 9.6842 552 0.7667 0.1148 0.7667 0.8756
0.2862 9.7193 554 0.7659 0.0247 0.7659 0.8752
0.2862 9.7544 556 0.7549 0.0247 0.7549 0.8688
0.2862 9.7895 558 0.7594 0.0680 0.7594 0.8714
0.2862 9.8246 560 0.7546 0.0247 0.7546 0.8687
0.2862 9.8596 562 0.7402 -0.0204 0.7402 0.8604
0.2862 9.8947 564 0.7332 0.0680 0.7332 0.8562
0.2862 9.9298 566 0.7491 0.0247 0.7491 0.8655
0.2862 9.9649 568 0.7680 -0.0179 0.7680 0.8763
0.2862 10.0 570 0.7582 0.0282 0.7582 0.8708
0.2862 10.0351 572 0.7368 -0.0179 0.7368 0.8584
0.2862 10.0702 574 0.7434 0.0471 0.7434 0.8622
0.2862 10.1053 576 0.7460 0.0471 0.7460 0.8637
0.2862 10.1404 578 0.7431 -0.0179 0.7431 0.8620
0.2862 10.1754 580 0.7740 0.0680 0.7740 0.8798
0.2862 10.2105 582 0.7809 0.0680 0.7809 0.8837
0.2862 10.2456 584 0.7769 0.0236 0.7769 0.8814
0.2862 10.2807 586 0.7593 0.0236 0.7593 0.8714
0.2862 10.3158 588 0.7571 0.1148 0.7571 0.8701
0.2862 10.3509 590 0.7729 0.0680 0.7729 0.8791
0.2862 10.3860 592 0.7709 0.0236 0.7709 0.8780
0.2862 10.4211 594 0.7625 -0.0145 0.7625 0.8732
0.2862 10.4561 596 0.7425 0.0318 0.7425 0.8617
0.2862 10.4912 598 0.7428 0.0355 0.7428 0.8619
0.2862 10.5263 600 0.7535 0.0449 0.7535 0.8681
0.2862 10.5614 602 0.7502 0.0 0.7502 0.8662
0.2862 10.5965 604 0.7437 0.0 0.7437 0.8624
0.2862 10.6316 606 0.7264 0.0395 0.7264 0.8523
0.2862 10.6667 608 0.7928 0.1097 0.7928 0.8904
0.2862 10.7018 610 0.8106 0.0639 0.8106 0.9003
0.2862 10.7368 612 0.7659 0.1199 0.7659 0.8752
0.2862 10.7719 614 0.7634 -0.0113 0.7634 0.8737
0.2862 10.8070 616 0.7905 0.0639 0.7905 0.8891
0.2862 10.8421 618 0.9516 0.1311 0.9516 0.9755
0.2862 10.8772 620 1.0016 0.1186 1.0016 1.0008
0.2862 10.9123 622 0.8703 -0.0008 0.8703 0.9329
0.2862 10.9474 624 0.7393 0.0318 0.7393 0.8598
0.2862 10.9825 626 0.7629 -0.0030 0.7629 0.8734
0.2862 11.0175 628 0.7999 0.0558 0.7999 0.8944
0.2862 11.0526 630 0.7655 -0.0030 0.7655 0.8749
0.2862 11.0877 632 0.7725 0.0152 0.7725 0.8789
0.2862 11.1228 634 0.9032 0.0651 0.9032 0.9504
0.2862 11.1579 636 0.9931 0.1311 0.9931 0.9966
0.2862 11.1930 638 0.9230 -0.0409 0.9230 0.9607
0.2862 11.2281 640 0.8660 0.0246 0.8660 0.9306
0.2862 11.2632 642 0.9151 0.1605 0.9151 0.9566
0.2862 11.2982 644 0.8818 0.0196 0.8818 0.9390
0.2862 11.3333 646 0.8331 0.0257 0.8331 0.9127
0.2862 11.3684 648 0.8795 -0.0355 0.8795 0.9378
0.2862 11.4035 650 0.8689 -0.0723 0.8689 0.9321
0.2862 11.4386 652 0.8632 0.0361 0.8632 0.9291
0.2862 11.4737 654 0.8476 0.0376 0.8476 0.9207
0.2862 11.5088 656 0.8144 0.0257 0.8144 0.9025
0.2862 11.5439 658 0.8314 0.0512 0.8314 0.9118
0.2862 11.5789 660 0.8468 0.0476 0.8468 0.9202
0.2862 11.6140 662 0.8262 0.0146 0.8262 0.9090
0.2862 11.6491 664 0.8027 -0.0103 0.8027 0.8959
0.2862 11.6842 666 0.8356 0.0119 0.8356 0.9141
0.2862 11.7193 668 0.8207 -0.0354 0.8207 0.9059
0.2862 11.7544 670 0.7491 -0.0086 0.7491 0.8655

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task3_organization

Finetuned
(3994)
this model