ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8048
  • Qwk: -0.0462
  • Mse: 0.8048
  • Rmse: 0.8971

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 3.5680 -0.0154 3.5680 1.8889
No log 0.0597 4 1.6444 0.0213 1.6444 1.2823
No log 0.0896 6 1.1836 0.0016 1.1836 1.0879
No log 0.1194 8 1.6746 0.0406 1.6746 1.2941
No log 0.1493 10 1.1763 0.0065 1.1763 1.0846
No log 0.1791 12 0.8671 0.1798 0.8671 0.9312
No log 0.2090 14 0.8702 0.0748 0.8702 0.9328
No log 0.2388 16 1.1775 0.0137 1.1775 1.0851
No log 0.2687 18 0.9431 -0.0143 0.9431 0.9711
No log 0.2985 20 0.6933 -0.0662 0.6933 0.8327
No log 0.3284 22 0.6936 -0.0035 0.6936 0.8328
No log 0.3582 24 0.7286 0.0 0.7286 0.8536
No log 0.3881 26 0.7439 -0.0215 0.7439 0.8625
No log 0.4179 28 1.3609 0.0767 1.3609 1.1666
No log 0.4478 30 0.9674 -0.0532 0.9674 0.9836
No log 0.4776 32 0.8155 0.0673 0.8155 0.9031
No log 0.5075 34 0.8718 -0.1834 0.8718 0.9337
No log 0.5373 36 0.7830 0.1304 0.7830 0.8848
No log 0.5672 38 0.7606 0.0296 0.7606 0.8721
No log 0.5970 40 0.7764 0.0089 0.7764 0.8812
No log 0.6269 42 0.7586 0.0031 0.7586 0.8710
No log 0.6567 44 0.7937 0.1196 0.7937 0.8909
No log 0.6866 46 0.9949 0.0458 0.9949 0.9974
No log 0.7164 48 0.7380 0.0099 0.7380 0.8591
No log 0.7463 50 0.7823 0.0633 0.7823 0.8845
No log 0.7761 52 0.8634 0.0734 0.8634 0.9292
No log 0.8060 54 0.8149 0.1538 0.8149 0.9027
No log 0.8358 56 0.8853 0.2063 0.8853 0.9409
No log 0.8657 58 0.9396 0.1514 0.9396 0.9693
No log 0.8955 60 0.9318 0.1558 0.9318 0.9653
No log 0.9254 62 0.9672 0.0401 0.9672 0.9835
No log 0.9552 64 1.0011 -0.0373 1.0011 1.0005
No log 0.9851 66 0.8579 0.1581 0.8579 0.9262
No log 1.0149 68 0.8406 0.0502 0.8406 0.9168
No log 1.0448 70 0.8351 0.0513 0.8351 0.9138
No log 1.0746 72 0.8924 0.2214 0.8924 0.9447
No log 1.1045 74 1.0788 0.0713 1.0788 1.0387
No log 1.1343 76 0.9546 0.1626 0.9546 0.9770
No log 1.1642 78 1.0358 0.0856 1.0358 1.0178
No log 1.1940 80 1.1825 0.1399 1.1825 1.0874
No log 1.2239 82 0.9512 0.1455 0.9512 0.9753
No log 1.2537 84 1.2545 0.1259 1.2545 1.1201
No log 1.2836 86 1.0652 0.0312 1.0652 1.0321
No log 1.3134 88 1.0244 0.0552 1.0244 1.0121
No log 1.3433 90 1.1163 0.0673 1.1163 1.0566
No log 1.3731 92 1.1524 0.0468 1.1524 1.0735
No log 1.4030 94 0.7764 0.1139 0.7764 0.8811
No log 1.4328 96 1.0146 0.0873 1.0146 1.0073
No log 1.4627 98 0.8529 0.0917 0.8529 0.9236
No log 1.4925 100 1.0532 0.0169 1.0532 1.0262
No log 1.5224 102 1.2446 0.0824 1.2446 1.1156
No log 1.5522 104 0.8978 0.0469 0.8978 0.9475
No log 1.5821 106 1.2015 -0.0218 1.2015 1.0961
No log 1.6119 108 1.0247 0.0233 1.0247 1.0123
No log 1.6418 110 0.9145 0.0962 0.9145 0.9563
No log 1.6716 112 1.5204 0.0389 1.5204 1.2331
No log 1.7015 114 1.2619 0.0044 1.2619 1.1234
No log 1.7313 116 0.8481 -0.0442 0.8481 0.9209
No log 1.7612 118 0.7891 -0.0425 0.7891 0.8883
No log 1.7910 120 1.0247 0.0260 1.0247 1.0123
No log 1.8209 122 0.8444 0.0609 0.8444 0.9189
No log 1.8507 124 0.6949 0.0454 0.6949 0.8336
No log 1.8806 126 0.9701 0.0205 0.9701 0.9850
No log 1.9104 128 1.1290 0.0247 1.1290 1.0626
No log 1.9403 130 0.9984 0.0885 0.9984 0.9992
No log 1.9701 132 1.0445 0.1602 1.0445 1.0220
No log 2.0 134 1.0743 0.1750 1.0743 1.0365
No log 2.0299 136 1.2683 0.0599 1.2683 1.1262
No log 2.0597 138 1.4399 0.1278 1.4399 1.1999
No log 2.0896 140 1.2112 0.0369 1.2112 1.1006
No log 2.1194 142 1.0215 0.0581 1.0215 1.0107
No log 2.1493 144 0.9641 0.1205 0.9641 0.9819
No log 2.1791 146 0.9218 0.1166 0.9218 0.9601
No log 2.2090 148 0.9888 0.0952 0.9888 0.9944
No log 2.2388 150 1.0082 0.0686 1.0082 1.0041
No log 2.2687 152 0.8678 0.1942 0.8678 0.9316
No log 2.2985 154 0.8768 0.0608 0.8768 0.9364
No log 2.3284 156 0.8512 0.2353 0.8512 0.9226
No log 2.3582 158 0.9543 0.0451 0.9543 0.9769
No log 2.3881 160 0.9323 0.0728 0.9323 0.9655
No log 2.4179 162 0.8620 0.1139 0.8620 0.9284
No log 2.4478 164 1.0708 0.0257 1.0708 1.0348
No log 2.4776 166 0.9858 0.0015 0.9858 0.9929
No log 2.5075 168 0.8949 -0.0208 0.8949 0.9460
No log 2.5373 170 1.2252 0.0596 1.2252 1.1069
No log 2.5672 172 1.1456 0.0280 1.1456 1.0703
No log 2.5970 174 0.7914 0.0129 0.7914 0.8896
No log 2.6269 176 0.8917 -0.0056 0.8917 0.9443
No log 2.6567 178 1.0285 0.0741 1.0285 1.0142
No log 2.6866 180 0.8255 0.0377 0.8255 0.9085
No log 2.7164 182 0.6914 0.0909 0.6914 0.8315
No log 2.7463 184 0.7408 -0.0345 0.7408 0.8607
No log 2.7761 186 0.7369 0.0428 0.7369 0.8584
No log 2.8060 188 0.7818 0.0282 0.7818 0.8842
No log 2.8358 190 0.8498 -0.0326 0.8498 0.9218
No log 2.8657 192 1.2512 0.0534 1.2512 1.1186
No log 2.8955 194 1.2237 0.0816 1.2237 1.1062
No log 2.9254 196 0.9238 0.0518 0.9238 0.9611
No log 2.9552 198 0.9313 0.0831 0.9313 0.9650
No log 2.9851 200 0.8786 0.0953 0.8786 0.9374
No log 3.0149 202 0.8186 0.1097 0.8186 0.9048
No log 3.0448 204 0.7695 -0.0859 0.7695 0.8772
No log 3.0746 206 0.8565 0.0794 0.8565 0.9255
No log 3.1045 208 0.8327 0.0377 0.8327 0.9125
No log 3.1343 210 0.7467 -0.0912 0.7467 0.8641
No log 3.1642 212 0.7985 0.0512 0.7985 0.8936
No log 3.1940 214 0.8127 0.0175 0.8127 0.9015
No log 3.2239 216 0.8582 0.0957 0.8582 0.9264
No log 3.2537 218 1.1063 0.0767 1.1063 1.0518
No log 3.2836 220 1.1935 0.0527 1.1935 1.0925
No log 3.3134 222 0.9986 0.0478 0.9986 0.9993
No log 3.3433 224 0.8547 0.1240 0.8547 0.9245
No log 3.3731 226 0.8250 0.1259 0.8250 0.9083
No log 3.4030 228 0.8184 0.0866 0.8184 0.9047
No log 3.4328 230 0.7949 0.0757 0.7949 0.8916
No log 3.4627 232 0.7716 0.0257 0.7716 0.8784
No log 3.4925 234 0.7345 0.0922 0.7345 0.8570
No log 3.5224 236 0.7565 0.0148 0.7565 0.8698
No log 3.5522 238 0.7195 -0.0571 0.7195 0.8482
No log 3.5821 240 0.7053 0.1902 0.7053 0.8398
No log 3.6119 242 0.7550 0.1097 0.7550 0.8689
No log 3.6418 244 0.8190 0.0953 0.8190 0.9050
No log 3.6716 246 0.7755 0.0303 0.7755 0.8806
No log 3.7015 248 0.7841 -0.0550 0.7841 0.8855
No log 3.7313 250 0.7891 -0.0029 0.7891 0.8883
No log 3.7612 252 0.8234 0.0200 0.8234 0.9074
No log 3.7910 254 0.8213 0.0606 0.8213 0.9062
No log 3.8209 256 0.8377 0.0952 0.8377 0.9152
No log 3.8507 258 0.8893 0.0365 0.8893 0.9430
No log 3.8806 260 0.9969 -0.0410 0.9969 0.9984
No log 3.9104 262 1.0098 -0.0449 1.0098 1.0049
No log 3.9403 264 0.8702 0.0657 0.8702 0.9328
No log 3.9701 266 0.9392 0.0348 0.9392 0.9691
No log 4.0 268 0.9810 0.0320 0.9810 0.9904
No log 4.0299 270 0.8458 0.1529 0.8458 0.9197
No log 4.0597 272 0.9461 -0.0545 0.9461 0.9727
No log 4.0896 274 1.0053 -0.0802 1.0053 1.0026
No log 4.1194 276 0.8753 -0.0424 0.8753 0.9356
No log 4.1493 278 0.8133 0.0535 0.8133 0.9019
No log 4.1791 280 0.8083 0.1561 0.8083 0.8991
No log 4.2090 282 0.8577 0.0514 0.8577 0.9261
No log 4.2388 284 0.9033 0.0301 0.9033 0.9504
No log 4.2687 286 0.8663 0.0185 0.8663 0.9308
No log 4.2985 288 0.8154 0.1138 0.8154 0.9030
No log 4.3284 290 0.7972 0.0893 0.7972 0.8929
No log 4.3582 292 0.7783 0.1674 0.7783 0.8822
No log 4.3881 294 0.8570 0.1379 0.8570 0.9257
No log 4.4179 296 1.0915 0.0762 1.0915 1.0448
No log 4.4478 298 1.0041 0.0725 1.0041 1.0020
No log 4.4776 300 0.8450 0.1132 0.8450 0.9192
No log 4.5075 302 0.9243 -0.0056 0.9243 0.9614
No log 4.5373 304 0.9198 -0.0056 0.9198 0.9590
No log 4.5672 306 0.7944 0.1903 0.7944 0.8913
No log 4.5970 308 1.0276 0.0767 1.0276 1.0137
No log 4.6269 310 1.2167 0.1393 1.2167 1.1030
No log 4.6567 312 1.0396 0.0848 1.0396 1.0196
No log 4.6866 314 0.7510 0.1803 0.7510 0.8666
No log 4.7164 316 0.8147 0.0999 0.8147 0.9026
No log 4.7463 318 0.9215 0.0333 0.9215 0.9600
No log 4.7761 320 0.8512 0.0999 0.8512 0.9226
No log 4.8060 322 0.7798 0.1277 0.7798 0.8831
No log 4.8358 324 0.9021 0.0366 0.9021 0.9498
No log 4.8657 326 0.8910 0.0665 0.8910 0.9439
No log 4.8955 328 0.7732 0.1702 0.7732 0.8793
No log 4.9254 330 0.7585 0.1722 0.7585 0.8709
No log 4.9552 332 0.7458 0.2181 0.7458 0.8636
No log 4.9851 334 0.7379 0.1387 0.7379 0.8590
No log 5.0149 336 0.7407 0.1379 0.7407 0.8606
No log 5.0448 338 0.7654 0.1786 0.7654 0.8749
No log 5.0746 340 0.8203 0.1309 0.8203 0.9057
No log 5.1045 342 0.8434 0.1684 0.8434 0.9184
No log 5.1343 344 0.7879 0.0590 0.7879 0.8877
No log 5.1642 346 0.7612 0.1148 0.7612 0.8725
No log 5.1940 348 0.8101 0.0909 0.8101 0.9001
No log 5.2239 350 0.7111 0.1691 0.7111 0.8432
No log 5.2537 352 0.7555 0.0135 0.7555 0.8692
No log 5.2836 354 0.7627 -0.0705 0.7627 0.8733
No log 5.3134 356 0.6918 0.0863 0.6918 0.8318
No log 5.3433 358 0.7236 0.1965 0.7236 0.8506
No log 5.3731 360 0.7637 0.1899 0.7637 0.8739
No log 5.4030 362 0.7278 0.2180 0.7278 0.8531
No log 5.4328 364 0.8299 0.0265 0.8299 0.9110
No log 5.4627 366 0.8922 0.0404 0.8922 0.9446
No log 5.4925 368 0.7777 0.0535 0.7777 0.8819
No log 5.5224 370 0.7105 0.1758 0.7105 0.8429
No log 5.5522 372 0.6954 0.0814 0.6954 0.8339
No log 5.5821 374 0.6868 0.0814 0.6868 0.8287
No log 5.6119 376 0.7016 0.1318 0.7016 0.8376
No log 5.6418 378 0.7482 0.1675 0.7482 0.8650
No log 5.6716 380 0.7766 0.1689 0.7766 0.8812
No log 5.7015 382 0.8751 0.0900 0.8751 0.9355
No log 5.7313 384 0.8996 0.0606 0.8996 0.9485
No log 5.7612 386 0.8329 0.0569 0.8329 0.9126
No log 5.7910 388 0.7786 0.0441 0.7786 0.8824
No log 5.8209 390 0.7693 0.1758 0.7693 0.8771
No log 5.8507 392 0.7980 0.1449 0.7980 0.8933
No log 5.8806 394 0.7453 0.2180 0.7453 0.8633
No log 5.9104 396 0.7250 0.0863 0.7250 0.8515
No log 5.9403 398 0.7704 0.0 0.7704 0.8777
No log 5.9701 400 0.7703 -0.0027 0.7703 0.8777
No log 6.0 402 0.7771 0.1758 0.7771 0.8815
No log 6.0299 404 0.8415 0.0909 0.8415 0.9173
No log 6.0597 406 0.8889 0.0748 0.8889 0.9428
No log 6.0896 408 0.9832 0.0465 0.9832 0.9916
No log 6.1194 410 0.8827 0.1239 0.8827 0.9395
No log 6.1493 412 0.7942 0.0861 0.7942 0.8912
No log 6.1791 414 0.9237 0.0403 0.9237 0.9611
No log 6.2090 416 0.9098 0.0121 0.9098 0.9539
No log 6.2388 418 0.7911 0.0628 0.7911 0.8894
No log 6.2687 420 0.7236 0.0338 0.7236 0.8506
No log 6.2985 422 0.7199 0.1254 0.7199 0.8485
No log 6.3284 424 0.7428 0.1244 0.7428 0.8619
No log 6.3582 426 0.8215 0.0879 0.8215 0.9064
No log 6.3881 428 0.8630 -0.0161 0.8630 0.9290
No log 6.4179 430 0.8418 0.1215 0.8418 0.9175
No log 6.4478 432 0.8468 0.2572 0.8468 0.9202
No log 6.4776 434 0.8745 0.0888 0.8745 0.9352
No log 6.5075 436 0.8021 0.1675 0.8021 0.8956
No log 6.5373 438 0.7691 0.1254 0.7691 0.8770
No log 6.5672 440 0.7576 0.1292 0.7576 0.8704
No log 6.5970 442 0.7963 0.0991 0.7963 0.8924
No log 6.6269 444 0.8156 0.0291 0.8156 0.9031
No log 6.6567 446 0.7628 -0.0387 0.7628 0.8734
No log 6.6866 448 0.7003 0.0909 0.7003 0.8368
No log 6.7164 450 0.6962 0.0857 0.6962 0.8344
No log 6.7463 452 0.7435 0.0922 0.7435 0.8623
No log 6.7761 454 0.8124 0.0580 0.8124 0.9014
No log 6.8060 456 0.7877 0.0455 0.7877 0.8875
No log 6.8358 458 0.7704 0.0723 0.7704 0.8777
No log 6.8657 460 0.8106 0.1342 0.8106 0.9003
No log 6.8955 462 0.7737 0.1047 0.7737 0.8796
No log 6.9254 464 0.7489 0.0814 0.7489 0.8654
No log 6.9552 466 0.7863 -0.0389 0.7863 0.8867
No log 6.9851 468 0.7859 -0.0389 0.7859 0.8865
No log 7.0149 470 0.7716 0.0869 0.7716 0.8784
No log 7.0448 472 0.8161 0.1449 0.8161 0.9034
No log 7.0746 474 0.8140 0.1096 0.8140 0.9022
No log 7.1045 476 0.8587 0.0203 0.8587 0.9267
No log 7.1343 478 0.9440 0.0025 0.9440 0.9716
No log 7.1642 480 0.9617 0.0078 0.9617 0.9807
No log 7.1940 482 0.8392 0.0196 0.8392 0.9161
No log 7.2239 484 0.7746 0.1254 0.7746 0.8801
No log 7.2537 486 0.7945 0.0588 0.7945 0.8914
No log 7.2836 488 0.8030 0.0588 0.8030 0.8961
No log 7.3134 490 0.8027 -0.0462 0.8027 0.8959
No log 7.3433 492 0.8253 0.1294 0.8253 0.9085
No log 7.3731 494 0.8031 0.0074 0.8031 0.8962
No log 7.4030 496 0.7567 0.0863 0.7567 0.8699
No log 7.4328 498 0.7661 0.0670 0.7661 0.8753
0.3253 7.4627 500 0.8146 0.0953 0.8146 0.9026
0.3253 7.4925 502 0.8101 0.0588 0.8101 0.9000
0.3253 7.5224 504 0.8173 0.0 0.8173 0.9041
0.3253 7.5522 506 0.8406 0.0526 0.8406 0.9169
0.3253 7.5821 508 0.8108 -0.0391 0.8108 0.9004
0.3253 7.6119 510 0.8048 -0.0462 0.8048 0.8971

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task3_organization

Finetuned
(3994)
this model