ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0681
  • Qwk: 0.0735
  • Mse: 1.0681
  • Rmse: 1.0335

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 3.5901 0.0035 3.5901 1.8948
No log 0.5 4 1.9497 0.0772 1.9497 1.3963
No log 0.75 6 1.5506 0.0235 1.5506 1.2452
No log 1.0 8 1.1802 -0.1288 1.1802 1.0864
No log 1.25 10 1.0615 -0.0200 1.0615 1.0303
No log 1.5 12 1.1837 -0.0628 1.1837 1.0880
No log 1.75 14 1.2978 0.0543 1.2978 1.1392
No log 2.0 16 0.9014 -0.1695 0.9014 0.9494
No log 2.25 18 0.8278 -0.0725 0.8278 0.9098
No log 2.5 20 1.0100 -0.1261 1.0100 1.0050
No log 2.75 22 1.9378 -0.0666 1.9378 1.3921
No log 3.0 24 2.2738 -0.0306 2.2738 1.5079
No log 3.25 26 1.4154 0.0543 1.4154 1.1897
No log 3.5 28 0.9329 -0.1554 0.9329 0.9659
No log 3.75 30 0.9125 -0.1580 0.9125 0.9553
No log 4.0 32 0.9712 -0.1155 0.9712 0.9855
No log 4.25 34 1.1781 -0.0862 1.1781 1.0854
No log 4.5 36 1.4154 -0.0930 1.4154 1.1897
No log 4.75 38 1.1184 0.0260 1.1184 1.0575
No log 5.0 40 1.2935 0.0117 1.2935 1.1373
No log 5.25 42 1.8389 0.0367 1.8389 1.3560
No log 5.5 44 1.6146 0.0389 1.6146 1.2707
No log 5.75 46 1.4517 -0.0011 1.4517 1.2048
No log 6.0 48 0.9368 -0.1200 0.9368 0.9679
No log 6.25 50 0.9243 -0.0704 0.9243 0.9614
No log 6.5 52 1.3898 -0.0348 1.3898 1.1789
No log 6.75 54 1.7082 0.0045 1.7082 1.3070
No log 7.0 56 1.5425 -0.0155 1.5425 1.2420
No log 7.25 58 0.9937 -0.0818 0.9937 0.9968
No log 7.5 60 0.8985 -0.2021 0.8985 0.9479
No log 7.75 62 0.9400 -0.1952 0.9400 0.9695
No log 8.0 64 0.9922 -0.0749 0.9922 0.9961
No log 8.25 66 1.1522 0.0186 1.1522 1.0734
No log 8.5 68 1.1204 -0.0068 1.1204 1.0585
No log 8.75 70 1.2579 0.1215 1.2579 1.1216
No log 9.0 72 1.5705 0.0104 1.5705 1.2532
No log 9.25 74 1.0220 0.0267 1.0220 1.0109
No log 9.5 76 0.8297 -0.1047 0.8297 0.9109
No log 9.75 78 0.8580 -0.1144 0.8580 0.9263
No log 10.0 80 1.5398 0.0101 1.5398 1.2409
No log 10.25 82 1.6668 0.0261 1.6668 1.2910
No log 10.5 84 0.9977 0.0250 0.9977 0.9988
No log 10.75 86 0.8591 -0.0341 0.8591 0.9269
No log 11.0 88 1.0705 0.0526 1.0705 1.0346
No log 11.25 90 1.1354 0.1150 1.1354 1.0656
No log 11.5 92 0.9832 -0.0052 0.9832 0.9915
No log 11.75 94 1.2083 0.1043 1.2083 1.0992
No log 12.0 96 1.3310 0.0895 1.3310 1.1537
No log 12.25 98 0.9475 -0.0440 0.9475 0.9734
No log 12.5 100 0.7932 -0.0573 0.7932 0.8906
No log 12.75 102 0.7950 -0.0218 0.7950 0.8916
No log 13.0 104 1.1539 0.0852 1.1539 1.0742
No log 13.25 106 1.7218 -0.0000 1.7218 1.3122
No log 13.5 108 1.7346 -0.0000 1.7346 1.3171
No log 13.75 110 1.3217 0.0456 1.3217 1.1497
No log 14.0 112 0.8658 0.0016 0.8658 0.9305
No log 14.25 114 0.7702 -0.0643 0.7702 0.8776
No log 14.5 116 0.8185 0.0871 0.8185 0.9047
No log 14.75 118 1.2427 0.0767 1.2427 1.1148
No log 15.0 120 1.5458 0.1094 1.5458 1.2433
No log 15.25 122 1.2557 0.0503 1.2557 1.1206
No log 15.5 124 0.8678 0.0913 0.8678 0.9316
No log 15.75 126 0.7913 0.0303 0.7913 0.8895
No log 16.0 128 0.8211 0.0175 0.8211 0.9061
No log 16.25 130 1.1064 0.0915 1.1064 1.0519
No log 16.5 132 1.4600 -0.0456 1.4600 1.2083
No log 16.75 134 1.2802 0.0446 1.2802 1.1314
No log 17.0 136 0.9658 0.1007 0.9658 0.9827
No log 17.25 138 0.9738 0.1042 0.9738 0.9868
No log 17.5 140 1.1914 0.1087 1.1914 1.0915
No log 17.75 142 1.3703 -0.0174 1.3703 1.1706
No log 18.0 144 1.1818 0.1153 1.1818 1.0871
No log 18.25 146 0.9545 0.1499 0.9545 0.9770
No log 18.5 148 0.7512 0.0205 0.7512 0.8667
No log 18.75 150 0.7450 0.0205 0.7450 0.8631
No log 19.0 152 0.9465 0.1077 0.9465 0.9729
No log 19.25 154 1.4598 0.0571 1.4598 1.2082
No log 19.5 156 1.4441 0.0571 1.4441 1.2017
No log 19.75 158 1.1594 0.1379 1.1594 1.0767
No log 20.0 160 0.9043 0.1191 0.9043 0.9509
No log 20.25 162 0.8282 0.1239 0.8282 0.9100
No log 20.5 164 0.9744 0.1042 0.9744 0.9871
No log 20.75 166 1.3254 0.0411 1.3254 1.1512
No log 21.0 168 1.3513 0.0411 1.3513 1.1625
No log 21.25 170 1.0670 0.0941 1.0670 1.0329
No log 21.5 172 0.8169 0.0913 0.8169 0.9038
No log 21.75 174 0.8525 0.1399 0.8525 0.9233
No log 22.0 176 1.0693 0.0604 1.0693 1.0341
No log 22.25 178 1.2422 0.0746 1.2422 1.1145
No log 22.5 180 1.2644 -0.0129 1.2644 1.1244
No log 22.75 182 1.1915 -0.0082 1.1915 1.0916
No log 23.0 184 0.9760 0.0977 0.9760 0.9879
No log 23.25 186 0.9315 0.1077 0.9315 0.9651
No log 23.5 188 1.0003 0.0945 1.0003 1.0002
No log 23.75 190 1.3273 0.0204 1.3273 1.1521
No log 24.0 192 1.7338 0.0235 1.7338 1.3168
No log 24.25 194 1.6513 0.0543 1.6513 1.2850
No log 24.5 196 1.2006 0.0852 1.2006 1.0957
No log 24.75 198 0.8922 0.0909 0.8922 0.9446
No log 25.0 200 0.7790 0.0068 0.7790 0.8826
No log 25.25 202 0.7821 0.0476 0.7821 0.8844
No log 25.5 204 0.8719 0.0946 0.8719 0.9337
No log 25.75 206 1.1719 0.0596 1.1719 1.0826
No log 26.0 208 1.3564 0.0411 1.3564 1.1647
No log 26.25 210 1.1884 0.0552 1.1884 1.0901
No log 26.5 212 0.9362 -0.0114 0.9362 0.9676
No log 26.75 214 0.8749 -0.0008 0.8749 0.9354
No log 27.0 216 0.8786 -0.0118 0.8786 0.9373
No log 27.25 218 0.9667 -0.0331 0.9667 0.9832
No log 27.5 220 0.9353 -0.0301 0.9353 0.9671
No log 27.75 222 0.9696 0.0026 0.9696 0.9847
No log 28.0 224 0.9867 0.0026 0.9867 0.9934
No log 28.25 226 0.9294 0.0515 0.9294 0.9640
No log 28.5 228 0.8951 0.0200 0.8951 0.9461
No log 28.75 230 0.8676 -0.0138 0.8676 0.9315
No log 29.0 232 0.8722 -0.0138 0.8722 0.9339
No log 29.25 234 0.9372 -0.0211 0.9372 0.9681
No log 29.5 236 1.0068 -0.0013 1.0068 1.0034
No log 29.75 238 1.0002 -0.0306 1.0002 1.0001
No log 30.0 240 0.9193 -0.0175 0.9193 0.9588
No log 30.25 242 0.9152 -0.0175 0.9152 0.9567
No log 30.5 244 0.9443 -0.0285 0.9443 0.9717
No log 30.75 246 0.9565 -0.0285 0.9565 0.9780
No log 31.0 248 1.0339 -0.0316 1.0339 1.0168
No log 31.25 250 1.0696 -0.0316 1.0696 1.0342
No log 31.5 252 1.0182 0.0111 1.0182 1.0091
No log 31.75 254 0.9124 0.0684 0.9124 0.9552
No log 32.0 256 0.8639 0.0871 0.8639 0.9295
No log 32.25 258 0.8791 0.0831 0.8791 0.9376
No log 32.5 260 0.9569 -0.0194 0.9569 0.9782
No log 32.75 262 0.9433 0.0224 0.9433 0.9712
No log 33.0 264 0.9026 0.0224 0.9026 0.9500
No log 33.25 266 0.8971 0.0224 0.8971 0.9471
No log 33.5 268 0.9160 0.0200 0.9160 0.9571
No log 33.75 270 0.9923 -0.0291 0.9923 0.9961
No log 34.0 272 1.1437 0.0252 1.1437 1.0694
No log 34.25 274 1.2630 0.0411 1.2630 1.1238
No log 34.5 276 1.2344 0.0411 1.2344 1.1111
No log 34.75 278 1.0891 0.0231 1.0891 1.0436
No log 35.0 280 0.9840 0.0331 0.9840 0.9920
No log 35.25 282 0.9197 -0.0151 0.9197 0.9590
No log 35.5 284 0.9056 -0.0151 0.9056 0.9517
No log 35.75 286 0.9817 0.0309 0.9817 0.9908
No log 36.0 288 1.1246 0.0433 1.1246 1.0605
No log 36.25 290 1.1182 0.0735 1.1182 1.0574
No log 36.5 292 0.9874 0.0238 0.9874 0.9937
No log 36.75 294 0.8750 -0.0228 0.8750 0.9354
No log 37.0 296 0.8213 -0.0408 0.8213 0.9062
No log 37.25 298 0.8140 -0.0408 0.8140 0.9022
No log 37.5 300 0.8494 0.0909 0.8494 0.9216
No log 37.75 302 0.9380 -0.0301 0.9380 0.9685
No log 38.0 304 1.0128 0.0878 1.0128 1.0064
No log 38.25 306 1.1133 0.0762 1.1133 1.0551
No log 38.5 308 1.1453 0.0762 1.1453 1.0702
No log 38.75 310 1.1949 0.0762 1.1949 1.0931
No log 39.0 312 1.0550 0.0543 1.0550 1.0271
No log 39.25 314 0.9095 -0.0157 0.9095 0.9537
No log 39.5 316 0.8808 -0.0076 0.8808 0.9385
No log 39.75 318 0.9141 0.0157 0.9141 0.9561
No log 40.0 320 0.9984 0.0596 0.9984 0.9992
No log 40.25 322 1.0475 0.0543 1.0475 1.0235
No log 40.5 324 1.0456 0.0819 1.0456 1.0226
No log 40.75 326 1.0394 0.1187 1.0394 1.0195
No log 41.0 328 0.9723 0.0107 0.9723 0.9861
No log 41.25 330 0.9513 0.0587 0.9513 0.9753
No log 41.5 332 0.9885 0.0065 0.9885 0.9942
No log 41.75 334 1.1348 0.0762 1.1348 1.0653
No log 42.0 336 1.3484 0.0658 1.3484 1.1612
No log 42.25 338 1.3251 0.0401 1.3251 1.1511
No log 42.5 340 1.0836 0.0790 1.0836 1.0409
No log 42.75 342 0.9408 0.0111 0.9408 0.9700
No log 43.0 344 0.9105 0.0556 0.9105 0.9542
No log 43.25 346 0.9364 -0.0269 0.9364 0.9677
No log 43.5 348 1.0326 0.0848 1.0326 1.0162
No log 43.75 350 1.0740 0.0819 1.0740 1.0363
No log 44.0 352 1.0307 0.0819 1.0307 1.0152
No log 44.25 354 1.0365 0.0819 1.0365 1.0181
No log 44.5 356 1.0854 0.0819 1.0854 1.0418
No log 44.75 358 1.0588 0.0527 1.0588 1.0290
No log 45.0 360 1.0567 0.0274 1.0567 1.0279
No log 45.25 362 1.0886 0.0536 1.0886 1.0433
No log 45.5 364 1.0839 0.0536 1.0839 1.0411
No log 45.75 366 1.0169 -0.0533 1.0169 1.0084
No log 46.0 368 0.9549 -0.0479 0.9549 0.9772
No log 46.25 370 0.9635 0.0753 0.9635 0.9816
No log 46.5 372 1.0417 0.0527 1.0417 1.0206
No log 46.75 374 1.0935 0.0479 1.0935 1.0457
No log 47.0 376 1.0850 0.0479 1.0850 1.0416
No log 47.25 378 1.0394 0.0479 1.0394 1.0195
No log 47.5 380 0.9509 0.0342 0.9509 0.9751
No log 47.75 382 0.9024 0.1150 0.9024 0.9499
No log 48.0 384 0.8681 0.0304 0.8681 0.9317
No log 48.25 386 0.8669 0.0719 0.8669 0.9311
No log 48.5 388 0.9195 0.0152 0.9195 0.9589
No log 48.75 390 1.0543 0.0342 1.0543 1.0268
No log 49.0 392 1.2260 0.0171 1.2260 1.1072
No log 49.25 394 1.2323 0.0171 1.2323 1.1101
No log 49.5 396 1.1753 0.0171 1.1753 1.0841
No log 49.75 398 1.0781 0.0309 1.0781 1.0383
No log 50.0 400 0.9617 0.0481 0.9617 0.9807
No log 50.25 402 0.8789 -0.0393 0.8789 0.9375
No log 50.5 404 0.8459 -0.0359 0.8459 0.9198
No log 50.75 406 0.8762 -0.0073 0.8762 0.9361
No log 51.0 408 0.9685 0.0366 0.9685 0.9841
No log 51.25 410 1.0521 0.0735 1.0521 1.0257
No log 51.5 412 1.0891 0.0735 1.0891 1.0436
No log 51.75 414 1.0827 0.0735 1.0827 1.0405
No log 52.0 416 1.0536 0.0819 1.0536 1.0265
No log 52.25 418 1.0102 0.0319 1.0102 1.0051
No log 52.5 420 0.9447 0.0843 0.9447 0.9720
No log 52.75 422 0.8893 0.0684 0.8893 0.9430
No log 53.0 424 0.9084 0.0224 0.9084 0.9531
No log 53.25 426 0.9566 0.0526 0.9566 0.9780
No log 53.5 428 1.0361 0.0578 1.0361 1.0179
No log 53.75 430 1.1046 0.0762 1.1046 1.0510
No log 54.0 432 1.1359 0.0709 1.1359 1.0658
No log 54.25 434 1.1271 0.0709 1.1271 1.0617
No log 54.5 436 1.0517 0.0762 1.0517 1.0255
No log 54.75 438 0.9777 0.0569 0.9777 0.9888
No log 55.0 440 0.8805 0.1414 0.8805 0.9383
No log 55.25 442 0.7977 0.1605 0.7977 0.8932
No log 55.5 444 0.7654 0.0953 0.7654 0.8748
No log 55.75 446 0.7737 0.0953 0.7737 0.8796
No log 56.0 448 0.7762 0.0068 0.7762 0.8810
No log 56.25 450 0.7942 -0.0373 0.7942 0.8912
No log 56.5 452 0.8252 -0.0391 0.8252 0.9084
No log 56.75 454 0.8976 0.0498 0.8976 0.9474
No log 57.0 456 0.9791 0.0366 0.9791 0.9895
No log 57.25 458 0.9982 0.0342 0.9982 0.9991
No log 57.5 460 0.9849 0.0366 0.9849 0.9924
No log 57.75 462 0.9712 0.0342 0.9712 0.9855
No log 58.0 464 0.9812 0.0342 0.9812 0.9905
No log 58.25 466 1.0239 0.0252 1.0239 1.0119
No log 58.5 468 1.0226 0.0790 1.0226 1.0113
No log 58.75 470 0.9759 0.0543 0.9759 0.9879
No log 59.0 472 0.9574 0.0282 0.9574 0.9785
No log 59.25 474 0.9381 0.0282 0.9381 0.9685
No log 59.5 476 0.9390 -0.0012 0.9390 0.9690
No log 59.75 478 0.9719 0.0231 0.9719 0.9858
No log 60.0 480 1.0443 0.0735 1.0443 1.0219
No log 60.25 482 1.1342 0.0423 1.1342 1.0650
No log 60.5 484 1.1482 0.0423 1.1482 1.0715
No log 60.75 486 1.0870 0.0735 1.0870 1.0426
No log 61.0 488 0.9690 -0.0245 0.9690 0.9844
No log 61.25 490 0.8756 -0.0408 0.8756 0.9357
No log 61.5 492 0.8456 -0.1197 0.8456 0.9196
No log 61.75 494 0.8703 -0.0828 0.8703 0.9329
No log 62.0 496 0.9221 -0.0245 0.9221 0.9603
No log 62.25 498 0.9897 0.0260 0.9897 0.9949
0.1821 62.5 500 1.0846 0.0735 1.0846 1.0415
0.1821 62.75 502 1.1701 0.0423 1.1701 1.0817
0.1821 63.0 504 1.2095 0.0358 1.2095 1.0998
0.1821 63.25 506 1.1790 0.0423 1.1790 1.0858
0.1821 63.5 508 1.1049 0.0423 1.1049 1.0511
0.1821 63.75 510 1.0681 0.0735 1.0681 1.0335

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
24
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task3_organization

Finetuned
(3994)
this model