ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7322
  • Qwk: 0.0628
  • Mse: 0.7322
  • Rmse: 0.8557

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 3.6281 0.0035 3.6281 1.9048
No log 0.0851 4 1.7507 0.0425 1.7507 1.3231
No log 0.1277 6 1.4924 -0.0466 1.4924 1.2216
No log 0.1702 8 1.0577 -0.0345 1.0577 1.0284
No log 0.2128 10 0.7084 0.0334 0.7084 0.8417
No log 0.2553 12 0.6942 -0.0101 0.6942 0.8332
No log 0.2979 14 0.6561 0.0 0.6561 0.8100
No log 0.3404 16 0.6936 0.0 0.6936 0.8328
No log 0.3830 18 0.6816 0.0 0.6816 0.8256
No log 0.4255 20 0.7014 0.0857 0.7014 0.8375
No log 0.4681 22 1.3469 0.0627 1.3469 1.1606
No log 0.5106 24 1.0185 0.0104 1.0185 1.0092
No log 0.5532 26 0.7644 0.0999 0.7644 0.8743
No log 0.5957 28 0.6963 0.0857 0.6963 0.8345
No log 0.6383 30 0.7218 0.1627 0.7218 0.8496
No log 0.6809 32 0.8130 0.0956 0.8130 0.9017
No log 0.7234 34 1.2530 0.0866 1.2530 1.1194
No log 0.7660 36 1.1573 0.0415 1.1573 1.0758
No log 0.8085 38 0.8753 0.0876 0.8753 0.9356
No log 0.8511 40 0.8011 0.0884 0.8011 0.8950
No log 0.8936 42 0.9270 0.1329 0.9270 0.9628
No log 0.9362 44 1.4796 0.0564 1.4796 1.2164
No log 0.9787 46 1.0751 0.0046 1.0751 1.0369
No log 1.0213 48 0.7446 0.0662 0.7446 0.8629
No log 1.0638 50 0.8333 0.0705 0.8333 0.9129
No log 1.1064 52 0.8002 -0.0678 0.8002 0.8946
No log 1.1489 54 0.7126 0.0759 0.7126 0.8442
No log 1.1915 56 0.8259 0.0233 0.8259 0.9088
No log 1.2340 58 0.7036 0.1644 0.7036 0.8388
No log 1.2766 60 0.7580 0.1408 0.7580 0.8707
No log 1.3191 62 0.8637 0.0753 0.8637 0.9293
No log 1.3617 64 0.7649 0.2118 0.7649 0.8746
No log 1.4043 66 1.0906 0.0426 1.0906 1.0443
No log 1.4468 68 1.0531 0.0741 1.0531 1.0262
No log 1.4894 70 0.8039 0.0074 0.8039 0.8966
No log 1.5319 72 0.7778 0.0846 0.7778 0.8819
No log 1.5745 74 0.8039 -0.0303 0.8039 0.8966
No log 1.6170 76 0.7763 0.1240 0.7763 0.8811
No log 1.6596 78 0.8076 0.1604 0.8076 0.8987
No log 1.7021 80 1.0833 0.0428 1.0833 1.0408
No log 1.7447 82 0.9847 0.1222 0.9847 0.9923
No log 1.7872 84 0.8887 0.0860 0.8887 0.9427
No log 1.8298 86 0.8854 0.0459 0.8854 0.9410
No log 1.8723 88 1.0765 0.1112 1.0765 1.0376
No log 1.9149 90 1.1371 0.0694 1.1371 1.0663
No log 1.9574 92 0.8058 0.1561 0.8058 0.8977
No log 2.0 94 0.7917 0.0408 0.7917 0.8898
No log 2.0426 96 0.9054 -0.0685 0.9054 0.9515
No log 2.0851 98 0.8573 0.0654 0.8573 0.9259
No log 2.1277 100 0.7936 -0.0675 0.7936 0.8908
No log 2.1702 102 0.9662 -0.0504 0.9662 0.9830
No log 2.2128 104 0.9994 -0.0532 0.9994 0.9997
No log 2.2553 106 0.7947 -0.0145 0.7947 0.8915
No log 2.2979 108 0.7866 -0.0506 0.7866 0.8869
No log 2.3404 110 0.9032 -0.1011 0.9032 0.9504
No log 2.3830 112 0.9124 -0.0696 0.9124 0.9552
No log 2.4255 114 0.8187 0.0690 0.8187 0.9048
No log 2.4681 116 1.0431 0.0451 1.0431 1.0213
No log 2.5106 118 0.8236 0.1800 0.8236 0.9076
No log 2.5532 120 0.8590 0.0319 0.8590 0.9268
No log 2.5957 122 0.9396 0.0481 0.9396 0.9693
No log 2.6383 124 0.7635 0.0989 0.7635 0.8738
No log 2.6809 126 1.0380 0.0778 1.0380 1.0188
No log 2.7234 128 1.4091 0.0819 1.4091 1.1871
No log 2.7660 130 1.0884 0.0104 1.0884 1.0433
No log 2.8085 132 0.7981 0.0909 0.7981 0.8934
No log 2.8511 134 0.9300 -0.0014 0.9300 0.9644
No log 2.8936 136 0.8780 -0.0123 0.8780 0.9370
No log 2.9362 138 0.8481 0.2827 0.8481 0.9209
No log 2.9787 140 0.9426 0.1222 0.9426 0.9709
No log 3.0213 142 0.8133 0.1001 0.8133 0.9018
No log 3.0638 144 0.7079 0.0814 0.7079 0.8413
No log 3.1064 146 0.7180 0.2326 0.7180 0.8474
No log 3.1489 148 0.7820 0.0955 0.7820 0.8843
No log 3.1915 150 0.8285 0.1304 0.8285 0.9102
No log 3.2340 152 0.9702 0.2291 0.9702 0.9850
No log 3.2766 154 0.9660 0.1166 0.9660 0.9828
No log 3.3191 156 0.9883 0.0673 0.9883 0.9942
No log 3.3617 158 0.8586 0.1313 0.8586 0.9266
No log 3.4043 160 0.7793 0.1143 0.7793 0.8828
No log 3.4468 162 0.8108 -0.0767 0.8108 0.9005
No log 3.4894 164 0.7552 0.1856 0.7552 0.8690
No log 3.5319 166 0.7600 0.1423 0.7600 0.8718
No log 3.5745 168 0.8387 0.0183 0.8387 0.9158
No log 3.6170 170 1.0478 0.0169 1.0478 1.0236
No log 3.6596 172 0.8785 0.0650 0.8785 0.9373
No log 3.7021 174 0.9178 0.0641 0.9178 0.9580
No log 3.7447 176 0.8870 0.0265 0.8870 0.9418
No log 3.7872 178 0.8030 0.1333 0.8030 0.8961
No log 3.8298 180 0.7673 0.1404 0.7673 0.8760
No log 3.8723 182 0.7995 0.1408 0.7995 0.8942
No log 3.9149 184 0.7767 0.1425 0.7767 0.8813
No log 3.9574 186 0.7751 0.0723 0.7751 0.8804
No log 4.0 188 0.8436 0.0538 0.8436 0.9185
No log 4.0426 190 0.8090 0.0574 0.8090 0.8995
No log 4.0851 192 0.8035 0.0993 0.8035 0.8964
No log 4.1277 194 0.9658 -0.0269 0.9658 0.9828
No log 4.1702 196 0.8505 0.0623 0.8505 0.9222
No log 4.2128 198 0.7511 0.1254 0.7511 0.8667
No log 4.2553 200 0.8433 0.1193 0.8433 0.9183
No log 4.2979 202 0.8628 0.0793 0.8628 0.9289
No log 4.3404 204 0.7131 0.1758 0.7131 0.8444
No log 4.3830 206 0.7441 0.0545 0.7441 0.8626
No log 4.4255 208 0.7350 0.1761 0.7350 0.8573
No log 4.4681 210 0.8088 0.1506 0.8088 0.8993
No log 4.5106 212 0.9173 0.0676 0.9173 0.9577
No log 4.5532 214 0.7673 0.0303 0.7673 0.8760
No log 4.5957 216 0.7921 0.1272 0.7921 0.8900
No log 4.6383 218 0.8998 0.0028 0.8998 0.9486
No log 4.6809 220 0.8268 0.1673 0.8268 0.9093
No log 4.7234 222 0.8686 0.0586 0.8686 0.9320
No log 4.7660 224 0.9423 0.0363 0.9423 0.9707
No log 4.8085 226 0.8102 0.0289 0.8102 0.9001
No log 4.8511 228 0.7781 0.1340 0.7781 0.8821
No log 4.8936 230 0.8056 0.0249 0.8056 0.8975
No log 4.9362 232 0.7293 0.0089 0.7293 0.8540
No log 4.9787 234 0.7506 0.1565 0.7506 0.8664
No log 5.0213 236 1.0676 0.0067 1.0676 1.0332
No log 5.0638 238 0.9807 -0.0200 0.9807 0.9903
No log 5.1064 240 0.7022 0.1828 0.7022 0.8380
No log 5.1489 242 0.8034 0.0654 0.8034 0.8963
No log 5.1915 244 0.8594 0.0968 0.8594 0.9270
No log 5.2340 246 0.8099 0.1580 0.8099 0.9000
No log 5.2766 248 1.0171 0.1461 1.0171 1.0085
No log 5.3191 250 1.0895 0.0679 1.0895 1.0438
No log 5.3617 252 0.8345 0.0452 0.8345 0.9135
No log 5.4043 254 0.7376 0.1379 0.7376 0.8588
No log 5.4468 256 0.7446 0.0989 0.7446 0.8629
No log 5.4894 258 0.6880 0.0436 0.6880 0.8295
No log 5.5319 260 0.7060 0.1259 0.7060 0.8402
No log 5.5745 262 0.7743 0.1965 0.7743 0.8799
No log 5.6170 264 0.7775 0.2096 0.7775 0.8818
No log 5.6596 266 0.8615 0.0203 0.8615 0.9282
No log 5.7021 268 0.9133 -0.0055 0.9133 0.9557
No log 5.7447 270 0.8648 0.1290 0.8648 0.9300
No log 5.7872 272 0.8235 0.1673 0.8235 0.9075
No log 5.8298 274 0.8055 0.1135 0.8055 0.8975
No log 5.8723 276 0.7714 0.2181 0.7714 0.8783
No log 5.9149 278 0.7529 0.1769 0.7529 0.8677
No log 5.9574 280 0.7416 0.1882 0.7416 0.8612
No log 6.0 282 0.7478 0.1371 0.7478 0.8647
No log 6.0426 284 0.7440 0.1371 0.7440 0.8626
No log 6.0851 286 0.7402 0.1371 0.7402 0.8603
No log 6.1277 288 0.7338 0.1371 0.7338 0.8566
No log 6.1702 290 0.7611 0.0650 0.7611 0.8724
No log 6.2128 292 0.7302 0.0914 0.7302 0.8545
No log 6.2553 294 0.7409 0.2326 0.7409 0.8607
No log 6.2979 296 0.7423 0.1423 0.7423 0.8615
No log 6.3404 298 0.7968 0.0871 0.7968 0.8926
No log 6.3830 300 0.8260 0.1646 0.8260 0.9088
No log 6.4255 302 0.7472 0.1815 0.7472 0.8644
No log 6.4681 304 0.8566 0.0220 0.8566 0.9255
No log 6.5106 306 0.8442 0.0952 0.8442 0.9188
No log 6.5532 308 0.8179 0.0947 0.8179 0.9044
No log 6.5957 310 0.8207 0.0113 0.8207 0.9059
No log 6.6383 312 0.8863 -0.0054 0.8863 0.9414
No log 6.6809 314 0.7597 0.1196 0.7597 0.8716
No log 6.7234 316 0.7400 0.0926 0.7400 0.8602
No log 6.7660 318 0.7383 0.0926 0.7383 0.8592
No log 6.8085 320 0.7302 0.0926 0.7302 0.8545
No log 6.8511 322 0.7957 0.1001 0.7957 0.8920
No log 6.8936 324 0.8228 0.0456 0.8228 0.9071
No log 6.9362 326 0.7698 0.0776 0.7698 0.8774
No log 6.9787 328 0.7801 0.0985 0.7801 0.8832
No log 7.0213 330 0.8645 0.0592 0.8645 0.9298
No log 7.0638 332 0.8085 0.0940 0.8085 0.8992
No log 7.1064 334 0.7780 0.2070 0.7780 0.8820
No log 7.1489 336 0.8801 0.0915 0.8801 0.9382
No log 7.1915 338 0.8174 0.1150 0.8174 0.9041
No log 7.2340 340 0.7243 0.0776 0.7243 0.8511
No log 7.2766 342 0.7276 0.1404 0.7276 0.8530
No log 7.3191 344 0.7220 0.1254 0.7220 0.8497
No log 7.3617 346 0.7676 0.1096 0.7676 0.8761
No log 7.4043 348 0.7881 0.1096 0.7881 0.8877
No log 7.4468 350 0.7844 0.1612 0.7844 0.8857
No log 7.4894 352 0.7806 0.1139 0.7806 0.8835
No log 7.5319 354 0.8142 0.1580 0.8142 0.9023
No log 7.5745 356 0.8309 0.1538 0.8309 0.9116
No log 7.6170 358 0.8051 0.2121 0.8051 0.8973
No log 7.6596 360 0.8789 0.1935 0.8789 0.9375
No log 7.7021 362 1.0159 0.1258 1.0159 1.0079
No log 7.7447 364 1.0758 0.0924 1.0758 1.0372
No log 7.7872 366 1.0332 0.1285 1.0332 1.0164
No log 7.8298 368 1.0400 0.1285 1.0400 1.0198
No log 7.8723 370 1.0612 0.1289 1.0612 1.0302
No log 7.9149 372 0.9911 0.0958 0.9911 0.9955
No log 7.9574 374 0.8855 0.1850 0.8855 0.9410
No log 8.0 376 0.9062 0.1718 0.9062 0.9520
No log 8.0426 378 0.9893 0.1182 0.9893 0.9947
No log 8.0851 380 0.9607 0.0824 0.9607 0.9802
No log 8.1277 382 0.8715 0.2083 0.8715 0.9335
No log 8.1702 384 0.8339 0.1899 0.8339 0.9132
No log 8.2128 386 0.9284 0.1509 0.9284 0.9636
No log 8.2553 388 1.0130 0.0707 1.0130 1.0065
No log 8.2979 390 0.8934 0.1758 0.8934 0.9452
No log 8.3404 392 0.7275 0.1787 0.7275 0.8529
No log 8.3830 394 0.8174 0.0600 0.8174 0.9041
No log 8.4255 396 0.8415 0.0681 0.8415 0.9173
No log 8.4681 398 0.7333 0.1395 0.7333 0.8563
No log 8.5106 400 0.7617 0.1691 0.7617 0.8728
No log 8.5532 402 0.9766 0.0810 0.9766 0.9883
No log 8.5957 404 1.0406 0.0391 1.0406 1.0201
No log 8.6383 406 0.8876 0.0304 0.8876 0.9421
No log 8.6809 408 0.6975 0.1828 0.6975 0.8352
No log 8.7234 410 0.7144 0.0973 0.7144 0.8452
No log 8.7660 412 0.7306 0.1860 0.7306 0.8548
No log 8.8085 414 0.7051 0.1807 0.7051 0.8397
No log 8.8511 416 0.7076 0.1807 0.7076 0.8412
No log 8.8936 418 0.7225 0.1828 0.7225 0.8500
No log 8.9362 420 0.7185 0.1758 0.7185 0.8476
No log 8.9787 422 0.7016 0.1444 0.7016 0.8376
No log 9.0213 424 0.7144 0.1423 0.7144 0.8452
No log 9.0638 426 0.7364 0.1354 0.7364 0.8582
No log 9.1064 428 0.7761 0.1633 0.7761 0.8810
No log 9.1489 430 0.7771 0.1761 0.7771 0.8816
No log 9.1915 432 0.7579 0.1751 0.7579 0.8706
No log 9.2340 434 0.7596 0.0741 0.7596 0.8716
No log 9.2766 436 0.7645 0.1347 0.7645 0.8743
No log 9.3191 438 0.8149 0.1697 0.8149 0.9027
No log 9.3617 440 0.8045 0.1714 0.8045 0.8969
No log 9.4043 442 0.8155 0.0181 0.8155 0.9031
No log 9.4468 444 0.8972 0.0623 0.8972 0.9472
No log 9.4894 446 0.8161 0.0639 0.8161 0.9034
No log 9.5319 448 0.7711 0.0236 0.7711 0.8781
No log 9.5745 450 0.7666 0.0236 0.7666 0.8755
No log 9.6170 452 0.7909 0.1047 0.7909 0.8894
No log 9.6596 454 0.7542 0.0670 0.7542 0.8685
No log 9.7021 456 0.7604 0.0600 0.7604 0.8720
No log 9.7447 458 0.7622 0.1675 0.7622 0.8731
No log 9.7872 460 0.7552 0.2070 0.7552 0.8690
No log 9.8298 462 0.7556 0.2096 0.7556 0.8693
No log 9.8723 464 0.8166 0.1281 0.8166 0.9037
No log 9.9149 466 1.0230 0.1316 1.0230 1.0114
No log 9.9574 468 0.9983 0.0707 0.9983 0.9992
No log 10.0 470 0.8461 0.0424 0.8461 0.9198
No log 10.0426 472 0.7952 0.0999 0.7952 0.8917
No log 10.0851 474 0.7524 0.1541 0.7524 0.8674
No log 10.1277 476 0.7471 0.1751 0.7471 0.8644
No log 10.1702 478 0.7354 0.1404 0.7354 0.8576
No log 10.2128 480 0.7366 0.1148 0.7366 0.8583
No log 10.2553 482 0.8705 0.0786 0.8705 0.9330
No log 10.2979 484 0.9084 -0.0490 0.9084 0.9531
No log 10.3404 486 0.7894 0.0512 0.7894 0.8885
No log 10.3830 488 0.6655 0.1444 0.6655 0.8158
No log 10.4255 490 0.6610 0.0964 0.6610 0.8130
No log 10.4681 492 0.6690 0.0964 0.6690 0.8179
No log 10.5106 494 0.7141 0.1828 0.7141 0.8450
No log 10.5532 496 0.7767 0.1379 0.7767 0.8813
No log 10.5957 498 0.8140 0.0917 0.8140 0.9022
0.2965 10.6383 500 0.8184 0.0917 0.8184 0.9047
0.2965 10.6809 502 0.8287 0.0799 0.8287 0.9103
0.2965 10.7234 504 0.8104 0.0512 0.8104 0.9002
0.2965 10.7660 506 0.8056 0.0512 0.8056 0.8976
0.2965 10.8085 508 0.7695 0.0512 0.7695 0.8772
0.2965 10.8511 510 0.7322 0.0628 0.7322 0.8557

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task3_organization

Finetuned
(3994)
this model