ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6977
  • Qwk: 0.0334
  • Mse: 0.6977
  • Rmse: 0.8353

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0227 2 3.9732 0.0017 3.9732 1.9933
No log 0.0455 4 2.2200 -0.0688 2.2200 1.4900
No log 0.0682 6 1.2114 0.0048 1.2114 1.1006
No log 0.0909 8 0.8437 0.0071 0.8437 0.9186
No log 0.1136 10 0.7082 0.0909 0.7082 0.8416
No log 0.1364 12 0.6808 -0.0035 0.6808 0.8251
No log 0.1591 14 0.7974 0.1453 0.7974 0.8930
No log 0.1818 16 1.2962 -0.0234 1.2962 1.1385
No log 0.2045 18 1.0906 0.0379 1.0906 1.0443
No log 0.2273 20 0.7794 0.1506 0.7794 0.8828
No log 0.25 22 0.7104 -0.0101 0.7104 0.8429
No log 0.2727 24 0.6367 0.0 0.6367 0.7980
No log 0.2955 26 0.6714 0.2424 0.6714 0.8194
No log 0.3182 28 1.0691 0.1007 1.0691 1.0340
No log 0.3409 30 1.2657 0.0912 1.2657 1.1250
No log 0.3636 32 0.8308 0.1685 0.8308 0.9115
No log 0.3864 34 0.6681 0.1259 0.6681 0.8173
No log 0.4091 36 0.6912 0.1202 0.6912 0.8314
No log 0.4318 38 1.1285 0.0469 1.1285 1.0623
No log 0.4545 40 1.2913 0.0735 1.2913 1.1363
No log 0.4773 42 0.7918 -0.0031 0.7918 0.8898
No log 0.5 44 0.7448 0.1597 0.7448 0.8630
No log 0.5227 46 0.8940 0.0800 0.8940 0.9455
No log 0.5455 48 0.7626 0.1034 0.7626 0.8733
No log 0.5682 50 0.9208 0.0915 0.9208 0.9596
No log 0.5909 52 1.2016 0.0761 1.2016 1.0962
No log 0.6136 54 0.9371 0.1254 0.9371 0.9680
No log 0.6364 56 0.8781 0.1591 0.8781 0.9371
No log 0.6591 58 1.0755 -0.0661 1.0755 1.0371
No log 0.6818 60 0.9705 0.0402 0.9705 0.9851
No log 0.7045 62 0.8731 0.1558 0.8731 0.9344
No log 0.7273 64 0.8635 0.1240 0.8635 0.9293
No log 0.75 66 0.9537 0.0794 0.9537 0.9766
No log 0.7727 68 0.9946 0.1108 0.9946 0.9973
No log 0.7955 70 0.8269 0.1942 0.8269 0.9093
No log 0.8182 72 1.1764 0.0025 1.1764 1.0846
No log 0.8409 74 0.9798 -0.0151 0.9798 0.9899
No log 0.8636 76 0.8867 -0.0059 0.8867 0.9416
No log 0.8864 78 0.9475 0.1077 0.9475 0.9734
No log 0.9091 80 0.7751 0.1922 0.7751 0.8804
No log 0.9318 82 1.1503 -0.0111 1.1503 1.0725
No log 0.9545 84 1.0119 0.0366 1.0119 1.0059
No log 0.9773 86 0.7671 0.0861 0.7671 0.8758
No log 1.0 88 1.2736 0.1393 1.2736 1.1285
No log 1.0227 90 1.3013 0.0871 1.3013 1.1408
No log 1.0455 92 0.8953 0.0618 0.8953 0.9462
No log 1.0682 94 1.1186 0.0701 1.1186 1.0576
No log 1.0909 96 1.2302 -0.0011 1.2302 1.1092
No log 1.1136 98 0.9395 0.0363 0.9395 0.9693
No log 1.1364 100 0.8815 0.0159 0.8815 0.9389
No log 1.1591 102 0.9100 -0.0123 0.9100 0.9539
No log 1.1818 104 0.8154 0.1272 0.8154 0.9030
No log 1.2045 106 0.8436 -0.0283 0.8436 0.9185
No log 1.2273 108 0.7980 0.1143 0.7980 0.8933
No log 1.25 110 0.8120 0.0196 0.8120 0.9011
No log 1.2727 112 0.9250 0.0065 0.9250 0.9618
No log 1.2955 114 0.8415 0.1181 0.8415 0.9173
No log 1.3182 116 0.9023 0.1352 0.9023 0.9499
No log 1.3409 118 0.9878 -0.0303 0.9878 0.9939
No log 1.3636 120 1.0542 -0.0341 1.0542 1.0268
No log 1.3864 122 1.0090 0.0246 1.0090 1.0045
No log 1.4091 124 1.1395 -0.0137 1.1395 1.0675
No log 1.4318 126 0.9195 -0.0354 0.9195 0.9589
No log 1.4545 128 0.8475 -0.0113 0.8475 0.9206
No log 1.4773 130 0.8616 -0.0849 0.8616 0.9282
No log 1.5 132 0.8559 0.0028 0.8559 0.9251
No log 1.5227 134 0.8660 0.0236 0.8660 0.9306
No log 1.5455 136 0.8824 -0.0171 0.8824 0.9394
No log 1.5682 138 0.9239 -0.0208 0.9239 0.9612
No log 1.5909 140 0.9503 0.0319 0.9503 0.9748
No log 1.6136 142 0.8682 0.1448 0.8682 0.9318
No log 1.6364 144 0.8700 0.0465 0.8700 0.9327
No log 1.6591 146 0.8267 0.0027 0.8267 0.9092
No log 1.6818 148 0.9494 0.0454 0.9494 0.9744
No log 1.7045 150 0.8154 0.0583 0.8154 0.9030
No log 1.7273 152 0.7462 0.1354 0.7462 0.8638
No log 1.75 154 0.7589 0.1047 0.7589 0.8711
No log 1.7727 156 0.9009 0.0304 0.9009 0.9492
No log 1.7955 158 0.8819 0.0755 0.8819 0.9391
No log 1.8182 160 0.7506 0.0355 0.7506 0.8663
No log 1.8409 162 0.7676 0.0394 0.7676 0.8761
No log 1.8636 164 0.8532 0.1095 0.8532 0.9237
No log 1.8864 166 1.1218 0.0774 1.1218 1.0591
No log 1.9091 168 1.0298 0.0603 1.0298 1.0148
No log 1.9318 170 0.7915 0.0944 0.7915 0.8897
No log 1.9545 172 0.7745 0.0172 0.7745 0.8801
No log 1.9773 174 0.7595 0.0732 0.7595 0.8715
No log 2.0 176 1.0098 0.0498 1.0098 1.0049
No log 2.0227 178 1.0143 0.0182 1.0143 1.0071
No log 2.0455 180 0.7726 0.0175 0.7726 0.8790
No log 2.0682 182 0.7493 0.0528 0.7493 0.8656
No log 2.0909 184 0.7329 0.0496 0.7329 0.8561
No log 2.1136 186 0.7848 0.1001 0.7848 0.8859
No log 2.1364 188 0.8218 0.0456 0.8218 0.9065
No log 2.1591 190 0.7332 0.1362 0.7332 0.8563
No log 2.1818 192 0.7641 0.0503 0.7641 0.8741
No log 2.2045 194 0.7601 0.0236 0.7601 0.8719
No log 2.2273 196 0.9195 -0.0441 0.9195 0.9589
No log 2.25 198 1.2209 0.1252 1.2209 1.1049
No log 2.2727 200 0.9520 -0.0441 0.9520 0.9757
No log 2.2955 202 0.8203 0.1425 0.8203 0.9057
No log 2.3182 204 0.9352 0.0405 0.9352 0.9671
No log 2.3409 206 0.8111 0.1660 0.8111 0.9006
No log 2.3636 208 1.0057 0.0309 1.0057 1.0028
No log 2.3864 210 0.8912 -0.0008 0.8912 0.9440
No log 2.4091 212 0.7057 0.1612 0.7057 0.8401
No log 2.4318 214 0.6896 0.0524 0.6896 0.8304
No log 2.4545 216 0.6986 0.0524 0.6986 0.8358
No log 2.4773 218 0.7530 -0.0449 0.7530 0.8678
No log 2.5 220 0.7266 0.0454 0.7266 0.8524
No log 2.5227 222 0.7756 0.1541 0.7756 0.8807
No log 2.5455 224 0.7622 0.1541 0.7622 0.8730
No log 2.5682 226 0.7430 0.1199 0.7430 0.8620
No log 2.5909 228 0.7572 0.0089 0.7572 0.8702
No log 2.6136 230 0.9257 -0.0616 0.9257 0.9621
No log 2.6364 232 0.8091 -0.1538 0.8091 0.8995
No log 2.6591 234 0.7131 0.0723 0.7131 0.8444
No log 2.6818 236 0.8167 0.1291 0.8167 0.9037
No log 2.7045 238 0.7089 0.0723 0.7089 0.8420
No log 2.7273 240 0.7000 0.0524 0.7000 0.8367
No log 2.75 242 0.7125 0.0863 0.7125 0.8441
No log 2.7727 244 0.8559 0.0316 0.8559 0.9251
No log 2.7955 246 0.8710 0.0316 0.8710 0.9333
No log 2.8182 248 0.7442 0.0247 0.7442 0.8626
No log 2.8409 250 0.7232 0.0479 0.7232 0.8504
No log 2.8636 252 0.8008 -0.0144 0.8008 0.8949
No log 2.8864 254 0.7345 0.0524 0.7345 0.8570
No log 2.9091 256 0.7710 0.1431 0.7710 0.8781
No log 2.9318 258 0.8472 0.1149 0.8472 0.9204
No log 2.9545 260 0.7227 0.2078 0.7227 0.8501
No log 2.9773 262 0.6905 0.0436 0.6905 0.8310
No log 3.0 264 0.7288 0.0524 0.7288 0.8537
No log 3.0227 266 0.7311 0.0436 0.7311 0.8550
No log 3.0455 268 0.7918 0.1485 0.7918 0.8898
No log 3.0682 270 0.8581 0.0793 0.8581 0.9263
No log 3.0909 272 0.8138 0.0600 0.8138 0.9021
No log 3.1136 274 0.7455 0.1675 0.7455 0.8634
No log 3.1364 276 0.7392 0.0414 0.7392 0.8598
No log 3.1591 278 0.7273 0.1254 0.7273 0.8528
No log 3.1818 280 0.7497 0.1096 0.7497 0.8659
No log 3.2045 282 0.7459 0.1440 0.7459 0.8637
No log 3.2273 284 0.7056 0.0863 0.7056 0.8400
No log 3.25 286 0.7354 0.1525 0.7354 0.8576
No log 3.2727 288 0.7537 0.1404 0.7537 0.8681
No log 3.2955 290 0.8559 0.0913 0.8559 0.9252
No log 3.3182 292 0.9603 0.0293 0.9603 0.9799
No log 3.3409 294 0.8028 0.1495 0.8028 0.8960
No log 3.3636 296 0.7574 0.0247 0.7574 0.8703
No log 3.3864 298 0.7907 0.1001 0.7907 0.8892
No log 3.4091 300 0.8831 -0.0101 0.8831 0.9397
No log 3.4318 302 0.8518 0.0748 0.8518 0.9229
No log 3.4545 304 0.7278 0.0296 0.7278 0.8531
No log 3.4773 306 0.7075 0.0964 0.7075 0.8411
No log 3.5 308 0.7316 0.0759 0.7316 0.8553
No log 3.5227 310 0.7202 0.0759 0.7202 0.8487
No log 3.5455 312 0.6888 0.0506 0.6888 0.8299
No log 3.5682 314 0.7463 0.0615 0.7463 0.8639
No log 3.5909 316 0.7167 0.0496 0.7167 0.8466
No log 3.6136 318 0.7627 0.1506 0.7627 0.8733
No log 3.6364 320 0.8293 0.0490 0.8293 0.9107
No log 3.6591 322 0.8448 0.0913 0.8448 0.9192
No log 3.6818 324 0.8183 0.0165 0.8183 0.9046
No log 3.7045 326 0.9231 0.0030 0.9231 0.9608
No log 3.7273 328 0.9275 0.0072 0.9275 0.9631
No log 3.75 330 0.7486 -0.1010 0.7486 0.8652
No log 3.7727 332 0.7724 0.1148 0.7724 0.8789
No log 3.7955 334 0.9037 -0.0101 0.9037 0.9506
No log 3.8182 336 0.7952 0.1899 0.7952 0.8917
No log 3.8409 338 0.7002 0.0355 0.7002 0.8368
No log 3.8636 340 0.7035 -0.0065 0.7035 0.8387
No log 3.8864 342 0.7087 0.0436 0.7087 0.8419
No log 3.9091 344 0.7705 0.0680 0.7705 0.8778
No log 3.9318 346 0.9220 -0.0518 0.9220 0.9602
No log 3.9545 348 0.8668 -0.0079 0.8668 0.9310
No log 3.9773 350 0.7229 0.0395 0.7229 0.8502
No log 4.0 352 0.7843 0.0165 0.7843 0.8856
No log 4.0227 354 0.8084 0.0644 0.8084 0.8991
No log 4.0455 356 0.7760 0.1298 0.7760 0.8809
No log 4.0682 358 0.8516 0.1440 0.8516 0.9228
No log 4.0909 360 0.7971 0.0660 0.7971 0.8928
No log 4.1136 362 0.7645 0.0723 0.7645 0.8744
No log 4.1364 364 0.7396 0.0395 0.7396 0.8600
No log 4.1591 366 0.7481 0.1202 0.7481 0.8649
No log 4.1818 368 0.7392 0.0869 0.7392 0.8598
No log 4.2045 370 0.7621 0.0783 0.7621 0.8730
No log 4.2273 372 0.7809 0.0741 0.7809 0.8837
No log 4.25 374 0.8397 0.0786 0.8397 0.9163
No log 4.2727 376 0.7599 0.1202 0.7599 0.8717
No log 4.2955 378 0.6905 0.0414 0.6905 0.8310
No log 4.3182 380 0.7061 0.0033 0.7061 0.8403
No log 4.3409 382 0.6710 0.0395 0.6710 0.8191
No log 4.3636 384 0.7584 0.1506 0.7584 0.8709
No log 4.3864 386 0.8299 0.0287 0.8299 0.9110
No log 4.4091 388 0.7242 0.1097 0.7242 0.8510
No log 4.4318 390 0.7111 0.0414 0.7111 0.8433
No log 4.4545 392 0.7108 0.0414 0.7108 0.8431
No log 4.4773 394 0.7304 0.1148 0.7304 0.8546
No log 4.5 396 0.7874 0.0909 0.7874 0.8874
No log 4.5227 398 0.7317 0.1148 0.7317 0.8554
No log 4.5455 400 0.7007 0.0807 0.7007 0.8371
No log 4.5682 402 0.7071 0.0318 0.7071 0.8409
No log 4.5909 404 0.7463 0.0723 0.7463 0.8639
No log 4.6136 406 0.8102 0.1291 0.8102 0.9001
No log 4.6364 408 0.7268 0.0282 0.7268 0.8525
No log 4.6591 410 0.7371 0.0033 0.7371 0.8586
No log 4.6818 412 0.7529 0.0094 0.7529 0.8677
No log 4.7045 414 0.6940 -0.0096 0.6940 0.8331
No log 4.7273 416 0.8838 0.0711 0.8838 0.9401
No log 4.75 418 1.2090 0.0503 1.2090 1.0995
No log 4.7727 420 1.1823 0.0503 1.1823 1.0874
No log 4.7955 422 0.8928 0.1024 0.8928 0.9449
No log 4.8182 424 0.6901 0.0814 0.6901 0.8307
No log 4.8409 426 0.7110 0.0 0.7110 0.8432
No log 4.8636 428 0.7398 0.0094 0.7398 0.8601
No log 4.8864 430 0.6776 -0.0096 0.6776 0.8232
No log 4.9091 432 0.7107 0.1836 0.7107 0.8430
No log 4.9318 434 0.8161 0.1107 0.8161 0.9034
No log 4.9545 436 0.8531 0.0986 0.8531 0.9236
No log 4.9773 438 0.7313 0.1836 0.7313 0.8551
No log 5.0 440 0.7160 -0.0030 0.7160 0.8462
No log 5.0227 442 0.7870 -0.0307 0.7870 0.8871
No log 5.0455 444 0.7485 0.0061 0.7485 0.8651
No log 5.0682 446 0.6979 0.0318 0.6979 0.8354
No log 5.0909 448 0.7047 0.1148 0.7047 0.8394
No log 5.1136 450 0.6931 0.1202 0.6931 0.8325
No log 5.1364 452 0.6851 0.0857 0.6851 0.8277
No log 5.1591 454 0.6925 0.0857 0.6925 0.8322
No log 5.1818 456 0.6996 0.0807 0.6996 0.8364
No log 5.2045 458 0.7159 0.0807 0.7159 0.8461
No log 5.2273 460 0.8155 0.0867 0.8155 0.9030
No log 5.25 462 1.0586 0.0111 1.0586 1.0289
No log 5.2727 464 1.1155 0.0282 1.1155 1.0562
No log 5.2955 466 0.9406 -0.0143 0.9406 0.9699
No log 5.3182 468 0.7743 0.1096 0.7743 0.8799
No log 5.3409 470 0.7440 -0.0125 0.7440 0.8625
No log 5.3636 472 0.7209 0.0374 0.7209 0.8490
No log 5.3864 474 0.7300 0.0260 0.7300 0.8544
No log 5.4091 476 0.7984 0.0442 0.7984 0.8936
No log 5.4318 478 0.8105 0.0409 0.8105 0.9003
No log 5.4545 480 0.7613 0.0999 0.7613 0.8725
No log 5.4773 482 0.7504 0.1097 0.7504 0.8663
No log 5.5 484 0.7519 -0.0195 0.7519 0.8671
No log 5.5227 486 0.7731 0.0622 0.7731 0.8792
No log 5.5455 488 0.7445 -0.0118 0.7445 0.8628
No log 5.5682 490 0.7342 0.0454 0.7342 0.8569
No log 5.5909 492 0.7251 -0.0152 0.7251 0.8515
No log 5.6136 494 0.7780 0.1836 0.7780 0.8821
No log 5.6364 496 0.7668 0.1899 0.7668 0.8757
No log 5.6591 498 0.7006 0.0296 0.7006 0.8370
0.3281 5.6818 500 0.6920 0.0436 0.6920 0.8319
0.3281 5.7045 502 0.7207 0.0436 0.7207 0.8489
0.3281 5.7273 504 0.7672 0.0611 0.7672 0.8759
0.3281 5.75 506 0.9328 0.1025 0.9328 0.9658
0.3281 5.7727 508 1.0715 0.0747 1.0715 1.0351
0.3281 5.7955 510 0.9167 0.1025 0.9167 0.9575
0.3281 5.8182 512 0.7372 -0.0204 0.7372 0.8586
0.3281 5.8409 514 0.7438 0.0061 0.7438 0.8624
0.3281 5.8636 516 0.7062 -0.0033 0.7062 0.8404
0.3281 5.8864 518 0.6814 0.0506 0.6814 0.8255
0.3281 5.9091 520 0.7212 0.0670 0.7212 0.8493
0.3281 5.9318 522 0.8354 0.1493 0.8354 0.9140
0.3281 5.9545 524 0.8060 0.1149 0.8060 0.8978
0.3281 5.9773 526 0.7716 0.1758 0.7716 0.8784
0.3281 6.0 528 0.7157 0.0857 0.7157 0.8460
0.3281 6.0227 530 0.7123 0.0395 0.7123 0.8440
0.3281 6.0455 532 0.7143 0.0296 0.7143 0.8452
0.3281 6.0682 534 0.7369 0.0714 0.7369 0.8585
0.3281 6.0909 536 0.7242 0.0714 0.7242 0.8510
0.3281 6.1136 538 0.6872 0.0857 0.6872 0.8290
0.3281 6.1364 540 0.7017 0.0857 0.7017 0.8377
0.3281 6.1591 542 0.7053 0.0807 0.7053 0.8398
0.3281 6.1818 544 0.7172 0.1259 0.7172 0.8469
0.3281 6.2045 546 0.7021 0.0395 0.7021 0.8379
0.3281 6.2273 548 0.6968 -0.0096 0.6968 0.8347
0.3281 6.25 550 0.7196 0.1691 0.7196 0.8483
0.3281 6.2727 552 0.7985 0.0549 0.7985 0.8936
0.3281 6.2955 554 0.8122 0.0071 0.8122 0.9012
0.3281 6.3182 556 0.7939 0.0549 0.7939 0.8910
0.3281 6.3409 558 0.7087 0.1259 0.7087 0.8419
0.3281 6.3636 560 0.7068 -0.0033 0.7068 0.8407
0.3281 6.3864 562 0.8056 0.1130 0.8056 0.8976
0.3281 6.4091 564 0.8232 0.0733 0.8232 0.9073
0.3281 6.4318 566 0.7432 0.0 0.7432 0.8621
0.3281 6.4545 568 0.6718 0.0909 0.6718 0.8196
0.3281 6.4773 570 0.6916 0.1758 0.6916 0.8316
0.3281 6.5 572 0.7079 0.1148 0.7079 0.8414
0.3281 6.5227 574 0.7043 0.1148 0.7043 0.8392
0.3281 6.5455 576 0.7287 0.1047 0.7287 0.8536
0.3281 6.5682 578 0.7170 0.1148 0.7170 0.8467
0.3281 6.5909 580 0.7096 0.1318 0.7096 0.8424
0.3281 6.6136 582 0.7126 0.0395 0.7126 0.8441
0.3281 6.6364 584 0.7202 0.0355 0.7202 0.8486
0.3281 6.6591 586 0.7248 0.0814 0.7248 0.8514
0.3281 6.6818 588 0.7134 0.0355 0.7134 0.8447
0.3281 6.7045 590 0.7138 0.0395 0.7138 0.8449
0.3281 6.7273 592 0.7090 0.0395 0.7090 0.8420
0.3281 6.75 594 0.7386 0.0549 0.7386 0.8594
0.3281 6.7727 596 0.8751 0.1107 0.8751 0.9355
0.3281 6.7955 598 0.8214 0.0867 0.8214 0.9063
0.3281 6.8182 600 0.7299 0.1506 0.7299 0.8543
0.3281 6.8409 602 0.6997 0.0857 0.6997 0.8365
0.3281 6.8636 604 0.7081 0.0909 0.7081 0.8415
0.3281 6.8864 606 0.7417 -0.0096 0.7417 0.8612
0.3281 6.9091 608 0.7157 -0.0096 0.7157 0.8460
0.3281 6.9318 610 0.6808 0.0857 0.6808 0.8251
0.3281 6.9545 612 0.6798 0.1318 0.6798 0.8245
0.3281 6.9773 614 0.7009 0.1148 0.7009 0.8372
0.3281 7.0 616 0.6935 0.1148 0.6935 0.8328
0.3281 7.0227 618 0.6787 0.0857 0.6787 0.8239
0.3281 7.0455 620 0.6775 0.0374 0.6775 0.8231
0.3281 7.0682 622 0.6870 0.1259 0.6870 0.8288
0.3281 7.0909 624 0.6900 0.1202 0.6900 0.8307
0.3281 7.1136 626 0.6944 0.1202 0.6944 0.8333
0.3281 7.1364 628 0.6977 0.1259 0.6977 0.8353
0.3281 7.1591 630 0.7223 -0.0125 0.7223 0.8499
0.3281 7.1818 632 0.7584 -0.0062 0.7584 0.8709
0.3281 7.2045 634 0.7573 -0.0032 0.7573 0.8702
0.3281 7.2273 636 0.7209 -0.0125 0.7209 0.8491
0.3281 7.25 638 0.6977 0.0334 0.6977 0.8353

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task3_organization

Finetuned
(3994)
this model