GS_bert2
This model is a fine-tuned version of skt/kobert-base-v1 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1128
- F1: 0.4492
- Precision: 0.4722
- Recall: 0.4319
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall |
---|---|---|---|---|---|---|
0.6576 | 1.0 | 45 | 0.6338 | 0.0386 | 0.0407 | 0.0370 |
0.5782 | 2.0 | 90 | 0.5574 | 0.0317 | 0.0333 | 0.0306 |
0.4912 | 3.0 | 135 | 0.4505 | 0.0317 | 0.0333 | 0.0306 |
0.3491 | 4.0 | 180 | 0.3189 | 0.0476 | 0.0519 | 0.0444 |
0.2505 | 5.0 | 225 | 0.2204 | 0.0476 | 0.0519 | 0.0444 |
0.1905 | 6.0 | 270 | 0.1784 | 0.0442 | 0.0463 | 0.0426 |
0.1746 | 7.0 | 315 | 0.1656 | 0.0442 | 0.0463 | 0.0426 |
0.1683 | 8.0 | 360 | 0.1622 | 0.0442 | 0.0463 | 0.0426 |
0.1666 | 9.0 | 405 | 0.1614 | 0.0548 | 0.0574 | 0.0528 |
0.1677 | 10.0 | 450 | 0.1615 | 0.0479 | 0.05 | 0.0463 |
0.1673 | 11.0 | 495 | 0.1614 | 0.0394 | 0.0426 | 0.0370 |
0.1664 | 12.0 | 540 | 0.1610 | 0.0347 | 0.0370 | 0.0329 |
0.1658 | 13.0 | 585 | 0.1606 | 0.0812 | 0.0852 | 0.0782 |
0.1649 | 14.0 | 630 | 0.1597 | 0.1037 | 0.1093 | 0.0995 |
0.1618 | 15.0 | 675 | 0.1581 | 0.1429 | 0.15 | 0.1375 |
0.1615 | 16.0 | 720 | 0.1561 | 0.2042 | 0.2130 | 0.1977 |
0.1577 | 17.0 | 765 | 0.1537 | 0.2275 | 0.2389 | 0.2190 |
0.1552 | 18.0 | 810 | 0.1518 | 0.2352 | 0.2463 | 0.2269 |
0.1526 | 19.0 | 855 | 0.1499 | 0.2698 | 0.2833 | 0.2597 |
0.1502 | 20.0 | 900 | 0.1481 | 0.2862 | 0.3 | 0.2759 |
0.1509 | 21.0 | 945 | 0.1462 | 0.2952 | 0.3093 | 0.2847 |
0.1487 | 22.0 | 990 | 0.1445 | 0.3103 | 0.3259 | 0.2986 |
0.1455 | 23.0 | 1035 | 0.1429 | 0.3122 | 0.3278 | 0.3005 |
0.1451 | 24.0 | 1080 | 0.1412 | 0.3175 | 0.3333 | 0.3056 |
0.1403 | 25.0 | 1125 | 0.1396 | 0.3175 | 0.3333 | 0.3056 |
0.139 | 26.0 | 1170 | 0.1382 | 0.3222 | 0.3389 | 0.3097 |
0.1363 | 27.0 | 1215 | 0.1366 | 0.3254 | 0.3426 | 0.3125 |
0.1329 | 28.0 | 1260 | 0.1353 | 0.3442 | 0.3630 | 0.3301 |
0.1348 | 29.0 | 1305 | 0.1343 | 0.3466 | 0.3648 | 0.3329 |
0.1309 | 30.0 | 1350 | 0.1332 | 0.3463 | 0.3648 | 0.3324 |
0.1291 | 31.0 | 1395 | 0.1318 | 0.3516 | 0.3704 | 0.3375 |
0.1283 | 32.0 | 1440 | 0.1307 | 0.3593 | 0.3778 | 0.3454 |
0.1276 | 33.0 | 1485 | 0.1295 | 0.3632 | 0.3833 | 0.3481 |
0.127 | 34.0 | 1530 | 0.1284 | 0.3672 | 0.3870 | 0.3523 |
0.123 | 35.0 | 1575 | 0.1272 | 0.3669 | 0.3870 | 0.3519 |
0.1215 | 36.0 | 1620 | 0.1264 | 0.3656 | 0.3852 | 0.3509 |
0.1218 | 37.0 | 1665 | 0.1258 | 0.3738 | 0.3944 | 0.3583 |
0.1224 | 38.0 | 1710 | 0.1253 | 0.3780 | 0.3981 | 0.3630 |
0.1168 | 39.0 | 1755 | 0.1242 | 0.3881 | 0.4093 | 0.3722 |
0.1163 | 40.0 | 1800 | 0.1232 | 0.3868 | 0.4074 | 0.3713 |
0.1163 | 41.0 | 1845 | 0.1229 | 0.3865 | 0.4074 | 0.3708 |
0.1143 | 42.0 | 1890 | 0.1219 | 0.3847 | 0.4056 | 0.3690 |
0.1128 | 43.0 | 1935 | 0.1211 | 0.3950 | 0.4167 | 0.3787 |
0.1118 | 44.0 | 1980 | 0.1207 | 0.3950 | 0.4167 | 0.3787 |
0.1104 | 45.0 | 2025 | 0.1201 | 0.3960 | 0.4167 | 0.3806 |
0.1097 | 46.0 | 2070 | 0.1197 | 0.3955 | 0.4167 | 0.3796 |
0.1072 | 47.0 | 2115 | 0.1193 | 0.3905 | 0.4111 | 0.375 |
0.1049 | 48.0 | 2160 | 0.1187 | 0.4021 | 0.4241 | 0.3856 |
0.1059 | 49.0 | 2205 | 0.1185 | 0.3968 | 0.4185 | 0.3806 |
0.1045 | 50.0 | 2250 | 0.1178 | 0.4074 | 0.4296 | 0.3907 |
0.1051 | 51.0 | 2295 | 0.1176 | 0.4050 | 0.4259 | 0.3894 |
0.1014 | 52.0 | 2340 | 0.1169 | 0.4079 | 0.4296 | 0.3917 |
0.1015 | 53.0 | 2385 | 0.1168 | 0.4026 | 0.4241 | 0.3866 |
0.0979 | 54.0 | 2430 | 0.1164 | 0.4135 | 0.4352 | 0.3972 |
0.0971 | 55.0 | 2475 | 0.1161 | 0.4116 | 0.4333 | 0.3954 |
0.0964 | 56.0 | 2520 | 0.1161 | 0.4114 | 0.4333 | 0.3949 |
0.095 | 57.0 | 2565 | 0.1157 | 0.4159 | 0.4370 | 0.4 |
0.0937 | 58.0 | 2610 | 0.1158 | 0.4220 | 0.4444 | 0.4051 |
0.0942 | 59.0 | 2655 | 0.1154 | 0.4172 | 0.4389 | 0.4009 |
0.0917 | 60.0 | 2700 | 0.1150 | 0.4222 | 0.4444 | 0.4056 |
0.0896 | 61.0 | 2745 | 0.1152 | 0.4212 | 0.4426 | 0.4051 |
0.0868 | 62.0 | 2790 | 0.1148 | 0.4262 | 0.4481 | 0.4097 |
0.0868 | 63.0 | 2835 | 0.1143 | 0.4265 | 0.4481 | 0.4102 |
0.0862 | 64.0 | 2880 | 0.1142 | 0.4296 | 0.4519 | 0.4130 |
0.0846 | 65.0 | 2925 | 0.1144 | 0.4153 | 0.4370 | 0.3991 |
0.0821 | 66.0 | 2970 | 0.1135 | 0.4407 | 0.4630 | 0.4241 |
0.0834 | 67.0 | 3015 | 0.1140 | 0.4212 | 0.4426 | 0.4051 |
0.0816 | 68.0 | 3060 | 0.1136 | 0.4320 | 0.4537 | 0.4157 |
0.0812 | 69.0 | 3105 | 0.1131 | 0.4392 | 0.4611 | 0.4227 |
0.0784 | 70.0 | 3150 | 0.1139 | 0.4368 | 0.4593 | 0.4199 |
0.0777 | 71.0 | 3195 | 0.1135 | 0.4265 | 0.4481 | 0.4102 |
0.0767 | 72.0 | 3240 | 0.1134 | 0.4259 | 0.4481 | 0.4093 |
0.0764 | 73.0 | 3285 | 0.1135 | 0.4347 | 0.4574 | 0.4176 |
0.0761 | 74.0 | 3330 | 0.1127 | 0.4315 | 0.4537 | 0.4148 |
0.0759 | 75.0 | 3375 | 0.1127 | 0.4336 | 0.4556 | 0.4171 |
0.0732 | 76.0 | 3420 | 0.1133 | 0.4368 | 0.4593 | 0.4199 |
0.073 | 77.0 | 3465 | 0.1128 | 0.4389 | 0.4611 | 0.4222 |
0.072 | 78.0 | 3510 | 0.1132 | 0.4370 | 0.4593 | 0.4204 |
0.0719 | 79.0 | 3555 | 0.1128 | 0.4410 | 0.4630 | 0.4245 |
0.0714 | 80.0 | 3600 | 0.1127 | 0.4423 | 0.4648 | 0.4255 |
0.0694 | 81.0 | 3645 | 0.1129 | 0.4389 | 0.4611 | 0.4222 |
0.07 | 82.0 | 3690 | 0.1128 | 0.4405 | 0.4630 | 0.4236 |
0.067 | 83.0 | 3735 | 0.1126 | 0.4423 | 0.4648 | 0.4255 |
0.069 | 84.0 | 3780 | 0.1131 | 0.4423 | 0.4648 | 0.4255 |
0.0672 | 85.0 | 3825 | 0.1130 | 0.4460 | 0.4685 | 0.4292 |
0.0674 | 86.0 | 3870 | 0.1129 | 0.4442 | 0.4667 | 0.4273 |
0.0653 | 87.0 | 3915 | 0.1131 | 0.4407 | 0.4630 | 0.4241 |
0.065 | 88.0 | 3960 | 0.1131 | 0.4423 | 0.4648 | 0.4255 |
0.0642 | 89.0 | 4005 | 0.1128 | 0.4458 | 0.4685 | 0.4287 |
0.0641 | 90.0 | 4050 | 0.1127 | 0.4410 | 0.4630 | 0.4245 |
0.0636 | 91.0 | 4095 | 0.1128 | 0.4407 | 0.4630 | 0.4241 |
0.0657 | 92.0 | 4140 | 0.1129 | 0.4439 | 0.4667 | 0.4269 |
0.0643 | 93.0 | 4185 | 0.1128 | 0.4476 | 0.4704 | 0.4306 |
0.063 | 94.0 | 4230 | 0.1129 | 0.4495 | 0.4722 | 0.4324 |
0.0634 | 95.0 | 4275 | 0.1129 | 0.4455 | 0.4685 | 0.4282 |
0.0643 | 96.0 | 4320 | 0.1128 | 0.4474 | 0.4704 | 0.4301 |
0.0625 | 97.0 | 4365 | 0.1127 | 0.4474 | 0.4704 | 0.4301 |
0.0636 | 98.0 | 4410 | 0.1128 | 0.4492 | 0.4722 | 0.4319 |
0.0627 | 99.0 | 4455 | 0.1128 | 0.4492 | 0.4722 | 0.4319 |
0.063 | 100.0 | 4500 | 0.1128 | 0.4492 | 0.4722 | 0.4319 |
Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.5.1+cu121
- Datasets 3.4.1
- Tokenizers 0.21.0
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support