bert-finetuned-wines-test
This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 5.6685
- Accuracy: 0.0835
- F1: 0.0519
- Precision: 0.5781
- Recall: 0.2021
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 5
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
7.0899 | 1.0 | 405 | 7.4053 | 0.0127 | 0.0010 | 0.9412 | 0.0166 |
7.0579 | 2.0 | 810 | 7.3116 | 0.0148 | 0.0016 | 0.9450 | 0.0194 |
6.9134 | 3.0 | 1215 | 7.2316 | 0.0167 | 0.0014 | 0.9427 | 0.0222 |
6.7818 | 4.0 | 1620 | 7.1426 | 0.0161 | 0.0019 | 0.9389 | 0.0204 |
6.6568 | 5.0 | 2025 | 7.0671 | 0.0179 | 0.0022 | 0.9416 | 0.0217 |
6.596 | 6.0 | 2430 | 6.9895 | 0.0201 | 0.0028 | 0.9387 | 0.0232 |
6.4948 | 7.0 | 2835 | 6.9191 | 0.0192 | 0.0021 | 0.9299 | 0.0269 |
6.3896 | 8.0 | 3240 | 6.8525 | 0.0244 | 0.0039 | 0.9266 | 0.0315 |
6.2833 | 9.0 | 3645 | 6.7975 | 0.0238 | 0.0041 | 0.9252 | 0.0342 |
6.1903 | 10.0 | 4050 | 6.7439 | 0.0278 | 0.0054 | 0.9241 | 0.0336 |
6.0943 | 11.0 | 4455 | 6.6941 | 0.0260 | 0.0048 | 0.9246 | 0.0314 |
6.0046 | 12.0 | 4860 | 6.6313 | 0.0285 | 0.0057 | 0.9096 | 0.0386 |
5.9142 | 13.0 | 5265 | 6.5805 | 0.0322 | 0.0074 | 0.9042 | 0.0415 |
5.8266 | 14.0 | 5670 | 6.5452 | 0.0325 | 0.0069 | 0.9041 | 0.0425 |
5.7441 | 15.0 | 6075 | 6.5005 | 0.0343 | 0.0090 | 0.8949 | 0.0468 |
5.6654 | 16.0 | 6480 | 6.4590 | 0.0346 | 0.0087 | 0.8872 | 0.0500 |
5.5887 | 17.0 | 6885 | 6.4252 | 0.0390 | 0.0092 | 0.8893 | 0.0513 |
5.5124 | 18.0 | 7290 | 6.3871 | 0.0393 | 0.0110 | 0.8758 | 0.0569 |
5.4385 | 19.0 | 7695 | 6.3533 | 0.0393 | 0.0123 | 0.8766 | 0.0551 |
5.3655 | 20.0 | 8100 | 6.3077 | 0.0411 | 0.0125 | 0.8617 | 0.0603 |
5.2988 | 21.0 | 8505 | 6.2834 | 0.0414 | 0.0120 | 0.8623 | 0.0627 |
5.2264 | 22.0 | 8910 | 6.2486 | 0.0442 | 0.0131 | 0.8556 | 0.0670 |
5.1614 | 23.0 | 9315 | 6.2221 | 0.0433 | 0.0138 | 0.8514 | 0.0667 |
5.0919 | 24.0 | 9720 | 6.1899 | 0.0430 | 0.0137 | 0.8473 | 0.0677 |
5.0295 | 25.0 | 10125 | 6.1625 | 0.0461 | 0.0153 | 0.8441 | 0.0716 |
4.9639 | 26.0 | 10530 | 6.1408 | 0.0455 | 0.0146 | 0.8348 | 0.0728 |
4.9026 | 27.0 | 10935 | 6.1182 | 0.0470 | 0.0160 | 0.8283 | 0.0778 |
4.8405 | 28.0 | 11340 | 6.1004 | 0.0492 | 0.0166 | 0.8266 | 0.0790 |
4.7813 | 29.0 | 11745 | 6.0731 | 0.0510 | 0.0188 | 0.8227 | 0.0832 |
4.7224 | 30.0 | 12150 | 6.0579 | 0.0507 | 0.0189 | 0.8174 | 0.0859 |
4.6615 | 31.0 | 12555 | 6.0392 | 0.0498 | 0.0181 | 0.8090 | 0.0876 |
4.6056 | 32.0 | 12960 | 6.0157 | 0.0529 | 0.0192 | 0.8034 | 0.0903 |
4.543 | 33.0 | 13365 | 5.9927 | 0.0532 | 0.0204 | 0.8023 | 0.0915 |
4.4967 | 34.0 | 13770 | 5.9756 | 0.0529 | 0.0210 | 0.7978 | 0.0958 |
4.4381 | 35.0 | 14175 | 5.9558 | 0.0532 | 0.0212 | 0.7924 | 0.0955 |
4.3834 | 36.0 | 14580 | 5.9545 | 0.0532 | 0.0214 | 0.7878 | 0.0977 |
4.3324 | 37.0 | 14985 | 5.9353 | 0.0551 | 0.0231 | 0.7781 | 0.1011 |
4.281 | 38.0 | 15390 | 5.9170 | 0.0569 | 0.0254 | 0.7774 | 0.1039 |
4.2281 | 39.0 | 15795 | 5.9025 | 0.0597 | 0.0277 | 0.7684 | 0.1133 |
4.1793 | 40.0 | 16200 | 5.8818 | 0.0588 | 0.0280 | 0.7680 | 0.1135 |
4.1252 | 41.0 | 16605 | 5.8647 | 0.0619 | 0.0299 | 0.7573 | 0.1146 |
4.0782 | 42.0 | 17010 | 5.8536 | 0.0588 | 0.0277 | 0.7556 | 0.1110 |
4.0296 | 43.0 | 17415 | 5.8441 | 0.0619 | 0.0298 | 0.7559 | 0.1200 |
3.9836 | 44.0 | 17820 | 5.8306 | 0.0634 | 0.0304 | 0.7405 | 0.1255 |
3.9309 | 45.0 | 18225 | 5.8258 | 0.0616 | 0.0296 | 0.7439 | 0.1242 |
3.8869 | 46.0 | 18630 | 5.8155 | 0.0646 | 0.0316 | 0.7389 | 0.1253 |
3.8416 | 47.0 | 19035 | 5.8037 | 0.0637 | 0.0316 | 0.7305 | 0.1282 |
3.7983 | 48.0 | 19440 | 5.7939 | 0.0640 | 0.0321 | 0.7308 | 0.1310 |
3.7486 | 49.0 | 19845 | 5.7790 | 0.0637 | 0.0322 | 0.7304 | 0.1321 |
3.707 | 50.0 | 20250 | 5.7733 | 0.0665 | 0.0339 | 0.7154 | 0.1363 |
3.6612 | 51.0 | 20655 | 5.7686 | 0.0656 | 0.0344 | 0.7137 | 0.1377 |
3.612 | 52.0 | 21060 | 5.7574 | 0.0677 | 0.0360 | 0.7131 | 0.1411 |
3.5744 | 53.0 | 21465 | 5.7534 | 0.0668 | 0.0364 | 0.7093 | 0.1423 |
3.5346 | 54.0 | 21870 | 5.7536 | 0.0684 | 0.0373 | 0.7055 | 0.1418 |
3.4931 | 55.0 | 22275 | 5.7447 | 0.0708 | 0.0381 | 0.6973 | 0.1463 |
3.4485 | 56.0 | 22680 | 5.7371 | 0.0693 | 0.0390 | 0.6964 | 0.1482 |
3.4095 | 57.0 | 23085 | 5.7284 | 0.0702 | 0.0393 | 0.6795 | 0.1522 |
3.3665 | 58.0 | 23490 | 5.7167 | 0.0705 | 0.0389 | 0.6859 | 0.1540 |
3.3293 | 59.0 | 23895 | 5.7250 | 0.0705 | 0.0382 | 0.6725 | 0.1590 |
3.293 | 60.0 | 24300 | 5.7100 | 0.0721 | 0.0405 | 0.6713 | 0.1587 |
3.2525 | 61.0 | 24705 | 5.7086 | 0.0745 | 0.0415 | 0.6691 | 0.1633 |
3.216 | 62.0 | 25110 | 5.7037 | 0.0742 | 0.0418 | 0.6610 | 0.1643 |
3.1776 | 63.0 | 25515 | 5.7055 | 0.0724 | 0.0401 | 0.6590 | 0.1647 |
3.1382 | 64.0 | 25920 | 5.6999 | 0.0742 | 0.0412 | 0.6527 | 0.1700 |
3.109 | 65.0 | 26325 | 5.7015 | 0.0718 | 0.0394 | 0.6424 | 0.1700 |
3.0677 | 66.0 | 26730 | 5.6967 | 0.0736 | 0.0417 | 0.6525 | 0.1643 |
3.0353 | 67.0 | 27135 | 5.6904 | 0.0745 | 0.0427 | 0.6422 | 0.1692 |
3.0014 | 68.0 | 27540 | 5.6926 | 0.0755 | 0.0440 | 0.6392 | 0.1723 |
2.9632 | 69.0 | 27945 | 5.6775 | 0.0770 | 0.0468 | 0.6328 | 0.1783 |
2.9307 | 70.0 | 28350 | 5.6893 | 0.0733 | 0.0445 | 0.6367 | 0.1721 |
2.9036 | 71.0 | 28755 | 5.6844 | 0.0755 | 0.0443 | 0.6288 | 0.1757 |
2.8705 | 72.0 | 29160 | 5.6800 | 0.0767 | 0.0461 | 0.6308 | 0.1788 |
2.838 | 73.0 | 29565 | 5.6779 | 0.0783 | 0.0476 | 0.6227 | 0.1857 |
2.8044 | 74.0 | 29970 | 5.6806 | 0.0792 | 0.0485 | 0.6194 | 0.1882 |
2.7717 | 75.0 | 30375 | 5.6774 | 0.0792 | 0.0477 | 0.6157 | 0.1851 |
2.7419 | 76.0 | 30780 | 5.6751 | 0.0776 | 0.0473 | 0.6098 | 0.1838 |
2.7131 | 77.0 | 31185 | 5.6763 | 0.0764 | 0.0460 | 0.6013 | 0.1870 |
2.6837 | 78.0 | 31590 | 5.6737 | 0.0810 | 0.0492 | 0.5997 | 0.1900 |
2.6545 | 79.0 | 31995 | 5.6724 | 0.0779 | 0.0501 | 0.6040 | 0.1954 |
2.6284 | 80.0 | 32400 | 5.6769 | 0.0804 | 0.0499 | 0.5999 | 0.1895 |
2.5979 | 81.0 | 32805 | 5.6731 | 0.0779 | 0.0487 | 0.6001 | 0.1897 |
2.5709 | 82.0 | 33210 | 5.6764 | 0.0795 | 0.0492 | 0.5946 | 0.1917 |
2.5454 | 83.0 | 33615 | 5.6707 | 0.0779 | 0.0478 | 0.5842 | 0.1939 |
2.5233 | 84.0 | 34020 | 5.6707 | 0.0820 | 0.0514 | 0.5847 | 0.1974 |
2.4933 | 85.0 | 34425 | 5.6718 | 0.0807 | 0.0515 | 0.5819 | 0.1932 |
2.4688 | 86.0 | 34830 | 5.6685 | 0.0835 | 0.0519 | 0.5781 | 0.2021 |
2.4451 | 87.0 | 35235 | 5.6717 | 0.0829 | 0.0533 | 0.5842 | 0.1973 |
2.4199 | 88.0 | 35640 | 5.6718 | 0.0829 | 0.0537 | 0.5791 | 0.2019 |
2.3913 | 89.0 | 36045 | 5.6707 | 0.0832 | 0.0533 | 0.5769 | 0.2032 |
2.3681 | 90.0 | 36450 | 5.6740 | 0.0820 | 0.0527 | 0.5766 | 0.1980 |
2.3453 | 91.0 | 36855 | 5.6719 | 0.0826 | 0.0533 | 0.5686 | 0.2008 |
2.3215 | 92.0 | 37260 | 5.6752 | 0.0841 | 0.0547 | 0.5758 | 0.2025 |
2.3008 | 93.0 | 37665 | 5.6851 | 0.0813 | 0.0517 | 0.5685 | 0.2009 |
2.284 | 94.0 | 38070 | 5.6749 | 0.0851 | 0.0556 | 0.5680 | 0.2069 |
2.2555 | 95.0 | 38475 | 5.6794 | 0.0817 | 0.0526 | 0.5642 | 0.2024 |
2.2349 | 96.0 | 38880 | 5.6793 | 0.0841 | 0.0541 | 0.5619 | 0.2047 |
2.2171 | 97.0 | 39285 | 5.6854 | 0.0844 | 0.0550 | 0.5608 | 0.2044 |
2.1979 | 98.0 | 39690 | 5.6859 | 0.0848 | 0.0555 | 0.5600 | 0.2031 |
2.178 | 99.0 | 40095 | 5.6810 | 0.0844 | 0.0544 | 0.5453 | 0.2075 |
2.1562 | 100.0 | 40500 | 5.6863 | 0.0841 | 0.0547 | 0.5489 | 0.2066 |
2.1394 | 101.0 | 40905 | 5.6865 | 0.0841 | 0.0554 | 0.5522 | 0.2043 |
2.1205 | 102.0 | 41310 | 5.6817 | 0.0838 | 0.0558 | 0.5417 | 0.2100 |
2.1022 | 103.0 | 41715 | 5.6893 | 0.0835 | 0.0551 | 0.5461 | 0.2096 |
2.0866 | 104.0 | 42120 | 5.6893 | 0.0860 | 0.0561 | 0.5493 | 0.2084 |
2.0732 | 105.0 | 42525 | 5.6843 | 0.0869 | 0.0568 | 0.5404 | 0.2110 |
2.0514 | 106.0 | 42930 | 5.6885 | 0.0854 | 0.0560 | 0.5403 | 0.2120 |
2.0384 | 107.0 | 43335 | 5.7016 | 0.0848 | 0.0551 | 0.5465 | 0.2083 |
2.0207 | 108.0 | 43740 | 5.6923 | 0.0860 | 0.0561 | 0.5375 | 0.2092 |
2.006 | 109.0 | 44145 | 5.6976 | 0.0857 | 0.0571 | 0.5358 | 0.2112 |
1.9922 | 110.0 | 44550 | 5.6940 | 0.0841 | 0.0554 | 0.5300 | 0.2112 |
1.971 | 111.0 | 44955 | 5.6990 | 0.0854 | 0.0566 | 0.5328 | 0.2106 |
1.9569 | 112.0 | 45360 | 5.6993 | 0.0878 | 0.0582 | 0.5305 | 0.2130 |
1.9454 | 113.0 | 45765 | 5.6989 | 0.0863 | 0.0574 | 0.5296 | 0.2159 |
1.9346 | 114.0 | 46170 | 5.6989 | 0.0872 | 0.0581 | 0.5273 | 0.2154 |
1.9206 | 115.0 | 46575 | 5.7015 | 0.0882 | 0.0582 | 0.5261 | 0.2180 |
1.9041 | 116.0 | 46980 | 5.7054 | 0.0872 | 0.0580 | 0.5237 | 0.2154 |
1.8955 | 117.0 | 47385 | 5.7080 | 0.0878 | 0.0583 | 0.5270 | 0.2159 |
1.8784 | 118.0 | 47790 | 5.7098 | 0.0872 | 0.0581 | 0.5261 | 0.2148 |
1.8666 | 119.0 | 48195 | 5.7112 | 0.0866 | 0.0581 | 0.5249 | 0.2151 |
1.861 | 120.0 | 48600 | 5.7105 | 0.0900 | 0.0599 | 0.5227 | 0.2202 |
1.8475 | 121.0 | 49005 | 5.7124 | 0.0875 | 0.0583 | 0.5264 | 0.2185 |
1.8359 | 122.0 | 49410 | 5.7127 | 0.0872 | 0.0585 | 0.5267 | 0.2155 |
1.8251 | 123.0 | 49815 | 5.7148 | 0.0872 | 0.0599 | 0.5218 | 0.2171 |
1.8143 | 124.0 | 50220 | 5.7138 | 0.0882 | 0.0593 | 0.5208 | 0.2205 |
1.8107 | 125.0 | 50625 | 5.7141 | 0.0878 | 0.0597 | 0.5199 | 0.2169 |
1.7978 | 126.0 | 51030 | 5.7192 | 0.0882 | 0.0594 | 0.5173 | 0.2201 |
1.7883 | 127.0 | 51435 | 5.7165 | 0.0903 | 0.0613 | 0.5188 | 0.2181 |
1.7818 | 128.0 | 51840 | 5.7194 | 0.0885 | 0.0603 | 0.5157 | 0.2208 |
1.7756 | 129.0 | 52245 | 5.7185 | 0.0875 | 0.0596 | 0.5145 | 0.2210 |
1.7653 | 130.0 | 52650 | 5.7165 | 0.0875 | 0.0583 | 0.5136 | 0.2231 |
1.7526 | 131.0 | 53055 | 5.7207 | 0.0891 | 0.0603 | 0.5170 | 0.2219 |
1.752 | 132.0 | 53460 | 5.7171 | 0.0891 | 0.0607 | 0.5144 | 0.2202 |
1.7485 | 133.0 | 53865 | 5.7230 | 0.0888 | 0.0604 | 0.5174 | 0.2217 |
1.7337 | 134.0 | 54270 | 5.7249 | 0.0888 | 0.0609 | 0.5175 | 0.2216 |
1.7337 | 135.0 | 54675 | 5.7237 | 0.0897 | 0.0615 | 0.5158 | 0.2203 |
1.7256 | 136.0 | 55080 | 5.7219 | 0.0900 | 0.0620 | 0.5164 | 0.2205 |
1.718 | 137.0 | 55485 | 5.7240 | 0.0894 | 0.0611 | 0.5125 | 0.2206 |
1.7183 | 138.0 | 55890 | 5.7249 | 0.0906 | 0.0616 | 0.5165 | 0.2204 |
1.7154 | 139.0 | 56295 | 5.7256 | 0.0891 | 0.0610 | 0.5138 | 0.2214 |
1.7056 | 140.0 | 56700 | 5.7254 | 0.0903 | 0.0616 | 0.5138 | 0.2238 |
1.7052 | 141.0 | 57105 | 5.7227 | 0.0894 | 0.0611 | 0.5146 | 0.2217 |
1.7005 | 142.0 | 57510 | 5.7250 | 0.0912 | 0.0625 | 0.5180 | 0.2223 |
1.6963 | 143.0 | 57915 | 5.7285 | 0.0912 | 0.0622 | 0.5169 | 0.2211 |
1.6961 | 144.0 | 58320 | 5.7269 | 0.0912 | 0.0624 | 0.5178 | 0.2226 |
1.6884 | 145.0 | 58725 | 5.7275 | 0.0909 | 0.0620 | 0.5165 | 0.2238 |
1.6907 | 146.0 | 59130 | 5.7284 | 0.0906 | 0.0617 | 0.5169 | 0.2233 |
1.6871 | 147.0 | 59535 | 5.7278 | 0.0906 | 0.0621 | 0.5175 | 0.2240 |
1.6827 | 148.0 | 59940 | 5.7273 | 0.0906 | 0.0621 | 0.5162 | 0.2236 |
1.6837 | 149.0 | 60345 | 5.7269 | 0.0909 | 0.0624 | 0.5155 | 0.2249 |
1.6823 | 150.0 | 60750 | 5.7273 | 0.0909 | 0.0623 | 0.5164 | 0.2249 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for dimitarpg13/bert-finetuned-wines-test
Base model
google-bert/bert-base-uncased