segformer-b0-finetuned-net-4Sep

This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-net-4Sep on the PushkarA07/batch2-tiles_W5 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0043
  • Mean Iou: 0.8489
  • Mean Accuracy: 0.8926
  • Overall Accuracy: 0.9984
  • Accuracy Abnormality: 0.7857
  • Iou Abnormality: 0.6994

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Abnormality Iou Abnormality
0.0105 0.5556 10 0.0057 0.8215 0.8853 0.9979 0.7715 0.6451
0.0053 1.1111 20 0.0052 0.8205 0.8583 0.9981 0.7172 0.6430
0.0123 1.6667 30 0.0052 0.8345 0.8931 0.9981 0.7870 0.6710
0.0048 2.2222 40 0.0048 0.8348 0.8720 0.9982 0.7445 0.6713
0.0102 2.7778 50 0.0049 0.8308 0.8649 0.9982 0.7303 0.6635
0.0037 3.3333 60 0.0048 0.8355 0.8789 0.9982 0.7585 0.6728
0.0059 3.8889 70 0.0048 0.8404 0.8860 0.9983 0.7726 0.6825
0.0071 4.4444 80 0.0046 0.8400 0.8780 0.9983 0.7565 0.6817
0.0029 5.0 90 0.0046 0.8422 0.8825 0.9983 0.7655 0.6861
0.008 5.5556 100 0.0045 0.8429 0.8834 0.9983 0.7673 0.6876
0.0054 6.1111 110 0.0047 0.8426 0.8909 0.9983 0.7825 0.6870
0.0059 6.6667 120 0.0047 0.8322 0.8613 0.9982 0.7230 0.6661
0.0097 7.2222 130 0.0049 0.8431 0.9026 0.9982 0.8060 0.6879
0.0063 7.7778 140 0.0045 0.8377 0.8707 0.9983 0.7419 0.6771
0.0102 8.3333 150 0.0047 0.8346 0.8706 0.9982 0.7418 0.6711
0.0065 8.8889 160 0.0046 0.8438 0.8927 0.9983 0.7861 0.6893
0.004 9.4444 170 0.0046 0.8427 0.8805 0.9983 0.7615 0.6870
0.0057 10.0 180 0.0046 0.8469 0.9000 0.9983 0.8007 0.6956
0.0097 10.5556 190 0.0044 0.8430 0.8789 0.9983 0.7582 0.6877
0.0059 11.1111 200 0.0045 0.8467 0.8955 0.9983 0.7918 0.6951
0.0124 11.6667 210 0.0044 0.8438 0.8820 0.9983 0.7645 0.6893
0.0105 12.2222 220 0.0044 0.8470 0.8905 0.9983 0.7817 0.6956
0.0043 12.7778 230 0.0044 0.8472 0.8877 0.9984 0.7760 0.6961
0.0041 13.3333 240 0.0044 0.8438 0.8772 0.9983 0.7549 0.6892
0.0083 13.8889 250 0.0044 0.8427 0.8752 0.9983 0.7509 0.6870
0.0089 14.4444 260 0.0044 0.8424 0.8774 0.9983 0.7554 0.6864
0.0034 15.0 270 0.0044 0.8455 0.8885 0.9983 0.7776 0.6927
0.0044 15.5556 280 0.0044 0.8456 0.8879 0.9983 0.7764 0.6928
0.0052 16.1111 290 0.0044 0.8450 0.8859 0.9983 0.7724 0.6916
0.0055 16.6667 300 0.0045 0.8470 0.8964 0.9983 0.7935 0.6957
0.0045 17.2222 310 0.0044 0.8462 0.8896 0.9983 0.7797 0.6941
0.0048 17.7778 320 0.0045 0.8449 0.8902 0.9983 0.7811 0.6914
0.0048 18.3333 330 0.0044 0.8449 0.8833 0.9983 0.7671 0.6914
0.0044 18.8889 340 0.0044 0.8466 0.8888 0.9983 0.7783 0.6948
0.0061 19.4444 350 0.0045 0.8483 0.9034 0.9983 0.8075 0.6983
0.0026 20.0 360 0.0044 0.8461 0.8838 0.9984 0.7682 0.6939
0.008 20.5556 370 0.0044 0.8488 0.8936 0.9984 0.7879 0.6993
0.0047 21.1111 380 0.0044 0.8482 0.8955 0.9983 0.7916 0.6980
0.0109 21.6667 390 0.0045 0.8465 0.8964 0.9983 0.7935 0.6947
0.007 22.2222 400 0.0045 0.8440 0.8788 0.9983 0.7581 0.6897
0.0101 22.7778 410 0.0044 0.8464 0.8914 0.9983 0.7835 0.6945
0.0038 23.3333 420 0.0044 0.8425 0.8802 0.9983 0.7608 0.6867
0.0049 23.8889 430 0.0045 0.8463 0.8916 0.9983 0.7838 0.6943
0.0074 24.4444 440 0.0044 0.8455 0.8861 0.9983 0.7727 0.6927
0.0065 25.0 450 0.0044 0.8466 0.8861 0.9984 0.7728 0.6948
0.0056 25.5556 460 0.0046 0.8490 0.9117 0.9983 0.8243 0.6997
0.0047 26.1111 470 0.0043 0.8469 0.8839 0.9984 0.7682 0.6955
0.0043 26.6667 480 0.0045 0.8475 0.8983 0.9983 0.7973 0.6967
0.0051 27.2222 490 0.0044 0.8430 0.8810 0.9983 0.7625 0.6877
0.0131 27.7778 500 0.0044 0.8477 0.8903 0.9984 0.7812 0.6971
0.0042 28.3333 510 0.0044 0.8489 0.9029 0.9983 0.8066 0.6996
0.0074 28.8889 520 0.0043 0.8454 0.8827 0.9983 0.7660 0.6924
0.002 29.4444 530 0.0044 0.8461 0.8892 0.9983 0.7789 0.6939
0.0125 30.0 540 0.0043 0.8470 0.8896 0.9983 0.7797 0.6957
0.0054 30.5556 550 0.0044 0.8471 0.8928 0.9983 0.7862 0.6959
0.009 31.1111 560 0.0044 0.8399 0.8690 0.9983 0.7383 0.6814
0.0074 31.6667 570 0.0044 0.8493 0.8937 0.9984 0.7880 0.7002
0.0048 32.2222 580 0.0045 0.8463 0.8897 0.9983 0.7801 0.6943
0.004 32.7778 590 0.0044 0.8479 0.8955 0.9983 0.7917 0.6975
0.0047 33.3333 600 0.0045 0.8488 0.9040 0.9983 0.8088 0.6993
0.0039 33.8889 610 0.0044 0.8476 0.8909 0.9983 0.7825 0.6968
0.0047 34.4444 620 0.0044 0.8483 0.8937 0.9983 0.7880 0.6983
0.0038 35.0 630 0.0044 0.8496 0.8980 0.9984 0.7966 0.7009
0.0028 35.5556 640 0.0043 0.8486 0.8928 0.9984 0.7861 0.6988
0.0129 36.1111 650 0.0043 0.8491 0.8935 0.9984 0.7877 0.6998
0.0055 36.6667 660 0.0044 0.8499 0.9056 0.9983 0.8119 0.7014
0.0089 37.2222 670 0.0044 0.8481 0.8899 0.9984 0.7804 0.6979
0.0042 37.7778 680 0.0043 0.8501 0.8935 0.9984 0.7877 0.7018
0.0068 38.3333 690 0.0044 0.8451 0.8826 0.9983 0.7657 0.6919
0.0036 38.8889 700 0.0044 0.8506 0.8978 0.9984 0.7963 0.7028
0.0129 39.4444 710 0.0043 0.8488 0.8904 0.9984 0.7814 0.6993
0.0043 40.0 720 0.0043 0.8491 0.8934 0.9984 0.7874 0.6999
0.0036 40.5556 730 0.0044 0.8410 0.8734 0.9983 0.7473 0.6837
0.0103 41.1111 740 0.0045 0.8497 0.9069 0.9983 0.8145 0.7011
0.0071 41.6667 750 0.0043 0.8448 0.8783 0.9984 0.7571 0.6913
0.0072 42.2222 760 0.0044 0.8501 0.8945 0.9984 0.7897 0.7018
0.0038 42.7778 770 0.0043 0.8495 0.8916 0.9984 0.7839 0.7007
0.0049 43.3333 780 0.0043 0.8449 0.8777 0.9984 0.7558 0.6915
0.0062 43.8889 790 0.0044 0.8495 0.8958 0.9984 0.7922 0.7006
0.0044 44.4444 800 0.0043 0.8492 0.8913 0.9984 0.7832 0.7001
0.0034 45.0 810 0.0044 0.8480 0.8854 0.9984 0.7713 0.6976
0.0033 45.5556 820 0.0043 0.8486 0.8903 0.9984 0.7813 0.6989
0.0053 46.1111 830 0.0044 0.8481 0.8982 0.9983 0.7970 0.6980
0.0025 46.6667 840 0.0044 0.8460 0.8855 0.9983 0.7716 0.6937
0.0069 47.2222 850 0.0044 0.8463 0.8827 0.9984 0.7660 0.6942
0.0031 47.7778 860 0.0043 0.8498 0.8917 0.9984 0.7840 0.7013
0.006 48.3333 870 0.0043 0.8462 0.8820 0.9984 0.7645 0.6941
0.0025 48.8889 880 0.0044 0.8498 0.8956 0.9984 0.7918 0.7011
0.0075 49.4444 890 0.0043 0.8498 0.8915 0.9984 0.7835 0.7013
0.0064 50.0 900 0.0043 0.8500 0.8931 0.9984 0.7867 0.7016
0.0067 50.5556 910 0.0043 0.8518 0.9003 0.9984 0.8013 0.7052
0.0046 51.1111 920 0.0043 0.8496 0.8913 0.9984 0.7832 0.7008
0.0075 51.6667 930 0.0044 0.8483 0.8884 0.9984 0.7775 0.6983
0.0102 52.2222 940 0.0044 0.8506 0.8991 0.9984 0.7988 0.7028
0.0056 52.7778 950 0.0043 0.8488 0.8909 0.9984 0.7824 0.6991
0.0066 53.3333 960 0.0043 0.8451 0.8800 0.9984 0.7605 0.6918
0.0073 53.8889 970 0.0044 0.8519 0.9031 0.9984 0.8070 0.7055
0.0054 54.4444 980 0.0043 0.8445 0.8772 0.9984 0.7549 0.6906
0.0066 55.0 990 0.0043 0.8506 0.8955 0.9984 0.7916 0.7028
0.0047 55.5556 1000 0.0043 0.8523 0.9037 0.9984 0.8082 0.7062
0.0083 56.1111 1010 0.0043 0.8492 0.8892 0.9984 0.7789 0.7001
0.0022 56.6667 1020 0.0043 0.8483 0.8866 0.9984 0.7737 0.6983
0.011 57.2222 1030 0.0043 0.8439 0.8757 0.9984 0.7519 0.6894
0.0033 57.7778 1040 0.0043 0.8496 0.8967 0.9984 0.7940 0.7008
0.0051 58.3333 1050 0.0044 0.8504 0.8978 0.9984 0.7963 0.7025
0.0029 58.8889 1060 0.0043 0.8491 0.8895 0.9984 0.7796 0.6999
0.0052 59.4444 1070 0.0043 0.8465 0.8812 0.9984 0.7628 0.6946
0.0036 60.0 1080 0.0043 0.8511 0.8999 0.9984 0.8005 0.7038
0.0026 60.5556 1090 0.0043 0.8498 0.8899 0.9984 0.7804 0.7012
0.0023 61.1111 1100 0.0043 0.8498 0.8906 0.9984 0.7818 0.7013
0.0087 61.6667 1110 0.0043 0.8464 0.8817 0.9984 0.7638 0.6945
0.0057 62.2222 1120 0.0043 0.8495 0.8914 0.9984 0.7833 0.7006
0.004 62.7778 1130 0.0043 0.8482 0.8865 0.9984 0.7735 0.6980
0.0087 63.3333 1140 0.0043 0.8506 0.8979 0.9984 0.7964 0.7029
0.0087 63.8889 1150 0.0043 0.8488 0.8900 0.9984 0.7805 0.6993
0.004 64.4444 1160 0.0044 0.8502 0.8983 0.9984 0.7972 0.7020
0.0037 65.0 1170 0.0044 0.8501 0.8994 0.9984 0.7996 0.7018
0.0084 65.5556 1180 0.0043 0.8485 0.8884 0.9984 0.7773 0.6986
0.0034 66.1111 1190 0.0044 0.8496 0.8953 0.9984 0.7912 0.7008
0.0063 66.6667 1200 0.0044 0.8498 0.9003 0.9983 0.8013 0.7012
0.0091 67.2222 1210 0.0043 0.8482 0.8885 0.9984 0.7775 0.6980
0.009 67.7778 1220 0.0043 0.8475 0.8857 0.9984 0.7719 0.6967
0.0054 68.3333 1230 0.0043 0.8493 0.8939 0.9984 0.7885 0.7003
0.0043 68.8889 1240 0.0043 0.8477 0.8904 0.9984 0.7814 0.6972
0.0046 69.4444 1250 0.0043 0.8476 0.8884 0.9984 0.7773 0.6968
0.0053 70.0 1260 0.0043 0.8445 0.8777 0.9984 0.7559 0.6907
0.0066 70.5556 1270 0.0043 0.8491 0.8903 0.9984 0.7811 0.6999
0.0044 71.1111 1280 0.0043 0.8480 0.8874 0.9984 0.7752 0.6976
0.0073 71.6667 1290 0.0043 0.8490 0.8921 0.9984 0.7848 0.6995
0.0035 72.2222 1300 0.0043 0.8486 0.8910 0.9984 0.7825 0.6989
0.0074 72.7778 1310 0.0044 0.8481 0.8896 0.9984 0.7798 0.6979
0.006 73.3333 1320 0.0044 0.8487 0.8949 0.9983 0.7904 0.6990
0.0062 73.8889 1330 0.0044 0.8492 0.8984 0.9983 0.7975 0.7001
0.0113 74.4444 1340 0.0043 0.8480 0.8908 0.9984 0.7821 0.6976
0.004 75.0 1350 0.0043 0.8452 0.8807 0.9984 0.7619 0.6920
0.0066 75.5556 1360 0.0043 0.8464 0.8844 0.9984 0.7693 0.6945
0.0059 76.1111 1370 0.0043 0.8486 0.8941 0.9984 0.7889 0.6989
0.0021 76.6667 1380 0.0044 0.8489 0.8997 0.9983 0.8000 0.6995
0.0034 77.2222 1390 0.0044 0.8476 0.8947 0.9983 0.7901 0.6969
0.0115 77.7778 1400 0.0044 0.8471 0.8894 0.9983 0.7794 0.6959
0.0064 78.3333 1410 0.0044 0.8480 0.8959 0.9983 0.7924 0.6976
0.0041 78.8889 1420 0.0044 0.8474 0.8926 0.9983 0.7858 0.6965
0.0062 79.4444 1430 0.0044 0.8487 0.8921 0.9984 0.7848 0.6991
0.0064 80.0 1440 0.0044 0.8486 0.8910 0.9984 0.7825 0.6989
0.0066 80.5556 1450 0.0044 0.8481 0.8896 0.9984 0.7799 0.6978
0.0039 81.1111 1460 0.0044 0.8490 0.8964 0.9983 0.7934 0.6998
0.0061 81.6667 1470 0.0044 0.8497 0.9002 0.9983 0.8011 0.7011
0.006 82.2222 1480 0.0044 0.8494 0.8962 0.9984 0.7930 0.7005
0.0036 82.7778 1490 0.0044 0.8492 0.8937 0.9984 0.7880 0.7001
0.0154 83.3333 1500 0.0044 0.8472 0.8868 0.9984 0.7741 0.6960
0.0063 83.8889 1510 0.0044 0.8465 0.8842 0.9984 0.7690 0.6946
0.0085 84.4444 1520 0.0044 0.8468 0.8878 0.9984 0.7761 0.6954
0.005 85.0 1530 0.0044 0.8491 0.8948 0.9984 0.7902 0.6999
0.0061 85.5556 1540 0.0044 0.8502 0.8995 0.9984 0.7996 0.7021
0.0144 86.1111 1550 0.0044 0.8488 0.8938 0.9984 0.7883 0.6992
0.0075 86.6667 1560 0.0044 0.8484 0.8923 0.9984 0.7851 0.6984
0.0119 87.2222 1570 0.0044 0.8482 0.8926 0.9984 0.7859 0.6981
0.0038 87.7778 1580 0.0044 0.8486 0.8952 0.9983 0.7910 0.6990
0.0047 88.3333 1590 0.0044 0.8486 0.8971 0.9983 0.7949 0.6989
0.0059 88.8889 1600 0.0044 0.8487 0.8956 0.9983 0.7918 0.6990
0.0035 89.4444 1610 0.0044 0.8479 0.8910 0.9984 0.7826 0.6974
0.0058 90.0 1620 0.0044 0.8477 0.8908 0.9984 0.7822 0.6971
0.0067 90.5556 1630 0.0044 0.8483 0.8943 0.9983 0.7891 0.6982
0.009 91.1111 1640 0.0044 0.8479 0.8917 0.9984 0.7840 0.6975
0.0042 91.6667 1650 0.0044 0.8476 0.8912 0.9983 0.7829 0.6969
0.0024 92.2222 1660 0.0044 0.8480 0.8923 0.9984 0.7851 0.6977
0.0055 92.7778 1670 0.0044 0.8474 0.8905 0.9983 0.7816 0.6966
0.0041 93.3333 1680 0.0044 0.8484 0.8932 0.9984 0.7870 0.6985
0.0041 93.8889 1690 0.0044 0.8491 0.8957 0.9984 0.7920 0.6998
0.0086 94.4444 1700 0.0044 0.8496 0.8977 0.9984 0.7962 0.7008
0.0086 95.0 1710 0.0044 0.8499 0.8994 0.9983 0.7995 0.7014
0.0081 95.5556 1720 0.0044 0.8497 0.8970 0.9984 0.7946 0.7010
0.0066 96.1111 1730 0.0044 0.8494 0.8962 0.9984 0.7931 0.7005
0.0056 96.6667 1740 0.0044 0.8494 0.8949 0.9984 0.7903 0.7005
0.0071 97.2222 1750 0.0043 0.8494 0.8944 0.9984 0.7894 0.7005
0.013 97.7778 1760 0.0043 0.8489 0.8923 0.9984 0.7852 0.6995
0.0044 98.3333 1770 0.0043 0.8496 0.8950 0.9984 0.7906 0.7008
0.0049 98.8889 1780 0.0043 0.8490 0.8929 0.9984 0.7865 0.6996
0.0079 99.4444 1790 0.0043 0.8487 0.8924 0.9984 0.7854 0.6991
0.0065 100.0 1800 0.0043 0.8489 0.8926 0.9984 0.7857 0.6994

Framework versions

  • Transformers 4.56.0
  • Pytorch 2.8.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.0
Downloads last month
8
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for PushkarA07/segformer-b0-finetuned-net-4Sep

Unable to build the model tree, the base model loops to the model itself. Learn more.