segformer-b0-finetuned-batch3-19May

This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-batch2w5-15Dec on the PushkarA07/batch3-tiles_first dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0057
  • Mean Iou: 0.7819
  • Mean Accuracy: 0.8573
  • Overall Accuracy: 0.9984
  • Accuracy Abnormality: 0.7153
  • Iou Abnormality: 0.5654

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Abnormality Iou Abnormality
0.005 0.9091 10 0.0059 0.7411 0.8270 0.9980 0.6550 0.4843
0.0038 1.8182 20 0.0061 0.7456 0.8639 0.9978 0.7293 0.4935
0.004 2.7273 30 0.0053 0.7557 0.8153 0.9983 0.6312 0.5132
0.0041 3.6364 40 0.0053 0.7559 0.8268 0.9982 0.6544 0.5137
0.0051 4.5455 50 0.0051 0.7659 0.8514 0.9982 0.7037 0.5335
0.0022 5.4545 60 0.0049 0.7669 0.8381 0.9983 0.6770 0.5356
0.0023 6.3636 70 0.0052 0.7667 0.8571 0.9982 0.7153 0.5353
0.0016 7.2727 80 0.0052 0.7679 0.8263 0.9984 0.6533 0.5375
0.0025 8.1818 90 0.0054 0.7687 0.8682 0.9982 0.7374 0.5392
0.0045 9.0909 100 0.0060 0.7622 0.8849 0.9980 0.7712 0.5264
0.0048 10.0 110 0.0054 0.7609 0.8045 0.9984 0.6094 0.5235
0.0035 10.9091 120 0.0054 0.7626 0.8587 0.9981 0.7185 0.5271
0.0033 11.8182 130 0.0051 0.7700 0.8375 0.9983 0.6758 0.5417
0.0022 12.7273 140 0.0052 0.7691 0.8476 0.9983 0.6961 0.5400
0.0039 13.6364 150 0.0055 0.7639 0.8710 0.9981 0.7432 0.5296
0.0034 14.5455 160 0.0051 0.7740 0.8457 0.9984 0.6922 0.5497
0.0039 15.4545 170 0.0053 0.7718 0.8653 0.9982 0.7316 0.5453
0.0024 16.3636 180 0.0053 0.7733 0.8620 0.9983 0.7248 0.5484
0.0023 17.2727 190 0.0051 0.7756 0.8424 0.9984 0.6855 0.5527
0.0022 18.1818 200 0.0053 0.7745 0.8748 0.9982 0.7507 0.5507
0.0041 19.0909 210 0.0051 0.7755 0.8470 0.9984 0.6947 0.5527
0.0025 20.0 220 0.0052 0.7791 0.8638 0.9983 0.7284 0.5599
0.0029 20.9091 230 0.0052 0.7760 0.8554 0.9983 0.7117 0.5537
0.0007 21.8182 240 0.0052 0.7760 0.8506 0.9984 0.7020 0.5537
0.0026 22.7273 250 0.0052 0.7782 0.8627 0.9983 0.7263 0.5581
0.0026 23.6364 260 0.0051 0.7791 0.8471 0.9984 0.6950 0.5597
0.0015 24.5455 270 0.0053 0.7754 0.8665 0.9983 0.7339 0.5525
0.0044 25.4545 280 0.0053 0.7773 0.8562 0.9983 0.7133 0.5563
0.0023 26.3636 290 0.0053 0.7790 0.8606 0.9984 0.7220 0.5597
0.0021 27.2727 300 0.0055 0.7769 0.8735 0.9983 0.7480 0.5556
0.0024 28.1818 310 0.0051 0.7809 0.8514 0.9984 0.7035 0.5633
0.0017 29.0909 320 0.0055 0.7786 0.8643 0.9983 0.7294 0.5589
0.003 30.0 330 0.0053 0.7777 0.8528 0.9984 0.7063 0.5570
0.0012 30.9091 340 0.0052 0.7760 0.8360 0.9984 0.6727 0.5535
0.003 31.8182 350 0.0054 0.7813 0.8648 0.9984 0.7305 0.5642
0.0043 32.7273 360 0.0052 0.7813 0.8465 0.9984 0.6937 0.5642
0.0032 33.6364 370 0.0052 0.7835 0.8587 0.9984 0.7181 0.5685
0.0025 34.5455 380 0.0053 0.7837 0.8637 0.9984 0.7281 0.5691
0.0017 35.4545 390 0.0051 0.7825 0.8380 0.9985 0.6766 0.5666
0.0024 36.3636 400 0.0055 0.7788 0.8704 0.9983 0.7418 0.5594
0.0026 37.2727 410 0.0053 0.7813 0.8533 0.9984 0.7073 0.5641
0.0023 38.1818 420 0.0051 0.7845 0.8646 0.9984 0.7300 0.5706
0.0023 39.0909 430 0.0053 0.7805 0.8648 0.9984 0.7304 0.5627
0.0021 40.0 440 0.0053 0.7823 0.8541 0.9984 0.7089 0.5662
0.0027 40.9091 450 0.0053 0.7834 0.8583 0.9984 0.7174 0.5683
0.004 41.8182 460 0.0052 0.7854 0.8545 0.9985 0.7096 0.5724
0.003 42.7273 470 0.0053 0.7826 0.8466 0.9985 0.6939 0.5668
0.0035 43.6364 480 0.0054 0.7815 0.8637 0.9984 0.7282 0.5646
0.0033 44.5455 490 0.0053 0.7802 0.8560 0.9984 0.7127 0.5620
0.0027 45.4545 500 0.0051 0.7828 0.8489 0.9985 0.6985 0.5672
0.0032 46.3636 510 0.0053 0.7836 0.8605 0.9984 0.7218 0.5687
0.0034 47.2727 520 0.0054 0.7830 0.8521 0.9984 0.7049 0.5675
0.0017 48.1818 530 0.0054 0.7833 0.8595 0.9984 0.7198 0.5681
0.003 49.0909 540 0.0054 0.7809 0.8509 0.9984 0.7024 0.5633
0.0013 50.0 550 0.0053 0.7841 0.8533 0.9985 0.7073 0.5697
0.0026 50.9091 560 0.0054 0.7828 0.8589 0.9984 0.7186 0.5671
0.0013 51.8182 570 0.0054 0.7831 0.8552 0.9984 0.7111 0.5677
0.0019 52.7273 580 0.0055 0.7808 0.8645 0.9984 0.7298 0.5632
0.0024 53.6364 590 0.0054 0.7828 0.8550 0.9984 0.7107 0.5671
0.0024 54.5455 600 0.0054 0.7837 0.8593 0.9984 0.7193 0.5690
0.0025 55.4545 610 0.0055 0.7818 0.8566 0.9984 0.7140 0.5653
0.0018 56.3636 620 0.0054 0.7846 0.8509 0.9985 0.7025 0.5707
0.0027 57.2727 630 0.0054 0.7830 0.8571 0.9984 0.7149 0.5675
0.0017 58.1818 640 0.0054 0.7833 0.8575 0.9984 0.7158 0.5682
0.0038 59.0909 650 0.0054 0.7855 0.8585 0.9984 0.7178 0.5725
0.0018 60.0 660 0.0058 0.7780 0.8628 0.9983 0.7266 0.5576
0.0023 60.9091 670 0.0056 0.7809 0.8534 0.9984 0.7075 0.5634
0.002 61.8182 680 0.0055 0.7841 0.8549 0.9984 0.7105 0.5698
0.0038 62.7273 690 0.0055 0.7822 0.8562 0.9984 0.7132 0.5661
0.0024 63.6364 700 0.0055 0.7813 0.8556 0.9984 0.7120 0.5642
0.0027 64.5455 710 0.0055 0.7828 0.8597 0.9984 0.7202 0.5672
0.0019 65.4545 720 0.0056 0.7813 0.8562 0.9984 0.7131 0.5642
0.0018 66.3636 730 0.0056 0.7818 0.8595 0.9984 0.7199 0.5653
0.0034 67.2727 740 0.0055 0.7825 0.8560 0.9984 0.7127 0.5666
0.0021 68.1818 750 0.0056 0.7818 0.8540 0.9984 0.7087 0.5652
0.0014 69.0909 760 0.0056 0.7818 0.8542 0.9984 0.7091 0.5652
0.002 70.0 770 0.0056 0.7819 0.8600 0.9984 0.7207 0.5654
0.0009 70.9091 780 0.0057 0.7814 0.8586 0.9984 0.7181 0.5645
0.0014 71.8182 790 0.0054 0.7836 0.8604 0.9984 0.7216 0.5688
0.0014 72.7273 800 0.0057 0.7799 0.8571 0.9984 0.7150 0.5615
0.0036 73.6364 810 0.0058 0.7796 0.8603 0.9984 0.7214 0.5607
0.0024 74.5455 820 0.0058 0.7787 0.8648 0.9983 0.7304 0.5590
0.0036 75.4545 830 0.0058 0.7786 0.8580 0.9984 0.7168 0.5589
0.0016 76.3636 840 0.0057 0.7811 0.8626 0.9984 0.7260 0.5639
0.0023 77.2727 850 0.0057 0.7810 0.8586 0.9984 0.7181 0.5637
0.0018 78.1818 860 0.0057 0.7808 0.8576 0.9984 0.7161 0.5633
0.0019 79.0909 870 0.0058 0.7797 0.8637 0.9983 0.7283 0.5610
0.0022 80.0 880 0.0057 0.7817 0.8526 0.9984 0.7058 0.5650
0.0023 80.9091 890 0.0059 0.7796 0.8617 0.9984 0.7243 0.5608
0.0019 81.8182 900 0.0058 0.7803 0.8572 0.9984 0.7152 0.5622
0.003 82.7273 910 0.0058 0.7802 0.8557 0.9984 0.7121 0.5619
0.0024 83.6364 920 0.0058 0.7809 0.8611 0.9984 0.7230 0.5634
0.0014 84.5455 930 0.0058 0.7817 0.8581 0.9984 0.7169 0.5650
0.0017 85.4545 940 0.0058 0.7815 0.8587 0.9984 0.7181 0.5645
0.0031 86.3636 950 0.0058 0.7804 0.8589 0.9984 0.7187 0.5624
0.003 87.2727 960 0.0058 0.7809 0.8568 0.9984 0.7143 0.5634
0.0007 88.1818 970 0.0058 0.7806 0.8594 0.9984 0.7195 0.5629
0.0024 89.0909 980 0.0057 0.7821 0.8577 0.9984 0.7162 0.5657
0.0039 90.0 990 0.0057 0.7826 0.8596 0.9984 0.7199 0.5669
0.0016 90.9091 1000 0.0057 0.7826 0.8578 0.9984 0.7163 0.5669
0.0024 91.8182 1010 0.0057 0.7822 0.8574 0.9984 0.7156 0.5659
0.0027 92.7273 1020 0.0057 0.7817 0.8593 0.9984 0.7195 0.5651
0.0018 93.6364 1030 0.0057 0.7821 0.8591 0.9984 0.7190 0.5658
0.0012 94.5455 1040 0.0057 0.7821 0.8581 0.9984 0.7169 0.5659
0.0017 95.4545 1050 0.0057 0.7816 0.8587 0.9984 0.7181 0.5648
0.0022 96.3636 1060 0.0057 0.7815 0.8582 0.9984 0.7172 0.5647
0.0019 97.2727 1070 0.0058 0.7818 0.8580 0.9984 0.7168 0.5652
0.0041 98.1818 1080 0.0057 0.7816 0.8575 0.9984 0.7157 0.5648
0.0034 99.0909 1090 0.0057 0.7815 0.8566 0.9984 0.7139 0.5646
0.0016 100.0 1100 0.0057 0.7819 0.8573 0.9984 0.7153 0.5654

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
36
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for PushkarA07/segformer-b0-finetuned-batch3-19May

Unable to build the model tree, the base model loops to the model itself. Learn more.