segformer-b0-finetuned-batch3-26May

This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-batch2w5-15Dec on the PushkarA07/batch3-tiles_second dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0014
  • Mean Iou: 0.8838
  • Mean Accuracy: 0.9271
  • Overall Accuracy: 0.9994
  • Accuracy Abnormality: 0.8545
  • Iou Abnormality: 0.7682

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Abnormality Iou Abnormality
0.0035 0.7143 10 0.0024 0.8344 0.8791 0.9992 0.7585 0.6697
0.0021 1.4286 20 0.0022 0.8422 0.8942 0.9992 0.7888 0.6852
0.0033 2.1429 30 0.0020 0.8474 0.8942 0.9992 0.7887 0.6956
0.007 2.8571 40 0.0020 0.8510 0.8943 0.9992 0.7889 0.7028
0.0036 3.5714 50 0.0019 0.8553 0.8983 0.9993 0.7968 0.7113
0.0032 4.2857 60 0.0018 0.8583 0.8969 0.9993 0.7940 0.7173
0.0026 5.0 70 0.0018 0.8594 0.9003 0.9993 0.8009 0.7195
0.0033 5.7143 80 0.0018 0.8600 0.8999 0.9993 0.8000 0.7207
0.0048 6.4286 90 0.0018 0.8616 0.8997 0.9993 0.7997 0.7239
0.003 7.1429 100 0.0017 0.8653 0.9105 0.9993 0.8214 0.7313
0.0026 7.8571 110 0.0017 0.8664 0.9122 0.9993 0.8248 0.7335
0.0022 8.5714 120 0.0017 0.8599 0.8902 0.9993 0.7806 0.7206
0.0023 9.2857 130 0.0017 0.8668 0.9069 0.9993 0.8140 0.7342
0.0028 10.0 140 0.0017 0.8681 0.9177 0.9993 0.8357 0.7368
0.0017 10.7143 150 0.0017 0.8672 0.9107 0.9993 0.8217 0.7350
0.002 11.4286 160 0.0017 0.8665 0.9020 0.9993 0.8041 0.7336
0.0018 12.1429 170 0.0017 0.8676 0.9047 0.9993 0.8097 0.7358
0.0021 12.8571 180 0.0017 0.8695 0.9240 0.9993 0.8484 0.7397
0.001 13.5714 190 0.0016 0.8700 0.9145 0.9993 0.8293 0.7407
0.0014 14.2857 200 0.0016 0.8721 0.9123 0.9994 0.8248 0.7449
0.0016 15.0 210 0.0016 0.8704 0.9082 0.9994 0.8166 0.7414
0.0023 15.7143 220 0.0017 0.8709 0.9175 0.9993 0.8352 0.7425
0.0023 16.4286 230 0.0016 0.8732 0.9188 0.9994 0.8379 0.7470
0.0019 17.1429 240 0.0016 0.8731 0.9153 0.9994 0.8308 0.7468
0.0018 17.8571 250 0.0016 0.8726 0.9094 0.9994 0.8191 0.7459
0.0027 18.5714 260 0.0016 0.8697 0.9016 0.9994 0.8033 0.7400
0.0013 19.2857 270 0.0016 0.8758 0.9214 0.9994 0.8431 0.7523
0.0036 20.0 280 0.0016 0.8750 0.9226 0.9994 0.8456 0.7506
0.0025 20.7143 290 0.0016 0.8751 0.9293 0.9994 0.8589 0.7509
0.0016 21.4286 300 0.0016 0.8725 0.9095 0.9994 0.8192 0.7457
0.0032 22.1429 310 0.0016 0.8737 0.9102 0.9994 0.8206 0.7481
0.002 22.8571 320 0.0016 0.8772 0.9304 0.9994 0.8610 0.7550
0.0012 23.5714 330 0.0016 0.8760 0.9151 0.9994 0.8304 0.7526
0.0012 24.2857 340 0.0016 0.8767 0.9226 0.9994 0.8454 0.7541
0.0028 25.0 350 0.0015 0.8771 0.9205 0.9994 0.8413 0.7548
0.0021 25.7143 360 0.0015 0.8769 0.9176 0.9994 0.8354 0.7543
0.0016 26.4286 370 0.0015 0.8763 0.9156 0.9994 0.8315 0.7533
0.0025 27.1429 380 0.0016 0.8742 0.9171 0.9994 0.8345 0.7490
0.0029 27.8571 390 0.0016 0.8763 0.9322 0.9994 0.8647 0.7532
0.0025 28.5714 400 0.0015 0.8767 0.9194 0.9994 0.8391 0.7539
0.0022 29.2857 410 0.0015 0.8783 0.9205 0.9994 0.8413 0.7572
0.0014 30.0 420 0.0016 0.8792 0.9318 0.9994 0.8639 0.7590
0.0027 30.7143 430 0.0015 0.8786 0.9269 0.9994 0.8541 0.7578
0.0038 31.4286 440 0.0015 0.8787 0.9270 0.9994 0.8544 0.7581
0.0014 32.1429 450 0.0015 0.8781 0.9243 0.9994 0.8490 0.7569
0.0015 32.8571 460 0.0015 0.8781 0.9214 0.9994 0.8431 0.7568
0.0034 33.5714 470 0.0015 0.8769 0.9136 0.9994 0.8275 0.7544
0.0048 34.2857 480 0.0015 0.8783 0.9310 0.9994 0.8623 0.7573
0.0025 35.0 490 0.0015 0.8783 0.9210 0.9994 0.8422 0.7572
0.0029 35.7143 500 0.0015 0.8788 0.9234 0.9994 0.8470 0.7582
0.0024 36.4286 510 0.0015 0.8797 0.9286 0.9994 0.8576 0.7600
0.0013 37.1429 520 0.0015 0.8792 0.9197 0.9994 0.8396 0.7589
0.0023 37.8571 530 0.0015 0.8797 0.9240 0.9994 0.8484 0.7601
0.0017 38.5714 540 0.0015 0.8802 0.9269 0.9994 0.8541 0.7610
0.0023 39.2857 550 0.0015 0.8801 0.9248 0.9994 0.8498 0.7609
0.0027 40.0 560 0.0015 0.8806 0.9297 0.9994 0.8596 0.7618
0.0029 40.7143 570 0.0015 0.8798 0.9209 0.9994 0.8420 0.7602
0.0035 41.4286 580 0.0015 0.8776 0.9129 0.9994 0.8261 0.7557
0.0025 42.1429 590 0.0015 0.8792 0.9197 0.9994 0.8397 0.7590
0.0011 42.8571 600 0.0015 0.8813 0.9312 0.9994 0.8628 0.7632
0.0022 43.5714 610 0.0015 0.8803 0.9254 0.9994 0.8511 0.7612
0.0029 44.2857 620 0.0015 0.8796 0.9199 0.9994 0.8400 0.7597
0.0017 45.0 630 0.0015 0.8808 0.9254 0.9994 0.8511 0.7621
0.0013 45.7143 640 0.0015 0.8815 0.9276 0.9994 0.8554 0.7635
0.0026 46.4286 650 0.0015 0.8798 0.9258 0.9994 0.8518 0.7602
0.0018 47.1429 660 0.0015 0.8803 0.9307 0.9994 0.8616 0.7613
0.0016 47.8571 670 0.0015 0.8811 0.9272 0.9994 0.8546 0.7627
0.001 48.5714 680 0.0015 0.8796 0.9160 0.9994 0.8321 0.7598
0.002 49.2857 690 0.0015 0.8807 0.9314 0.9994 0.8632 0.7621
0.0021 50.0 700 0.0015 0.8797 0.9235 0.9994 0.8473 0.7600
0.0019 50.7143 710 0.0015 0.8800 0.9229 0.9994 0.8461 0.7606
0.0013 51.4286 720 0.0015 0.8794 0.9212 0.9994 0.8427 0.7593
0.0032 52.1429 730 0.0015 0.8806 0.9229 0.9994 0.8461 0.7618
0.0015 52.8571 740 0.0015 0.8813 0.9268 0.9994 0.8540 0.7632
0.0027 53.5714 750 0.0015 0.8807 0.9235 0.9994 0.8472 0.7620
0.0018 54.2857 760 0.0015 0.8806 0.9209 0.9994 0.8420 0.7618
0.0028 55.0 770 0.0015 0.8817 0.9245 0.9994 0.8493 0.7639
0.0019 55.7143 780 0.0015 0.8797 0.9154 0.9994 0.8311 0.7601
0.0017 56.4286 790 0.0015 0.8815 0.9238 0.9994 0.8479 0.7636
0.001 57.1429 800 0.0015 0.8811 0.9227 0.9994 0.8456 0.7628
0.0022 57.8571 810 0.0015 0.8827 0.9303 0.9994 0.8610 0.7660
0.0012 58.5714 820 0.0015 0.8825 0.9237 0.9994 0.8475 0.7656
0.0018 59.2857 830 0.0015 0.8831 0.9268 0.9994 0.8538 0.7667
0.0022 60.0 840 0.0015 0.8821 0.9263 0.9994 0.8530 0.7649
0.0021 60.7143 850 0.0015 0.8821 0.9246 0.9994 0.8494 0.7647
0.0033 61.4286 860 0.0015 0.8818 0.9277 0.9994 0.8556 0.7642
0.0034 62.1429 870 0.0015 0.8817 0.9231 0.9994 0.8465 0.7640
0.0037 62.8571 880 0.0015 0.8821 0.9261 0.9994 0.8524 0.7647
0.0018 63.5714 890 0.0015 0.8831 0.9334 0.9994 0.8671 0.7668
0.0018 64.2857 900 0.0015 0.8830 0.9316 0.9994 0.8635 0.7667
0.0035 65.0 910 0.0015 0.8806 0.9153 0.9994 0.8309 0.7618
0.0018 65.7143 920 0.0015 0.8814 0.9312 0.9994 0.8627 0.7635
0.0015 66.4286 930 0.0015 0.8826 0.9264 0.9994 0.8531 0.7657
0.0016 67.1429 940 0.0015 0.8836 0.9358 0.9994 0.8719 0.7677
0.0023 67.8571 950 0.0015 0.8820 0.9215 0.9994 0.8433 0.7645
0.0015 68.5714 960 0.0015 0.8816 0.9283 0.9994 0.8569 0.7639
0.0023 69.2857 970 0.0015 0.8833 0.9302 0.9994 0.8607 0.7673
0.0036 70.0 980 0.0015 0.8824 0.9256 0.9994 0.8515 0.7654
0.0011 70.7143 990 0.0015 0.8816 0.9268 0.9994 0.8538 0.7637
0.0034 71.4286 1000 0.0015 0.8818 0.9267 0.9994 0.8536 0.7643
0.0014 72.1429 1010 0.0015 0.8833 0.9303 0.9994 0.8609 0.7672
0.0011 72.8571 1020 0.0015 0.8827 0.9287 0.9994 0.8577 0.7659
0.0014 73.5714 1030 0.0015 0.8819 0.9257 0.9994 0.8516 0.7643
0.001 74.2857 1040 0.0015 0.8829 0.9294 0.9994 0.8591 0.7665
0.0026 75.0 1050 0.0015 0.8812 0.9210 0.9994 0.8423 0.7629
0.0014 75.7143 1060 0.0015 0.8823 0.9288 0.9994 0.8579 0.7653
0.0029 76.4286 1070 0.0015 0.8825 0.9244 0.9994 0.8490 0.7656
0.0007 77.1429 1080 0.0015 0.8828 0.9268 0.9994 0.8538 0.7662
0.0021 77.8571 1090 0.0015 0.8829 0.9257 0.9994 0.8517 0.7664
0.002 78.5714 1100 0.0015 0.8835 0.9270 0.9994 0.8542 0.7675
0.0025 79.2857 1110 0.0015 0.8833 0.9276 0.9994 0.8554 0.7673
0.0026 80.0 1120 0.0015 0.8831 0.9255 0.9994 0.8513 0.7669
0.0035 80.7143 1130 0.0015 0.8841 0.9306 0.9994 0.8615 0.7689
0.0016 81.4286 1140 0.0015 0.8833 0.9256 0.9994 0.8515 0.7672
0.0018 82.1429 1150 0.0015 0.8828 0.9266 0.9994 0.8535 0.7661
0.0024 82.8571 1160 0.0015 0.8831 0.9280 0.9994 0.8563 0.7668
0.0022 83.5714 1170 0.0015 0.8836 0.9309 0.9994 0.8620 0.7677
0.0018 84.2857 1180 0.0015 0.8835 0.9303 0.9994 0.8608 0.7676
0.0014 85.0 1190 0.0015 0.8832 0.9266 0.9994 0.8535 0.7669
0.0013 85.7143 1200 0.0015 0.8838 0.9273 0.9994 0.8548 0.7682
0.0033 86.4286 1210 0.0014 0.8836 0.9316 0.9994 0.8635 0.7678
0.0023 87.1429 1220 0.0015 0.8831 0.9231 0.9994 0.8465 0.7667
0.0027 87.8571 1230 0.0014 0.8834 0.9284 0.9994 0.8571 0.7675
0.0014 88.5714 1240 0.0015 0.8833 0.9285 0.9994 0.8572 0.7672
0.0025 89.2857 1250 0.0014 0.8836 0.9276 0.9994 0.8555 0.7678
0.003 90.0 1260 0.0014 0.8842 0.9299 0.9994 0.8600 0.7690
0.0022 90.7143 1270 0.0014 0.8842 0.9271 0.9994 0.8545 0.7690
0.0024 91.4286 1280 0.0014 0.8839 0.9285 0.9994 0.8572 0.7684
0.0017 92.1429 1290 0.0014 0.8835 0.9262 0.9994 0.8526 0.7676
0.0014 92.8571 1300 0.0014 0.8830 0.9243 0.9994 0.8488 0.7666
0.0018 93.5714 1310 0.0014 0.8836 0.9293 0.9994 0.8589 0.7678
0.0019 94.2857 1320 0.0014 0.8833 0.9265 0.9994 0.8532 0.7673
0.0027 95.0 1330 0.0015 0.8831 0.9250 0.9994 0.8503 0.7667
0.0008 95.7143 1340 0.0015 0.8835 0.9275 0.9994 0.8553 0.7677
0.0033 96.4286 1350 0.0015 0.8835 0.9280 0.9994 0.8563 0.7675
0.0021 97.1429 1360 0.0015 0.8834 0.9268 0.9994 0.8540 0.7675
0.0016 97.8571 1370 0.0015 0.8836 0.9271 0.9994 0.8545 0.7677
0.0039 98.5714 1380 0.0014 0.8835 0.9255 0.9994 0.8513 0.7677
0.0017 99.2857 1390 0.0014 0.8836 0.9257 0.9994 0.8517 0.7677
0.0046 100.0 1400 0.0014 0.8838 0.9271 0.9994 0.8545 0.7682

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
30
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for PushkarA07/segformer-b0-finetuned-batch3-26May

Unable to build the model tree, the base model loops to the model itself. Learn more.