segformer-b1-finetuned-UBC
This model is a fine-tuned version of nvidia/segformer-b4-finetuned-ade-512-512 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.6250
- Mean Iou: 0.3060
- Mean Accuracy: 0.5177
- Overall Accuracy: 0.6022
- Accuracy Road-trees-ocean: nan
- Accuracy Residential: 0.7533
- Accuracy Commercial: 0.4895
- Accuracy Industrial: 0.6393
- Accuracy Public: 0.4912
- Accuracy Other: 0.2152
- Iou Road-trees-ocean: 0.0
- Iou Residential: 0.6788
- Iou Commercial: 0.3656
- Iou Industrial: 0.2652
- Iou Public: 0.3602
- Iou Other: 0.1662
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 5
- eval_batch_size: 5
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Road-trees-ocean | Accuracy Residential | Accuracy Commercial | Accuracy Industrial | Accuracy Public | Accuracy Other | Iou Road-trees-ocean | Iou Residential | Iou Commercial | Iou Industrial | Iou Public | Iou Other |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.6615 | 1.0 | 112 | 0.7051 | 0.1920 | 0.3218 | 0.5703 | nan | 0.7995 | 0.6999 | 0.0002 | 0.1094 | 0.0 | 0.0 | 0.6690 | 0.3774 | 0.0002 | 0.1052 | 0.0 |
0.4892 | 2.0 | 224 | 0.4453 | 0.2012 | 0.3142 | 0.5312 | nan | 0.7007 | 0.6561 | 0.0 | 0.2144 | 0.0 | 0.0 | 0.6245 | 0.3895 | 0.0 | 0.1934 | 0.0 |
0.3549 | 3.0 | 336 | 0.4100 | 0.2273 | 0.3564 | 0.5065 | nan | 0.7146 | 0.2978 | 0.3118 | 0.4577 | 0.0 | 0.0 | 0.6335 | 0.2514 | 0.1735 | 0.3055 | 0.0 |
0.3597 | 4.0 | 448 | 0.4001 | 0.2534 | 0.4062 | 0.5525 | nan | 0.7658 | 0.3541 | 0.4291 | 0.4818 | 0.0 | 0.0 | 0.6590 | 0.2975 | 0.2454 | 0.3184 | 0.0 |
0.276 | 5.0 | 560 | 0.3985 | 0.2382 | 0.3652 | 0.5385 | nan | 0.6930 | 0.5632 | 0.2008 | 0.3588 | 0.0103 | 0.0 | 0.6399 | 0.3665 | 0.1302 | 0.2827 | 0.0102 |
0.2568 | 6.0 | 672 | 0.3932 | 0.2715 | 0.4139 | 0.5838 | nan | 0.7988 | 0.4944 | 0.3519 | 0.3776 | 0.0467 | 0.0 | 0.6785 | 0.3799 | 0.2135 | 0.3129 | 0.0440 |
0.2238 | 7.0 | 784 | 0.4032 | 0.2991 | 0.4681 | 0.5846 | nan | 0.7494 | 0.5426 | 0.5155 | 0.3801 | 0.1527 | 0.0 | 0.6745 | 0.3865 | 0.3042 | 0.3037 | 0.1257 |
0.1856 | 8.0 | 896 | 0.4345 | 0.2946 | 0.4942 | 0.5812 | nan | 0.7566 | 0.3599 | 0.6136 | 0.5516 | 0.1892 | 0.0 | 0.6851 | 0.2924 | 0.2887 | 0.3528 | 0.1489 |
0.1733 | 9.0 | 1008 | 0.4259 | 0.2993 | 0.4755 | 0.5946 | nan | 0.7494 | 0.5293 | 0.5092 | 0.4660 | 0.1238 | 0.0 | 0.6848 | 0.3785 | 0.2872 | 0.3398 | 0.1054 |
0.177 | 10.0 | 1120 | 0.4989 | 0.2675 | 0.4962 | 0.5674 | nan | 0.7871 | 0.2486 | 0.7673 | 0.5283 | 0.1495 | 0.0 | 0.6960 | 0.2170 | 0.2421 | 0.3289 | 0.1211 |
0.128 | 11.0 | 1232 | 0.4857 | 0.2926 | 0.5225 | 0.5677 | nan | 0.6727 | 0.4868 | 0.7228 | 0.4968 | 0.2335 | 0.0 | 0.6400 | 0.3525 | 0.2497 | 0.3547 | 0.1586 |
0.1277 | 12.0 | 1344 | 0.4589 | 0.3046 | 0.5037 | 0.6014 | nan | 0.7587 | 0.4364 | 0.5857 | 0.5713 | 0.1663 | 0.0 | 0.6857 | 0.3491 | 0.2877 | 0.3670 | 0.1382 |
0.1465 | 13.0 | 1456 | 0.4877 | 0.2978 | 0.5149 | 0.5791 | nan | 0.7151 | 0.4300 | 0.6773 | 0.5372 | 0.2151 | 0.0 | 0.6608 | 0.3373 | 0.2732 | 0.3567 | 0.1589 |
0.1692 | 14.0 | 1568 | 0.4828 | 0.3006 | 0.4966 | 0.5947 | nan | 0.7469 | 0.4576 | 0.5661 | 0.5380 | 0.1745 | 0.0 | 0.6808 | 0.3505 | 0.2721 | 0.3595 | 0.1407 |
0.1402 | 15.0 | 1680 | 0.5083 | 0.3008 | 0.5034 | 0.5905 | nan | 0.7414 | 0.4931 | 0.6033 | 0.4570 | 0.2224 | 0.0 | 0.6659 | 0.3629 | 0.2614 | 0.3504 | 0.1642 |
0.1815 | 16.0 | 1792 | 0.5092 | 0.2995 | 0.5242 | 0.5847 | nan | 0.7432 | 0.4522 | 0.6954 | 0.4245 | 0.3056 | 0.0 | 0.6767 | 0.3478 | 0.2400 | 0.3362 | 0.1965 |
0.1327 | 17.0 | 1904 | 0.5164 | 0.3064 | 0.5051 | 0.5940 | nan | 0.7289 | 0.5073 | 0.5805 | 0.5011 | 0.2077 | 0.0 | 0.6639 | 0.3757 | 0.2818 | 0.3643 | 0.1528 |
0.1045 | 18.0 | 2016 | 0.5242 | 0.3021 | 0.5083 | 0.6035 | nan | 0.7820 | 0.4367 | 0.6433 | 0.5077 | 0.1717 | 0.0 | 0.6898 | 0.3522 | 0.2736 | 0.3597 | 0.1375 |
0.0935 | 19.0 | 2128 | 0.5375 | 0.2995 | 0.5132 | 0.5826 | nan | 0.7114 | 0.4796 | 0.6702 | 0.5107 | 0.1942 | 0.0 | 0.6584 | 0.3477 | 0.2836 | 0.3558 | 0.1516 |
0.1029 | 20.0 | 2240 | 0.5384 | 0.2934 | 0.5002 | 0.5858 | nan | 0.7347 | 0.3958 | 0.6215 | 0.6049 | 0.1442 | 0.0 | 0.6786 | 0.3180 | 0.2778 | 0.3658 | 0.1203 |
0.1076 | 21.0 | 2352 | 0.5209 | 0.3030 | 0.4923 | 0.5998 | nan | 0.7554 | 0.4506 | 0.4999 | 0.5542 | 0.2013 | 0.0 | 0.6809 | 0.3515 | 0.2704 | 0.3588 | 0.1565 |
0.1118 | 22.0 | 2464 | 0.5442 | 0.2876 | 0.4962 | 0.5899 | nan | 0.7744 | 0.4817 | 0.6387 | 0.3802 | 0.2060 | 0.0 | 0.6798 | 0.3616 | 0.2201 | 0.3017 | 0.1627 |
0.0962 | 23.0 | 2576 | 0.5447 | 0.3073 | 0.5146 | 0.6054 | nan | 0.7627 | 0.4891 | 0.6101 | 0.4858 | 0.2255 | 0.0 | 0.6828 | 0.3689 | 0.2692 | 0.3511 | 0.1717 |
0.098 | 24.0 | 2688 | 0.5635 | 0.2999 | 0.5139 | 0.5964 | nan | 0.7423 | 0.4949 | 0.6486 | 0.4872 | 0.1964 | 0.0 | 0.6779 | 0.3582 | 0.2564 | 0.3482 | 0.1588 |
0.1051 | 25.0 | 2800 | 0.5509 | 0.3002 | 0.5153 | 0.5976 | nan | 0.7759 | 0.4163 | 0.6716 | 0.4958 | 0.2169 | 0.0 | 0.6932 | 0.3270 | 0.2636 | 0.3516 | 0.1657 |
0.1073 | 26.0 | 2912 | 0.5476 | 0.3103 | 0.4987 | 0.6083 | nan | 0.7506 | 0.5518 | 0.4992 | 0.4737 | 0.2181 | 0.0 | 0.6793 | 0.3950 | 0.2688 | 0.3563 | 0.1627 |
0.1398 | 27.0 | 3024 | 0.5794 | 0.3043 | 0.5146 | 0.5989 | nan | 0.7554 | 0.4470 | 0.6395 | 0.5288 | 0.2025 | 0.0 | 0.6811 | 0.3501 | 0.2817 | 0.3538 | 0.1591 |
0.1013 | 28.0 | 3136 | 0.5661 | 0.3042 | 0.5002 | 0.6027 | nan | 0.7400 | 0.5420 | 0.5505 | 0.4905 | 0.1781 | 0.0 | 0.6683 | 0.3882 | 0.2677 | 0.3551 | 0.1456 |
0.0693 | 29.0 | 3248 | 0.5628 | 0.3096 | 0.5212 | 0.6097 | nan | 0.7531 | 0.5093 | 0.6121 | 0.5100 | 0.2214 | 0.0 | 0.6836 | 0.3813 | 0.2634 | 0.3599 | 0.1693 |
0.0954 | 30.0 | 3360 | 0.5839 | 0.2934 | 0.5060 | 0.5921 | nan | 0.7682 | 0.4295 | 0.6672 | 0.4823 | 0.1827 | 0.0 | 0.6774 | 0.3386 | 0.2436 | 0.3560 | 0.1446 |
0.0792 | 31.0 | 3472 | 0.5779 | 0.3021 | 0.5069 | 0.6012 | nan | 0.7463 | 0.5260 | 0.5756 | 0.4653 | 0.2212 | 0.0 | 0.6793 | 0.3729 | 0.2372 | 0.3542 | 0.1692 |
0.0842 | 32.0 | 3584 | 0.5680 | 0.3155 | 0.5169 | 0.6178 | nan | 0.7789 | 0.5031 | 0.5572 | 0.4906 | 0.2548 | 0.0 | 0.6903 | 0.3851 | 0.2612 | 0.3726 | 0.1839 |
0.0778 | 33.0 | 3696 | 0.5928 | 0.3053 | 0.5157 | 0.6025 | nan | 0.7550 | 0.4850 | 0.6293 | 0.4979 | 0.2115 | 0.0 | 0.6814 | 0.3648 | 0.2582 | 0.3644 | 0.1630 |
0.0681 | 34.0 | 3808 | 0.5967 | 0.3054 | 0.5124 | 0.6045 | nan | 0.7600 | 0.4790 | 0.6243 | 0.5157 | 0.1831 | 0.0 | 0.6832 | 0.3673 | 0.2687 | 0.3655 | 0.1480 |
0.1204 | 35.0 | 3920 | 0.6043 | 0.3051 | 0.5284 | 0.6009 | nan | 0.7611 | 0.4233 | 0.7087 | 0.5413 | 0.2079 | 0.0 | 0.6865 | 0.3399 | 0.2723 | 0.3688 | 0.1631 |
0.063 | 36.0 | 4032 | 0.5979 | 0.3065 | 0.5126 | 0.6024 | nan | 0.7561 | 0.4720 | 0.6008 | 0.5127 | 0.2210 | 0.0 | 0.6801 | 0.3614 | 0.2639 | 0.3652 | 0.1682 |
0.0861 | 37.0 | 4144 | 0.6206 | 0.3025 | 0.5210 | 0.5957 | nan | 0.7405 | 0.4709 | 0.6698 | 0.5083 | 0.2151 | 0.0 | 0.6742 | 0.3580 | 0.2621 | 0.3568 | 0.1638 |
0.0716 | 38.0 | 4256 | 0.6022 | 0.3003 | 0.5082 | 0.5999 | nan | 0.7575 | 0.4981 | 0.6141 | 0.4636 | 0.2075 | 0.0 | 0.6791 | 0.3709 | 0.2434 | 0.3476 | 0.1610 |
0.0698 | 39.0 | 4368 | 0.6149 | 0.3055 | 0.5139 | 0.6070 | nan | 0.7747 | 0.4750 | 0.6323 | 0.4872 | 0.2002 | 0.0 | 0.6845 | 0.3681 | 0.2625 | 0.3598 | 0.1578 |
0.0966 | 40.0 | 4480 | 0.6190 | 0.3050 | 0.5135 | 0.6049 | nan | 0.7670 | 0.4782 | 0.6261 | 0.4913 | 0.2051 | 0.0 | 0.6819 | 0.3670 | 0.2601 | 0.3618 | 0.1594 |
0.0792 | 41.0 | 4592 | 0.6239 | 0.3038 | 0.5170 | 0.5996 | nan | 0.7436 | 0.4801 | 0.6419 | 0.5223 | 0.1970 | 0.0 | 0.6773 | 0.3643 | 0.2615 | 0.3643 | 0.1555 |
0.0847 | 42.0 | 4704 | 0.6188 | 0.3061 | 0.5182 | 0.6043 | nan | 0.7546 | 0.4909 | 0.6345 | 0.5010 | 0.2099 | 0.0 | 0.6802 | 0.3715 | 0.2567 | 0.3647 | 0.1633 |
0.0699 | 43.0 | 4816 | 0.6188 | 0.3070 | 0.5141 | 0.6078 | nan | 0.7694 | 0.4687 | 0.6024 | 0.5124 | 0.2177 | 0.0 | 0.6868 | 0.3643 | 0.2598 | 0.3639 | 0.1671 |
0.073 | 44.0 | 4928 | 0.6249 | 0.3042 | 0.5186 | 0.5993 | nan | 0.7432 | 0.4952 | 0.6594 | 0.4987 | 0.1966 | 0.0 | 0.6770 | 0.3678 | 0.2628 | 0.3620 | 0.1558 |
0.0707 | 45.0 | 5040 | 0.6273 | 0.3066 | 0.5247 | 0.5990 | nan | 0.7430 | 0.4767 | 0.6635 | 0.5071 | 0.2332 | 0.0 | 0.6734 | 0.3602 | 0.2637 | 0.3665 | 0.1759 |
0.0716 | 46.0 | 5152 | 0.6314 | 0.3074 | 0.5220 | 0.6055 | nan | 0.7554 | 0.4911 | 0.6483 | 0.5007 | 0.2147 | 0.0 | 0.6818 | 0.3675 | 0.2633 | 0.3644 | 0.1672 |
0.1249 | 47.0 | 5264 | 0.6242 | 0.3081 | 0.5194 | 0.6054 | nan | 0.7627 | 0.4854 | 0.6419 | 0.4876 | 0.2194 | 0.0 | 0.6810 | 0.3687 | 0.2679 | 0.3632 | 0.1674 |
0.0772 | 48.0 | 5376 | 0.6427 | 0.3065 | 0.5249 | 0.6022 | nan | 0.7516 | 0.4817 | 0.6720 | 0.4979 | 0.2215 | 0.0 | 0.6789 | 0.3633 | 0.2653 | 0.3622 | 0.1696 |
0.0914 | 49.0 | 5488 | 0.6323 | 0.3067 | 0.5234 | 0.6040 | nan | 0.7571 | 0.4809 | 0.6677 | 0.4985 | 0.2127 | 0.0 | 0.6816 | 0.3634 | 0.2673 | 0.3631 | 0.1649 |
0.0679 | 50.0 | 5600 | 0.6250 | 0.3060 | 0.5177 | 0.6022 | nan | 0.7533 | 0.4895 | 0.6393 | 0.4912 | 0.2152 | 0.0 | 0.6788 | 0.3656 | 0.2652 | 0.3602 | 0.1662 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support