segformer-b4-finetuned-UBC
This model is a fine-tuned version of nvidia/segformer-b4-finetuned-ade-512-512 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.9831
- Mean Iou: 0.3535
- Mean Accuracy: 0.5149
- Overall Accuracy: 0.6482
- Accuracy Background: nan
- Accuracy Residential: 0.8665
- Accuracy Commercial: 0.5016
- Accuracy Industrial: 0.5567
- Accuracy Public: 0.4487
- Accuracy Other: 0.2007
- Iou Background: nan
- Iou Residential: 0.7569
- Iou Commercial: 0.3413
- Iou Industrial: 0.2314
- Iou Public: 0.2827
- Iou Other: 0.1552
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 5
- eval_batch_size: 5
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Residential | Accuracy Commercial | Accuracy Industrial | Accuracy Public | Accuracy Other | Iou Background | Iou Residential | Iou Commercial | Iou Industrial | Iou Public | Iou Other |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.6613 | 1.0 | 112 | 1.0186 | 0.2282 | 0.3923 | 0.6080 | nan | 0.7908 | 0.6591 | 0.1395 | 0.3716 | 0.0005 | 0.0 | 0.7023 | 0.3610 | 0.0698 | 0.2355 | 0.0005 |
0.6229 | 2.0 | 224 | 0.9143 | 0.3063 | 0.4309 | 0.6547 | nan | 0.8927 | 0.4666 | 0.1872 | 0.6030 | 0.0052 | nan | 0.7356 | 0.3453 | 0.0835 | 0.3617 | 0.0051 |
0.474 | 3.0 | 336 | 0.9810 | 0.3225 | 0.4892 | 0.6415 | nan | 0.8693 | 0.3502 | 0.5204 | 0.6787 | 0.0273 | nan | 0.7447 | 0.2791 | 0.2138 | 0.3482 | 0.0267 |
0.416 | 4.0 | 448 | 1.0021 | 0.3657 | 0.5521 | 0.6573 | nan | 0.8784 | 0.4672 | 0.7352 | 0.4826 | 0.1971 | nan | 0.7372 | 0.3384 | 0.2748 | 0.3262 | 0.1518 |
0.3421 | 5.0 | 560 | 1.0451 | 0.3722 | 0.5476 | 0.6509 | nan | 0.8677 | 0.4221 | 0.6045 | 0.5012 | 0.3425 | nan | 0.7387 | 0.3117 | 0.2801 | 0.3251 | 0.2053 |
0.2723 | 6.0 | 672 | 1.2387 | 0.3482 | 0.4987 | 0.6411 | nan | 0.8439 | 0.5259 | 0.4330 | 0.4398 | 0.2511 | nan | 0.7547 | 0.3376 | 0.1830 | 0.2817 | 0.1840 |
0.2719 | 7.0 | 784 | 1.1848 | 0.3489 | 0.4784 | 0.6576 | nan | 0.8764 | 0.5764 | 0.3244 | 0.4185 | 0.1962 | nan | 0.7348 | 0.3760 | 0.1931 | 0.2871 | 0.1537 |
0.1785 | 8.0 | 896 | 1.2896 | 0.3686 | 0.5563 | 0.6438 | nan | 0.8694 | 0.4545 | 0.7179 | 0.3925 | 0.3472 | nan | 0.7536 | 0.3200 | 0.3063 | 0.2641 | 0.1988 |
0.1309 | 9.0 | 1008 | 1.3525 | 0.3515 | 0.4856 | 0.6480 | nan | 0.8511 | 0.4882 | 0.3594 | 0.5495 | 0.1796 | nan | 0.7557 | 0.3365 | 0.2199 | 0.3031 | 0.1426 |
0.1106 | 10.0 | 1120 | 1.4989 | 0.3494 | 0.5203 | 0.6379 | nan | 0.8556 | 0.3745 | 0.5852 | 0.5822 | 0.2041 | nan | 0.7496 | 0.2844 | 0.2445 | 0.3171 | 0.1513 |
0.1281 | 11.0 | 1232 | 1.5308 | 0.3652 | 0.5315 | 0.6550 | nan | 0.8469 | 0.5610 | 0.5537 | 0.4390 | 0.2569 | nan | 0.7555 | 0.3650 | 0.2274 | 0.3013 | 0.1770 |
0.0942 | 12.0 | 1344 | 1.5054 | 0.3547 | 0.5125 | 0.6486 | nan | 0.8600 | 0.4925 | 0.5328 | 0.4894 | 0.1876 | nan | 0.7544 | 0.3393 | 0.2390 | 0.2960 | 0.1450 |
0.1254 | 13.0 | 1456 | 1.5499 | 0.3497 | 0.5026 | 0.6400 | nan | 0.8533 | 0.5087 | 0.4680 | 0.4197 | 0.2634 | nan | 0.7487 | 0.3391 | 0.2159 | 0.2656 | 0.1794 |
0.1043 | 14.0 | 1568 | 1.5838 | 0.3543 | 0.5093 | 0.6513 | nan | 0.8712 | 0.4877 | 0.4996 | 0.4779 | 0.2101 | nan | 0.7567 | 0.3410 | 0.2234 | 0.2991 | 0.1514 |
0.0841 | 15.0 | 1680 | 1.6761 | 0.3677 | 0.5503 | 0.6507 | nan | 0.8639 | 0.4700 | 0.7035 | 0.4673 | 0.2470 | nan | 0.7547 | 0.3283 | 0.2825 | 0.3011 | 0.1716 |
0.0765 | 16.0 | 1792 | 1.7000 | 0.3558 | 0.5328 | 0.6526 | nan | 0.9054 | 0.3970 | 0.6789 | 0.4872 | 0.1955 | nan | 0.7523 | 0.3063 | 0.2699 | 0.3055 | 0.1451 |
0.0712 | 17.0 | 1904 | 1.8459 | 0.3587 | 0.5353 | 0.6467 | nan | 0.8638 | 0.4774 | 0.6618 | 0.4534 | 0.2200 | nan | 0.7542 | 0.3283 | 0.2581 | 0.2895 | 0.1635 |
0.0819 | 18.0 | 2016 | 1.7872 | 0.3544 | 0.5048 | 0.6563 | nan | 0.8925 | 0.5231 | 0.4922 | 0.4012 | 0.2150 | nan | 0.7600 | 0.3556 | 0.2259 | 0.2710 | 0.1596 |
0.0622 | 19.0 | 2128 | 1.8449 | 0.3649 | 0.5505 | 0.6451 | nan | 0.8498 | 0.4761 | 0.6958 | 0.4564 | 0.2742 | nan | 0.7556 | 0.3268 | 0.2658 | 0.2930 | 0.1830 |
0.0353 | 20.0 | 2240 | 1.8546 | 0.3511 | 0.5185 | 0.6412 | nan | 0.8546 | 0.4634 | 0.5779 | 0.4835 | 0.2131 | nan | 0.7545 | 0.3182 | 0.2255 | 0.2945 | 0.1626 |
0.0302 | 21.0 | 2352 | 1.8425 | 0.3545 | 0.5222 | 0.6468 | nan | 0.8731 | 0.4495 | 0.5990 | 0.4813 | 0.2082 | nan | 0.7563 | 0.3200 | 0.2444 | 0.2939 | 0.1580 |
0.0523 | 22.0 | 2464 | 1.9309 | 0.3565 | 0.5288 | 0.6450 | nan | 0.8509 | 0.4838 | 0.6172 | 0.4812 | 0.2110 | nan | 0.7561 | 0.3311 | 0.2390 | 0.2947 | 0.1618 |
0.0374 | 23.0 | 2576 | 1.9155 | 0.3529 | 0.5157 | 0.6481 | nan | 0.8688 | 0.4734 | 0.5491 | 0.4763 | 0.2110 | nan | 0.7528 | 0.3317 | 0.2218 | 0.2983 | 0.1597 |
0.0466 | 24.0 | 2688 | 1.9691 | 0.3515 | 0.5178 | 0.6437 | nan | 0.8645 | 0.4620 | 0.5759 | 0.4724 | 0.2140 | nan | 0.7538 | 0.3246 | 0.2296 | 0.2893 | 0.1599 |
0.0243 | 25.0 | 2800 | 1.9683 | 0.3565 | 0.5246 | 0.6461 | nan | 0.8617 | 0.4950 | 0.5975 | 0.4406 | 0.2284 | nan | 0.7571 | 0.3355 | 0.2414 | 0.2802 | 0.1684 |
0.0287 | 26.0 | 2912 | 1.9427 | 0.3552 | 0.5152 | 0.6491 | nan | 0.8651 | 0.5151 | 0.5380 | 0.4333 | 0.2246 | nan | 0.7555 | 0.3468 | 0.2270 | 0.2802 | 0.1668 |
0.0575 | 27.0 | 3024 | 2.0501 | 0.3538 | 0.5194 | 0.6471 | nan | 0.8583 | 0.5049 | 0.5777 | 0.4570 | 0.1992 | nan | 0.7564 | 0.3415 | 0.2293 | 0.2877 | 0.1538 |
0.0523 | 28.0 | 3136 | 1.9851 | 0.3553 | 0.5166 | 0.6489 | nan | 0.8639 | 0.5234 | 0.5550 | 0.4250 | 0.2156 | nan | 0.7565 | 0.3483 | 0.2329 | 0.2759 | 0.1629 |
0.0243 | 29.0 | 3248 | 1.9891 | 0.3533 | 0.5148 | 0.6480 | nan | 0.8676 | 0.5087 | 0.5591 | 0.4325 | 0.2064 | nan | 0.7561 | 0.3427 | 0.2312 | 0.2779 | 0.1584 |
0.0514 | 30.0 | 3360 | 1.9831 | 0.3535 | 0.5149 | 0.6482 | nan | 0.8665 | 0.5016 | 0.5567 | 0.4487 | 0.2007 | nan | 0.7569 | 0.3413 | 0.2314 | 0.2827 | 0.1552 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for tferhan/segformer-b4-finetuned-UBC
Base model
nvidia/segformer-b4-finetuned-ade-512-512