segformer-b5-finetuned-ade20k-morphpadver1-hgo-coord_40epochs_distortion_global_norm
This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the NICOPOI-9/Morphpad_HGO_1600_coord_global_norm dataset. It achieves the following results on the evaluation set:
- Loss: 0.1027
- Mean Iou: 0.9780
- Mean Accuracy: 0.9889
- Overall Accuracy: 0.9886
- Accuracy 0-0: 0.9903
- Accuracy 0-90: 0.9850
- Accuracy 90-0: 0.9874
- Accuracy 90-90: 0.9929
- Iou 0-0: 0.9819
- Iou 0-90: 0.9740
- Iou 90-0: 0.9729
- Iou 90-90: 0.9831
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.2959 | 1.3680 | 4000 | 1.2209 | 0.2639 | 0.4166 | 0.4247 | 0.2827 | 0.4642 | 0.5298 | 0.3899 | 0.2232 | 0.2838 | 0.2934 | 0.2550 |
0.8526 | 2.7360 | 8000 | 0.9446 | 0.4145 | 0.5789 | 0.5841 | 0.5010 | 0.6754 | 0.5810 | 0.5582 | 0.4193 | 0.4020 | 0.4076 | 0.4290 |
0.6552 | 4.1040 | 12000 | 0.7331 | 0.5293 | 0.6854 | 0.6880 | 0.6425 | 0.6769 | 0.7503 | 0.6717 | 0.5626 | 0.4909 | 0.5151 | 0.5485 |
0.4479 | 5.4720 | 16000 | 0.5873 | 0.6216 | 0.7621 | 0.7634 | 0.7338 | 0.8038 | 0.7429 | 0.7677 | 0.6463 | 0.5867 | 0.6042 | 0.6494 |
0.3862 | 6.8399 | 20000 | 0.4229 | 0.7262 | 0.8411 | 0.8396 | 0.8751 | 0.8489 | 0.8048 | 0.8354 | 0.7527 | 0.6991 | 0.7079 | 0.7449 |
1.3206 | 8.2079 | 24000 | 0.2964 | 0.8276 | 0.9031 | 0.9040 | 0.8928 | 0.9069 | 0.9194 | 0.8932 | 0.8544 | 0.8091 | 0.8044 | 0.8427 |
1.2794 | 9.5759 | 28000 | 0.2393 | 0.8588 | 0.9236 | 0.9229 | 0.9416 | 0.8933 | 0.9435 | 0.9160 | 0.8776 | 0.8406 | 0.8410 | 0.8760 |
0.1104 | 10.9439 | 32000 | 0.1972 | 0.8876 | 0.9409 | 0.9396 | 0.9546 | 0.9336 | 0.9196 | 0.9559 | 0.9028 | 0.8736 | 0.8700 | 0.9040 |
0.5852 | 12.3119 | 36000 | 0.1674 | 0.9102 | 0.9536 | 0.9523 | 0.9572 | 0.9438 | 0.9339 | 0.9794 | 0.9167 | 0.9024 | 0.8896 | 0.9320 |
0.0711 | 13.6799 | 40000 | 0.1398 | 0.9289 | 0.9629 | 0.9625 | 0.9648 | 0.9542 | 0.9641 | 0.9684 | 0.9374 | 0.9178 | 0.9169 | 0.9435 |
0.5558 | 15.0479 | 44000 | 0.1391 | 0.9337 | 0.9656 | 0.9651 | 0.9677 | 0.9580 | 0.9638 | 0.9728 | 0.9424 | 0.9235 | 0.9227 | 0.9461 |
1.2811 | 16.4159 | 48000 | 0.1252 | 0.9426 | 0.9706 | 0.9700 | 0.9776 | 0.9681 | 0.9591 | 0.9777 | 0.9521 | 0.9341 | 0.9326 | 0.9516 |
1.1932 | 17.7839 | 52000 | 0.1162 | 0.9486 | 0.9740 | 0.9733 | 0.9811 | 0.9641 | 0.9690 | 0.9818 | 0.9553 | 0.9404 | 0.9428 | 0.9558 |
1.2719 | 19.1518 | 56000 | 0.1141 | 0.9533 | 0.9759 | 0.9757 | 0.9744 | 0.9717 | 0.9761 | 0.9814 | 0.9581 | 0.9487 | 0.9428 | 0.9638 |
0.0857 | 20.5198 | 60000 | 0.1049 | 0.9590 | 0.9790 | 0.9787 | 0.9783 | 0.9746 | 0.9761 | 0.9870 | 0.9655 | 0.9531 | 0.9508 | 0.9665 |
1.1869 | 21.8878 | 64000 | 0.1069 | 0.9558 | 0.9774 | 0.9770 | 0.9781 | 0.9625 | 0.9826 | 0.9865 | 0.9641 | 0.9494 | 0.9448 | 0.9649 |
0.0294 | 23.2558 | 68000 | 0.1028 | 0.9641 | 0.9819 | 0.9814 | 0.9840 | 0.9772 | 0.9760 | 0.9903 | 0.9680 | 0.9586 | 0.9564 | 0.9733 |
0.5373 | 24.6238 | 72000 | 0.1089 | 0.9639 | 0.9815 | 0.9813 | 0.9813 | 0.9771 | 0.9807 | 0.9869 | 0.9696 | 0.9584 | 0.9564 | 0.9711 |
0.7069 | 25.9918 | 76000 | 0.1026 | 0.9683 | 0.9840 | 0.9836 | 0.9859 | 0.9764 | 0.9842 | 0.9893 | 0.9734 | 0.9630 | 0.9620 | 0.9750 |
0.0382 | 27.3598 | 80000 | 0.1075 | 0.9647 | 0.9818 | 0.9819 | 0.9748 | 0.9835 | 0.9800 | 0.9890 | 0.9627 | 0.9630 | 0.9591 | 0.9739 |
0.0143 | 28.7278 | 84000 | 0.1082 | 0.9719 | 0.9858 | 0.9855 | 0.9896 | 0.9804 | 0.9864 | 0.9866 | 0.9778 | 0.9679 | 0.9654 | 0.9766 |
1.2277 | 30.0958 | 88000 | 0.0961 | 0.9730 | 0.9864 | 0.9861 | 0.9889 | 0.9818 | 0.9829 | 0.9922 | 0.9773 | 0.9685 | 0.9671 | 0.9791 |
1.1838 | 31.4637 | 92000 | 0.0947 | 0.9739 | 0.9868 | 0.9865 | 0.9899 | 0.9834 | 0.9846 | 0.9893 | 0.9785 | 0.9693 | 0.9679 | 0.9801 |
1.2587 | 32.8317 | 96000 | 0.0925 | 0.9745 | 0.9871 | 0.9869 | 0.9902 | 0.9827 | 0.9864 | 0.9893 | 0.9792 | 0.9700 | 0.9689 | 0.9801 |
0.0103 | 34.1997 | 100000 | 0.0988 | 0.9748 | 0.9871 | 0.9870 | 0.9865 | 0.9844 | 0.9869 | 0.9907 | 0.9782 | 0.9713 | 0.9693 | 0.9803 |
1.2995 | 35.5677 | 104000 | 0.0980 | 0.9763 | 0.9880 | 0.9878 | 0.9887 | 0.9838 | 0.9874 | 0.9921 | 0.9805 | 0.9727 | 0.9708 | 0.9813 |
0.0678 | 36.9357 | 108000 | 0.1043 | 0.9778 | 0.9888 | 0.9886 | 0.9899 | 0.9857 | 0.9873 | 0.9922 | 0.9812 | 0.9743 | 0.9722 | 0.9835 |
0.0094 | 38.3037 | 112000 | 0.1031 | 0.9775 | 0.9886 | 0.9884 | 0.9896 | 0.9848 | 0.9884 | 0.9916 | 0.9816 | 0.9742 | 0.9717 | 0.9826 |
0.1925 | 39.6717 | 116000 | 0.1027 | 0.9780 | 0.9889 | 0.9886 | 0.9903 | 0.9850 | 0.9874 | 0.9929 | 0.9819 | 0.9740 | 0.9729 | 0.9831 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 28
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support