segformer-b5-finetuned-morphpadver1-hgo-coord-v6
This model is a fine-tuned version of nvidia/mit-b5 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0083
- Mean Iou: 0.9982
- Mean Accuracy: 0.9991
- Overall Accuracy: 0.9991
- Accuracy 0-0: 0.9993
- Accuracy 0-90: 0.9991
- Accuracy 90-0: 0.9996
- Accuracy 90-90: 0.9984
- Iou 0-0: 0.9988
- Iou 0-90: 0.9981
- Iou 90-0: 0.9979
- Iou 90-90: 0.9981
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.9111 | 2.6525 | 4000 | 0.8614 | 0.4113 | 0.5804 | 0.5812 | 0.6876 | 0.3429 | 0.6048 | 0.6862 | 0.5164 | 0.2855 | 0.3396 | 0.5035 |
0.4003 | 5.3050 | 8000 | 0.4114 | 0.6920 | 0.8177 | 0.8178 | 0.8421 | 0.7507 | 0.7478 | 0.9300 | 0.7679 | 0.6480 | 0.6597 | 0.6924 |
0.2246 | 7.9576 | 12000 | 0.2023 | 0.8443 | 0.9155 | 0.9155 | 0.9119 | 0.8979 | 0.8940 | 0.9583 | 0.8755 | 0.8355 | 0.8336 | 0.8327 |
0.1235 | 10.6101 | 16000 | 0.1268 | 0.9097 | 0.9527 | 0.9528 | 0.9612 | 0.9442 | 0.9450 | 0.9605 | 0.9223 | 0.9063 | 0.8991 | 0.9112 |
0.1012 | 13.2626 | 20000 | 0.0789 | 0.9445 | 0.9715 | 0.9715 | 0.9731 | 0.9707 | 0.9771 | 0.9650 | 0.9520 | 0.9426 | 0.9391 | 0.9445 |
0.0473 | 15.9151 | 24000 | 0.0582 | 0.9606 | 0.9799 | 0.9799 | 0.9769 | 0.9807 | 0.9832 | 0.9789 | 0.9615 | 0.9573 | 0.9588 | 0.9648 |
0.0258 | 18.5676 | 28000 | 0.0353 | 0.9830 | 0.9914 | 0.9914 | 0.9908 | 0.9915 | 0.9927 | 0.9906 | 0.9837 | 0.9824 | 0.9807 | 0.9850 |
0.046 | 21.2202 | 32000 | 0.0361 | 0.9839 | 0.9919 | 0.9919 | 0.9904 | 0.9916 | 0.9934 | 0.9922 | 0.9834 | 0.9832 | 0.9824 | 0.9866 |
0.0169 | 23.8727 | 36000 | 0.0262 | 0.9874 | 0.9937 | 0.9937 | 0.9937 | 0.9932 | 0.9935 | 0.9943 | 0.9883 | 0.9865 | 0.9871 | 0.9878 |
0.0127 | 26.5252 | 40000 | 0.0166 | 0.9926 | 0.9963 | 0.9963 | 0.9961 | 0.9957 | 0.9965 | 0.9968 | 0.9933 | 0.9918 | 0.9919 | 0.9934 |
0.0249 | 29.1777 | 44000 | 0.0222 | 0.9924 | 0.9962 | 0.9962 | 0.9931 | 0.9984 | 0.9962 | 0.9972 | 0.9913 | 0.9921 | 0.9919 | 0.9945 |
0.007 | 31.8302 | 48000 | 0.0114 | 0.9960 | 0.9980 | 0.9980 | 0.9979 | 0.9978 | 0.9981 | 0.9982 | 0.9960 | 0.9960 | 0.9956 | 0.9963 |
0.0061 | 34.4828 | 52000 | 0.0123 | 0.9966 | 0.9983 | 0.9983 | 0.9983 | 0.9981 | 0.9990 | 0.9978 | 0.9974 | 0.9963 | 0.9960 | 0.9965 |
0.0073 | 37.1353 | 56000 | 0.0125 | 0.9965 | 0.9983 | 0.9982 | 0.9977 | 0.9986 | 0.9985 | 0.9982 | 0.9968 | 0.9966 | 0.9961 | 0.9965 |
0.0053 | 39.7878 | 60000 | 0.0111 | 0.9974 | 0.9987 | 0.9987 | 0.9989 | 0.9985 | 0.9982 | 0.9993 | 0.9979 | 0.9974 | 0.9969 | 0.9975 |
0.0041 | 42.4403 | 64000 | 0.0125 | 0.9979 | 0.9989 | 0.9989 | 0.9988 | 0.9992 | 0.9991 | 0.9987 | 0.9979 | 0.9981 | 0.9976 | 0.9980 |
0.0037 | 45.0928 | 68000 | 0.0088 | 0.9980 | 0.9990 | 0.9990 | 0.9992 | 0.9991 | 0.9992 | 0.9985 | 0.9986 | 0.9980 | 0.9976 | 0.9979 |
0.0082 | 47.7454 | 72000 | 0.0083 | 0.9982 | 0.9991 | 0.9991 | 0.9993 | 0.9991 | 0.9996 | 0.9984 | 0.9988 | 0.9981 | 0.9979 | 0.9981 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for NICOPOI-9/segformer-b5-finetuned-morphpadver1-hgo-coord-v6
Base model
nvidia/mit-b5