metadata
library_name: transformers
license: other
base_model: nvidia/mit-b3
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b3-finetuned-morphpadver1-hgo-coord-v4
results: []
segformer-b3-finetuned-morphpadver1-hgo-coord-v4
This model is a fine-tuned version of nvidia/mit-b3 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0105
- Mean Iou: 0.9978
- Mean Accuracy: 0.9989
- Overall Accuracy: 0.9989
- Accuracy 0-0: 0.9991
- Accuracy 0-90: 0.9988
- Accuracy 90-0: 0.9988
- Accuracy 90-90: 0.9989
- Iou 0-0: 0.9983
- Iou 0-90: 0.9977
- Iou 90-0: 0.9978
- Iou 90-90: 0.9974
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.0526 | 2.6525 | 4000 | 1.0174 | 0.3366 | 0.5028 | 0.5035 | 0.5781 | 0.3249 | 0.6323 | 0.4758 | 0.3780 | 0.2714 | 0.3136 | 0.3836 |
0.7543 | 5.3050 | 8000 | 0.5721 | 0.5952 | 0.7459 | 0.7458 | 0.7487 | 0.7334 | 0.6962 | 0.8051 | 0.6389 | 0.5638 | 0.5555 | 0.6227 |
0.1714 | 7.9576 | 12000 | 0.1845 | 0.8808 | 0.9366 | 0.9366 | 0.9416 | 0.9454 | 0.9140 | 0.9455 | 0.8984 | 0.8681 | 0.8613 | 0.8956 |
0.2092 | 10.6101 | 16000 | 0.0953 | 0.9378 | 0.9679 | 0.9679 | 0.9719 | 0.9663 | 0.9730 | 0.9602 | 0.9466 | 0.9322 | 0.9312 | 0.9412 |
0.0773 | 13.2626 | 20000 | 0.1115 | 0.9348 | 0.9663 | 0.9663 | 0.9665 | 0.9728 | 0.9588 | 0.9672 | 0.9450 | 0.9220 | 0.9319 | 0.9405 |
0.1 | 15.9151 | 24000 | 0.0842 | 0.9597 | 0.9794 | 0.9794 | 0.9792 | 0.9743 | 0.9802 | 0.9840 | 0.9642 | 0.9559 | 0.9539 | 0.9649 |
0.0297 | 18.5676 | 28000 | 0.0524 | 0.9706 | 0.9851 | 0.9851 | 0.9855 | 0.9868 | 0.9886 | 0.9794 | 0.9753 | 0.9676 | 0.9666 | 0.9728 |
0.0188 | 21.2202 | 32000 | 0.0463 | 0.9813 | 0.9906 | 0.9906 | 0.9911 | 0.9884 | 0.9899 | 0.9930 | 0.9828 | 0.9804 | 0.9787 | 0.9834 |
0.0987 | 23.8727 | 36000 | 0.0377 | 0.9849 | 0.9924 | 0.9924 | 0.9913 | 0.9967 | 0.9885 | 0.9932 | 0.9866 | 0.9816 | 0.9844 | 0.9871 |
0.1753 | 26.5252 | 40000 | 0.0367 | 0.9878 | 0.9939 | 0.9938 | 0.9922 | 0.9929 | 0.9947 | 0.9957 | 0.9876 | 0.9879 | 0.9853 | 0.9903 |
0.0536 | 29.1777 | 44000 | 0.0392 | 0.9880 | 0.9939 | 0.9939 | 0.9946 | 0.9945 | 0.9933 | 0.9934 | 0.9891 | 0.9875 | 0.9859 | 0.9893 |
0.0273 | 31.8302 | 48000 | 0.0450 | 0.9879 | 0.9939 | 0.9939 | 0.9946 | 0.9922 | 0.9937 | 0.9952 | 0.9897 | 0.9850 | 0.9872 | 0.9898 |
0.006 | 34.4828 | 52000 | 0.0272 | 0.9936 | 0.9968 | 0.9968 | 0.9972 | 0.9968 | 0.9966 | 0.9965 | 0.9940 | 0.9938 | 0.9927 | 0.9937 |
0.005 | 37.1353 | 56000 | 0.0240 | 0.9948 | 0.9974 | 0.9974 | 0.9981 | 0.9961 | 0.9975 | 0.9980 | 0.9965 | 0.9936 | 0.9951 | 0.9941 |
0.0039 | 39.7878 | 60000 | 0.0244 | 0.9954 | 0.9977 | 0.9977 | 0.9967 | 0.9981 | 0.9980 | 0.9981 | 0.9953 | 0.9949 | 0.9951 | 0.9965 |
0.0042 | 42.4403 | 64000 | 0.0203 | 0.9961 | 0.9980 | 0.9980 | 0.9982 | 0.9979 | 0.9982 | 0.9979 | 0.9972 | 0.9956 | 0.9954 | 0.9962 |
0.0034 | 45.0928 | 68000 | 0.0165 | 0.9970 | 0.9985 | 0.9985 | 0.9984 | 0.9984 | 0.9984 | 0.9987 | 0.9976 | 0.9966 | 0.9962 | 0.9975 |
0.0033 | 47.7454 | 72000 | 0.0105 | 0.9978 | 0.9989 | 0.9989 | 0.9991 | 0.9988 | 0.9988 | 0.9989 | 0.9983 | 0.9977 | 0.9978 | 0.9974 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0