segformer-b4-finetuned-morphpadver1-hgo-coord-v5

This model is a fine-tuned version of nvidia/mit-b4 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0305
  • Mean Iou: 0.9957
  • Mean Accuracy: 0.9978
  • Overall Accuracy: 0.9978
  • Accuracy 0-0: 0.9985
  • Accuracy 0-90: 0.9967
  • Accuracy 90-0: 0.9974
  • Accuracy 90-90: 0.9988
  • Iou 0-0: 0.9963
  • Iou 0-90: 0.9949
  • Iou 90-0: 0.9946
  • Iou 90-90: 0.9970

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 0-0 Accuracy 0-90 Accuracy 90-0 Accuracy 90-90 Iou 0-0 Iou 0-90 Iou 90-0 Iou 90-90
1.1205 2.6525 4000 1.0224 0.3348 0.5016 0.5008 0.3938 0.4555 0.5489 0.6084 0.3529 0.3168 0.3227 0.3469
0.7254 5.3050 8000 0.6015 0.5633 0.7192 0.7188 0.6706 0.7324 0.7551 0.7186 0.6037 0.5466 0.5314 0.5714
0.3698 7.9576 12000 0.3707 0.7204 0.8368 0.8369 0.8560 0.8606 0.8261 0.8044 0.7447 0.6982 0.6938 0.7450
0.2039 10.6101 16000 0.2198 0.8391 0.9123 0.9123 0.9102 0.9321 0.9138 0.8931 0.8560 0.8255 0.8231 0.8516
0.0791 13.2626 20000 0.1565 0.9056 0.9504 0.9504 0.9499 0.9421 0.9554 0.9542 0.9165 0.8932 0.8953 0.9172
0.0587 15.9151 24000 0.1037 0.9456 0.9721 0.9721 0.9732 0.9693 0.9708 0.9749 0.9512 0.9402 0.9402 0.9509
0.0739 18.5676 28000 0.0545 0.9677 0.9836 0.9836 0.9836 0.9762 0.9869 0.9876 0.9728 0.9635 0.9640 0.9704
0.0443 21.2202 32000 0.0565 0.9716 0.9856 0.9856 0.9887 0.9885 0.9817 0.9835 0.9767 0.9675 0.9687 0.9735
0.1436 23.8727 36000 0.0484 0.9763 0.9880 0.9880 0.9898 0.9858 0.9876 0.9889 0.9808 0.9734 0.9730 0.9782
0.0681 26.5252 40000 0.0467 0.9831 0.9915 0.9915 0.9915 0.9904 0.9912 0.9929 0.9861 0.9821 0.9790 0.9852
0.0138 29.1777 44000 0.0357 0.9868 0.9934 0.9934 0.9930 0.9927 0.9932 0.9946 0.9879 0.9856 0.9842 0.9897
0.0292 31.8302 48000 0.0295 0.9898 0.9949 0.9949 0.9957 0.9942 0.9949 0.9947 0.9899 0.9894 0.9879 0.9923
0.0081 34.4828 52000 0.0262 0.9915 0.9957 0.9957 0.9958 0.9951 0.9958 0.9962 0.9912 0.9910 0.9901 0.9937
0.0061 37.1353 56000 0.0388 0.9905 0.9952 0.9952 0.9949 0.9939 0.9952 0.9969 0.9909 0.9900 0.9874 0.9936
0.006 39.7878 60000 0.0415 0.9929 0.9964 0.9964 0.9949 0.9963 0.9964 0.9982 0.9926 0.9927 0.9907 0.9955
0.0056 42.4403 64000 0.0301 0.9943 0.9972 0.9972 0.9975 0.9971 0.9966 0.9974 0.9961 0.9927 0.9931 0.9954
0.005 45.0928 68000 0.0213 0.9957 0.9978 0.9978 0.9982 0.9979 0.9976 0.9978 0.9968 0.9948 0.9946 0.9967
0.0041 47.7454 72000 0.0305 0.9957 0.9978 0.9978 0.9985 0.9967 0.9974 0.9988 0.9963 0.9949 0.9946 0.9970

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.1.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
64M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for NICOPOI-9/segformer-b4-finetuned-morphpadver1-hgo-coord-v5

Base model

nvidia/mit-b4
Finetuned
(10)
this model