sagittal-b4-finetuned-segments
This model is a fine-tuned version of nvidia/mit-b4 on the jenniferlumeng/Sagittal dataset. It achieves the following results on the evaluation set:
- Loss: 0.5610
- Mean Iou: 0.6387
- Mean Accuracy: 0.7597
- Overall Accuracy: 0.7684
- Accuracy Background: nan
- Accuracy Olfactory bulb: 0.7170
- Accuracy Anterior olfactory nucleus: 0.6456
- Accuracy Basal ganglia: 0.7788
- Accuracy Cortex: 0.7965
- Accuracy Hypothalamus: 0.6187
- Accuracy Thalamus: 0.7553
- Accuracy Hippocampus: 0.8524
- Accuracy Midbrain: 0.8602
- Accuracy Cerebellum: 0.7899
- Accuracy Pons and medulla: 0.7831
- Iou Background: 0.0
- Iou Olfactory bulb: 0.6979
- Iou Anterior olfactory nucleus: 0.5897
- Iou Basal ganglia: 0.7036
- Iou Cortex: 0.7569
- Iou Hypothalamus: 0.5348
- Iou Thalamus: 0.7058
- Iou Hippocampus: 0.8192
- Iou Midbrain: 0.7187
- Iou Cerebellum: 0.7689
- Iou Pons and medulla: 0.7295
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Olfactory bulb | Accuracy Anterior olfactory nucleus | Accuracy Basal ganglia | Accuracy Cortex | Accuracy Hypothalamus | Accuracy Thalamus | Accuracy Hippocampus | Accuracy Midbrain | Accuracy Cerebellum | Accuracy Pons and medulla | Iou Background | Iou Olfactory bulb | Iou Anterior olfactory nucleus | Iou Basal ganglia | Iou Cortex | Iou Hypothalamus | Iou Thalamus | Iou Hippocampus | Iou Midbrain | Iou Cerebellum | Iou Pons and medulla |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.1622 | 3.3333 | 20 | 1.5275 | 0.2290 | 0.2936 | 0.3032 | nan | 0.3546 | 0.0661 | 0.3629 | 0.4619 | 0.3534 | 0.0245 | 0.4997 | 0.3121 | 0.3472 | 0.1532 | 0.0 | 0.3179 | 0.0654 | 0.2793 | 0.3896 | 0.3249 | 0.0232 | 0.4386 | 0.1852 | 0.3458 | 0.1488 |
0.8211 | 6.6667 | 40 | 1.0160 | 0.3284 | 0.4622 | 0.4886 | nan | 0.4025 | 0.2457 | 0.4573 | 0.7211 | 0.3213 | 0.7259 | 0.5201 | 0.4266 | 0.3944 | 0.4069 | 0.0 | 0.3734 | 0.2422 | 0.3826 | 0.4803 | 0.2946 | 0.2820 | 0.4623 | 0.3025 | 0.3941 | 0.3986 |
0.2823 | 10.0 | 60 | 0.9503 | 0.4263 | 0.5468 | 0.5681 | nan | 0.4108 | 0.3905 | 0.5659 | 0.6721 | 0.4120 | 0.7734 | 0.5256 | 0.6114 | 0.6022 | 0.5039 | 0.0 | 0.4030 | 0.3792 | 0.4867 | 0.6002 | 0.3712 | 0.5518 | 0.4860 | 0.4593 | 0.4789 | 0.4733 |
0.4346 | 13.3333 | 80 | 0.6683 | 0.5384 | 0.6798 | 0.7221 | nan | 0.5424 | 0.5376 | 0.7562 | 0.8557 | 0.5587 | 0.8000 | 0.5245 | 0.7792 | 0.7022 | 0.7419 | 0.0 | 0.5198 | 0.5055 | 0.6668 | 0.7488 | 0.4901 | 0.6675 | 0.4835 | 0.5702 | 0.6572 | 0.6131 |
0.1348 | 16.6667 | 100 | 0.5909 | 0.5275 | 0.6836 | 0.7131 | nan | 0.5024 | 0.5379 | 0.6884 | 0.7820 | 0.6158 | 0.8733 | 0.5253 | 0.7972 | 0.8618 | 0.6525 | 0.0 | 0.4253 | 0.5029 | 0.5920 | 0.7553 | 0.5203 | 0.5693 | 0.4756 | 0.5773 | 0.7332 | 0.6511 |
0.1317 | 20.0 | 120 | 0.5279 | 0.6000 | 0.7499 | 0.7699 | nan | 0.6679 | 0.6691 | 0.6804 | 0.9212 | 0.6790 | 0.7882 | 0.7477 | 0.8093 | 0.8195 | 0.7169 | 0.0 | 0.6282 | 0.6283 | 0.6090 | 0.7796 | 0.6010 | 0.5967 | 0.6350 | 0.6556 | 0.7736 | 0.6933 |
0.2667 | 23.3333 | 140 | 0.6451 | 0.5482 | 0.6840 | 0.6961 | nan | 0.6738 | 0.5915 | 0.6175 | 0.7717 | 0.6215 | 0.7162 | 0.7077 | 0.7127 | 0.7174 | 0.7097 | 0.0 | 0.6220 | 0.5489 | 0.5643 | 0.7119 | 0.5204 | 0.5866 | 0.6468 | 0.5720 | 0.6582 | 0.5986 |
0.3673 | 26.6667 | 160 | 0.5395 | 0.5843 | 0.7265 | 0.7280 | nan | 0.7682 | 0.6859 | 0.6984 | 0.8040 | 0.6214 | 0.7752 | 0.8302 | 0.7929 | 0.5215 | 0.7669 | 0.0 | 0.7397 | 0.6224 | 0.6607 | 0.6138 | 0.5355 | 0.6739 | 0.6855 | 0.6733 | 0.5017 | 0.7208 |
0.345 | 30.0 | 180 | 0.4865 | 0.6101 | 0.7534 | 0.7675 | nan | 0.7244 | 0.7111 | 0.7634 | 0.9073 | 0.7027 | 0.7449 | 0.7589 | 0.8557 | 0.6596 | 0.7061 | 0.0 | 0.7089 | 0.6334 | 0.6898 | 0.6957 | 0.5837 | 0.6786 | 0.6832 | 0.7091 | 0.6404 | 0.6886 |
0.1892 | 33.3333 | 200 | 0.5088 | 0.6134 | 0.7589 | 0.7739 | nan | 0.6971 | 0.6785 | 0.7077 | 0.8255 | 0.6950 | 0.7285 | 0.8019 | 0.7823 | 0.8302 | 0.8419 | 0.0 | 0.6760 | 0.6139 | 0.6244 | 0.7471 | 0.5948 | 0.6243 | 0.7012 | 0.6364 | 0.7359 | 0.7934 |
0.283 | 36.6667 | 220 | 0.5012 | 0.6032 | 0.7387 | 0.7525 | nan | 0.6736 | 0.6548 | 0.6843 | 0.8329 | 0.6138 | 0.7489 | 0.8097 | 0.7708 | 0.8219 | 0.7763 | 0.0 | 0.6511 | 0.5898 | 0.5952 | 0.7460 | 0.5490 | 0.6433 | 0.7184 | 0.6573 | 0.7478 | 0.7373 |
0.3255 | 40.0 | 240 | 0.4538 | 0.6439 | 0.7751 | 0.7926 | nan | 0.6323 | 0.6450 | 0.7895 | 0.8253 | 0.6834 | 0.8150 | 0.8167 | 0.8587 | 0.8580 | 0.8274 | 0.0 | 0.6155 | 0.5910 | 0.6879 | 0.7771 | 0.5999 | 0.7198 | 0.7293 | 0.7568 | 0.8085 | 0.7969 |
0.148 | 43.3333 | 260 | 0.5867 | 0.5934 | 0.7219 | 0.7242 | nan | 0.5819 | 0.6130 | 0.7968 | 0.7211 | 0.6326 | 0.7201 | 0.8921 | 0.7792 | 0.7586 | 0.7235 | 0.0 | 0.5698 | 0.5527 | 0.7039 | 0.6853 | 0.5568 | 0.6735 | 0.7675 | 0.6665 | 0.6775 | 0.6738 |
0.2442 | 46.6667 | 280 | 0.5438 | 0.6123 | 0.7363 | 0.7502 | nan | 0.6327 | 0.6296 | 0.7893 | 0.7839 | 0.5792 | 0.7350 | 0.8244 | 0.8132 | 0.7980 | 0.7780 | 0.0 | 0.6221 | 0.5730 | 0.6917 | 0.7543 | 0.5004 | 0.6962 | 0.7619 | 0.6561 | 0.7618 | 0.7179 |
0.1645 | 50.0 | 300 | 0.5079 | 0.6323 | 0.7651 | 0.7711 | nan | 0.7346 | 0.6775 | 0.7749 | 0.7836 | 0.6132 | 0.7336 | 0.8661 | 0.8496 | 0.8220 | 0.7960 | 0.0 | 0.6891 | 0.6033 | 0.7091 | 0.7671 | 0.5359 | 0.6642 | 0.7340 | 0.7250 | 0.7986 | 0.7295 |
0.2699 | 53.3333 | 320 | 0.5663 | 0.6069 | 0.7401 | 0.7475 | nan | 0.7376 | 0.6604 | 0.7358 | 0.8071 | 0.6238 | 0.7225 | 0.8290 | 0.8199 | 0.7012 | 0.7635 | 0.0 | 0.7229 | 0.6041 | 0.6395 | 0.7020 | 0.5321 | 0.6502 | 0.7546 | 0.6855 | 0.6720 | 0.7131 |
0.2053 | 56.6667 | 340 | 0.5013 | 0.6341 | 0.7684 | 0.7750 | nan | 0.7147 | 0.6551 | 0.7489 | 0.8326 | 0.6458 | 0.8202 | 0.8792 | 0.8516 | 0.7662 | 0.7696 | 0.0 | 0.6918 | 0.5916 | 0.6512 | 0.7922 | 0.5612 | 0.7269 | 0.7414 | 0.7425 | 0.7519 | 0.7245 |
0.2427 | 60.0 | 360 | 0.4900 | 0.6275 | 0.7673 | 0.7721 | nan | 0.7584 | 0.7267 | 0.7405 | 0.8320 | 0.6785 | 0.7632 | 0.8677 | 0.8152 | 0.6697 | 0.8215 | 0.0 | 0.7289 | 0.6565 | 0.6647 | 0.7254 | 0.5795 | 0.6798 | 0.7799 | 0.6798 | 0.6329 | 0.7752 |
0.0668 | 63.3333 | 380 | 0.4845 | 0.6435 | 0.7722 | 0.7766 | nan | 0.7479 | 0.7064 | 0.7754 | 0.7830 | 0.6316 | 0.7340 | 0.8832 | 0.8429 | 0.7855 | 0.8320 | 0.0 | 0.7189 | 0.6336 | 0.6988 | 0.7412 | 0.5582 | 0.6887 | 0.8092 | 0.7069 | 0.7433 | 0.7797 |
0.1278 | 66.6667 | 400 | 0.5318 | 0.6220 | 0.7447 | 0.7560 | nan | 0.7063 | 0.6682 | 0.7959 | 0.7900 | 0.6057 | 0.7272 | 0.8067 | 0.8161 | 0.7418 | 0.7891 | 0.0 | 0.6939 | 0.6056 | 0.7106 | 0.7178 | 0.5353 | 0.7047 | 0.7581 | 0.6757 | 0.7074 | 0.7330 |
0.1184 | 70.0 | 420 | 0.5153 | 0.6434 | 0.7695 | 0.7778 | nan | 0.7200 | 0.6898 | 0.7627 | 0.8246 | 0.6589 | 0.7738 | 0.8395 | 0.8716 | 0.7698 | 0.7847 | 0.0 | 0.6858 | 0.6246 | 0.7008 | 0.7713 | 0.5730 | 0.6913 | 0.8064 | 0.7338 | 0.7483 | 0.7425 |
0.1317 | 73.3333 | 440 | 0.5403 | 0.6346 | 0.7586 | 0.7668 | nan | 0.7143 | 0.6677 | 0.7672 | 0.7990 | 0.5974 | 0.7354 | 0.8611 | 0.8529 | 0.8030 | 0.7876 | 0.0 | 0.6901 | 0.6051 | 0.6957 | 0.7751 | 0.5214 | 0.6903 | 0.8054 | 0.7020 | 0.7681 | 0.7279 |
0.0959 | 76.6667 | 460 | 0.5506 | 0.6325 | 0.7529 | 0.7596 | nan | 0.7081 | 0.6401 | 0.7706 | 0.7878 | 0.6339 | 0.7571 | 0.8553 | 0.8437 | 0.7636 | 0.7686 | 0.0 | 0.6859 | 0.5831 | 0.6937 | 0.7470 | 0.5459 | 0.7144 | 0.8041 | 0.7182 | 0.7359 | 0.7289 |
0.1181 | 80.0 | 480 | 0.5810 | 0.6227 | 0.7489 | 0.7528 | nan | 0.7194 | 0.6986 | 0.7478 | 0.7786 | 0.6016 | 0.7453 | 0.8501 | 0.8435 | 0.7140 | 0.7897 | 0.0 | 0.7035 | 0.6368 | 0.6793 | 0.7059 | 0.5201 | 0.6781 | 0.7990 | 0.7002 | 0.6957 | 0.7306 |
0.1272 | 83.3333 | 500 | 0.5927 | 0.6213 | 0.7406 | 0.7501 | nan | 0.7056 | 0.6515 | 0.7716 | 0.7891 | 0.6042 | 0.7289 | 0.8221 | 0.8266 | 0.7345 | 0.7725 | 0.0 | 0.6898 | 0.5965 | 0.7000 | 0.7362 | 0.5184 | 0.7017 | 0.7840 | 0.6860 | 0.7035 | 0.7186 |
0.1653 | 86.6667 | 520 | 0.5653 | 0.6368 | 0.7586 | 0.7645 | nan | 0.7195 | 0.6718 | 0.7697 | 0.7843 | 0.6201 | 0.7360 | 0.8479 | 0.8640 | 0.8044 | 0.7683 | 0.0 | 0.7011 | 0.6056 | 0.7009 | 0.7658 | 0.5286 | 0.7009 | 0.8013 | 0.7081 | 0.7787 | 0.7136 |
0.1633 | 90.0 | 540 | 0.5539 | 0.6421 | 0.7641 | 0.7693 | nan | 0.7257 | 0.6989 | 0.7533 | 0.8062 | 0.6249 | 0.7595 | 0.8532 | 0.8622 | 0.7844 | 0.7728 | 0.0 | 0.7107 | 0.6324 | 0.6835 | 0.7668 | 0.5405 | 0.6966 | 0.8200 | 0.7257 | 0.7640 | 0.7232 |
0.0863 | 93.3333 | 560 | 0.5737 | 0.6348 | 0.7544 | 0.7607 | nan | 0.7237 | 0.6606 | 0.7705 | 0.7775 | 0.6186 | 0.7401 | 0.8493 | 0.8438 | 0.7727 | 0.7868 | 0.0 | 0.7032 | 0.5988 | 0.7055 | 0.7366 | 0.5337 | 0.7141 | 0.8136 | 0.7060 | 0.7485 | 0.7230 |
0.072 | 96.6667 | 580 | 0.5544 | 0.6400 | 0.7616 | 0.7691 | nan | 0.7153 | 0.6546 | 0.7755 | 0.7964 | 0.6272 | 0.7607 | 0.8605 | 0.8571 | 0.7832 | 0.7854 | 0.0 | 0.6956 | 0.5972 | 0.7029 | 0.7550 | 0.5402 | 0.7088 | 0.8237 | 0.7215 | 0.7640 | 0.7309 |
0.1014 | 100.0 | 600 | 0.5610 | 0.6387 | 0.7597 | 0.7684 | nan | 0.7170 | 0.6456 | 0.7788 | 0.7965 | 0.6187 | 0.7553 | 0.8524 | 0.8602 | 0.7899 | 0.7831 | 0.0 | 0.6979 | 0.5897 | 0.7036 | 0.7569 | 0.5348 | 0.7058 | 0.8192 | 0.7187 | 0.7689 | 0.7295 |
Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
- Downloads last month
- 89
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for jenniferlumeng/sagittal-b4-finetuned-segments
Base model
nvidia/mit-b4