detr_finetuned_cppe5_Carla-COCO
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the carla-coco-object-detection-dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.9886
- Map: 0.2337
- Map 50: 0.3446
- Map 75: 0.233
- Map Small: 0.1595
- Map Medium: 0.5443
- Map Large: 0.9195
- Mar 1: 0.2413
- Mar 10: 0.4546
- Mar 100: 0.4658
- Mar Small: 0.3762
- Mar Medium: 0.7772
- Mar Large: 0.9417
- Map Coverall: 0.62
- Mar 100 Coverall: 0.6933
- Map Face Shield: 0.2334
- Mar 100 Face Shield: 0.4938
- Map Gloves: 0.0406
- Mar 100 Gloves: 0.4364
- Map Goggles: 0.2714
- Mar 100 Goggles: 0.3902
- Map Mask: 0.0032
- Mar 100 Mask: 0.3154
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 83 | 2.0575 | 0.0079 | 0.024 | 0.0032 | 0.0041 | 0.0388 | 0.4185 | 0.0149 | 0.0882 | 0.1529 | 0.1184 | 0.3278 | 0.7542 | 0.0358 | 0.4455 | 0.0004 | 0.0812 | 0.0009 | 0.1364 | 0.0022 | 0.1012 | 0.0 | 0.0 |
No log | 2.0 | 166 | 1.8170 | 0.0125 | 0.0405 | 0.0024 | 0.0059 | 0.0564 | 0.3833 | 0.0208 | 0.0802 | 0.1121 | 0.0795 | 0.2078 | 0.8083 | 0.0607 | 0.373 | 0.0 | 0.0 | 0.0001 | 0.0364 | 0.0006 | 0.0665 | 0.001 | 0.0846 |
No log | 3.0 | 249 | 1.7051 | 0.0333 | 0.0808 | 0.0214 | 0.017 | 0.0716 | 0.55 | 0.0522 | 0.1304 | 0.1636 | 0.1294 | 0.359 | 0.7833 | 0.161 | 0.5 | 0.001 | 0.1125 | 0.0008 | 0.0727 | 0.0037 | 0.1329 | 0.0 | 0.0 |
No log | 4.0 | 332 | 1.5606 | 0.0621 | 0.1076 | 0.0725 | 0.0384 | 0.1191 | 0.6246 | 0.0636 | 0.1624 | 0.1908 | 0.1367 | 0.4395 | 0.875 | 0.3068 | 0.5573 | 0.0004 | 0.0562 | 0.0012 | 0.2 | 0.0023 | 0.125 | 0.0 | 0.0154 |
No log | 5.0 | 415 | 1.4048 | 0.0597 | 0.1133 | 0.057 | 0.0354 | 0.2192 | 0.5628 | 0.0778 | 0.192 | 0.2163 | 0.1898 | 0.4136 | 0.875 | 0.2873 | 0.6028 | 0.0009 | 0.1063 | 0.0031 | 0.1545 | 0.0072 | 0.1793 | 0.0001 | 0.0385 |
No log | 6.0 | 498 | 1.3473 | 0.0719 | 0.1314 | 0.0698 | 0.0455 | 0.2928 | 0.6042 | 0.0906 | 0.2615 | 0.2828 | 0.2319 | 0.5625 | 0.8958 | 0.3406 | 0.6197 | 0.0041 | 0.2625 | 0.0039 | 0.2727 | 0.0109 | 0.1976 | 0.0002 | 0.0615 |
2.6459 | 7.0 | 581 | 1.3085 | 0.0859 | 0.1431 | 0.0941 | 0.0461 | 0.2514 | 0.6921 | 0.1076 | 0.274 | 0.3119 | 0.2417 | 0.6042 | 0.9125 | 0.4091 | 0.6197 | 0.0058 | 0.3187 | 0.0053 | 0.4091 | 0.0092 | 0.1811 | 0.0001 | 0.0308 |
2.6459 | 8.0 | 664 | 1.3991 | 0.096 | 0.1896 | 0.0834 | 0.0514 | 0.3385 | 0.6803 | 0.1623 | 0.3106 | 0.3233 | 0.2407 | 0.6275 | 0.9042 | 0.3711 | 0.5567 | 0.0769 | 0.3625 | 0.0103 | 0.3909 | 0.0214 | 0.1677 | 0.0006 | 0.1385 |
2.6459 | 9.0 | 747 | 1.2441 | 0.1132 | 0.2009 | 0.108 | 0.0669 | 0.4426 | 0.8451 | 0.1713 | 0.3419 | 0.3688 | 0.2852 | 0.6747 | 0.9208 | 0.4431 | 0.6 | 0.0762 | 0.4437 | 0.0078 | 0.4 | 0.038 | 0.2232 | 0.0011 | 0.1769 |
2.6459 | 10.0 | 830 | 1.2353 | 0.1355 | 0.2241 | 0.1414 | 0.081 | 0.3865 | 0.8697 | 0.2051 | 0.3612 | 0.3879 | 0.291 | 0.7353 | 0.9208 | 0.5136 | 0.6275 | 0.0695 | 0.475 | 0.0114 | 0.4727 | 0.0822 | 0.2335 | 0.0008 | 0.1308 |
2.6459 | 11.0 | 913 | 1.2549 | 0.1024 | 0.2165 | 0.0832 | 0.0606 | 0.3461 | 0.8273 | 0.1455 | 0.3096 | 0.326 | 0.2378 | 0.6262 | 0.875 | 0.4035 | 0.5152 | 0.0322 | 0.4062 | 0.0087 | 0.3364 | 0.0659 | 0.203 | 0.0015 | 0.1692 |
2.6459 | 12.0 | 996 | 1.1793 | 0.1868 | 0.2971 | 0.199 | 0.1109 | 0.4837 | 0.8912 | 0.2007 | 0.3837 | 0.3907 | 0.2894 | 0.755 | 0.925 | 0.5445 | 0.6427 | 0.2159 | 0.4187 | 0.0164 | 0.5 | 0.1561 | 0.2537 | 0.0009 | 0.1385 |
1.197 | 13.0 | 1079 | 1.1864 | 0.1813 | 0.2901 | 0.1777 | 0.1047 | 0.4798 | 0.895 | 0.2218 | 0.3909 | 0.4018 | 0.3163 | 0.7097 | 0.9208 | 0.5205 | 0.6129 | 0.2186 | 0.4938 | 0.0227 | 0.4545 | 0.1435 | 0.2555 | 0.0013 | 0.1923 |
1.197 | 14.0 | 1162 | 1.2058 | 0.1626 | 0.2839 | 0.1641 | 0.1064 | 0.456 | 0.877 | 0.219 | 0.3737 | 0.3857 | 0.3036 | 0.6867 | 0.9125 | 0.5327 | 0.6247 | 0.1378 | 0.4125 | 0.0343 | 0.4455 | 0.1068 | 0.2152 | 0.0015 | 0.2308 |
1.197 | 15.0 | 1245 | 1.1169 | 0.1872 | 0.299 | 0.1926 | 0.1114 | 0.4861 | 0.8964 | 0.2142 | 0.3935 | 0.4056 | 0.3183 | 0.7157 | 0.9292 | 0.5492 | 0.6416 | 0.1994 | 0.4 | 0.023 | 0.5273 | 0.1631 | 0.2512 | 0.0015 | 0.2077 |
1.197 | 16.0 | 1328 | 1.1178 | 0.1917 | 0.3024 | 0.1915 | 0.1094 | 0.5089 | 0.8875 | 0.2112 | 0.3881 | 0.3943 | 0.3039 | 0.7433 | 0.9208 | 0.5703 | 0.6449 | 0.2019 | 0.425 | 0.0171 | 0.4909 | 0.1677 | 0.2415 | 0.0014 | 0.1692 |
1.197 | 17.0 | 1411 | 1.0844 | 0.1877 | 0.3046 | 0.1873 | 0.1144 | 0.5125 | 0.9234 | 0.2118 | 0.4051 | 0.4155 | 0.3145 | 0.7749 | 0.9417 | 0.569 | 0.6551 | 0.1696 | 0.4187 | 0.0266 | 0.4909 | 0.1722 | 0.2896 | 0.0013 | 0.2231 |
1.197 | 18.0 | 1494 | 1.0568 | 0.1993 | 0.3077 | 0.2146 | 0.1316 | 0.5029 | 0.9176 | 0.2257 | 0.405 | 0.4184 | 0.3395 | 0.7202 | 0.9458 | 0.5854 | 0.6702 | 0.18 | 0.4313 | 0.0283 | 0.4909 | 0.2013 | 0.2994 | 0.0017 | 0.2 |
0.9969 | 19.0 | 1577 | 1.0722 | 0.1971 | 0.3183 | 0.201 | 0.1323 | 0.4862 | 0.9307 | 0.2157 | 0.3928 | 0.406 | 0.3218 | 0.7345 | 0.95 | 0.5986 | 0.673 | 0.1767 | 0.475 | 0.0288 | 0.4364 | 0.1799 | 0.2841 | 0.0015 | 0.1615 |
0.9969 | 20.0 | 1660 | 1.0702 | 0.2098 | 0.3401 | 0.2093 | 0.135 | 0.5195 | 0.9179 | 0.2289 | 0.4078 | 0.4228 | 0.3344 | 0.7622 | 0.9417 | 0.5846 | 0.6629 | 0.2246 | 0.5 | 0.0318 | 0.4364 | 0.2059 | 0.2915 | 0.002 | 0.2231 |
0.9969 | 21.0 | 1743 | 1.0587 | 0.2084 | 0.3354 | 0.2009 | 0.1408 | 0.5043 | 0.9059 | 0.2252 | 0.4244 | 0.4351 | 0.3412 | 0.7655 | 0.9292 | 0.5983 | 0.677 | 0.1978 | 0.4875 | 0.0402 | 0.4455 | 0.2035 | 0.2963 | 0.0025 | 0.2692 |
0.9969 | 22.0 | 1826 | 1.0388 | 0.2072 | 0.324 | 0.2009 | 0.1457 | 0.5005 | 0.9025 | 0.2307 | 0.4229 | 0.4346 | 0.3444 | 0.7712 | 0.9292 | 0.5972 | 0.673 | 0.1531 | 0.4563 | 0.044 | 0.4636 | 0.239 | 0.3262 | 0.0027 | 0.2538 |
0.9969 | 23.0 | 1909 | 1.0332 | 0.209 | 0.3291 | 0.2113 | 0.1468 | 0.4986 | 0.9023 | 0.2379 | 0.4323 | 0.4391 | 0.3474 | 0.7586 | 0.9292 | 0.5936 | 0.6697 | 0.1648 | 0.4688 | 0.0452 | 0.4364 | 0.2382 | 0.336 | 0.0035 | 0.2846 |
0.9969 | 24.0 | 1992 | 1.0036 | 0.229 | 0.346 | 0.2347 | 0.1555 | 0.5326 | 0.9242 | 0.2418 | 0.4406 | 0.46 | 0.3714 | 0.7665 | 0.9458 | 0.6228 | 0.6983 | 0.2266 | 0.475 | 0.0378 | 0.4455 | 0.2546 | 0.3579 | 0.0033 | 0.3231 |
0.8821 | 25.0 | 2075 | 0.9946 | 0.2308 | 0.347 | 0.2345 | 0.16 | 0.5486 | 0.9167 | 0.2406 | 0.4457 | 0.4572 | 0.37 | 0.7728 | 0.9417 | 0.6219 | 0.6966 | 0.2397 | 0.4875 | 0.0363 | 0.4545 | 0.2532 | 0.3628 | 0.0032 | 0.2846 |
0.8821 | 26.0 | 2158 | 1.0003 | 0.2298 | 0.3428 | 0.2379 | 0.1521 | 0.5418 | 0.925 | 0.2481 | 0.4578 | 0.468 | 0.3785 | 0.7839 | 0.9458 | 0.6061 | 0.6843 | 0.2329 | 0.5063 | 0.0418 | 0.4727 | 0.2652 | 0.3768 | 0.0032 | 0.3 |
0.8821 | 27.0 | 2241 | 0.9896 | 0.2331 | 0.3446 | 0.234 | 0.1524 | 0.5515 | 0.9193 | 0.241 | 0.4543 | 0.4666 | 0.3777 | 0.7772 | 0.9417 | 0.6195 | 0.6938 | 0.2405 | 0.4938 | 0.0354 | 0.4455 | 0.2668 | 0.3848 | 0.0034 | 0.3154 |
0.8821 | 28.0 | 2324 | 0.9876 | 0.2329 | 0.3425 | 0.236 | 0.1597 | 0.5443 | 0.9217 | 0.2404 | 0.4529 | 0.4646 | 0.3746 | 0.776 | 0.9458 | 0.6215 | 0.6949 | 0.2299 | 0.4875 | 0.0407 | 0.4455 | 0.2693 | 0.3872 | 0.0033 | 0.3077 |
0.8821 | 29.0 | 2407 | 0.9894 | 0.2346 | 0.3462 | 0.2338 | 0.1591 | 0.5479 | 0.9194 | 0.24 | 0.4556 | 0.4662 | 0.3767 | 0.7772 | 0.9417 | 0.6212 | 0.6955 | 0.2374 | 0.4875 | 0.041 | 0.4455 | 0.27 | 0.3872 | 0.0033 | 0.3154 |
0.8821 | 30.0 | 2490 | 0.9886 | 0.2337 | 0.3446 | 0.233 | 0.1595 | 0.5443 | 0.9195 | 0.2413 | 0.4546 | 0.4658 | 0.3762 | 0.7772 | 0.9417 | 0.62 | 0.6933 | 0.2334 | 0.4938 | 0.0406 | 0.4364 | 0.2714 | 0.3902 | 0.0032 | 0.3154 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.4.1+cu118
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 36
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for zhengyu998/detr_finetuned_cppe5_Carla-COCO
Base model
microsoft/conditional-detr-resnet-50