detr_finetuned_cppe5
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1505
- Map: 0.2397
- Map 50: 0.4842
- Map 75: 0.2145
- Map Small: 0.0771
- Map Medium: 0.1892
- Map Large: 0.3666
- Mar 1: 0.2729
- Mar 10: 0.4204
- Mar 100: 0.4418
- Mar Small: 0.1732
- Mar Medium: 0.3953
- Mar Large: 0.6037
- Map Coverall: 0.5417
- Mar 100 Coverall: 0.6581
- Map Face Shield: 0.1556
- Mar 100 Face Shield: 0.4253
- Map Gloves: 0.1615
- Mar 100 Gloves: 0.3464
- Map Goggles: 0.0883
- Mar 100 Goggles: 0.3831
- Map Mask: 0.2513
- Mar 100 Mask: 0.396
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 107 | 1.9254 | 0.0089 | 0.0292 | 0.0041 | 0.0076 | 0.0056 | 0.0152 | 0.0283 | 0.1434 | 0.1834 | 0.1216 | 0.1419 | 0.2288 | 0.021 | 0.3333 | 0.0041 | 0.138 | 0.0023 | 0.1237 | 0.0008 | 0.0646 | 0.0163 | 0.2573 |
No log | 2.0 | 214 | 1.7315 | 0.0316 | 0.0824 | 0.0217 | 0.0064 | 0.0117 | 0.0431 | 0.0674 | 0.1563 | 0.2029 | 0.0736 | 0.137 | 0.2704 | 0.1228 | 0.4757 | 0.0142 | 0.0797 | 0.0036 | 0.1656 | 0.0012 | 0.0308 | 0.0164 | 0.2627 |
No log | 3.0 | 321 | 1.6433 | 0.0295 | 0.0741 | 0.0226 | 0.0058 | 0.0264 | 0.0369 | 0.0783 | 0.1926 | 0.2497 | 0.0829 | 0.1988 | 0.3205 | 0.0975 | 0.5032 | 0.0078 | 0.1772 | 0.0044 | 0.192 | 0.0019 | 0.0923 | 0.0358 | 0.284 |
No log | 4.0 | 428 | 1.5178 | 0.0511 | 0.1239 | 0.0353 | 0.0198 | 0.0536 | 0.067 | 0.1029 | 0.223 | 0.2837 | 0.1222 | 0.2297 | 0.3573 | 0.1569 | 0.6212 | 0.0298 | 0.1848 | 0.0053 | 0.217 | 0.0067 | 0.0508 | 0.0567 | 0.3449 |
2.3097 | 5.0 | 535 | 1.4526 | 0.0713 | 0.1539 | 0.0603 | 0.0169 | 0.0549 | 0.089 | 0.1343 | 0.277 | 0.3258 | 0.1162 | 0.268 | 0.4345 | 0.2458 | 0.6248 | 0.0208 | 0.2405 | 0.0144 | 0.2906 | 0.0139 | 0.1292 | 0.0617 | 0.344 |
2.3097 | 6.0 | 642 | 1.5010 | 0.0801 | 0.1644 | 0.0688 | 0.0056 | 0.0541 | 0.0964 | 0.1051 | 0.249 | 0.295 | 0.1058 | 0.2399 | 0.3782 | 0.3258 | 0.6401 | 0.0106 | 0.1873 | 0.0087 | 0.2219 | 0.0063 | 0.1169 | 0.0494 | 0.3089 |
2.3097 | 7.0 | 749 | 1.4414 | 0.1159 | 0.248 | 0.1043 | 0.0213 | 0.0832 | 0.1497 | 0.1409 | 0.3233 | 0.3544 | 0.1386 | 0.285 | 0.5048 | 0.401 | 0.6419 | 0.0515 | 0.3228 | 0.024 | 0.2634 | 0.0076 | 0.1954 | 0.0954 | 0.3484 |
2.3097 | 8.0 | 856 | 1.3548 | 0.1377 | 0.2836 | 0.1153 | 0.0262 | 0.1127 | 0.1769 | 0.1715 | 0.3524 | 0.3806 | 0.1773 | 0.3246 | 0.5433 | 0.4279 | 0.6063 | 0.0503 | 0.3291 | 0.0598 | 0.3241 | 0.0244 | 0.2769 | 0.126 | 0.3667 |
2.3097 | 9.0 | 963 | 1.3714 | 0.1387 | 0.3026 | 0.1118 | 0.0471 | 0.1076 | 0.1801 | 0.1768 | 0.3338 | 0.3622 | 0.1575 | 0.3076 | 0.4957 | 0.4347 | 0.6207 | 0.0763 | 0.3405 | 0.0557 | 0.2812 | 0.0108 | 0.2338 | 0.1161 | 0.3347 |
1.266 | 10.0 | 1070 | 1.3475 | 0.147 | 0.3108 | 0.1229 | 0.054 | 0.1173 | 0.2096 | 0.1726 | 0.3343 | 0.3685 | 0.1741 | 0.3064 | 0.5156 | 0.4417 | 0.6054 | 0.0569 | 0.319 | 0.0784 | 0.3071 | 0.0236 | 0.2646 | 0.1343 | 0.3462 |
1.266 | 11.0 | 1177 | 1.3020 | 0.1686 | 0.3368 | 0.1441 | 0.0417 | 0.1373 | 0.2309 | 0.196 | 0.3778 | 0.4038 | 0.1531 | 0.3567 | 0.5621 | 0.4751 | 0.6387 | 0.0649 | 0.3861 | 0.0861 | 0.3192 | 0.0425 | 0.2938 | 0.1745 | 0.3813 |
1.266 | 12.0 | 1284 | 1.2834 | 0.1783 | 0.3679 | 0.1553 | 0.0602 | 0.1293 | 0.2556 | 0.2056 | 0.3728 | 0.4095 | 0.1542 | 0.3621 | 0.5692 | 0.4902 | 0.6356 | 0.0793 | 0.4152 | 0.1067 | 0.3027 | 0.0312 | 0.3231 | 0.1842 | 0.3711 |
1.266 | 13.0 | 1391 | 1.2809 | 0.1884 | 0.3905 | 0.1642 | 0.0763 | 0.1413 | 0.2712 | 0.2209 | 0.3812 | 0.4131 | 0.149 | 0.3702 | 0.5711 | 0.5076 | 0.6514 | 0.1031 | 0.4241 | 0.1121 | 0.3152 | 0.0267 | 0.3138 | 0.1927 | 0.3609 |
1.266 | 14.0 | 1498 | 1.2472 | 0.2063 | 0.4264 | 0.1738 | 0.0719 | 0.165 | 0.2975 | 0.2314 | 0.392 | 0.4239 | 0.1678 | 0.3819 | 0.5731 | 0.5065 | 0.6468 | 0.1438 | 0.443 | 0.1169 | 0.3321 | 0.0456 | 0.3185 | 0.2188 | 0.3791 |
1.1184 | 15.0 | 1605 | 1.2362 | 0.1995 | 0.4184 | 0.1744 | 0.0717 | 0.1504 | 0.3102 | 0.2327 | 0.3969 | 0.4225 | 0.1598 | 0.3799 | 0.5834 | 0.5193 | 0.6414 | 0.1235 | 0.4139 | 0.1171 | 0.3237 | 0.0365 | 0.3631 | 0.201 | 0.3702 |
1.1184 | 16.0 | 1712 | 1.2272 | 0.2058 | 0.4247 | 0.1817 | 0.0802 | 0.1523 | 0.3163 | 0.2416 | 0.4039 | 0.4325 | 0.1692 | 0.381 | 0.6015 | 0.5089 | 0.6514 | 0.1292 | 0.4456 | 0.1208 | 0.3366 | 0.0421 | 0.3431 | 0.2278 | 0.3858 |
1.1184 | 17.0 | 1819 | 1.2129 | 0.2126 | 0.4398 | 0.1768 | 0.0687 | 0.1595 | 0.3418 | 0.2568 | 0.4052 | 0.4281 | 0.1488 | 0.3755 | 0.5999 | 0.5196 | 0.6505 | 0.1524 | 0.4405 | 0.1173 | 0.3179 | 0.0507 | 0.3492 | 0.2228 | 0.3822 |
1.1184 | 18.0 | 1926 | 1.1863 | 0.2217 | 0.4585 | 0.1855 | 0.0758 | 0.1655 | 0.3558 | 0.2671 | 0.418 | 0.4418 | 0.1641 | 0.3919 | 0.6083 | 0.5137 | 0.6491 | 0.1634 | 0.4608 | 0.1441 | 0.3326 | 0.0524 | 0.3631 | 0.2348 | 0.4036 |
0.9987 | 19.0 | 2033 | 1.1810 | 0.2248 | 0.4596 | 0.1896 | 0.085 | 0.1722 | 0.3442 | 0.2613 | 0.4204 | 0.441 | 0.1612 | 0.3939 | 0.6136 | 0.5193 | 0.6541 | 0.1567 | 0.4405 | 0.1443 | 0.3335 | 0.0503 | 0.3708 | 0.2533 | 0.4062 |
0.9987 | 20.0 | 2140 | 1.1736 | 0.2239 | 0.4592 | 0.1928 | 0.0785 | 0.1673 | 0.351 | 0.265 | 0.4142 | 0.4379 | 0.1884 | 0.3805 | 0.6085 | 0.5237 | 0.65 | 0.1457 | 0.4342 | 0.1585 | 0.3415 | 0.0583 | 0.3785 | 0.2332 | 0.3853 |
0.9987 | 21.0 | 2247 | 1.1634 | 0.2311 | 0.4658 | 0.2071 | 0.0757 | 0.1792 | 0.3625 | 0.2713 | 0.4179 | 0.4377 | 0.1625 | 0.3869 | 0.6066 | 0.5357 | 0.6572 | 0.1398 | 0.4241 | 0.1576 | 0.342 | 0.0803 | 0.3631 | 0.2423 | 0.4022 |
0.9987 | 22.0 | 2354 | 1.1715 | 0.2264 | 0.4584 | 0.2126 | 0.0775 | 0.179 | 0.3555 | 0.2674 | 0.4136 | 0.4337 | 0.1694 | 0.3896 | 0.5918 | 0.5298 | 0.65 | 0.1425 | 0.4165 | 0.1609 | 0.3442 | 0.0645 | 0.3662 | 0.2341 | 0.3916 |
0.9987 | 23.0 | 2461 | 1.1680 | 0.2304 | 0.4713 | 0.2057 | 0.0824 | 0.1768 | 0.3583 | 0.2659 | 0.4208 | 0.4387 | 0.1748 | 0.3881 | 0.6052 | 0.538 | 0.6545 | 0.1487 | 0.4329 | 0.1599 | 0.3353 | 0.0588 | 0.3831 | 0.2468 | 0.3876 |
0.9095 | 24.0 | 2568 | 1.1550 | 0.2405 | 0.4887 | 0.2174 | 0.0799 | 0.1828 | 0.3681 | 0.2698 | 0.4198 | 0.4399 | 0.1723 | 0.39 | 0.602 | 0.5444 | 0.659 | 0.1684 | 0.4241 | 0.1645 | 0.3411 | 0.0824 | 0.3877 | 0.2428 | 0.3876 |
0.9095 | 25.0 | 2675 | 1.1538 | 0.2397 | 0.488 | 0.2138 | 0.0792 | 0.1892 | 0.3626 | 0.273 | 0.423 | 0.4439 | 0.1731 | 0.3978 | 0.6037 | 0.5364 | 0.6581 | 0.1658 | 0.4392 | 0.1641 | 0.3433 | 0.0838 | 0.3846 | 0.2486 | 0.3942 |
0.9095 | 26.0 | 2782 | 1.1572 | 0.2427 | 0.4879 | 0.2162 | 0.0775 | 0.1905 | 0.3668 | 0.2728 | 0.4217 | 0.4414 | 0.1622 | 0.3959 | 0.6015 | 0.5404 | 0.6577 | 0.1703 | 0.4329 | 0.1614 | 0.342 | 0.091 | 0.38 | 0.2502 | 0.3942 |
0.9095 | 27.0 | 2889 | 1.1502 | 0.239 | 0.4833 | 0.2093 | 0.075 | 0.1866 | 0.3686 | 0.2698 | 0.4184 | 0.4403 | 0.1693 | 0.3919 | 0.6056 | 0.5419 | 0.6581 | 0.1542 | 0.4165 | 0.1592 | 0.3455 | 0.0893 | 0.3831 | 0.2506 | 0.3982 |
0.9095 | 28.0 | 2996 | 1.1521 | 0.2399 | 0.4842 | 0.2118 | 0.0775 | 0.1882 | 0.3694 | 0.2705 | 0.4206 | 0.4436 | 0.1705 | 0.3977 | 0.6073 | 0.5412 | 0.6577 | 0.1545 | 0.4241 | 0.1615 | 0.3464 | 0.0909 | 0.3923 | 0.2516 | 0.3973 |
0.858 | 29.0 | 3103 | 1.1511 | 0.2399 | 0.4839 | 0.2146 | 0.0767 | 0.1895 | 0.3676 | 0.2732 | 0.4208 | 0.4422 | 0.1726 | 0.3959 | 0.6043 | 0.5422 | 0.6586 | 0.1557 | 0.4253 | 0.1618 | 0.3469 | 0.0886 | 0.3846 | 0.2512 | 0.3956 |
0.858 | 30.0 | 3210 | 1.1505 | 0.2397 | 0.4842 | 0.2145 | 0.0771 | 0.1892 | 0.3666 | 0.2729 | 0.4204 | 0.4418 | 0.1732 | 0.3953 | 0.6037 | 0.5417 | 0.6581 | 0.1556 | 0.4253 | 0.1615 | 0.3464 | 0.0883 | 0.3831 | 0.2513 | 0.396 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.4.1+cu118
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 35
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for zhengyu998/detr_finetuned_cppe5
Base model
microsoft/conditional-detr-resnet-50