cppe_finetuned_microsoft_detr
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0626
- Map: 0.3024
- Map 50: 0.5805
- Map 75: 0.267
- Map Small: 0.1176
- Map Medium: 0.2792
- Map Large: 0.4209
- Mar 1: 0.3265
- Mar 10: 0.4753
- Mar 100: 0.493
- Mar Small: 0.2999
- Mar Medium: 0.4328
- Mar Large: 0.5974
- Map Coverall: 0.6047
- Mar 100 Coverall: 0.7317
- Map Face Shield: 0.1945
- Mar 100 Face Shield: 0.4371
- Map Gloves: 0.2079
- Mar 100 Gloves: 0.3809
- Map Goggles: 0.1927
- Mar 100 Goggles: 0.4708
- Map Mask: 0.3121
- Mar 100 Mask: 0.4444
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 107 | 1.2931 | 0.1555 | 0.3243 | 0.1314 | 0.0558 | 0.1632 | 0.1669 | 0.1937 | 0.3673 | 0.4018 | 0.195 | 0.365 | 0.4633 | 0.4548 | 0.6756 | 0.0363 | 0.2903 | 0.0701 | 0.3393 | 0.0442 | 0.3063 | 0.172 | 0.3974 |
No log | 2.0 | 214 | 1.2610 | 0.1908 | 0.3986 | 0.1517 | 0.0879 | 0.1743 | 0.2192 | 0.2276 | 0.4023 | 0.4378 | 0.2142 | 0.3801 | 0.5356 | 0.5089 | 0.6861 | 0.0549 | 0.3403 | 0.0915 | 0.3764 | 0.0949 | 0.35 | 0.204 | 0.436 |
No log | 3.0 | 321 | 1.2437 | 0.1857 | 0.385 | 0.1501 | 0.0685 | 0.1703 | 0.2282 | 0.2215 | 0.3818 | 0.4142 | 0.2441 | 0.3476 | 0.4908 | 0.5133 | 0.6906 | 0.0584 | 0.3145 | 0.0869 | 0.3348 | 0.0627 | 0.3104 | 0.2073 | 0.4206 |
No log | 4.0 | 428 | 1.2333 | 0.1856 | 0.3823 | 0.1584 | 0.0645 | 0.1891 | 0.2458 | 0.2334 | 0.4016 | 0.435 | 0.2289 | 0.3904 | 0.5318 | 0.48 | 0.6717 | 0.0516 | 0.3597 | 0.0926 | 0.3449 | 0.0925 | 0.4021 | 0.2111 | 0.3968 |
1.2628 | 5.0 | 535 | 1.1888 | 0.2164 | 0.4411 | 0.1896 | 0.0945 | 0.2176 | 0.2803 | 0.2553 | 0.4269 | 0.458 | 0.2953 | 0.4162 | 0.5399 | 0.5172 | 0.6944 | 0.0615 | 0.4032 | 0.1291 | 0.3904 | 0.1075 | 0.3667 | 0.2666 | 0.4354 |
1.2628 | 6.0 | 642 | 1.2081 | 0.2176 | 0.4395 | 0.1929 | 0.1024 | 0.2072 | 0.2821 | 0.2459 | 0.4112 | 0.4422 | 0.2462 | 0.3919 | 0.5244 | 0.5461 | 0.6694 | 0.0926 | 0.3855 | 0.1002 | 0.3253 | 0.1061 | 0.4125 | 0.2431 | 0.4185 |
1.2628 | 7.0 | 749 | 1.1817 | 0.2211 | 0.4487 | 0.1808 | 0.1157 | 0.22 | 0.2853 | 0.2672 | 0.4291 | 0.4571 | 0.2933 | 0.4044 | 0.5384 | 0.5562 | 0.6983 | 0.114 | 0.3887 | 0.1202 | 0.3635 | 0.0768 | 0.3875 | 0.2384 | 0.4476 |
1.2628 | 8.0 | 856 | 1.1479 | 0.2366 | 0.4707 | 0.2 | 0.095 | 0.2234 | 0.3338 | 0.2812 | 0.4407 | 0.4614 | 0.2716 | 0.4098 | 0.5436 | 0.5626 | 0.7022 | 0.0875 | 0.4048 | 0.1459 | 0.3657 | 0.1235 | 0.4104 | 0.2632 | 0.4238 |
1.2628 | 9.0 | 963 | 1.1549 | 0.2368 | 0.489 | 0.1934 | 0.1169 | 0.2275 | 0.3217 | 0.2798 | 0.4324 | 0.4575 | 0.2512 | 0.4139 | 0.5355 | 0.5537 | 0.6856 | 0.1044 | 0.4 | 0.1528 | 0.3506 | 0.1268 | 0.4187 | 0.2465 | 0.4328 |
1.1288 | 10.0 | 1070 | 1.1305 | 0.2359 | 0.4874 | 0.1974 | 0.0855 | 0.2248 | 0.3196 | 0.2768 | 0.4554 | 0.4793 | 0.2871 | 0.4268 | 0.5719 | 0.563 | 0.7083 | 0.0903 | 0.4306 | 0.1509 | 0.368 | 0.0978 | 0.4542 | 0.2777 | 0.4354 |
1.1288 | 11.0 | 1177 | 1.1459 | 0.2407 | 0.4803 | 0.2174 | 0.115 | 0.2211 | 0.3336 | 0.2786 | 0.4438 | 0.471 | 0.2459 | 0.417 | 0.5717 | 0.5698 | 0.7156 | 0.1062 | 0.4258 | 0.1374 | 0.3461 | 0.139 | 0.4417 | 0.2511 | 0.4259 |
1.1288 | 12.0 | 1284 | 1.1289 | 0.2497 | 0.5027 | 0.2098 | 0.1368 | 0.2233 | 0.3492 | 0.2913 | 0.4496 | 0.4676 | 0.2719 | 0.3944 | 0.5732 | 0.5902 | 0.7117 | 0.1187 | 0.4145 | 0.1528 | 0.3478 | 0.1139 | 0.425 | 0.2729 | 0.4392 |
1.1288 | 13.0 | 1391 | 1.1266 | 0.2564 | 0.5143 | 0.2167 | 0.1368 | 0.2323 | 0.3596 | 0.296 | 0.4573 | 0.4824 | 0.2703 | 0.4183 | 0.6026 | 0.5731 | 0.7056 | 0.1374 | 0.4435 | 0.1614 | 0.3517 | 0.1354 | 0.4854 | 0.2745 | 0.4259 |
1.1288 | 14.0 | 1498 | 1.1110 | 0.2641 | 0.5054 | 0.2241 | 0.1242 | 0.2381 | 0.3797 | 0.3017 | 0.4593 | 0.4811 | 0.2946 | 0.422 | 0.5822 | 0.5887 | 0.7206 | 0.1489 | 0.4323 | 0.1551 | 0.35 | 0.1371 | 0.4563 | 0.2906 | 0.4466 |
1.0237 | 15.0 | 1605 | 1.0852 | 0.2788 | 0.5402 | 0.2488 | 0.1479 | 0.2432 | 0.3835 | 0.2996 | 0.4663 | 0.492 | 0.3137 | 0.4293 | 0.5889 | 0.5757 | 0.7133 | 0.1546 | 0.4419 | 0.1981 | 0.3972 | 0.1701 | 0.4688 | 0.2953 | 0.4386 |
1.0237 | 16.0 | 1712 | 1.0886 | 0.2768 | 0.5406 | 0.2381 | 0.1037 | 0.2503 | 0.3823 | 0.3031 | 0.4631 | 0.4878 | 0.292 | 0.431 | 0.5886 | 0.5799 | 0.715 | 0.1653 | 0.4435 | 0.2027 | 0.3916 | 0.1374 | 0.4458 | 0.2987 | 0.4429 |
1.0237 | 17.0 | 1819 | 1.0825 | 0.2755 | 0.5308 | 0.2419 | 0.1058 | 0.2549 | 0.3796 | 0.3094 | 0.4629 | 0.4839 | 0.261 | 0.433 | 0.5792 | 0.5822 | 0.7083 | 0.1686 | 0.421 | 0.1964 | 0.3837 | 0.1297 | 0.4583 | 0.3005 | 0.4481 |
1.0237 | 18.0 | 1926 | 1.0832 | 0.2878 | 0.5609 | 0.2468 | 0.1383 | 0.2599 | 0.4 | 0.3072 | 0.4664 | 0.4858 | 0.279 | 0.4257 | 0.6015 | 0.591 | 0.7139 | 0.1712 | 0.4177 | 0.2039 | 0.3848 | 0.1659 | 0.4771 | 0.307 | 0.4354 |
0.9398 | 19.0 | 2033 | 1.0871 | 0.283 | 0.5585 | 0.2483 | 0.1222 | 0.2519 | 0.4097 | 0.3143 | 0.4597 | 0.4822 | 0.2877 | 0.4131 | 0.5994 | 0.5798 | 0.7117 | 0.1882 | 0.4226 | 0.2148 | 0.386 | 0.1401 | 0.4479 | 0.2921 | 0.4429 |
0.9398 | 20.0 | 2140 | 1.0830 | 0.2926 | 0.5668 | 0.2684 | 0.1397 | 0.2659 | 0.4047 | 0.311 | 0.466 | 0.4844 | 0.2939 | 0.4176 | 0.5936 | 0.5834 | 0.7172 | 0.1957 | 0.4306 | 0.2104 | 0.3798 | 0.1691 | 0.45 | 0.3042 | 0.4444 |
0.9398 | 21.0 | 2247 | 1.0669 | 0.2973 | 0.5757 | 0.2692 | 0.134 | 0.2775 | 0.4069 | 0.3216 | 0.4725 | 0.494 | 0.2879 | 0.4342 | 0.5977 | 0.5851 | 0.7106 | 0.1966 | 0.4355 | 0.2185 | 0.3865 | 0.1791 | 0.4833 | 0.3072 | 0.454 |
0.9398 | 22.0 | 2354 | 1.0805 | 0.2894 | 0.5703 | 0.2566 | 0.1277 | 0.2651 | 0.421 | 0.3232 | 0.4695 | 0.4863 | 0.3147 | 0.4199 | 0.5961 | 0.5887 | 0.7194 | 0.1879 | 0.4403 | 0.2018 | 0.3646 | 0.1696 | 0.4667 | 0.2989 | 0.4402 |
0.9398 | 23.0 | 2461 | 1.0686 | 0.3071 | 0.5837 | 0.2746 | 0.1368 | 0.2735 | 0.4291 | 0.3296 | 0.4745 | 0.4905 | 0.303 | 0.4352 | 0.5913 | 0.6071 | 0.7344 | 0.1998 | 0.4468 | 0.2135 | 0.3736 | 0.2062 | 0.4563 | 0.3088 | 0.4413 |
0.8602 | 24.0 | 2568 | 1.0703 | 0.2993 | 0.5735 | 0.2705 | 0.1225 | 0.2744 | 0.4134 | 0.3201 | 0.4754 | 0.4916 | 0.3 | 0.4343 | 0.5966 | 0.6049 | 0.7378 | 0.1961 | 0.4387 | 0.2067 | 0.3787 | 0.1804 | 0.4667 | 0.3084 | 0.436 |
0.8602 | 25.0 | 2675 | 1.0672 | 0.3009 | 0.5742 | 0.2655 | 0.1273 | 0.2775 | 0.419 | 0.3263 | 0.4804 | 0.4989 | 0.3047 | 0.4422 | 0.5972 | 0.6047 | 0.7333 | 0.1898 | 0.4435 | 0.2015 | 0.3826 | 0.1979 | 0.4854 | 0.3104 | 0.4497 |
0.8602 | 26.0 | 2782 | 1.0661 | 0.3021 | 0.5755 | 0.268 | 0.1187 | 0.2731 | 0.4256 | 0.3241 | 0.4786 | 0.4935 | 0.3007 | 0.4297 | 0.6028 | 0.6021 | 0.725 | 0.1903 | 0.4387 | 0.2072 | 0.377 | 0.1982 | 0.475 | 0.3126 | 0.4519 |
0.8602 | 27.0 | 2889 | 1.0628 | 0.3009 | 0.5737 | 0.2665 | 0.1136 | 0.2778 | 0.42 | 0.3231 | 0.4755 | 0.493 | 0.2926 | 0.4344 | 0.5994 | 0.603 | 0.7289 | 0.1875 | 0.4403 | 0.2061 | 0.3826 | 0.1989 | 0.4708 | 0.3091 | 0.4423 |
0.8602 | 28.0 | 2996 | 1.0646 | 0.3031 | 0.579 | 0.2694 | 0.1169 | 0.2819 | 0.4205 | 0.3269 | 0.4762 | 0.4935 | 0.2948 | 0.4355 | 0.5975 | 0.6047 | 0.7317 | 0.1945 | 0.4435 | 0.21 | 0.3798 | 0.1918 | 0.4667 | 0.3144 | 0.446 |
0.8206 | 29.0 | 3103 | 1.0626 | 0.3031 | 0.5804 | 0.2679 | 0.1177 | 0.2792 | 0.4217 | 0.3266 | 0.4754 | 0.4929 | 0.2936 | 0.433 | 0.5982 | 0.605 | 0.7311 | 0.1945 | 0.4355 | 0.2076 | 0.382 | 0.1956 | 0.4708 | 0.3126 | 0.445 |
0.8206 | 30.0 | 3210 | 1.0626 | 0.3024 | 0.5805 | 0.267 | 0.1176 | 0.2792 | 0.4209 | 0.3265 | 0.4753 | 0.493 | 0.2999 | 0.4328 | 0.5974 | 0.6047 | 0.7317 | 0.1945 | 0.4371 | 0.2079 | 0.3809 | 0.1927 | 0.4708 | 0.3121 | 0.4444 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 53
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for optimization-hashira/cppe_finetuned_microsoft_detr
Base model
microsoft/conditional-detr-resnet-50