detr_finetuned_cppe5
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8041
- Map: 0.4041
- Map 50: 0.8246
- Map 75: 0.3402
- Map Small: 0.3079
- Map Medium: 0.3562
- Map Large: 0.6364
- Mar 1: 0.1839
- Mar 10: 0.4794
- Mar 100: 0.5657
- Mar Small: 0.4329
- Mar Medium: 0.5174
- Mar Large: 0.7856
- Map Hardhat: 0.4075
- Mar 100 Hardhat: 0.5473
- Map No-hardhat: 0.4007
- Mar 100 No-hardhat: 0.5842
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Hardhat | Mar 100 Hardhat | Map No-hardhat | Mar 100 No-hardhat |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 125 | 1.1847 | 0.0822 | 0.1954 | 0.0553 | 0.0796 | 0.0937 | 0.2324 | 0.1312 | 0.3509 | 0.4354 | 0.2086 | 0.3775 | 0.6788 | 0.1426 | 0.5655 | 0.0218 | 0.3053 |
No log | 2.0 | 250 | 1.0931 | 0.1205 | 0.2648 | 0.0965 | 0.1277 | 0.1601 | 0.1647 | 0.1587 | 0.3817 | 0.4625 | 0.2886 | 0.435 | 0.6636 | 0.1868 | 0.5618 | 0.0542 | 0.3632 |
No log | 3.0 | 375 | 1.1882 | 0.1449 | 0.3805 | 0.0834 | 0.0775 | 0.1623 | 0.3615 | 0.1042 | 0.3262 | 0.4281 | 0.3086 | 0.347 | 0.5962 | 0.2321 | 0.5036 | 0.0577 | 0.3526 |
1.472 | 4.0 | 500 | 1.0418 | 0.2419 | 0.6022 | 0.1416 | 0.1374 | 0.3265 | 0.3723 | 0.1444 | 0.4138 | 0.4827 | 0.2986 | 0.5035 | 0.6114 | 0.2844 | 0.5127 | 0.1994 | 0.4526 |
1.472 | 5.0 | 625 | 1.0232 | 0.2349 | 0.5815 | 0.1564 | 0.1736 | 0.3006 | 0.3826 | 0.1395 | 0.3992 | 0.4974 | 0.2686 | 0.5149 | 0.6689 | 0.2891 | 0.5527 | 0.1807 | 0.4421 |
1.472 | 6.0 | 750 | 0.9985 | 0.293 | 0.6561 | 0.2108 | 0.2129 | 0.3184 | 0.426 | 0.1604 | 0.4373 | 0.5125 | 0.3129 | 0.5225 | 0.6879 | 0.3452 | 0.5618 | 0.2407 | 0.4632 |
1.472 | 7.0 | 875 | 0.9616 | 0.3145 | 0.7258 | 0.2615 | 0.2491 | 0.3435 | 0.4999 | 0.1381 | 0.4634 | 0.5455 | 0.3171 | 0.5568 | 0.7386 | 0.3164 | 0.5436 | 0.3126 | 0.5474 |
0.9939 | 8.0 | 1000 | 0.9688 | 0.3194 | 0.7461 | 0.1786 | 0.1922 | 0.3171 | 0.576 | 0.16 | 0.4325 | 0.5252 | 0.2943 | 0.5194 | 0.7462 | 0.2982 | 0.4873 | 0.3405 | 0.5632 |
0.9939 | 9.0 | 1125 | 0.9211 | 0.3572 | 0.7888 | 0.2944 | 0.2046 | 0.3196 | 0.5454 | 0.1482 | 0.4632 | 0.5346 | 0.2571 | 0.5625 | 0.7303 | 0.3545 | 0.5218 | 0.3599 | 0.5474 |
0.9939 | 10.0 | 1250 | 0.9664 | 0.3463 | 0.7569 | 0.2541 | 0.2503 | 0.3473 | 0.4924 | 0.1611 | 0.4344 | 0.5116 | 0.31 | 0.499 | 0.7258 | 0.3423 | 0.5127 | 0.3504 | 0.5105 |
0.9939 | 11.0 | 1375 | 0.9463 | 0.3261 | 0.8263 | 0.2012 | 0.2582 | 0.2676 | 0.5931 | 0.1643 | 0.4153 | 0.5134 | 0.32 | 0.4437 | 0.7629 | 0.3386 | 0.5164 | 0.3136 | 0.5105 |
0.8775 | 12.0 | 1500 | 0.9153 | 0.3571 | 0.7972 | 0.2882 | 0.2397 | 0.3273 | 0.5803 | 0.1587 | 0.4377 | 0.5556 | 0.3571 | 0.566 | 0.7189 | 0.3347 | 0.5164 | 0.3795 | 0.5947 |
0.8775 | 13.0 | 1625 | 0.9063 | 0.3512 | 0.8299 | 0.2471 | 0.2342 | 0.3119 | 0.6117 | 0.1695 | 0.4222 | 0.5016 | 0.2786 | 0.4672 | 0.7439 | 0.3422 | 0.4927 | 0.3602 | 0.5105 |
0.8775 | 14.0 | 1750 | 0.9384 | 0.3351 | 0.7633 | 0.2393 | 0.1938 | 0.3064 | 0.5551 | 0.1723 | 0.4257 | 0.5105 | 0.3371 | 0.4718 | 0.7076 | 0.3496 | 0.5 | 0.3206 | 0.5211 |
0.8775 | 15.0 | 1875 | 0.8734 | 0.3836 | 0.8279 | 0.3055 | 0.2541 | 0.348 | 0.614 | 0.1748 | 0.4373 | 0.531 | 0.3729 | 0.4941 | 0.7386 | 0.3671 | 0.5145 | 0.4002 | 0.5474 |
0.7888 | 16.0 | 2000 | 0.8470 | 0.3763 | 0.8437 | 0.2603 | 0.2854 | 0.3289 | 0.5821 | 0.1822 | 0.4556 | 0.5455 | 0.4314 | 0.4861 | 0.7568 | 0.3894 | 0.5436 | 0.3633 | 0.5474 |
0.7888 | 17.0 | 2125 | 0.8579 | 0.3708 | 0.8189 | 0.2792 | 0.2701 | 0.2976 | 0.6115 | 0.185 | 0.443 | 0.5206 | 0.3957 | 0.4633 | 0.7326 | 0.3986 | 0.5255 | 0.343 | 0.5158 |
0.7888 | 18.0 | 2250 | 0.8404 | 0.3714 | 0.7962 | 0.2522 | 0.2531 | 0.3327 | 0.6139 | 0.1778 | 0.4587 | 0.5433 | 0.3729 | 0.5084 | 0.7455 | 0.3709 | 0.5182 | 0.3719 | 0.5684 |
0.7888 | 19.0 | 2375 | 0.8268 | 0.3997 | 0.8285 | 0.3108 | 0.2915 | 0.3526 | 0.5942 | 0.1829 | 0.4882 | 0.5481 | 0.4157 | 0.4986 | 0.7644 | 0.4014 | 0.5436 | 0.3979 | 0.5526 |
0.7048 | 20.0 | 2500 | 0.8091 | 0.4209 | 0.8122 | 0.4316 | 0.2668 | 0.377 | 0.6569 | 0.1964 | 0.4669 | 0.5568 | 0.4 | 0.5206 | 0.7462 | 0.4154 | 0.54 | 0.4265 | 0.5737 |
0.7048 | 21.0 | 2625 | 0.8206 | 0.416 | 0.8227 | 0.303 | 0.3221 | 0.3747 | 0.6208 | 0.1839 | 0.4811 | 0.5401 | 0.3729 | 0.4992 | 0.747 | 0.4198 | 0.5382 | 0.4121 | 0.5421 |
0.7048 | 22.0 | 2750 | 0.8108 | 0.4266 | 0.8502 | 0.4021 | 0.3038 | 0.3847 | 0.6317 | 0.1965 | 0.4688 | 0.5534 | 0.4257 | 0.5125 | 0.7515 | 0.4264 | 0.5436 | 0.4269 | 0.5632 |
0.7048 | 23.0 | 2875 | 0.8239 | 0.4103 | 0.8158 | 0.3492 | 0.2874 | 0.3626 | 0.6316 | 0.1919 | 0.4572 | 0.5533 | 0.4114 | 0.5152 | 0.7462 | 0.417 | 0.5382 | 0.4036 | 0.5684 |
0.6439 | 24.0 | 3000 | 0.8092 | 0.4077 | 0.825 | 0.3504 | 0.3205 | 0.3525 | 0.6074 | 0.1893 | 0.4883 | 0.5641 | 0.4357 | 0.5228 | 0.7652 | 0.4129 | 0.5545 | 0.4026 | 0.5737 |
0.6439 | 25.0 | 3125 | 0.8076 | 0.4104 | 0.8432 | 0.3547 | 0.316 | 0.3559 | 0.6302 | 0.1893 | 0.4689 | 0.5535 | 0.4429 | 0.5027 | 0.7515 | 0.4187 | 0.5491 | 0.4021 | 0.5579 |
0.6439 | 26.0 | 3250 | 0.7988 | 0.4133 | 0.837 | 0.3469 | 0.3285 | 0.3631 | 0.6222 | 0.2035 | 0.4849 | 0.573 | 0.4643 | 0.5166 | 0.7902 | 0.4219 | 0.5618 | 0.4048 | 0.5842 |
0.6439 | 27.0 | 3375 | 0.8015 | 0.4082 | 0.832 | 0.3262 | 0.3104 | 0.3619 | 0.62 | 0.1699 | 0.4785 | 0.5693 | 0.4429 | 0.5174 | 0.7902 | 0.4165 | 0.5491 | 0.3998 | 0.5895 |
0.6079 | 28.0 | 3500 | 0.8043 | 0.4064 | 0.824 | 0.3412 | 0.3052 | 0.3588 | 0.6301 | 0.1857 | 0.4785 | 0.5631 | 0.4329 | 0.5111 | 0.7856 | 0.4145 | 0.5473 | 0.3982 | 0.5789 |
0.6079 | 29.0 | 3625 | 0.8043 | 0.403 | 0.8246 | 0.3378 | 0.3082 | 0.3533 | 0.6358 | 0.1839 | 0.4794 | 0.5667 | 0.44 | 0.5174 | 0.7856 | 0.4076 | 0.5491 | 0.3984 | 0.5842 |
0.6079 | 30.0 | 3750 | 0.8041 | 0.4041 | 0.8246 | 0.3402 | 0.3079 | 0.3562 | 0.6364 | 0.1839 | 0.4794 | 0.5657 | 0.4329 | 0.5174 | 0.7856 | 0.4075 | 0.5473 | 0.4007 | 0.5842 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 99
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for hxwk507/detr_finetuned_cppe5
Base model
microsoft/conditional-detr-resnet-50