metadata
library_name: transformers
license: apache-2.0
base_model: PekingU/rtdetr_r50vd_coco_o365
tags:
- generated_from_trainer
model-index:
- name: rt-detr-resnet-50-dc5-rdd2022-finetuned
results: []
rt-detr-resnet-50-dc5-rdd2022-finetuned
This model is a fine-tuned version of PekingU/rtdetr_r50vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 19.3719
- Map: 0.1069
- Map 50: 0.1963
- Map 75: 0.0976
- Map Small: 0.0428
- Map Medium: 0.1013
- Map Large: 0.1629
- Mar 1: 0.1475
- Mar 10: 0.1998
- Mar 100: 0.2204
- Mar Small: 0.1209
- Mar Medium: 0.1972
- Mar Large: 0.2704
- Map D20: 0.0147
- Mar 100 D20: 0.083
- Map D40: 0.0141
- Mar 100 D40: 0.1521
- Map D50: 0.2902
- Mar 100 D50: 0.3499
- Map D10: 0.0005
- Mar 100 D10: 0.1
- Map D44: 0.1447
- Mar 100 D44: 0.2717
- Map D00: 0.0095
- Mar 100 D00: 0.1005
- Map D43: 0.2743
- Mar 100 D43: 0.486
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map D20 | Mar 100 D20 | Map D40 | Mar 100 D40 | Map D50 | Mar 100 D50 | Map D10 | Mar 100 D10 | Map D44 | Mar 100 D44 | Map D00 | Mar 100 D00 | Map D43 | Mar 100 D43 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
70.5412 | 1.0 | 591 | 15.8723 | 0.195 | 0.3702 | 0.1913 | 0.0835 | 0.1413 | 0.2872 | 0.2633 | 0.4192 | 0.4639 | 0.2852 | 0.3983 | 0.5786 | 0.2117 | 0.5141 | 0.0347 | 0.2521 | 0.3687 | 0.5223 | 0.0601 | 0.3365 | 0.26 | 0.5964 | 0.051 | 0.366 | 0.3788 | 0.6596 |
18.658 | 2.0 | 1182 | 14.9769 | 0.215 | 0.41 | 0.2031 | 0.0901 | 0.185 | 0.3103 | 0.2907 | 0.4185 | 0.4475 | 0.2827 | 0.3842 | 0.5612 | 0.1749 | 0.4639 | 0.0771 | 0.2747 | 0.4111 | 0.5318 | 0.0522 | 0.2324 | 0.2839 | 0.6062 | 0.1151 | 0.3318 | 0.3905 | 0.6912 |
16.5068 | 3.0 | 1773 | 15.7224 | 0.225 | 0.4212 | 0.2151 | 0.0878 | 0.1847 | 0.3261 | 0.2885 | 0.3949 | 0.4256 | 0.224 | 0.3765 | 0.5596 | 0.1723 | 0.4003 | 0.0783 | 0.2695 | 0.4111 | 0.4905 | 0.0631 | 0.2403 | 0.2807 | 0.5652 | 0.099 | 0.3028 | 0.4704 | 0.7105 |
15.5337 | 4.0 | 2364 | 16.4444 | 0.1833 | 0.3516 | 0.1729 | 0.0658 | 0.1528 | 0.2629 | 0.2478 | 0.337 | 0.3669 | 0.1783 | 0.3145 | 0.4655 | 0.1052 | 0.2952 | 0.0472 | 0.2184 | 0.363 | 0.4341 | 0.022 | 0.1828 | 0.2872 | 0.5433 | 0.0723 | 0.242 | 0.3864 | 0.6526 |
14.9507 | 5.0 | 2955 | 16.4993 | 0.2025 | 0.3692 | 0.1978 | 0.0647 | 0.1726 | 0.292 | 0.2574 | 0.3549 | 0.3909 | 0.1786 | 0.3521 | 0.5171 | 0.1393 | 0.3789 | 0.026 | 0.2221 | 0.3892 | 0.4648 | 0.0359 | 0.2098 | 0.3253 | 0.5462 | 0.0591 | 0.25 | 0.4425 | 0.6649 |
14.1545 | 6.0 | 3546 | 17.7267 | 0.1623 | 0.2977 | 0.1535 | 0.0455 | 0.1462 | 0.2393 | 0.2146 | 0.2838 | 0.3159 | 0.1342 | 0.2742 | 0.4229 | 0.0548 | 0.2349 | 0.0278 | 0.1779 | 0.3243 | 0.3862 | 0.0168 | 0.1902 | 0.2644 | 0.4574 | 0.0253 | 0.1542 | 0.4225 | 0.6105 |
13.8334 | 7.0 | 4137 | 17.8210 | 0.1634 | 0.3069 | 0.165 | 0.0649 | 0.1381 | 0.2378 | 0.2169 | 0.2836 | 0.32 | 0.1723 | 0.263 | 0.4299 | 0.0587 | 0.2259 | 0.0363 | 0.2047 | 0.3482 | 0.4264 | 0.0024 | 0.1523 | 0.2671 | 0.4602 | 0.0341 | 0.2111 | 0.3967 | 0.5596 |
13.6299 | 8.0 | 4728 | 17.8241 | 0.1493 | 0.2867 | 0.1425 | 0.0527 | 0.133 | 0.2169 | 0.1898 | 0.2625 | 0.307 | 0.1583 | 0.2537 | 0.4132 | 0.0733 | 0.2392 | 0.0303 | 0.2053 | 0.3389 | 0.4112 | 0.0019 | 0.1523 | 0.2564 | 0.4321 | 0.0294 | 0.204 | 0.3151 | 0.5053 |
13.3132 | 9.0 | 5319 | 18.3872 | 0.1404 | 0.2689 | 0.1421 | 0.0756 | 0.1164 | 0.2121 | 0.1814 | 0.249 | 0.2812 | 0.1588 | 0.2283 | 0.366 | 0.0425 | 0.2041 | 0.0398 | 0.1826 | 0.3299 | 0.3989 | 0.002 | 0.112 | 0.2364 | 0.4021 | 0.0135 | 0.1462 | 0.3185 | 0.5228 |
13.2014 | 10.0 | 5910 | 18.5692 | 0.1337 | 0.2527 | 0.1286 | 0.0544 | 0.1129 | 0.2106 | 0.1816 | 0.2403 | 0.2678 | 0.1421 | 0.2112 | 0.3659 | 0.0425 | 0.1425 | 0.0381 | 0.1805 | 0.3206 | 0.3808 | 0.0071 | 0.1687 | 0.1744 | 0.3395 | 0.0248 | 0.1663 | 0.328 | 0.4965 |
12.8205 | 11.0 | 6501 | 18.1347 | 0.1674 | 0.3078 | 0.1555 | 0.0762 | 0.1299 | 0.2459 | 0.223 | 0.2937 | 0.3265 | 0.1629 | 0.2647 | 0.4288 | 0.0783 | 0.2133 | 0.0391 | 0.2211 | 0.3569 | 0.416 | 0.0069 | 0.1659 | 0.2878 | 0.4786 | 0.0353 | 0.1887 | 0.3677 | 0.6018 |
12.6339 | 12.0 | 7092 | 19.0128 | 0.1247 | 0.246 | 0.1093 | 0.0496 | 0.1097 | 0.1955 | 0.1574 | 0.2035 | 0.2217 | 0.0965 | 0.1869 | 0.29 | 0.0183 | 0.0985 | 0.0274 | 0.1574 | 0.3019 | 0.3504 | 0.0005 | 0.0869 | 0.1753 | 0.2969 | 0.0117 | 0.0948 | 0.3375 | 0.4667 |
12.4423 | 13.0 | 7683 | 18.8381 | 0.1277 | 0.2505 | 0.1144 | 0.0849 | 0.1266 | 0.1923 | 0.1778 | 0.235 | 0.2588 | 0.1694 | 0.2306 | 0.3268 | 0.0195 | 0.1193 | 0.0504 | 0.1979 | 0.3533 | 0.4209 | 0.0012 | 0.1278 | 0.1403 | 0.2776 | 0.0187 | 0.1432 | 0.3106 | 0.5246 |
12.2991 | 14.0 | 8274 | 19.0234 | 0.1228 | 0.2338 | 0.1101 | 0.0614 | 0.116 | 0.1907 | 0.1684 | 0.2215 | 0.2415 | 0.1367 | 0.2066 | 0.3109 | 0.0313 | 0.1213 | 0.041 | 0.1879 | 0.3032 | 0.3605 | 0.0006 | 0.112 | 0.1593 | 0.2819 | 0.0138 | 0.0934 | 0.3103 | 0.5333 |
12.2121 | 15.0 | 8865 | 19.4475 | 0.1089 | 0.2059 | 0.0998 | 0.0468 | 0.0918 | 0.1818 | 0.1399 | 0.1803 | 0.1988 | 0.1112 | 0.1531 | 0.2661 | 0.01 | 0.0793 | 0.0147 | 0.1242 | 0.2872 | 0.3381 | 0.0004 | 0.0913 | 0.1155 | 0.2286 | 0.009 | 0.0776 | 0.3256 | 0.4526 |
11.9798 | 16.0 | 9456 | 19.3502 | 0.1048 | 0.1988 | 0.1031 | 0.0471 | 0.091 | 0.1549 | 0.1461 | 0.1956 | 0.2164 | 0.1288 | 0.1905 | 0.2721 | 0.0129 | 0.079 | 0.0287 | 0.1832 | 0.2843 | 0.3481 | 0.0006 | 0.097 | 0.1235 | 0.2407 | 0.0071 | 0.0844 | 0.2765 | 0.4825 |
11.7902 | 17.0 | 10047 | 19.3674 | 0.1078 | 0.1969 | 0.1042 | 0.0455 | 0.0934 | 0.172 | 0.1509 | 0.2007 | 0.2234 | 0.1192 | 0.1898 | 0.2875 | 0.0119 | 0.083 | 0.0216 | 0.1621 | 0.2806 | 0.3464 | 0.0008 | 0.1166 | 0.1467 | 0.2645 | 0.0097 | 0.1 | 0.2831 | 0.4912 |
11.691 | 18.0 | 10638 | 19.4521 | 0.1104 | 0.201 | 0.1075 | 0.0409 | 0.0941 | 0.1724 | 0.1571 | 0.2099 | 0.2338 | 0.1174 | 0.1886 | 0.3027 | 0.0188 | 0.103 | 0.0164 | 0.1742 | 0.2806 | 0.3513 | 0.0007 | 0.1131 | 0.1344 | 0.2707 | 0.0074 | 0.1083 | 0.3145 | 0.5158 |
11.5899 | 19.0 | 11229 | 19.4104 | 0.1066 | 0.2 | 0.0945 | 0.0434 | 0.0983 | 0.1539 | 0.1488 | 0.1981 | 0.217 | 0.1238 | 0.1839 | 0.2732 | 0.0117 | 0.0798 | 0.0198 | 0.1379 | 0.2865 | 0.3493 | 0.0005 | 0.0951 | 0.1328 | 0.2726 | 0.0073 | 0.0962 | 0.2879 | 0.4877 |
11.4701 | 20.0 | 11820 | 19.3719 | 0.1069 | 0.1963 | 0.0976 | 0.0428 | 0.1013 | 0.1629 | 0.1475 | 0.1998 | 0.2204 | 0.1209 | 0.1972 | 0.2704 | 0.0147 | 0.083 | 0.0141 | 0.1521 | 0.2902 | 0.3499 | 0.0005 | 0.1 | 0.1447 | 0.2717 | 0.0095 | 0.1005 | 0.2743 | 0.486 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1