rtdetr-r50-fruits3.1-finetune
This model is a fine-tuned version of PekingU/rtdetr_v2_r50vd on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 10.2825
- Map: 0.5141
- Map 50: 0.6438
- Map 75: 0.5328
- Map Small: 0.1834
- Map Medium: 0.4552
- Map Large: 0.6833
- Mar 1: 0.2766
- Mar 10: 0.6043
- Mar 100: 0.6896
- Mar Small: 0.3069
- Mar Medium: 0.653
- Mar Large: 0.8601
- Map Apple: 0.484
- Mar 100 Apple: 0.6951
- Map Banana: 0.5239
- Mar 100 Banana: 0.7047
- Map Grapes: 0.4436
- Mar 100 Grapes: 0.5755
- Map Orange: 0.4851
- Mar 100 Orange: 0.6356
- Map Pineapple: 0.6032
- Mar 100 Pineapple: 0.7605
- Map Watermelon: 0.5445
- Mar 100 Watermelon: 0.7659
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 300
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Apple | Mar 100 Apple | Map Banana | Mar 100 Banana | Map Grapes | Mar 100 Grapes | Map Orange | Mar 100 Orange | Map Pineapple | Mar 100 Pineapple | Map Watermelon | Mar 100 Watermelon |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
61.0927 | 1.0 | 750 | 12.3536 | 0.4105 | 0.5309 | 0.4348 | 0.1007 | 0.3523 | 0.5936 | 0.2516 | 0.5606 | 0.6943 | 0.3265 | 0.6804 | 0.8648 | 0.3696 | 0.6673 | 0.4721 | 0.7284 | 0.3272 | 0.5616 | 0.3893 | 0.626 | 0.4278 | 0.7525 | 0.4771 | 0.8298 |
16.2131 | 2.0 | 1500 | 11.0204 | 0.4591 | 0.5898 | 0.4801 | 0.1266 | 0.3931 | 0.6455 | 0.2608 | 0.5789 | 0.7038 | 0.343 | 0.6833 | 0.8654 | 0.4085 | 0.6891 | 0.4473 | 0.7299 | 0.3675 | 0.5923 | 0.4612 | 0.6384 | 0.5151 | 0.7489 | 0.5548 | 0.8241 |
15.0214 | 3.0 | 2250 | 10.6170 | 0.4864 | 0.6169 | 0.5035 | 0.1523 | 0.4532 | 0.651 | 0.2665 | 0.5951 | 0.7111 | 0.3693 | 0.6834 | 0.8752 | 0.4727 | 0.701 | 0.4647 | 0.7427 | 0.3902 | 0.5812 | 0.4707 | 0.6376 | 0.5272 | 0.7851 | 0.5933 | 0.8188 |
13.8951 | 4.0 | 3000 | 10.3428 | 0.5106 | 0.6355 | 0.5364 | 0.1468 | 0.4302 | 0.7136 | 0.2779 | 0.5933 | 0.6975 | 0.3156 | 0.6572 | 0.8758 | 0.4983 | 0.6914 | 0.4684 | 0.7223 | 0.4134 | 0.5737 | 0.4944 | 0.6373 | 0.5702 | 0.7678 | 0.6189 | 0.7926 |
13.1909 | 5.0 | 3750 | 10.0188 | 0.5284 | 0.6543 | 0.5485 | 0.1648 | 0.4766 | 0.7106 | 0.2813 | 0.6024 | 0.723 | 0.3977 | 0.6936 | 0.8861 | 0.5002 | 0.6963 | 0.5068 | 0.7582 | 0.4229 | 0.6168 | 0.5036 | 0.6445 | 0.6046 | 0.7812 | 0.6321 | 0.8406 |
12.6709 | 6.0 | 4500 | 9.9985 | 0.5344 | 0.6675 | 0.5558 | 0.1833 | 0.4766 | 0.7188 | 0.2757 | 0.6071 | 0.7241 | 0.3929 | 0.6945 | 0.8888 | 0.5139 | 0.7059 | 0.4884 | 0.7343 | 0.4506 | 0.6095 | 0.521 | 0.6597 | 0.5997 | 0.7891 | 0.6326 | 0.8457 |
12.2203 | 7.0 | 5250 | 10.2048 | 0.5242 | 0.6494 | 0.5433 | 0.178 | 0.4519 | 0.7161 | 0.2805 | 0.6061 | 0.7126 | 0.351 | 0.675 | 0.8877 | 0.4993 | 0.6946 | 0.5214 | 0.7222 | 0.4247 | 0.5999 | 0.4953 | 0.6465 | 0.5884 | 0.7826 | 0.6158 | 0.8295 |
12.0783 | 8.0 | 6000 | 10.1179 | 0.5268 | 0.6511 | 0.5493 | 0.1827 | 0.4863 | 0.7003 | 0.2771 | 0.6061 | 0.7276 | 0.3949 | 0.7093 | 0.8875 | 0.5119 | 0.7065 | 0.4949 | 0.7407 | 0.4276 | 0.6063 | 0.4987 | 0.6544 | 0.5789 | 0.7971 | 0.649 | 0.8608 |
11.7674 | 9.0 | 6750 | 10.7223 | 0.4952 | 0.6141 | 0.5192 | 0.1485 | 0.4494 | 0.6737 | 0.2707 | 0.6003 | 0.7203 | 0.397 | 0.69 | 0.8796 | 0.4633 | 0.6996 | 0.4973 | 0.7448 | 0.4313 | 0.5888 | 0.4631 | 0.6606 | 0.4962 | 0.7714 | 0.6198 | 0.8565 |
11.5219 | 10.0 | 7500 | 9.8389 | 0.5221 | 0.6564 | 0.5438 | 0.2016 | 0.461 | 0.7026 | 0.2753 | 0.6044 | 0.7138 | 0.3662 | 0.6795 | 0.8808 | 0.4921 | 0.6998 | 0.4981 | 0.731 | 0.4604 | 0.6223 | 0.5007 | 0.6329 | 0.5846 | 0.7768 | 0.5966 | 0.8202 |
11.1748 | 11.0 | 8250 | 10.1057 | 0.5227 | 0.6501 | 0.5488 | 0.2067 | 0.4508 | 0.7094 | 0.276 | 0.6117 | 0.7201 | 0.3771 | 0.6965 | 0.881 | 0.5098 | 0.7075 | 0.4953 | 0.7257 | 0.4535 | 0.6093 | 0.4955 | 0.6459 | 0.5596 | 0.7967 | 0.6225 | 0.8355 |
11.0897 | 12.0 | 9000 | 10.4848 | 0.5181 | 0.6473 | 0.5365 | 0.1853 | 0.4529 | 0.6962 | 0.2812 | 0.5989 | 0.6958 | 0.3296 | 0.6539 | 0.8736 | 0.4742 | 0.676 | 0.5002 | 0.7085 | 0.455 | 0.5944 | 0.4842 | 0.6337 | 0.5777 | 0.7373 | 0.6171 | 0.8247 |
10.8801 | 13.0 | 9750 | 10.0386 | 0.5219 | 0.6487 | 0.541 | 0.1772 | 0.4602 | 0.7022 | 0.2822 | 0.6146 | 0.7172 | 0.3145 | 0.6881 | 0.8917 | 0.4978 | 0.6926 | 0.5425 | 0.7467 | 0.4405 | 0.6068 | 0.4968 | 0.6402 | 0.5561 | 0.7967 | 0.5974 | 0.8205 |
10.723 | 14.0 | 10500 | 9.5538 | 0.5517 | 0.6795 | 0.5827 | 0.205 | 0.4932 | 0.7338 | 0.2815 | 0.6206 | 0.7203 | 0.3765 | 0.6777 | 0.8922 | 0.526 | 0.6924 | 0.5612 | 0.7605 | 0.4625 | 0.6355 | 0.5177 | 0.6492 | 0.6413 | 0.7783 | 0.6012 | 0.806 |
10.5762 | 15.0 | 11250 | 9.9150 | 0.5278 | 0.6531 | 0.5492 | 0.1654 | 0.4583 | 0.7103 | 0.2817 | 0.608 | 0.7124 | 0.328 | 0.6863 | 0.8814 | 0.5079 | 0.6937 | 0.5213 | 0.74 | 0.4296 | 0.5788 | 0.5046 | 0.6458 | 0.5681 | 0.788 | 0.6354 | 0.8278 |
10.5457 | 16.0 | 12000 | 9.7727 | 0.5418 | 0.6709 | 0.5595 | 0.2069 | 0.4866 | 0.7144 | 0.2766 | 0.6027 | 0.698 | 0.3201 | 0.6541 | 0.8743 | 0.4888 | 0.671 | 0.5528 | 0.7216 | 0.4597 | 0.5941 | 0.4876 | 0.6375 | 0.5958 | 0.7475 | 0.666 | 0.8165 |
10.3053 | 17.0 | 12750 | 9.9212 | 0.5392 | 0.6723 | 0.5627 | 0.1865 | 0.5009 | 0.7033 | 0.2806 | 0.6087 | 0.7101 | 0.3284 | 0.68 | 0.8813 | 0.4932 | 0.6922 | 0.5369 | 0.7298 | 0.458 | 0.598 | 0.5006 | 0.6409 | 0.5976 | 0.7681 | 0.6489 | 0.8315 |
10.2234 | 18.0 | 13500 | 9.7854 | 0.5276 | 0.6601 | 0.5511 | 0.1782 | 0.4737 | 0.6981 | 0.2792 | 0.61 | 0.7155 | 0.3208 | 0.6896 | 0.8851 | 0.5198 | 0.7052 | 0.5038 | 0.724 | 0.4565 | 0.6081 | 0.5157 | 0.6548 | 0.5671 | 0.7746 | 0.6024 | 0.8264 |
10.0754 | 19.0 | 14250 | 10.5111 | 0.5086 | 0.6329 | 0.531 | 0.1631 | 0.4495 | 0.6713 | 0.2769 | 0.6035 | 0.7092 | 0.3326 | 0.6738 | 0.8797 | 0.4743 | 0.6977 | 0.5051 | 0.7214 | 0.443 | 0.601 | 0.4704 | 0.6357 | 0.5976 | 0.7663 | 0.5609 | 0.8332 |
10.1197 | 20.0 | 15000 | 10.2968 | 0.5204 | 0.6463 | 0.5452 | 0.1705 | 0.4586 | 0.7002 | 0.2755 | 0.6036 | 0.7082 | 0.3294 | 0.6713 | 0.885 | 0.4837 | 0.6886 | 0.4481 | 0.7191 | 0.4491 | 0.5994 | 0.5103 | 0.6457 | 0.5882 | 0.7678 | 0.6432 | 0.8284 |
9.8649 | 21.0 | 15750 | 10.0439 | 0.5264 | 0.6576 | 0.5482 | 0.1852 | 0.4709 | 0.7035 | 0.2796 | 0.5943 | 0.697 | 0.3317 | 0.6524 | 0.8778 | 0.4649 | 0.6727 | 0.4915 | 0.7099 | 0.4386 | 0.5786 | 0.498 | 0.6446 | 0.6342 | 0.7696 | 0.631 | 0.8065 |
9.8367 | 22.0 | 16500 | 9.5550 | 0.5584 | 0.6869 | 0.5826 | 0.2292 | 0.5105 | 0.7348 | 0.2825 | 0.618 | 0.7228 | 0.3802 | 0.6957 | 0.8876 | 0.5113 | 0.6976 | 0.5429 | 0.7419 | 0.4542 | 0.6117 | 0.5285 | 0.6559 | 0.634 | 0.7841 | 0.6797 | 0.8455 |
9.5876 | 23.0 | 17250 | 9.8796 | 0.5391 | 0.6607 | 0.563 | 0.1863 | 0.4824 | 0.7163 | 0.2796 | 0.6184 | 0.7192 | 0.3546 | 0.6988 | 0.884 | 0.4933 | 0.6959 | 0.5088 | 0.7413 | 0.4553 | 0.6001 | 0.513 | 0.6559 | 0.6116 | 0.7779 | 0.6525 | 0.8443 |
9.5507 | 24.0 | 18000 | 10.0149 | 0.5351 | 0.6661 | 0.5565 | 0.2108 | 0.4703 | 0.7099 | 0.2782 | 0.6077 | 0.706 | 0.3393 | 0.6614 | 0.8794 | 0.5201 | 0.699 | 0.5335 | 0.7254 | 0.4731 | 0.6213 | 0.5071 | 0.6384 | 0.6214 | 0.7779 | 0.5554 | 0.7741 |
9.4775 | 25.0 | 18750 | 9.9049 | 0.538 | 0.6679 | 0.5583 | 0.1898 | 0.4768 | 0.7084 | 0.2808 | 0.6184 | 0.7181 | 0.3543 | 0.6864 | 0.8851 | 0.5175 | 0.6982 | 0.5342 | 0.7324 | 0.4569 | 0.6153 | 0.5212 | 0.6541 | 0.5982 | 0.7884 | 0.6002 | 0.8205 |
9.4565 | 26.0 | 19500 | 10.0631 | 0.5303 | 0.6553 | 0.5532 | 0.1805 | 0.4694 | 0.6984 | 0.281 | 0.6136 | 0.7016 | 0.325 | 0.6713 | 0.8758 | 0.5116 | 0.6836 | 0.5145 | 0.7229 | 0.4634 | 0.5911 | 0.4719 | 0.6312 | 0.5988 | 0.7678 | 0.6215 | 0.8128 |
9.3008 | 27.0 | 20250 | 9.9243 | 0.5324 | 0.6673 | 0.5509 | 0.1461 | 0.4834 | 0.6986 | 0.2792 | 0.6119 | 0.7063 | 0.3311 | 0.6805 | 0.8749 | 0.4854 | 0.7004 | 0.5318 | 0.7143 | 0.449 | 0.5907 | 0.4865 | 0.6382 | 0.5817 | 0.7641 | 0.6602 | 0.8304 |
9.1329 | 28.0 | 21000 | 10.0756 | 0.5404 | 0.675 | 0.5647 | 0.1808 | 0.4816 | 0.7115 | 0.2807 | 0.6161 | 0.7134 | 0.3472 | 0.6905 | 0.8782 | 0.5153 | 0.6949 | 0.5222 | 0.7309 | 0.4616 | 0.6116 | 0.5094 | 0.6427 | 0.6026 | 0.7808 | 0.631 | 0.8196 |
9.1302 | 29.0 | 21750 | 9.9070 | 0.548 | 0.6764 | 0.5735 | 0.1968 | 0.5044 | 0.7151 | 0.2826 | 0.6176 | 0.7138 | 0.3358 | 0.6884 | 0.8799 | 0.5129 | 0.7091 | 0.5441 | 0.7292 | 0.4818 | 0.6084 | 0.4875 | 0.6393 | 0.6194 | 0.7667 | 0.6423 | 0.8301 |
9.0189 | 30.0 | 22500 | 10.0898 | 0.5376 | 0.6634 | 0.5571 | 0.1898 | 0.489 | 0.705 | 0.2798 | 0.6117 | 0.7035 | 0.3247 | 0.6689 | 0.8763 | 0.5088 | 0.6922 | 0.5051 | 0.702 | 0.4494 | 0.588 | 0.4821 | 0.6381 | 0.6578 | 0.7928 | 0.6225 | 0.8082 |
8.9913 | 31.0 | 23250 | 10.9136 | 0.4988 | 0.6284 | 0.5167 | 0.128 | 0.452 | 0.667 | 0.2759 | 0.5892 | 0.6775 | 0.2474 | 0.643 | 0.8679 | 0.4517 | 0.6456 | 0.4671 | 0.6929 | 0.4257 | 0.5472 | 0.4507 | 0.611 | 0.5799 | 0.7757 | 0.6177 | 0.7923 |
8.7717 | 32.0 | 24000 | 10.2825 | 0.5141 | 0.6438 | 0.5328 | 0.1834 | 0.4552 | 0.6833 | 0.2766 | 0.6043 | 0.6896 | 0.3069 | 0.653 | 0.8601 | 0.484 | 0.6951 | 0.5239 | 0.7047 | 0.4436 | 0.5755 | 0.4851 | 0.6356 | 0.6032 | 0.7605 | 0.5445 | 0.7659 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for hungnguyen2k4/rtdetr-r50-fruits3.1-finetune
Base model
PekingU/rtdetr_v2_r50vd