rtdetr-cppe5-detection-v2

This model is a fine-tuned version of PekingU/rtdetr_r50vd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 9.9867

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
130.8497 1.0 63 71.7316
60.0138 2.0 126 28.9468
37.7655 3.0 189 18.4218
29.4038 4.0 252 14.7458
24.3595 5.0 315 12.2400
23.2655 6.0 378 11.4468
21.1393 7.0 441 10.5996
20.4947 8.0 504 10.2594
19.731 9.0 567 10.1579
19.052 10.0 630 9.9744
17.977 11.0 693 9.9976
18.3429 12.0 756 9.9192
17.1892 13.0 819 9.8707
17.3446 14.0 882 9.9686
16.9275 15.0 945 9.9490
17.2444 16.0 1008 9.8504
16.6155 17.0 1071 9.8684
15.4995 18.0 1134 9.8703
16.2258 19.0 1197 9.8435
15.7694 20.0 1260 9.9867

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
10
Safetensors
Model size
42.9M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Godsonntungi2/rtdetr-cppe5-detection-v2

Finetuned
(16)
this model

Evaluation results