rtdetr-v2-r50-cppe5-finetune-2
This model is a fine-tuned version of PekingU/rtdetr_v2_r50vd on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 6.6493
- Map: 0.6094
- Mar 100: 0.7099
- Map 50: 0.9567
- Map 75: 0.719
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 40
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Mar 100 | Map 50 | Map 75 |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 142 | 34.3182 | 0.0607 | 0.6404 | 0.094 | 0.0739 |
| No log | 2.0 | 284 | 8.2558 | 0.4949 | 0.7404 | 0.8746 | 0.5321 |
| No log | 3.0 | 426 | 7.6811 | 0.596 | 0.7113 | 0.9488 | 0.7037 |
| 175.7115 | 4.0 | 568 | 6.7879 | 0.5943 | 0.7057 | 0.9781 | 0.7188 |
| 175.7115 | 5.0 | 710 | 6.6212 | 0.6087 | 0.7248 | 0.9704 | 0.7148 |
| 175.7115 | 6.0 | 852 | 6.6538 | 0.5936 | 0.7255 | 0.972 | 0.7068 |
| 175.7115 | 7.0 | 994 | 6.5767 | 0.6078 | 0.7298 | 0.9675 | 0.7592 |
| 12.1686 | 8.0 | 1136 | 6.6338 | 0.6246 | 0.7511 | 0.9646 | 0.7952 |
| 12.1686 | 9.0 | 1278 | 6.5246 | 0.598 | 0.7262 | 0.9483 | 0.7194 |
| 12.1686 | 10.0 | 1420 | 6.3852 | 0.6277 | 0.7369 | 0.959 | 0.8146 |
| 11.266 | 11.0 | 1562 | 6.6311 | 0.6092 | 0.7312 | 0.9659 | 0.7514 |
| 11.266 | 12.0 | 1704 | 6.4214 | 0.609 | 0.7333 | 0.9642 | 0.7471 |
| 11.266 | 13.0 | 1846 | 6.4222 | 0.6219 | 0.7319 | 0.9696 | 0.7763 |
| 11.266 | 14.0 | 1988 | 6.5066 | 0.617 | 0.7426 | 0.9636 | 0.7628 |
| 10.6436 | 15.0 | 2130 | 6.5157 | 0.6119 | 0.7248 | 0.9767 | 0.79 |
| 10.6436 | 16.0 | 2272 | 6.4737 | 0.6119 | 0.7348 | 0.9571 | 0.7829 |
| 10.6436 | 17.0 | 2414 | 6.5789 | 0.6326 | 0.7277 | 0.9674 | 0.7798 |
| 10.0889 | 18.0 | 2556 | 6.5001 | 0.6168 | 0.722 | 0.9755 | 0.7703 |
| 10.0889 | 19.0 | 2698 | 6.4702 | 0.614 | 0.7326 | 0.963 | 0.7387 |
| 10.0889 | 20.0 | 2840 | 6.5242 | 0.6151 | 0.7248 | 0.9673 | 0.7568 |
| 10.0889 | 21.0 | 2982 | 6.5797 | 0.6293 | 0.7298 | 0.9738 | 0.7369 |
| 9.4174 | 22.0 | 3124 | 6.5303 | 0.616 | 0.7355 | 0.9669 | 0.7879 |
| 9.4174 | 23.0 | 3266 | 6.7147 | 0.6079 | 0.7262 | 0.9668 | 0.756 |
| 9.4174 | 24.0 | 3408 | 6.6836 | 0.6146 | 0.7319 | 0.9725 | 0.7951 |
| 8.9177 | 25.0 | 3550 | 6.6182 | 0.6048 | 0.7298 | 0.9734 | 0.7162 |
| 8.9177 | 26.0 | 3692 | 6.5351 | 0.6182 | 0.7156 | 0.9713 | 0.7856 |
| 8.9177 | 27.0 | 3834 | 6.3831 | 0.6235 | 0.7262 | 0.9782 | 0.8026 |
| 8.9177 | 28.0 | 3976 | 6.4694 | 0.6194 | 0.7326 | 0.9756 | 0.7649 |
| 8.4449 | 29.0 | 4118 | 6.6127 | 0.6119 | 0.717 | 0.9672 | 0.7576 |
| 8.4449 | 30.0 | 4260 | 6.6870 | 0.6078 | 0.7312 | 0.9562 | 0.7143 |
| 8.4449 | 31.0 | 4402 | 6.7105 | 0.6084 | 0.7163 | 0.9659 | 0.7379 |
| 7.9999 | 32.0 | 4544 | 6.6299 | 0.6091 | 0.717 | 0.9607 | 0.7731 |
| 7.9999 | 33.0 | 4686 | 6.7453 | 0.6062 | 0.7262 | 0.9625 | 0.7409 |
| 7.9999 | 34.0 | 4828 | 6.6493 | 0.6121 | 0.7191 | 0.9643 | 0.7963 |
| 7.9999 | 35.0 | 4970 | 6.8267 | 0.6044 | 0.7213 | 0.9668 | 0.7582 |
| 7.5518 | 36.0 | 5112 | 6.8738 | 0.604 | 0.7156 | 0.9669 | 0.735 |
| 7.5518 | 37.0 | 5254 | 6.8444 | 0.6068 | 0.7113 | 0.967 | 0.7484 |
| 7.5518 | 38.0 | 5396 | 6.8525 | 0.605 | 0.7142 | 0.9667 | 0.7401 |
| 7.1858 | 39.0 | 5538 | 6.8547 | 0.609 | 0.7149 | 0.9683 | 0.7406 |
| 7.1858 | 40.0 | 5680 | 6.8616 | 0.6103 | 0.7156 | 0.9697 | 0.7328 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.9.0+cu128
- Datasets 4.4.1
- Tokenizers 0.22.1
- Downloads last month
- 9
Model tree for ranm26/rtdetr-v2-r50-cppe5-finetune-2
Base model
PekingU/rtdetr_v2_r50vd