lw-detr-medium-tray-detection-hub-init

This model is a fine-tuned version of AnnaZhang/lwdetr_medium_60e_coco on the nielsr/tray-cart-detection dataset. It achieves the following results on the evaluation set:

  • Loss: 9.3240
  • Map: 0.4578
  • Map 50: 0.7573
  • Map 75: 0.4780
  • Map Small: 0.6219
  • Map Medium: 0.4323
  • Map Large: 0.6013
  • Mar 1: 0.0666
  • Mar 10: 0.3331
  • Mar 100: 0.5372
  • Mar Small: 0.6238
  • Mar Medium: 0.5074
  • Mar Large: 0.7371
  • Map Per Class: -1.0
  • Mar 100 Per Class: -1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 4
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 300.0

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Per Class Mar 100 Per Class
8.1848 1.0 25 7.9812 0.1435 0.3363 0.1109 0.4139 0.1333 0.2534 0.0251 0.1157 0.2631 0.5857 0.2184 0.4740 -1.0 -1.0
5.8618 2.0 50 8.2678 0.2526 0.4962 0.2112 0.4874 0.2491 0.3527 0.0423 0.1901 0.3557 0.5119 0.3278 0.5104 -1.0 -1.0
5.1454 3.0 75 8.5504 0.2988 0.5869 0.2618 0.5210 0.2907 0.3796 0.0424 0.2381 0.4039 0.6048 0.3726 0.5627 -1.0 -1.0
4.7482 4.0 100 8.6991 0.3458 0.6480 0.3359 0.5575 0.3369 0.4219 0.0475 0.2574 0.4483 0.6024 0.4198 0.6106 -1.0 -1.0
4.5330 5.0 125 8.3971 0.3721 0.6832 0.3540 0.5741 0.3682 0.4560 0.0540 0.2843 0.4978 0.6333 0.4800 0.5878 -1.0 -1.0
4.1680 6.0 150 8.6646 0.3952 0.7328 0.3653 0.5804 0.3761 0.5275 0.0585 0.2848 0.4699 0.6119 0.4481 0.5852 -1.0 -1.0
4.2098 7.0 175 9.0848 0.3944 0.7143 0.3776 0.5974 0.3830 0.4835 0.0524 0.2734 0.4757 0.6119 0.4600 0.5490 -1.0 -1.0
3.9427 8.0 200 8.4698 0.3977 0.7213 0.3679 0.5200 0.3686 0.5834 0.0572 0.2828 0.4775 0.5548 0.4461 0.6902 -1.0 -1.0
3.8337 9.0 225 9.0207 0.3865 0.6856 0.4031 0.6180 0.3788 0.4956 0.0589 0.2715 0.4838 0.6333 0.4621 0.6002 -1.0 -1.0
3.7943 10.0 250 8.9388 0.4012 0.7208 0.3869 0.5656 0.3900 0.4790 0.0517 0.3104 0.4934 0.5833 0.4758 0.6009 -1.0 -1.0
3.7726 11.0 275 9.2610 0.3941 0.7179 0.3722 0.5896 0.3823 0.4907 0.0481 0.2932 0.4751 0.5905 0.4543 0.5923 -1.0 -1.0
3.5588 12.0 300 9.1557 0.3897 0.7229 0.3494 0.6089 0.3730 0.5256 0.0480 0.2957 0.4950 0.6238 0.4745 0.6066 -1.0 -1.0
3.4384 13.0 325 8.9558 0.4018 0.7064 0.3752 0.5532 0.3983 0.4603 0.0504 0.3014 0.4933 0.5571 0.4783 0.5924 -1.0 -1.0
3.4022 14.0 350 9.3375 0.4079 0.7364 0.4034 0.5751 0.3916 0.5041 0.0518 0.2976 0.4990 0.5857 0.4700 0.6902 -1.0 -1.0
3.3132 15.0 375 8.7625 0.4124 0.7299 0.3900 0.6249 0.3923 0.5530 0.0563 0.3020 0.4983 0.6310 0.4747 0.6368 -1.0 -1.0
3.2947 16.0 400 8.9514 0.4222 0.7418 0.4335 0.5266 0.4128 0.5114 0.0537 0.3031 0.5168 0.5690 0.5060 0.5974 -1.0 -1.0
3.3214 17.0 425 9.2974 0.3746 0.7435 0.3315 0.5135 0.3558 0.5423 0.0471 0.2816 0.4784 0.5190 0.4529 0.6639 -1.0 -1.0
3.2517 18.0 450 8.5708 0.4240 0.7304 0.4626 0.6015 0.4174 0.5148 0.0624 0.3235 0.5256 0.6024 0.4988 0.7055 -1.0 -1.0
3.0901 19.0 475 8.8775 0.4104 0.7342 0.3616 0.5639 0.3993 0.5030 0.0528 0.2901 0.5014 0.5810 0.4819 0.6279 -1.0 -1.0
3.0710 20.0 500 8.7070 0.3996 0.7125 0.3801 0.6014 0.3778 0.5304 0.0527 0.2933 0.5012 0.6190 0.4777 0.6424 -1.0 -1.0
3.0758 21.0 525 8.7404 0.4118 0.7191 0.4078 0.5816 0.4069 0.4996 0.0521 0.2981 0.5107 0.5833 0.4896 0.6473 -1.0 -1.0
2.9934 22.0 550 9.0060 0.4213 0.7281 0.4236 0.5762 0.4049 0.5164 0.0496 0.3129 0.5064 0.5857 0.4866 0.6318 -1.0 -1.0
2.8724 23.0 575 9.2333 0.4262 0.7492 0.3845 0.5668 0.4159 0.5020 0.0519 0.3046 0.5173 0.5667 0.5026 0.6175 -1.0 -1.0
2.8296 24.0 600 9.2622 0.4243 0.7391 0.4183 0.5527 0.4098 0.5511 0.0552 0.3105 0.5106 0.5548 0.4875 0.6740 -1.0 -1.0
2.8128 25.0 625 8.8469 0.4332 0.7532 0.4067 0.5649 0.4155 0.5792 0.0628 0.3154 0.5220 0.5881 0.4947 0.7082 -1.0 -1.0
2.8109 26.0 650 9.2135 0.4067 0.7041 0.4218 0.5677 0.4093 0.5233 0.0617 0.3198 0.5268 0.5833 0.5047 0.6823 -1.0 -1.0
2.8434 27.0 675 8.9676 0.4297 0.7278 0.4447 0.5748 0.4106 0.5597 0.0555 0.3107 0.5159 0.5833 0.4997 0.6161 -1.0 -1.0
2.7521 28.0 700 9.1216 0.4318 0.7418 0.4410 0.6021 0.4245 0.5261 0.0554 0.3018 0.5228 0.6071 0.5070 0.6202 -1.0 -1.0
2.7126 29.0 725 9.1676 0.4143 0.7209 0.4215 0.5990 0.3947 0.5432 0.0578 0.3165 0.5156 0.6000 0.4858 0.7115 -1.0 -1.0
2.7230 30.0 750 9.4190 0.3812 0.6908 0.3748 0.3792 0.3741 0.5315 0.0542 0.2857 0.4743 0.3810 0.4618 0.6188 -1.0 -1.0
2.8132 31.0 775 9.3240 0.4578 0.7573 0.4780 0.6219 0.4323 0.6013 0.0666 0.3331 0.5372 0.6238 0.5074 0.7371 -1.0 -1.0
2.6837 32.0 800 9.1286 0.4207 0.7357 0.4057 0.5849 0.3971 0.5545 0.0571 0.3127 0.5144 0.5952 0.4970 0.6216 -1.0 -1.0
2.5350 33.0 825 9.3508 0.4084 0.7225 0.4179 0.6298 0.3932 0.5131 0.0504 0.3142 0.5089 0.6405 0.4839 0.6532 -1.0 -1.0
2.5510 34.0 850 9.3640 0.4034 0.7085 0.3874 0.5713 0.3815 0.5396 0.0521 0.3233 0.5079 0.5714 0.4822 0.6784 -1.0 -1.0
2.6408 35.0 875 9.6026 0.4253 0.7299 0.4322 0.5708 0.4070 0.5435 0.0548 0.3173 0.5155 0.5857 0.4964 0.6386 -1.0 -1.0
2.4881 36.0 900 9.4144 0.4137 0.7247 0.4147 0.5757 0.3872 0.5581 0.0579 0.3107 0.5124 0.5762 0.4920 0.6498 -1.0 -1.0
2.5174 37.0 925 9.1971 0.4251 0.7162 0.4269 0.5874 0.4031 0.5559 0.0629 0.3126 0.5080 0.5929 0.4893 0.6263 -1.0 -1.0
2.4497 38.0 950 9.1950 0.4299 0.7362 0.4289 0.5774 0.4060 0.6253 0.0650 0.3231 0.5239 0.5786 0.4876 0.7819 -1.0 -1.0
2.3860 39.0 975 9.5015 0.3930 0.6898 0.3825 0.5594 0.3736 0.5632 0.0627 0.3136 0.5085 0.5595 0.4738 0.7579 -1.0 -1.0
2.3519 40.0 1000 9.4528 0.4169 0.7194 0.4121 0.5626 0.3970 0.5680 0.0596 0.3261 0.5174 0.5643 0.4877 0.7301 -1.0 -1.0
2.3449 41.0 1025 9.7886 0.3988 0.7021 0.3926 0.5783 0.3902 0.5016 0.0486 0.2949 0.4947 0.5881 0.4764 0.6070 -1.0 -1.0

Framework versions

  • Transformers 5.3.0.dev0
  • Pytorch 2.10.0+cu128
  • Datasets 4.8.2
  • Tokenizers 0.22.2
Downloads last month
367
Safetensors
Model size
28.2M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nielsr/lw-detr-medium-tray-detection-hub-init

Finetuned
(3)
this model