detr_finetuned_cppe5
This model is a fine-tuned version of facebook/detr-resnet-101 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.4203
- Map: 0.211
- Map 50: 0.4148
- Map 75: 0.1838
- Map Small: 0.0751
- Map Medium: 0.1771
- Map Large: 0.3113
- Mar 1: 0.2413
- Mar 10: 0.4141
- Mar 100: 0.4369
- Mar Small: 0.1463
- Mar Medium: 0.3822
- Mar Large: 0.589
- Map Coverall: 0.4919
- Mar 100 Coverall: 0.6775
- Map Face Shield: 0.116
- Mar 100 Face Shield: 0.4051
- Map Gloves: 0.126
- Mar 100 Gloves: 0.4067
- Map Goggles: 0.0542
- Mar 100 Goggles: 0.3062
- Map Mask: 0.2667
- Mar 100 Mask: 0.3889
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 107 | 2.4645 | 0.0227 | 0.0572 | 0.016 | 0.004 | 0.0095 | 0.0259 | 0.042 | 0.122 | 0.1707 | 0.0328 | 0.1149 | 0.2212 | 0.1014 | 0.5545 | 0.0 | 0.0 | 0.0031 | 0.1335 | 0.0 | 0.0 | 0.0091 | 0.1653 |
No log | 2.0 | 214 | 2.1804 | 0.0448 | 0.1044 | 0.0358 | 0.0148 | 0.0316 | 0.0467 | 0.0628 | 0.1692 | 0.2066 | 0.0811 | 0.1648 | 0.2201 | 0.177 | 0.5851 | 0.0 | 0.0 | 0.0159 | 0.2134 | 0.0 | 0.0 | 0.031 | 0.2347 |
No log | 3.0 | 321 | 2.5284 | 0.0155 | 0.0479 | 0.0083 | 0.0035 | 0.0205 | 0.0156 | 0.0308 | 0.1059 | 0.1257 | 0.0295 | 0.0967 | 0.1519 | 0.0652 | 0.3986 | 0.0 | 0.0 | 0.003 | 0.1237 | 0.0 | 0.0 | 0.0093 | 0.1062 |
No log | 4.0 | 428 | 2.5653 | 0.0495 | 0.1184 | 0.034 | 0.0134 | 0.0368 | 0.0532 | 0.0689 | 0.1406 | 0.1473 | 0.0536 | 0.1184 | 0.1448 | 0.1843 | 0.4559 | 0.0 | 0.0 | 0.0156 | 0.0982 | 0.0 | 0.0 | 0.0475 | 0.1827 |
2.1715 | 5.0 | 535 | 2.0014 | 0.0562 | 0.1325 | 0.0429 | 0.0107 | 0.0506 | 0.062 | 0.071 | 0.1755 | 0.208 | 0.0996 | 0.1643 | 0.2142 | 0.2171 | 0.5464 | 0.004 | 0.0051 | 0.0111 | 0.2196 | 0.0 | 0.0 | 0.0488 | 0.2689 |
2.1715 | 6.0 | 642 | 2.0494 | 0.0742 | 0.1766 | 0.0547 | 0.0253 | 0.0715 | 0.0841 | 0.0918 | 0.1821 | 0.1947 | 0.0776 | 0.147 | 0.2081 | 0.2231 | 0.5293 | 0.014 | 0.0101 | 0.037 | 0.1933 | 0.0 | 0.0 | 0.0969 | 0.2409 |
2.1715 | 7.0 | 749 | 1.9789 | 0.0762 | 0.1757 | 0.0609 | 0.0175 | 0.0563 | 0.0979 | 0.0947 | 0.1935 | 0.2099 | 0.0754 | 0.14 | 0.2697 | 0.2697 | 0.5482 | 0.0068 | 0.0405 | 0.0326 | 0.2263 | 0.0 | 0.0 | 0.072 | 0.2347 |
2.1715 | 8.0 | 856 | 1.7979 | 0.1115 | 0.2421 | 0.0963 | 0.0385 | 0.0838 | 0.1233 | 0.1205 | 0.2413 | 0.2577 | 0.1064 | 0.2004 | 0.2981 | 0.366 | 0.6 | 0.0318 | 0.1734 | 0.0442 | 0.2375 | 0.0 | 0.0 | 0.1154 | 0.2778 |
2.1715 | 9.0 | 963 | 1.7814 | 0.106 | 0.2485 | 0.0782 | 0.0298 | 0.0914 | 0.1249 | 0.1282 | 0.2654 | 0.291 | 0.1279 | 0.2449 | 0.3136 | 0.3169 | 0.6149 | 0.0396 | 0.2443 | 0.0369 | 0.2661 | 0.0053 | 0.0277 | 0.1316 | 0.3022 |
1.6797 | 10.0 | 1070 | 1.7592 | 0.12 | 0.2686 | 0.09 | 0.0538 | 0.0951 | 0.1547 | 0.1463 | 0.2745 | 0.2955 | 0.1219 | 0.2298 | 0.3636 | 0.348 | 0.5896 | 0.0404 | 0.2418 | 0.0448 | 0.2746 | 0.0066 | 0.0815 | 0.16 | 0.2902 |
1.6797 | 11.0 | 1177 | 1.6620 | 0.1444 | 0.3101 | 0.118 | 0.0517 | 0.107 | 0.1972 | 0.1605 | 0.3127 | 0.3286 | 0.109 | 0.2746 | 0.4274 | 0.4222 | 0.636 | 0.0764 | 0.3304 | 0.0504 | 0.2683 | 0.0125 | 0.0846 | 0.1602 | 0.3236 |
1.6797 | 12.0 | 1284 | 1.6521 | 0.1496 | 0.3058 | 0.1316 | 0.0668 | 0.1276 | 0.195 | 0.179 | 0.3412 | 0.3661 | 0.1314 | 0.3048 | 0.4696 | 0.3977 | 0.659 | 0.0678 | 0.3316 | 0.0651 | 0.3304 | 0.0155 | 0.1677 | 0.2017 | 0.3418 |
1.6797 | 13.0 | 1391 | 1.6103 | 0.1557 | 0.3242 | 0.1339 | 0.0619 | 0.1241 | 0.2161 | 0.1805 | 0.3549 | 0.3805 | 0.1434 | 0.3194 | 0.49 | 0.4414 | 0.6509 | 0.0615 | 0.3734 | 0.0656 | 0.3049 | 0.0273 | 0.2446 | 0.1827 | 0.3284 |
1.6797 | 14.0 | 1498 | 1.5562 | 0.1555 | 0.331 | 0.1264 | 0.0774 | 0.1244 | 0.2154 | 0.1817 | 0.3534 | 0.3801 | 0.1578 | 0.3145 | 0.4972 | 0.4226 | 0.6482 | 0.0506 | 0.3278 | 0.0645 | 0.3357 | 0.025 | 0.2231 | 0.215 | 0.3658 |
1.4442 | 15.0 | 1605 | 1.5950 | 0.1646 | 0.3338 | 0.1427 | 0.0654 | 0.1243 | 0.2369 | 0.2009 | 0.3721 | 0.3948 | 0.1344 | 0.3278 | 0.5394 | 0.4499 | 0.6252 | 0.0607 | 0.4101 | 0.0827 | 0.354 | 0.0168 | 0.2462 | 0.2129 | 0.3387 |
1.4442 | 16.0 | 1712 | 1.5378 | 0.1787 | 0.3643 | 0.1506 | 0.0709 | 0.1506 | 0.2531 | 0.2191 | 0.3931 | 0.4166 | 0.1542 | 0.3543 | 0.5625 | 0.4758 | 0.6631 | 0.0755 | 0.4241 | 0.0807 | 0.3272 | 0.0301 | 0.3092 | 0.2316 | 0.3596 |
1.4442 | 17.0 | 1819 | 1.5125 | 0.1918 | 0.3755 | 0.1708 | 0.0788 | 0.1531 | 0.2661 | 0.223 | 0.3882 | 0.413 | 0.1521 | 0.3452 | 0.5525 | 0.4683 | 0.6541 | 0.1277 | 0.3949 | 0.0788 | 0.3647 | 0.0524 | 0.3031 | 0.2319 | 0.348 |
1.4442 | 18.0 | 1926 | 1.5578 | 0.1828 | 0.3717 | 0.1554 | 0.0728 | 0.1433 | 0.2664 | 0.2173 | 0.3768 | 0.4014 | 0.1351 | 0.3382 | 0.5489 | 0.4573 | 0.645 | 0.1133 | 0.4013 | 0.0831 | 0.3719 | 0.0463 | 0.2662 | 0.214 | 0.3227 |
1.2711 | 19.0 | 2033 | 1.5281 | 0.183 | 0.3667 | 0.1594 | 0.0701 | 0.1424 | 0.269 | 0.2141 | 0.382 | 0.4056 | 0.1545 | 0.3369 | 0.5538 | 0.4556 | 0.6459 | 0.102 | 0.3759 | 0.0856 | 0.3804 | 0.0421 | 0.2723 | 0.2295 | 0.3533 |
1.2711 | 20.0 | 2140 | 1.4865 | 0.1904 | 0.3761 | 0.1706 | 0.0691 | 0.1571 | 0.2782 | 0.2229 | 0.3888 | 0.4176 | 0.147 | 0.3621 | 0.5556 | 0.4628 | 0.6491 | 0.1048 | 0.3962 | 0.1006 | 0.3929 | 0.0512 | 0.2923 | 0.2326 | 0.3578 |
1.2711 | 21.0 | 2247 | 1.4419 | 0.1998 | 0.3915 | 0.1805 | 0.0764 | 0.1666 | 0.2851 | 0.2175 | 0.402 | 0.426 | 0.1665 | 0.3647 | 0.5649 | 0.484 | 0.6622 | 0.097 | 0.3835 | 0.1045 | 0.4129 | 0.053 | 0.2815 | 0.2604 | 0.3898 |
1.2711 | 22.0 | 2354 | 1.4334 | 0.2005 | 0.3988 | 0.1731 | 0.0784 | 0.1593 | 0.2923 | 0.2251 | 0.4072 | 0.4286 | 0.1525 | 0.3701 | 0.5665 | 0.4877 | 0.6662 | 0.1069 | 0.4051 | 0.1085 | 0.392 | 0.0421 | 0.2908 | 0.2574 | 0.3889 |
1.2711 | 23.0 | 2461 | 1.4424 | 0.1944 | 0.3864 | 0.1668 | 0.0766 | 0.1589 | 0.2875 | 0.2263 | 0.3953 | 0.4195 | 0.1527 | 0.3595 | 0.5552 | 0.4836 | 0.6676 | 0.0897 | 0.3924 | 0.1088 | 0.3929 | 0.0426 | 0.2738 | 0.2471 | 0.3707 |
1.1557 | 24.0 | 2568 | 1.4330 | 0.1985 | 0.3946 | 0.1749 | 0.0721 | 0.1629 | 0.2972 | 0.2302 | 0.4083 | 0.4291 | 0.1512 | 0.3689 | 0.5678 | 0.4795 | 0.6653 | 0.0962 | 0.3899 | 0.1172 | 0.4027 | 0.0416 | 0.3015 | 0.2579 | 0.3862 |
1.1557 | 25.0 | 2675 | 1.4414 | 0.2055 | 0.4042 | 0.1823 | 0.0726 | 0.1736 | 0.2998 | 0.2395 | 0.4119 | 0.4336 | 0.1429 | 0.3867 | 0.5743 | 0.4858 | 0.6671 | 0.0996 | 0.4051 | 0.128 | 0.4031 | 0.0504 | 0.3031 | 0.2637 | 0.3898 |
1.1557 | 26.0 | 2782 | 1.4282 | 0.2074 | 0.4059 | 0.1823 | 0.0727 | 0.1728 | 0.3052 | 0.2407 | 0.4168 | 0.4361 | 0.1469 | 0.3789 | 0.5882 | 0.4905 | 0.6766 | 0.1144 | 0.4089 | 0.126 | 0.4071 | 0.0521 | 0.3046 | 0.2542 | 0.3831 |
1.1557 | 27.0 | 2889 | 1.4217 | 0.2071 | 0.404 | 0.1879 | 0.0721 | 0.1745 | 0.3031 | 0.2395 | 0.4148 | 0.4359 | 0.1493 | 0.3809 | 0.5853 | 0.4924 | 0.6806 | 0.115 | 0.4051 | 0.1235 | 0.4134 | 0.0481 | 0.2954 | 0.2564 | 0.3849 |
1.1557 | 28.0 | 2996 | 1.4227 | 0.21 | 0.4135 | 0.1839 | 0.0737 | 0.1748 | 0.309 | 0.2423 | 0.4133 | 0.4385 | 0.1474 | 0.3828 | 0.5933 | 0.4903 | 0.6757 | 0.1144 | 0.4051 | 0.1268 | 0.4089 | 0.0532 | 0.3108 | 0.2652 | 0.392 |
1.0809 | 29.0 | 3103 | 1.4192 | 0.2106 | 0.4139 | 0.1833 | 0.0746 | 0.1768 | 0.3106 | 0.2414 | 0.4134 | 0.4373 | 0.146 | 0.3822 | 0.5903 | 0.4915 | 0.6779 | 0.1161 | 0.4051 | 0.1261 | 0.4076 | 0.0537 | 0.3077 | 0.2654 | 0.388 |
1.0809 | 30.0 | 3210 | 1.4203 | 0.211 | 0.4148 | 0.1838 | 0.0751 | 0.1771 | 0.3113 | 0.2413 | 0.4141 | 0.4369 | 0.1463 | 0.3822 | 0.589 | 0.4919 | 0.6775 | 0.116 | 0.4051 | 0.126 | 0.4067 | 0.0542 | 0.3062 | 0.2667 | 0.3889 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 21
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for sars973/detr_finetuned_cppe5
Base model
facebook/detr-resnet-101