billpsomas
commited on
Commit
·
31b8e61
1
Parent(s):
915500d
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,47 @@
|
|
1 |
---
|
2 |
license: cc-by-4.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: cc-by-4.0
|
3 |
+
datasets:
|
4 |
+
- imagenet-1k
|
5 |
+
metrics:
|
6 |
+
- accuracy
|
7 |
+
pipeline_tag: image-classification
|
8 |
+
language:
|
9 |
+
- en
|
10 |
+
tags:
|
11 |
+
- resnet
|
12 |
+
- convolutional neural network
|
13 |
+
- simpool
|
14 |
+
- computer vision
|
15 |
+
- deep learning
|
16 |
---
|
17 |
+
|
18 |
+
# Supervised ResNet-50 model
|
19 |
+
|
20 |
+
ResNet-50 model with SimPool (gamma=2.0) trained on ImageNet-1k for 100 epochs.
|
21 |
+
|
22 |
+
SimPool is a simple attention-based pooling method at the end of network, released in this [repository](https://github.com/billpsomas/simpool/).
|
23 |
+
Disclaimer: This model card is written by the author of SimPool, i.e. [Bill Psomas](http://users.ntua.gr/psomasbill/).
|
24 |
+
|
25 |
+
## Motivation
|
26 |
+
|
27 |
+
Convolutional networks and vision transformers have different forms of pairwise interactions, pooling across layers and pooling at the end of the network. Does the latter really need to be different?
|
28 |
+
As a by-product of pooling, vision transformers provide spatial attention for free, but this is most often of low quality unless self-supervised, which is not well studied. Is supervision really the problem?
|
29 |
+
|
30 |
+
## Method
|
31 |
+
|
32 |
+
SimPool is a simple attention-based pooling mechanism as a replacement of the default one for both convolutional and transformer encoders. For transformers, we completely discard the [CLS] token.
|
33 |
+
Interestingly, we find that, whether supervised or self-supervised, SimPool improves performance on pre-training and downstream tasks and provides attention maps delineating object boundaries in all cases.
|
34 |
+
One could thus call SimPool universal.
|
35 |
+
|
36 |
+
## BibTeX entry and citation info
|
37 |
+
|
38 |
+
```
|
39 |
+
@misc{psomas2023simpool,
|
40 |
+
title={Keep It SimPool: Who Said Supervised Transformers Suffer from Attention Deficit?},
|
41 |
+
author={Bill Psomas and Ioannis Kakogeorgiou and Konstantinos Karantzalos and Yannis Avrithis},
|
42 |
+
year={2023},
|
43 |
+
eprint={2309.06891},
|
44 |
+
archivePrefix={arXiv},
|
45 |
+
primaryClass={cs.CV}
|
46 |
+
}
|
47 |
+
```
|