Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention map #2

Open
yujialele opened this issue Feb 1, 2020 · 3 comments
Open

attention map #2

yujialele opened this issue Feb 1, 2020 · 3 comments

Comments

@yujialele
Copy link

您好,我想请教您几个问题?
(1)在resnet50里, 代码实现中attemtion map 的数量为32,是每一个attention map 和原来的最后一个bottleneck里的卷积层特征相乘。我想问的是attention map是如何生成。
(2)能否采用像imporved bilinear pooling的方法改良整个代码。

@wvinzh
Copy link
Owner

wvinzh commented Feb 24, 2020

关于resnet50,我只是实现了功能,但是并没有进行实验测试效果,仅供参考。(封闭太久,没有及时回复)
(1)对,是每一个都对应相乘,attention map 是通过1x1卷积生成,我是直接取得中间的一个1x1卷积层的输出,并不是单独添加的。
(2)这个应该需要尝试,你可以试一下

@yujialeimustudent
Copy link

@wvinzh ,谢谢您的回答,实验效果不错,我实验了。祝好。

@LawrenceXia2008
Copy link

LawrenceXia2008 commented Apr 20, 2020

@wvinzh ,谢谢您的回答,实验效果不错,我实验了。祝好。

您好,您说的是ResNet50-BAP实验效果不错吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants