We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我想请教您几个问题? (1)在resnet50里, 代码实现中attemtion map 的数量为32,是每一个attention map 和原来的最后一个bottleneck里的卷积层特征相乘。我想问的是attention map是如何生成。 (2)能否采用像imporved bilinear pooling的方法改良整个代码。
The text was updated successfully, but these errors were encountered:
关于resnet50,我只是实现了功能,但是并没有进行实验测试效果,仅供参考。(封闭太久,没有及时回复) (1)对,是每一个都对应相乘,attention map 是通过1x1卷积生成,我是直接取得中间的一个1x1卷积层的输出,并不是单独添加的。 (2)这个应该需要尝试,你可以试一下
Sorry, something went wrong.
@wvinzh ,谢谢您的回答,实验效果不错,我实验了。祝好。
您好,您说的是ResNet50-BAP实验效果不错吗
No branches or pull requests
您好,我想请教您几个问题?
(1)在resnet50里, 代码实现中attemtion map 的数量为32,是每一个attention map 和原来的最后一个bottleneck里的卷积层特征相乘。我想问的是attention map是如何生成。
(2)能否采用像imporved bilinear pooling的方法改良整个代码。
The text was updated successfully, but these errors were encountered: