You need to enable JavaScript to run this app.
Sign Up
Home
Trending
Discover
Bookmarks
Keep Up With Latest Trending Papers. Computer Science, AI and more.
Subscribe
CC BY
Source arXiv
Computer Vision
Image and Video Processing
normalization-based attention module for model compression
NAM: Normalization-based Attention Module
Recognition of less salient features is the key for model compression.
However, it has not been investigated in the revolutionary attention mechanisms.
In this work, we propose a novel normalization-based attention module (nam), which suppresses less salient weights.
It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance.
A comparison with three other attentionmechanisms on both resnet and mobilenet indicates that our method results in higher accuracy.
Authors
Yichao Liu, Zongru Shao, Yueyang Teng, Nico Hoffmann
Related Topics
Attention mechanisms
Attention modules
NAM
Normalization-based attention module
Read the Paper
Read Paper
Sign Up
Home
Trending
Discover
Bookmarks
◐ Recommended Follows
Qz5210735376
Follow
Lynne Zhang
Follow
Frederic S
Follow
More
◐ Latest Activity
More