Keep Up With Latest Trending Papers. Computer Science, AI and more.
Image and Video Processing
normalization-based attention module for model compression
NAM: Normalization-based Attention Module
Recognition of less salient features is the key for model compression.
However, it has not been investigated in the revolutionary attention mechanisms.
In this work, we propose a novel normalization-based attention module (nam), which suppresses less salient weights.
It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance.
A comparison with three other attentionmechanisms on both resnet and mobilenet indicates that our method results in higher accuracy.
Yichao Liu, Zongru Shao, Yueyang Teng, Nico Hoffmann
Normalization-based attention module
Read the Paper
◐ Recommended Follows
◐ Latest Activity