img

Notice détaillée

Generalized Gradient Flow Based Saliency for Pruning Deep Convolutional Neural Networks

Article Ecrit par: Chen, Zhen ; Li, Baopu ; Yuan, Yixuan ; Liu, Xinyu ;

Résumé: Model filter pruning has shown efficiency in compressing deep convolutional neural networks by removing unimportant filters without sacrificing the performance. However, most existing criteria are empirical, and overlook the relationship between channel saliencies and the non-linear activation functions within the networks. To address these problems, we propose a novel channel pruning method coined gradient flow based saliency (GFBS). Instead of relying on the magnitudes of the entire feature maps, GFBS evaluates the channel saliencies from the gradient flow perspective and only requires the information in normalization and activation layers. Concretely, we first integrate the effects of normalization and ReLU activation layers into convolutional layers based on Taylor expansion. Then, through backpropagation, the derived channel saliency of each layer is indicated by of the first-order Taylor polynomial of the scaling parameter and the signed shifting parameter in the normalization layers. To validate the efficiency and generalization ability of GFBS, we conduct extensive experiments on various tasks, including image classification (CIFAR, ImageNet), image denoising, object detection, and 3D object classification. GFBS could feasibly cooperate with the baseline networks and compress them with only negligible performance drop. Moreover, we extended our method to pruning scratch networks and GFBS is capable to identify subnetworks with comparable performance with the baseline model at an early training stage.


Langue: Anglais
Thème Informatique

Mots clés:
Image Classification
normalization
network architecture
Gradient flow
Model pruning

Generalized Gradient Flow Based Saliency for Pruning Deep Convolutional Neural Networks

Sommaire