Beyond
Network Pruning: a Joint Search-and-Training Approach
Xiaotong Lu1 Han Huang1 Weisheng Dong1* Xin
Li2 Guangming Shi1
1School of Artificial Intelligence, Xidian University 2West Virginia
University
Figure 1. Overview
of our joint search-and-training approach. The ’sampler’ searches for compact sub-networks
from the target network, while the ’updater’ maps the trained sub-networks back
to the target network.The best performing sub-network will be further fine-tuned
as the final output.
Abstract
Network
pruning has been proposed as a remedy for alleviating the over-parameterization
problem of deep neural networks. However, its value has been recently
challenged especially from the perspective of neural architecture search (NAS).
We challenge the conventional wisdom of pruning after-training by proposing a
joint search-and-training approach that directly learns a compact network from
the scratch. By treating pruning as a search strategy, we present two new
insights in this paper: 1) it is possible to expand the search space of
networking pruning by associating each filter with a learnable weight; 2) joint
search-and-training can be conducted iteratively to maximize the learning effificiency.
More specifically, we propose a coarse-to-fine tuning strategy to iteratively
sample and update compact sub-network to approximate the target network. The
weights associated with network fifilters will be accordingly updated by joint
search-and-training to reflect learned knowledge in NAS space. Moreover, we
introduce strategies of random perturbation (inspired by Monte Carlo) and
flexible thresholding (inspired by Reinforcement Learning) to adjust the weight
and size of each layer. Extensive experiments on ResNet and VGGNet demonstrate
the superior performance of our proposed method on popular datasets including
CIFAR10, CIFAR100 and ImageNet.
|
Citation
Lu
X, Huang H, Dong W, et al. Beyond network pruning: a joint search-and-training
approach[C]//International Joint Conference on Artificial Intelligence. 2020.
Bibtex
@inproceedings{lu2020beyond,
title={Beyond network pruning: a joint
search-and-training approach},
author={Lu, Xiaotong and Huang, Han and Dong,
Weisheng and Li, Xin and Shi, Guangming},
booktitle={International Joint Conference on
Artificial Intelligence},
year={2020}
}
Results
References
[1] José MBioucas-Dias and Mário
ATFigueiredo. Anew twist: Two-step iterative shrinkage/thresholding algorithms
for image restoration. IEEE Transactions on Image processing, 16(12):2992–3004,
2007.
[2] Xin Yuan. Generalized
alternating projection based total variation minimization for compressive
sensing. In 2016 IEEE International Conference on Image Processing (ICIP), pages
2539–2543. IEEE, 2016.
[3] Yang Liu,
Xin Yuan, Jinli Suo, David J Brady, and Qionghai Dai. Rank minimization for
snapshot compressive imaging. IEEE transactions on pattern analysis and machine
intelligence, 41(12):2990–3006, 2018.
[4] Xin Miao,
Xin Yuan, Yunchen Pu, and Vassilis Athitsos. lambda-net: Reconstruct
hyperspectral images from a snapshot measurement. In 2019 IEEE/CVF
International Conference on Computer Vision (ICCV), pages 4058–4068. IEEE, 2019.
[5] Lizhi Wang,
Chen Sun, Ying Fu, Min H Kim, and Hua Huang. Hyperspectral image reconstruction
using a deep spatial-spectralprior. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition, pages 8032–8041, 2019.
[6] Lizhi Wang,
Chen Sun, Maoqing Zhang, Ying Fu, and Hua Huang. Dnu: Deep non-local unrolling
for computational spectral imaging. In Proceedings of the IEEE/CVF Conference
on Computer Vision and Pattern Recognition, pages 1661–1671, 2020.
[7] Ziyi Meng,
Jiawei Ma, and Xin Yuan. End-to-end low cost compressive spectral imaging with
spatial-spectral self-attention. In European Conference on Computer Vision, pages
187–204. Springer, 2020.
Contact
Xiaotong Lu,
Email: xiaotonglu47@gmail.com
Han Huang,
Email:hanhuang8264@gmail.com
Weisheng Dong,
Email: wsdong@mail.xidian.edu.cn
Xin Li, Email: xin.li@ieee.org
Guangming Shi,
Email: gmshi@xidian.edu.cn