We present a new method for structured pruning of neural networks, based on the recently proposed neuron merging trick in which following a pruning operation, the weights of the next layer are suitably modified. By a rigorous mathematical analysis of the neuron merging technique we prove an upper bound on the reconstruction error. This bound defines a new objective function for pruning-and-merging.
Articles
Related Articles
September 21, 2020
Digital Evolution of the Quadrature Balanced Power Amplifier Transceiver for Full Duplex Wireless Operation
This letter compares analog and digital implementations of quadrature balanced power amplifier (QBPA) transceiver for full-duplex...
Read More >
1 MIN READING
March 9, 2020
Orthogonal Directional Transforms Using Discrete Directional Laplacian Eigen Solutions for Beyond HEVC Intra Coding
We introduce new transforms for efficient compression of image blocks with directional preferences. Each transform, which...
Read More >
1 MIN READING
April 15, 2022
Universal Virtual Lab: A Fast and Accurate Simulation Tool for Wideband Nonlinear DWDM Systems
We introduce the concept of the universal virtual lab, an extension to the virtual lab platform...
Read More >
1 MIN READING