We present a new method for structured pruning of neural networks, based on the recently proposed neuron merging trick in which following a pruning operation, the weights of the next layer are suitably modified. By a rigorous mathematical analysis of the neuron merging technique we prove an upper bound on the reconstruction error. This bound defines a new objective function for pruning-and-merging.
Articles
Related Articles
April 19, 2022
Linear Polarization Control for Shared Aperture SATCOM Systems at Ku-Band
This paper presents a new phase-shifting duplexer architecture for passive linear polarization control. The duplexer is...
Read More >
1 MIN READING
February 18, 2024
Non-magnetic four-port electronic circulators based on 90∘ non-reciprocal phase-shifters
This paper presents a family of four-port electronic circulators adhering to a new topology symmetry that...
Read More >
1 MIN READING
January 16, 2025
Universal Virtual Lab for Fast and Accurate Performance Assessments in Nonlinear DWDM Systems with Digital Subcarrier Multiplexing
An extended Universal Virtual Lab method is presented for performance assessments of nonlinear DWDM links using...
Read More >
1 MIN READING