Exploring Vision Neural Network Pruning via Screening Methodology
arXiv:2502.07189v2 Announce Type: replace-cross Abstract: The remarkable performance of modern deep neural networks (DNNs) is largely driven by their massive scale, often comprising tens to hundreds of millions-or even billions-of parameters. However, such a scale incurs substantial storage and computational costs, hindering deployment on platforms such as edge devices that require energy-efficient and real-time processing. In this paper, we propose a network pruning framework that reduces both storage and computation requirements by an order of magnitude while preserving model accuracy. Our approach eliminates non-essential parameters through a statistical analysis of component significance across classification categories. Specifically, we employ a F-statistic-based screening technique combined with a weighted evaluation scheme to quantify the contributions of connections and channels, enabling both unstructured and structured pruning within a unified framework. Extensive experiments on real-world vision datasets, covering both fully connected neural networks (FNNs) and convolutional neural networks (CNNs), demonstrate that the proposed framework produces compact and efficient models that are highly competitive with the state of art apporoaches.
