Abstract
Today, convolutional neural network (CNN) pruning techniques often rely on manually crafted importance criteria and pruning structures. Due to their heuristic nature, these methods may lack generality, and their performance is not guaranteed. In this paper, we propose a theoretical framework to address this challenge by leveraging the concept of γ-weak submodularity, based on a new efficient importance function. By deriving an upper bound on the absolute error in the layer subsequent to the pruned layer, we formulate the importance function as a γ-weakly submodular function. This formulation enables the development of an easy-to-implement, low-complexity, and data-free oblivious algorithm for selecting filters to be removed from a convolutional layer. Extensive experiments show that our method outperforms state-of-the-art benchmark networks across various datasets, with a computational cost comparable to the simplest pruning tech niques, such as l2-norm pruning. Notably, the proposed method achieves an accuracy of 76.52%, compared to 75.15% for the over all best baseline, with a 25.5% reduction in network parameters. According to our proposed resource-efficiency metric for pruning methods, the ACLI approach demonstrates orders-of-magnitude higher efficiency than the other baselines, while maintaining com petitive accuracy.
| Original language | English |
|---|---|
| Pages (from-to) | 932-945 |
| Number of pages | 14 |
| Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
| Volume | 48 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 2026 |
| Externally published | Yes |
!!!Keywords
- Deep learning
- convolutional neural networks
- data-free
- machine learning
- model comparison
- pruning
Fingerprint
Dive into the research topics of 'ACLI: A CNN Pruning Framework Leveraging Adjacent Convolutional Layer Interdependence and γ-Weakly Submodularity'. These topics are generated from the title and abstract of the publication. Together, they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver