Title : COLT: Cyclic Overlapping Lottery Tickets for Faster Pruning of Convolutional Neural Networks


Authors : Md. Ismail Hossain, Mohammed Rakib, M. M. Lutfe Elahi, Nabeel Mohammed, Shafin Rahman


Journal Article Title: IEEE Transactions on Artificial Intelligence Volume Number: 6 Publication Year : 2025 Issue Number: 6
Index: scopus Ranking: A Publisher Name: IEEE
Pages : 1664 - 1678
ISSN (Online): 2691-4581
Funding Information:
Funding Source : Other
Other Information:
Direct Sustainable Development Goals :
SDG9 Industry, Innovation & Infrastructure
Indirect Sustainable Development Goals :
SDG17 Partnership for the Goals
Sustainable Development Sub Goals :
Enhance scientific research & technological capacity
Impact statement: The emergence of large-scale deep learning models outperforms traditional systems in solving many real-life problems. Its success sometimes even beats human-level performance in several tasks. However, the main bottleneck is still the rise in computational cost. To minimize this, the researcher started looking for small-scale alternatives for large-scale models. One important direction is model pruning, which aims to prune the existing deep model without compromising performance. Existing approaches in this line of investigation perform iterative pruning of the same model in several pruning rounds. In this article, we propose a novel idea to minimize the number of pruning rounds while keeping the unpruned model’s accuracy. In other words, by utilizing our COLT algorithm, models can be pruned quicker, reducing the carbon footprint and making our algorithm more environment-friendly. Moreover, aligning with the literature, we demonstrate that the pruned subnetwork computed using one dataset can be used for different datasets without any dataset-specific pruning. It will help to build a pruned sub-network for a new domain quickly. This study will open up a new research prospect in finding a new pruning strategy for convolutional NNs. Collaboration: Other