透過您的圖書館登入
IP:216.73.216.156
  • 學位論文

基於非支配排序遺傳算法的深度學習模型剪枝

NSGAP: Filter Pruning for Deep Learning Models using NSGA-II

指導教授 : 王勝德

摘要


本研究提出了一種名為 NSGAP 的新方法,它是一種基於 "非支配性排序遺傳算法" 的自動化剪枝方法。與現有方法相比, NSGAP 在壓縮率、準確性和搜索時間方面都具有競爭力。 NSGAP 的一個顯著特點是它產生了一個柏拉圖前緣作為搜索結果,從而避免為獲得不同壓縮率的架構而進行多次搜索的需要。 為了加強搜索過程,本研究還引入了 "非對稱高斯分佈"方法,將其應用於遺傳算法的初始化群體,從而有效地改善了搜索結果。總體而言,NSGAP方法在在模型剪枝和壓縮領域具有競爭力和高效性。它的貢獻在於多目標優化能力、柏拉圖前緣的生成和"非對稱高斯分佈"的引入,在壓縮率、準確性和搜索時間方面都取得了優異的表現。

並列摘要


This study proposes NSGAP, an innovative architecture for automatic pruning compression that incorporates the "Non-Dominated Sorting Genetic Algorithm II" (NSGA-II). NSGAP possesses an advantage in terms of compression rate, accuracy, and search duration, diverging from conventional methods. A notable feature of NSGAP is its ability to generate a Pareto Front as a search result, eliminating the need for multiple searches to obtain architectures with different compression rates. To optimize the search procedure, this investigation incorporates the "Asymmetric Gaussian Distribution" (AGD) strategy. By applying the AGD to the initial population of the genetic algorithm, the study facilitates improved search outcomes. In conclusion, NSGAP demonstrates impressive competitiveness and efficiency in the field of model pruning and compression. The principal contributions of this approach include its capability for multi-objective optimization, the generation of the Pareto Front, and the integration of AGD. These contributions result in superior performance in terms of compression rate, accuracy, and search duration.

參考文獻


[1] K. Deb and H. Jain. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Transactions on Evolutionary Computation, 18(4), 2014.
[2] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2), 2002.
[3] X. Dong and Y. Yang. Network pruning via transformable architecture search. Advances in Neural Information Processing Systems, 32, 2019.
[4] R. Duggal, C. Xiao, R. Vuduc, D. H. Chau, and J. Sun. Cup: Cluster pruning for compressing deep neural networks. In 2021 IEEE International Conference on Big Data (Big Data). IEEE, 2021.
[5] A. E. Eiben and J. E. Smith. Introduction to evolutionary computing. Springer, 2015.

延伸閱讀