簡易檢索 / 詳目顯示

研究生: 呂政儒
Lu, Cheng-Ju
論文名稱: 以改良式粒子群演算法進行分類問題之特徵選擇
An Improved Particle Swarm Optimization Algorithm for Feature Selection in Classification
指導教授: 蔣宗哲
Chiang, Tsung-Che
口試委員: 温育瑋
Wen, Yu-Wei
鄒慶士
Tsou, Ching-Shih
蔣宗哲
Chiang, Tsung-Che
口試日期: 2023/01/31
學位類別: 碩士
Master
系所名稱: 資訊工程學系
Department of Computer Science and Information Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 中文
論文頁數: 41
中文關鍵詞: 演化演算法粒子群演算法特徵選擇
研究方法: 實驗設計法
DOI URL: http://doi.org/10.6345/NTNU202300219
論文種類: 學術論文
相關次數: 點閱:59下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 特徵選擇是分類問題中降低問題維度的一種重要前處理技術,能消除冗餘和不相關的特徵,留下有用的特徵進行分類,提高分類準確率。隨著資料集維度增加,搜尋空間將急遽加大,這對各種最佳化演算法來說是一個挑戰。本研究希望能提出演算法,以小量計算資源找到一組少量且具有良好分類效果的特徵子集。研究基於當今主流的粒子群演算法,加入搜尋空間調整相關設計、參考相對優秀粒子的較差粒子調整策略,以及一個具方向性的新粒子群產生策略,該策略使用外部族群更新與紀錄演化過程中找到的多組特徵子集,並將其作為產生新粒子群的參考點。實驗結果與文獻演算法比較後,顯示所提出之演算法機制與設計具有良好的分類效能與較少的特徵數。

    第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究問題定義 1 1.3 粒子群演算法 2 1.4 研究目的與方法 3 1.5 論文架構 4 第二章 文獻探討 5 2.1 粒子之表示法 5 2.2 搜尋空間調整 7 2.3 族群多樣性控制 9 2.4 族群收斂性控制 11 2.5 加速粒子評估 12 2.6 多峰搜尋 13 第三章 改良式粒子群演算法 14 3.1 演算法架構 14 3.2 粒子的編碼與解碼 14 3.3 粒子的評估 16 3.4 族群初始化 17 3.4.1 特徵排序 17 3.4.2 生成子空間 17 3.4.3 粒子初始化 18 3.5 外部族群與引導向量 19 3.6 粒子群停滯判定 19 3.7 粒子群選擇 20 3.8 搜尋空間調整 20 3.9 新粒子群產生策略 22 3.9.1 更新外部族群 22 3.9.2 尋找參考點 22 3.9.3 新粒子群初始化 23 3.10 粒子速度與位置更新 24 3.10.1 粒子更新 24 3.10.2 粒子擾動 24 3.11 較差粒子調整策略 25 3.11.1 較佳粒子(調整參考對象)挑選 25 3.11.2 較差粒子(調整對象)挑選 26 3.11.3 調整方式 26 3.12 族群最佳位置更新 27 3.13 終止條件 27 第四章 實驗探討 28 4.1 測試資料集 28 4.2 演算法參數設定 28 4.3 實驗環境設定 29 4.4 效能指標 29 4.5 實驗結果與討論 29 4.5.1 粒子評估機制之影響 29 4.5.2 特徵排序之效果分析 32 4.5.3 新粒子群機制效果分析 33 4.5.4 較差粒子調整機制分析 34 4.5.5 與文獻方法之效能比較 36 第五章 結論與未來研究方向 38 參考文獻 40

    [1] J. Kennedy and R. Eberhart, “Particle swarm optimization,” Proceedings of ICNN’95 - International Conference on Neural Networks, vol. 4, pp. 1942–1948, 1995.
    [2] A. D. Li, B. Xue, and M. Zhang, “A forward search inspired particle swarm optimization algorithm for feature selection in classification,” Proceedings of IEEE Congress on Evolutionary Computation, pp. 786–793, 2021.
    [3] Y. Tian, R. Liu, X. Zhang, H. Ma, K. C. Tan, and Y. Jin, “A multipopulation evolutionary algorithm for solving large-scale multi-modal multi-objective optimization problems,”IEEE Trans. Evol. Comput. vol. 25, no. 3, pp. 405–418, 2020.
    [4] J. Kennedy and R. C. Eberhart, “A discrete binary version of the particle swarm algorithm,” Proceedings of 1997 Conference Systems Man and Cybernetics, pp. 4104–4108, 1997.
    [5] S. W. Lin, K. C. Ying, S. C. Chen, and Z. J. Lee, “Particle swarm optimization for parameter determination and feature selection of support vector machines,” Expert Syst. Appl. vol. 35, no. 4, pp. 1817–1824, 2008.
    [6] B. Tran, B. Xue, and M. Zhang, “A new representation in PSO for discretization-based feature selection,” IEEE Trans. Cybern. vol. 48, no. 6, pp. 1733–1746, 2018.
    [7] B. Xue, L. Cervante, L. Shang, W.N. Browne, and M. Zhang, “A multi-objective particle swarm optimization for filter-based feature selection in classification problems”, Connect. Sci. vol. 24, no. 2–3, pp. 91–116, 2012.
    [8] H.Wang, R. Ke, J. Li, Y. An, K.Wang, and L. Yu, “A correlation-based binary particle swarm optimization method for feature selection in human activity recognition”, Int. J. Distributed Sens. Netw. vol. 14, no. 4, 2018.
    [9] B. Tran, B. Xue, and M. Zhang, “Variable-length particle swarm optimization for feature selection on high-dimensional classification,” IEEE Trans. Evol. Comput. vol. 24, no. 5, pp. 882–895, 2018.
    [10] B. P. Flannery, W. H. Press, S. A. Teukolsky, and W. Vetterling, “Numerical Recipes in C,” vol. 24, Press Syndicate of the University of Cambridge, New York, pp. 78, 1992.
    [11] F. Hafiz, A. Swain, N. Patel, and C. Naik. “A two-dimensional (2-D) learning framework for Particle Swarm based feature selection,” Pattern Recognition. vol. 76, pp. 416–433, 2018.
    [12] J. Kennedy, “Bare bones particle swarms,” Proceedings of the 2003 IEEE Swarm Intelligence Symposium, pp. 80–87, 2003.
    [13] Y. Zhang, D. W. Gong, Y. Hu, and W. Zhang, “Feature selection algorithm based on bare bones particle swarm optimization,” Neurocomputing. vol.148, pp. 150–157, 2015.
    [14] F. Wang and J. Liang, “An efficient feature selection algorithm for hybrid data,” Neurocomputing. vol. 193, pp. 33–41, 2016.
    [15] H. B. Nguyen, B. Xue, I. Liu, and M. Zhang, “Filter based backward elimination in wrapper based PSO for feature selection in classification,” Proceedings of IEEE Congress on Evolutionary Computation, pp. 3111–3118, 2014.
    [16] A. D. Li, B. Xue, and M. Zhang, “Improved binary particle swarm optimization for feature selection with new initialization and search space reduction strategies,” Appl. Soft Comput. vol. 106, Article 107302, 2021.
    [17] H. B. Nguyen, B. Xue, P. Andreae, and M. Zhang, “Particle swarm optimization with genetic operators for feature selection,” Proceedings of IEEE Congress on Evolutionary Computation, pp. 286–293, 2017.
    [18] L. Y. Chuang, H. W. Chang, C. J. Tu, and C. H. Yang, “Improved binary PSO for feature selection using gene expression data,” Comput. Biol. Chem. vol. 32, no. 1, pp. 29–38, 2008.
    [19] Y. Zhang, D. W. Gong, and J. Cheng, “Multi-objective particle swarm optimization approach for cost-based feature selection in classification,” IEEE ACM Trans. Comput. Biol. Bioinf. vol. 14, no. 1, pp. 64–75, 2017.
    [20] K. Mistry, L. Zhang, S. C. Neoh, C. P. Lim, and B. Fielding, “Amicro-GA embedded PSO feature selection approach to intelligent facial emotion recognition,” IEEE Trans. Cybern. vol. 47, no. 6, pp.1496–1509, 2017.
    [21] S. Gu, R. Cheng, and Y. Jin, “Feature selection for high-dimensional classification using a competitive swarm optimizer,” Soft Comput. vol. 22, pp. 811–822, 2018.
    [22] R. Cheng and Y. Jin, “A competitive swarm optimizer for large scale optimization,” IEEE Trans. Cybern. vol. 45, no. 2, pp. 191–204, 2015.
    [23] K. Chen, B. Xue, M. Zhang, and F. Zhou, “Correlation-guided updating strategy for feature selection in classification with surrogate-assisted particle swarm optimization,” IEEE Trans. Evol. Comput. vol. 26, no. 5, pp. 1015–1029, 2021.
    [24] B. Tran, B. Xue, and M. Zhang, “Improved PSO for feature selection on high-dimensional datasets,” Proceedings of Simulated Evolution and Learning, Springer, pp. 503–515, 2014.
    [25] B. Tran, M. Zhang, and B. Xue, “A PSO based hybrid feature selection algorithm for high-dimensional classification,” Proceedings of IEEE Congress on Evolutionary Computation, pp. 3801–3808, 2016.
    [26] H. B. Nguyen, B. Xue, and P. Andreae, “Surrogate-model based particle swarm optimization with local search for feature selection in classification,” Proceedings of Applications of Evolutionary Computation, Lecture Notes in Computer Science, Springer International Publishing, pp. 487–505, 2017.
    [27] H. B. Nguyen, B. Xue, and P. Andreae, “PSO with surrogate models for feature selection: static and dynamic clustering-based methods,” Memet. Comput. vol. 10, pp. 291–300, 2018.
    [28] T. Butler-Yeoman, B. Xue, and M. Zhang, “Particle swarm optimisation for feature selection: a hybrid filter-wrapper approach,” Proceedings of IEEE Congress on Evolutionary Computation, pp. 2428–2435, 2015.
    [29] X. M. Hu, S. R. Zhang, M. Li, and J. D. Deng, “Multimodal particle swarm optimization for feature selection,” Appl. Soft Comput. vol. 113, Article 107887, 2021.
    [30] UCI machine learning repository, url: http://archive.ics.uci.edu/ml
    [31] Y. Xue , B. Xue, and M. Zhang. “Self-adaptive particle swarm optimization for large-scale feature selection in classification,” ACM Trans. Knowl. Discov. vol 13, no. 5, pp. 1–27, 2019.

    無法下載圖示 電子全文延後公開
    2025/02/20
    QR CODE