透過您的圖書館登入
IP:3.144.79.248
  • 學位論文

強健支向機迴歸之序列最小優化演算法

Sequential minimal optimization algorithm for robust support vector regression

指導教授 : 楊棧雲
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


基於統計學習理論的支持向量機,在機器學習領域具有最小模型複雜度,高泛化性能等優勢,無論在分類或迴歸的應用都深具潛力。但鑑於其模型是一二次最佳規劃模型,計算複雜度O(n2),將隨訓練樣本增加而導致維度災難。此一被稱為序列最小優化(Sequential minimal optimization, SMO)的演算法旨在降低其計算複雜度,把整個二次規劃問題分解成為很多易於處理的最小規模問題,一次只優化兩個樣本,並且用解析的方法迭代進行,以最小的計算複雜度最終達到系統的最佳解。本研究基於現有的支向機分類SMO演算法,援引Maximal Gain的工作集選擇方法,拓展成為一個支向機迴歸SMO演算法,以饗眾多迴歸分析使用者之期望。

並列摘要


Based on the statistical learning theory, the support vector machine is excellent with its features in low model complexity and high generalization ability, and is highly potential for the applications in both pattern recognition and function approximation. The quadratic expression in its original model intrinsically corresponds to a high computational complexity in O(n2), and leads it to a curse of dimensionality with the increasing training instances. By employing the sequential minimal optimization (SMO) algorithm which subdivides the big integrated optimization into a series of small two-instance optimization, the computation of the quadratic programming can be effectively reduced, and reach rapidly the optimal solution. With some improved findings, the study extends the SMO for SVM classifications to that for SVM regression. The development would be advantageous to the applications of function approximation.

參考文獻


[1] V. N. Vapnik, in: The nature of statistical learning theory. springer-
[2] C. Corts, V. N. Vapnik, Support vecter networks, Machine Learning,
[3] C. J. C. Burges, “A tutorial on support vector machines for pattern
recognition,” Data mining and knowledge discovery, 2(2):121–167,
[4] Platt, John. "Sequential minimal optimization: A fast algorithm

延伸閱讀