透過您的圖書館登入
IP:18.221.53.5
  • 學位論文

最小裁減平方支撐向量機回歸與它的應用

Least Trimmed Square Support Vector Machine Regression and Its Applications

指導教授 : 鄭錦聰
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


自從有了人工智慧的概念,許多關於機器學習的演算法被發展出來,然而在這些演法之中近幾年以支撐向量機(SVM)較普遍使用,我們可以在一些期刊上發現常有支撐向量機迴歸(SVMR)及最小平方支撐量機迴歸(LS-SVMR)的文獻,在本論文中我們針對LS-SVMR的強健性問題來提出最小平方裁減支撐量機迴歸(LTS-SVMR),此方法是最小平方裁減(LTS)與最小平方支撐向量機迴歸(LS-SVMR)的混合。一般而言LTS能夠排除存在訓練樣本中的離群值,我們基於LTS的特性來增強LS-SVMR的強健性,但LTS有一個主要缺陷,就是選擇理想初值時所引起的大量運算,為了克服此缺陷我們提出三種挑選初值的LTS-SVMR方法,第一個方法是在執行裁減之前,先執行一次的LS-SVMR的估測,第二個方法是在執行裁減之前,先用隨機的方式挑選出最佳的訓練子樣本,最後一個方法是在執行裁減之前,先用模擬退火法來挑選出最佳的訓練子樣本。此外為了提升LTS-SVMR的運算效率,我們進一步提出區域線性嵌入最小平方裁減的支撐向量機迴歸(LLE-LTS-SVMR),它是把第一種挑選初值方法的LTS-SVMR與區域線性嵌入演算法做結合。最後實驗結果顯示三種LTS-SVMR能確實提高LS-SVMR的強健性,也呈現採用模擬退火的LTS-SVMR比另外兩種LTS-SVMR來的可靠,此外 LLE-LTS-SVMR顯得它能在低運算量之下還能保持不錯的建模能力。

並列摘要


There are many machine learning algorithms have developed since the ideal of artificial intelligence was proposed. Besides, in recent years, SVM among machine learning algorithms is generally used. Hence, many literatures about the support vector machine regression (SVMR) and the least squares-support vector machine regression (LS-SVMR) can be found in some well-known journals. In this thesis, for the robustness problem of the LS-SVMR, we propose the least trimmed squares support vector machine regression (LTS-SVMR) which is the hybrid of the least trimmed squares (LTS) and the LS-SVMR which is the improvement of the support vector machine regression (SVMR). Some literatures have pointed out that when the LTS method faces on the training sample with outliers, it can effectively remove outlier points. That is, robustness of the LS-SVMR is enhanced by combining the LS-SVMR and the LTS. However, the LTS method has one major drawback which is that the process of choosing the suitable initial function leads computation to be very large. For this problem we propose three methods of choosing the suitable initial function. First method is that an initial function is obtained by performing once estimation of the LS-SVMR before doing trimming process. Second method is that an optimal initial function is picked from training subsamples which are produced by random way before doing trimming process. Third method is that an optimal initial function is obtained by the simulated annealing (SA) algorithm before doing trimming process. In addition, in order to reducing process of complex computation, we propose the locally linear embeddings least trimmed squares support vector machine regression (LLE-LTS-SVMR) which combines our first method of the LTS-SVMR with the locally linear embeddings (LLE) which is the dimensionality reduction algorithm. Finally, experiment results show that three methods of the LTS-SVMR can improve the problem of low robustness of the LS-SVMR and the LTS-SVMR based on the SA algorithm is more reliable than other method of LTS-SVMR. Besides, the LLE-LTS-SVMR can reduce large computation process of first method of the LTS-SVMR and keep a good modeling ability.

參考文獻


[1] E. Alpaydm, Introduction to Machine Learning. The MIT Press, Oct. 2004, ch.1.
[2] M. Khodabandeh and H. Bolandi, “Model Predictive Control with State Estimation and Adaptation Mechanism for a Continuous Stirred Tank Reactor,” International Conference on Control, Automation and Systems, pp.17-20, Seoul, Oct. 2007.
[4] W.G. Vieira, V.M.L. Santos, F.R. Carvalho, J.A.F.R. Pereira and A.M. Fileti, “Identification and predictive control of a FCC unit using a MIMO neural model,” Chemical Engineering and Processing, Volume 44, Issue 8, pp. 855-868, Aug. 2005.
[6] N.F. Al-Muthairi, S. Bingulac and M. Zribi, “Identification of discrete-time MIMO systems using a class of observable canonical-form,” IEEE Proceedings- Control Theory and Applications, Volume 149, Issue 2, pp. 125-130, Mar. 2002.
[7] Y.Y. Fu, C.J. Wu, J.T. Jeng and C.N. Ko, “Identification of MIMO systems using radial basis function networks with hybrid learning algorithm,” Applied Mathematics and Computation, Volume 213, Issue 1, pp.184-196 , Jul. 2009.

延伸閱讀