透過您的圖書館登入
IP:3.138.109.118
  • 學位論文

提高樣本使用效率之逐段二次迴歸樹

Sample-Efficient Piecewise Quadratic Regression Tree

指導教授 : 陳正剛

摘要


分類迴歸樹 (classification and regression tree, CART) 在資料採擷(data mining)裡是很常被使用的方法,透過對樣本資料的切割,將觀察值以二元的方式進行分類。但CART會隨著資料的分割,使樣本數快速的減少,減低預測的可靠性,因此提高樣本使用效率的迴歸樹 (Sample Efficient Regression Tree, SERT) 便提出來,利用連動效力檢定 (Interaction Effect Test) 來避免不必要的分割,但是,對於樣本裡存在著二次效力 (Quadratic Effect) 時,CART和SERT都無法檢測出來,因此我們提出了提高樣本使用效力的逐段二次迴歸樹(Sample-Efficient Piecewise Quadratic Regression Tree)來解決這個問題。 首先,我們發展了一種逐段二次迴歸模型(Piecewise Quadratic Regression Model)來對樣本中存在的二次效力作檢測,接著還利用Gram-Schmidt Process的手法發展出選擇變數的一次項和二次項的方法,以避免因為兩者間的高度共線性而造成變數選取的錯誤。 最後,我們列舉了十七個不同類型的模擬方案以及實際的例子來驗證我們提出的方法。

並列摘要


The classification and regression tree (CART) is a popular method in data mining. It classifies the responses by sequentially splitting the sample into two branches. In CART, the sample size will deplete quickly and the reliability of prediction will diminish with splitting sample. Therefore, the sample efficient regression tree (SERT) is proposed. It uses interaction effect test to avoid unnecessary splits. However, CART and SERT can not detect the quadratic effect. For this reason, we propose the sample-efficient piecewise quadratic regression tree to solve the problem. First we develop the piecewise quadratic regression model to detect the quadratic effect. Then, we use the Gram-Schmidt to resolve the possible multicollinearity issue between the linear effect and the quadratic effect. With this process, we can avoid wrong attribute selection resulted from statistical insignificance due to collinearity between the linear effect and the quadratic effect. Finally, we use seventeen simulated cases and a real case to verify our proposed method.

參考文獻


[1] Breiman L., Friedman J. H., Olshen R. A. and Stone C. J., “Classification and Regression Trees”, Monterey, California: Wadsworth and Brooks/Cole, Belmont, 1984.
[7] G. W. Stewart “Collinearity and least squares regression”, Statistic Science, 1987
[2] T. R. Ho, “Sample-Efficient Regression Tree for Binary and Ordinal Attributes and Continuous Target”, M.S. Thesis, Graduate Institute of Industrial Engineering, National Taiwan University, 2003.
[3] C.W. Liu, “Enhanced Sample-Efficient Regression Trees with MaxF Selection Criterion and Attribute Combination Selection”, M.S. Thesis, Graduate Institute of Industrial Engineering, National Taiwan University, 2004.
[4] Y.X. Huang, “Sample-Efficient Regression Trees for Attributes with Maxed Continuous and Discrete Effects – A Piecewise-Linear Regression Tree”, M.S. Thesis, Graduate Institute of Industrial Engineering, National Taiwan University, 2005.

延伸閱讀