ETF在近10年間蓬勃發展,成為國內投資最活絡的交易工具之一。近年來,人工智慧與人工神經網路的研究是時下的趨勢,人工神經網路概念被大量運用在各種金融商品的研究中。本研究透過人工神經網路之模型預測五檔ETF價格。 本研究運用倒傳遞神經網路(Backpropagation Neural Networks,BPN)做為預測模型的架構,以2013年至2015年交易量最大的前5檔ETF價格為研究的標的,探討倒傳遞神經網路對於ETF價格之預測績效,並進一步討論網路模型輸入變數的特徵篩選與特徵縮放。利用逐步迴歸分析法(Stepwise Regression)從57種技術指標選出最鑑別能力之變數,再將變數進行四種不同的特徵縮放方法,探討各種不同方法對於倒傳遞神經網路模型預測績效的影響。最後加入K次交叉驗證法(K-Fold Cross-Validation)做為樣本切割的方法,以增加模型訓練的精確度。 經實證結果顯示,每檔ETF利用逐步迴歸分析法所選的變數大不相同,且經由特徵篩選可以提高網路模型預測的績效。在特徵縮放的部分,分別比較向量標準化、曼哈頓標準化、最大線性標準化與非線性標準化對預測績效的影響,結果顯示經由特徵縮放可以提高預測的績效,且發現利用不同的標準化方法對績效影響差異不大,其中績效最佳的方法為曼哈頓標準化。
ETFs have flourished in the past 10 years and become the most popular type of exchange-traded product. In recent years, artificial intelligence and artificial neural network are the trend of research, artificial neural network concept is widely used in various financial commodities research. This study predicts the price of the five ETFs through the model of the artificial neural network. In this study, Back-propagation Neural Networks (BPN) is used as the framework of the forecasting model. The subject of the study of the five ETFs price of the largest trading volume from 2013 to 2015. This paper studies the predictive performance of BPN model for ETF price. Further to discuss the feature selection and feature scaling of input variables of BPN model. Using stepwise regression analysis to select the most discriminating variables from 57 kinds of technical indicators, and then the variables are set into four different feature scaling methods to explore the effect of predictive performance of the BPN model. Finally, the K-fold cross-validation is used as the sample cutting method to increase the accuracy of model training. The empirical results show that the variables selected by the stepwise regression analysis are different, and the forecasting performance of BPN model can be improved by using feature selection. In the part of the feature scaling, comparing the effects of vector normalization, Manhattan standardization, Maximum linearization and Nonlinear standardization on forecasting performance. The results show that the performance of the prediction can be improved by feature scaling, and it is found that different standardized methods have slight difference on performance, and the best performance is Manhattan standardized.