透過您的圖書館登入
IP:3.17.152.183
  • 學位論文

集成學習與主成分分析於股票指數之應用

Application of Ensemble Learning with Principal Component Analysis in Stock Indices

指導教授 : 黃宜侯

摘要


過去已有許多研究探討資產價格的預測性,其中有一部份的學者以技術指標作為傳統回歸之獨立變數,並得到了顯著的預測性。而近年來在機器學習技術發展之下,已經越來越支持更高維度的非線性回歸計算,因此本篇論文將採用三種非線性回歸: 支持向量機、隨機森林、遞歸神經網路做預測。而為了讓預測結果更好,此篇論文使用集成學習將此三種模型整合。相較於前人之研究,此篇論文使用的特徵除了技術指標外,我們額外加入了指數期貨與指數選擇權的指標。並且,本篇論文的研究對象擴展至8種指數,研究年限擴時西元1996年至2016年。而除了預測外,本篇論文也建構一個交易策略,實證結果發現遞歸神經網路以及集成學習皆能戰勝大盤,其中以集成學習績效最好,這也驗證了集成學習在預測方面有很好的表現。

關鍵字

機器學習 集成學習 指數 預測

並列摘要


Previous studies have been devoted to the predictability of the asset price. Some researchers used technical indicators as independent variables to conduct the ordinary least square, and they get the significant predictability. Recently, the development of the machine learning allows us to conduct regression in higher dimension. This paper conduct three non-linear model: support vector regression, random forest and recurrent neural network. In order to enhance the predictability of the models, we apply ensemble learning to combine the result. Compare to other researches, we not only use the technical indicators, but also consider the features of indices futures and options. In addition, we expand our target indices to 9 kinds of indices and lengthen investigation timeline to 20 years. Besides prediction, we also build a trading strategy, and we find that both recurrent neural network and ensemble learning can beat the market, where ensemble learning methods brings the highest return, and this demonstrates the predictability of the ensemble learning skills.

參考文獻


1. Aydogmus, H., Ekinci, A., Erdal, H.I., Erdal, H., 2015. Optimizing the monthly crude oil price forecasting accuracy via bagging ensemble models. Journal of Economics and International Finance, 7, 127-136.
2. Booth, A., Gerding, E., McGroarty, F., 2015. Performance-weighted ensembles of random forests for predicting price impact. Quantitative Finance, 15, 1823-1835.
3. Booth, A., Gerding, E., McGroarty, F., 2014. Automated trading with performance weighted random forests and seasonality. Expert Systems with Applications, 41, 3651-3661.
4. Breiman, L., 1996. Bagging predictors. Machine Learning, 26, 123-140.
5. Breiman, L., 2001. Random forests. Machine Learning, 45, 5-32.

延伸閱讀