透過您的圖書館登入
IP:3.17.184.90
  • 學位論文

大規模線性支持向量迴歸

Large-scale Linear Support Vector Regression

指導教授 : 林智仁
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


在機器學習中,支持向量迴歸(SVR)與支持向量分類(SVC)是常見的方法,但是使用再生核函數後,他們的訓練過程常常很費時。近年來的研究發現,不使用再生核函數的線性SVC在特定領域上有良好的準確度以及快速的訓練及預測時間。然而,很少研究是著重在線性的SVR上。我們在這篇論文將快速的線性SVC訓練演算法拓展到線性SVR上。這些方法中,有些方法可以直接套用,有些方法則需要一些調整。實驗結果發現,我們提出的線性SVR訓練方法可以快速地產生和非線性SVR一樣好的模型。

並列摘要


Support vector regression (SVR) and support vector classification (SVC) are popular learning techniques, but their use with kernels is often time consuming. Recently, linear SVC without kernels has been shown to give competitive accuracy for some applications, but enjoys much faster training/testing. However, few studies have focused on linear SVR. In this thesis, we extend state-of-the-art training methods for linear SVC to linear SVR. We show that the extension is straightforward for some methods, but is not trivial for some others. Our experiments demonstrate that for some problems, the proposed linear-SVR training methods can very efficiently produce models that are as good as kernel SVR.

參考文獻


T. Bertin-Mahieux, D. P. Ellis, B. Whitman, and P. Lamere. The million song dataset. In In Proceedings of the Twelfth International Society for Music Information Retrieval Conference (ISMIR 2011), 2011.
C. Cortes and V. Vapnik. Support-vector network. Machine Learning, 20:273–297, 1995.
R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, 9:1871–1874, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/liblinear.pdf.
A. E. Hoerl and R. W. Kennard. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1):55–67, 1970.
ing, 2006.

延伸閱讀