透過您的圖書館登入
IP:3.15.149.45
  • 學位論文

對偶座標下降法求解線性支持向量機

Dual Coordinate Descent Methods for Large-scale Linear Support Vector Machines

指導教授 : 林智仁

摘要


在許多分類問題中,訓練資料量頗為龐大,不但資料筆數多,特徵值數量也不少,線性支持向量機是一個處理大型分類問題的熱門訓練模型。在論文中,我們提出一個新的對偶座標下降法,以求解一階及二階損失線性支持向量機,此方法不但簡單,且能在 O(log(1/e)個迭代內求得e精確度的最佳解。實驗結果顯示,與目前最先進的求解方法相比,如Pegasos、TRON、SVMperf,我們的方法能在更短的時間內求得解答。此外,我們還延伸對偶座標下降法,以求解大型多類分類問題,並且在本論文中介紹我們實做的LIBLINEAR軟體。

並列摘要


In many applications, data appear with a huge number of instances as well as features. Linear Support Vector Machines (SVM) is one of the most popular tools to deal with such large-scale sparse data. In this thesis, we present a novel dual coordinate descent method for linear SVM with L1- and L2-loss functions. The proposed method is simple and reaches an e-accurate solution in O(log (1/e)) iterations. Experiments indicate that our method is much faster than state of the art solvers such as Pegasos, TRON, SVMperf, and a recent primal coordinate descent implementation. In addition, we extended the proposed method to solve multi-class problems. We also describe our implementation for the software LIBLINEAR.

參考文獻


B. E. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin
classifiers. In COLT, 1992.
L. Bottou. Stochastic gradient descent examples, 2007. http://leon.bottou.
C.-C. Chang and C.-J. Lin. LIBSVM: a library for support vector machines,
K.-W. Chang, C.-J. Hsieh, and C.-J. Lin. Coordinate descent method for large-

延伸閱讀