透過您的圖書館登入
IP:3.144.77.71
  • 學位論文

二階正規化多標籤線性分類器比較

Comparison of L2-Regularized Multi-Class Linear Classifiers

指導教授 : 林智仁
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

並列摘要


The classification problem appears in many applications such as document classification and web page search. Support vector machine(SVM) is one of the most popular tools used in classification task. One of the component in SVM is the kernel trick. We use kernels to map data into a higher dimentional space. And this technique is applied in non-linear SVMs. For large-scale sparce data, we use the linear kernel to deal with it. We call such SVM as the linear SVM. There are many kinds of SVMs in which different loss functions are applied. We call these SVMs as L1-SVM and L2-SVM in which L1-loss and L2-loss functions are used respectively. We can also apply SVMs to deal with multi-class classification with one-against-one or one-against-all approaches. In this thesis several models such as logistic regression, L1-SVM, L2-SVM, Crammer and Singer, and maximum entropy will be compared in the multi-class classification task.

參考文獻


approach to natural language processing. Computational Linguistics, 22(1):39–
B. E. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal mar-
ods: a case study in handwriting digit recognition. In International Conference
K.-W. Chang, C.-J. Hsieh, and C.-J. Lin. Coordinate descent method for large-
C. Cortes and V. Vapnik. Support-vector network. Machine Learning, 20:273–

延伸閱讀