透過您的圖書館登入
IP:3.131.110.169
  • 學位論文

LightGBM與CatBoost在類別資料集下之效能探討

A Study on Performance of LightGBM and CatBoost under categorical datasets

指導教授 : 蔣明晃

摘要


對於現今中小型的資料集,梯度提升決策樹演算法(GBDT)在業界、學術界以及競賽被廣泛應用,此篇論文目的為比較目前最常使用的兩個GBDT套件,LightGBM與CatBoost,並找出兩個演算法之間效能差異的原因。為了讓比較具有公平性與一致性,我們根據一般現有真實資料集的特性設計了一個實驗,並根據此實驗的限制尋找資料集。實驗結果指出CatBoost在類別欄位較多的資料集確實預測效果更佳,而LightGBM則傾向於使用數值欄位來預測。在訓練時間上,LightGBM恆比CatBoost來的迅速。

並列摘要


On medium-sized datasets, Gradient Boosting Decision Tree(GBDT) methods have been proven to be effective both academically and competitively. This paper aims to investigate and compare the efficiency of the two most used GBDT methods, LightGBM and CatBoost, and discover the reason behind the performance difference. To make a fairer comparison, we designed an experiment based on data characteristic, and found several desirable raw datasets accordingly. The implementation indicates that CatBoost tends to perform better when the dataset has indeed more categorical columns, while LightGBM incline to use numerical columns to predict. For training speed, LightGBM is always faster than CatBoost under all circumstances.

並列關鍵字

Gradient Boosting LightGBM CatBoost Big Data Data mining

參考文獻


[1] K. Guolin, M. Qi, F. Thomas, W. Taifeng, C. Wei, M. Weidong, Y. Qiwei, L. Tie-Yan, "LightGBM: A Highly Efficient Gradient Boosting Decision Tree," Advances in Neural Information Processing Systems vol. 30, pp. 3149-3157, 2017.
[2] A. Dorogush, V. Ershov, A. Gulin "CatBoost: gradient boosting with categorical features support," NIPS, pp.1-7, 2017.
[3] J. Friedman. "Greedy function approximation: a gradient boosting machine." Annals of Statistics, 29(5): pp.1189-1232, 2001.
[4] J. Friedman. "Stochastic gradient boosting." Computational Statistics Data Analysis, 38(4): pp. 367-378, 2002.
[5] Tianqi Chen and Carlos Guestrin. "Xgboost: A scalable tree boosting system." In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794. ACM, 2016.

延伸閱讀