透過您的圖書館登入
IP:3.136.97.64
  • 期刊

Towards Faster Model Selection in Gibbs-Sampling-Based Bayesian Classification: Discrete Gradient Approach versus Component-Wise Optimization

並列摘要


In Gibbs-sampling-based Bayesian classification, for each 'model' --i.e., for each combination of the number of clusters, degree of the corresponding polynomials, etc. --we find the posterior distribution corresponding to this model, and then we select the model with the largest marginal likelihood. For a small number of parameters, each of which has a few possible values, we can use exhaustive search to select the model. However, for a larger number of parameters, exhaustive search requires too long a computation time. So, faster model selection is needed. In this paper, we describe two possible methods for model selection: component-wise optimization and gradient ascent. We show that gradient ascent leads to faster model selection.

延伸閱讀