Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions

Abstract
We consider the problem of estimating the number of component s d and the unknown mixing distribution in a é nite mixture model, in which d is bounded by some é xed é nite number N. Our approach relies on the use of a prior over the space of mixing distributions with at most N components . By decomposing the resulting marginal density under this prior, we discover a weighted Bayes factor method for consistently estimating d that can be implemented by an iid generalized weighted Chinese restaurant (GWCR) Monte Carlo algorithm. We also discuss a Gibbs sampling method (the blocked Gibbs sampler) for estimating d and also the mixing distribution. We show that our resulting posterior is consistent and achieves the frequentist optimal Op4n ƒ1=4 5 rate of estimation. We compare the performance of the new GWCR model selection procedure with that of the Akaike information criterion and the Bayes information criterion implemented through an EM algorithm. Applications of our methods to é ve real datasets and simulations are considered.

This publication has 36 references indexed in Scilit: