Bayesian Mixture Models with Weight-Dependent Component Priors

  • Elaheh OftadehEmail author
  • Jian Zhang


In the conventional Bayesian mixture models, independent priors are often assigned to weights and component parameters. This may cause bias in estimation of missing group memberships due to the domination of these priors for some components when there is a big variation across component weights. To tackle this issue, we propose weight-dependent priors for component parameters. To implement the proposal, we develop a simple coordinate-wise updating algorithm for finding empirical Bayesian estimator of allocation or labelling vector of observations. We conduct a simulation study to show that the new method can outperform the existing approaches in terms of adjusted Rand index. The proposed method is further demonstrated by a real data analysis.



The research of Elaheh Oftadeh is supported by the 50th anniversary PhD scholarship from the University of Kent.


  1. 1.
    Binder, D.A.: Bayesian cluster analysis. Biometrika 65, 31–38 (1978)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Diebolt, J., Robert, C.P.: Estimation of finite mixture distributions through Bayesian sampling. J. R. Stat. Soc. Series B 56, 363–375 (1994)MathSciNetzbMATHGoogle Scholar
  3. 3.
    Frühwirth-Schnatter, S.: Finite Mixture and Markov Switching Models: Modeling and Applications to Random Processes. Springer, New York (2006)zbMATHGoogle Scholar
  4. 4.
    Fraley, C., Raftery, A.E.: Model-based clustering, discriminant analysis, and density estimation. J. Am. Stat. Assoc. 971, 611–631 (2002)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Hubert, L., Arabie, P.: Comparing partitions. J. Classif. 2, 193–218 (1985)CrossRefGoogle Scholar
  6. 6.
    Raftery, A.E.: Hypothesis testing and model selection. In: Gilks, W.R., Spiegelhalter, D.J., Richardson, S. (eds.) Markov Chain Monte Carlo in Practice, pp. 163–188. Chapman and Hall, London (1996)Google Scholar
  7. 7.
    Rand, W.M.: Objective criteria for the evaluation of clustering methods. J. Am. Stat. Assoc. 66, 846–850 (1971)CrossRefGoogle Scholar
  8. 8.
    Richardson, S., Green, P. J.: On Bayesian analysis of mixtures with an unknown number of components (with discussions), J. R. Stat. Soc. Series B (statistical methodology) 59, 731–792 (1997)CrossRefGoogle Scholar
  9. 9.
    Roeder, K., Wasserman, L.: Practical Bayesian density estimation using mixtures of normals. J. Am. Stat. Assoc. 92, 894–902 (1997)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Zhang, J.: A Bayesian model for biclustering with applications. JRSSC (Applied Statistics) 59, 635–656 (2010)MathSciNetGoogle Scholar
  11. 11.
    Zhang, J.: Genralized plaid models. Neurocomputing 79, 95–104 (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.School of Mathematics, Statistics and Actuarial Science, University of KentCanterburyUK

Personalised recommendations