Variational Bayes Learning Of Graphical Models With Hidden Variables
Hang Yu, Justin Dauwels

Abstract:
Hidden variable graphical models are powerful tools to describe high-dimensional data; they capture dependencies between observed variables by introducing a suitable number of hidden variables. Present methods for learning the dependence structure of hidden variable graphical models are derived from the idea of maximizing penalized likelihood, and hence are associated with the troublesome problem of regularization selection. In this paper, we show that this problem can be successfully circumvented by treating the penalty parameters as random variables and describing the hidden variable graphical models in a Bayesian formulation. An efficient variational Bayes algorithm is further developed to adaptively learn the graphical model as well as the distribution of penalty parameters. Numerical results from both synthetic and real data show that the proposed variational Bayes method yields comparable or better performance than the stability selection based maximum penalized likelihood method, yet it requires several orders of magnitude less computational time.