A method for Bayesian factor analysis (FA) of large matrices is proposed. It is assumed that a small number of matrix elements are initially observed, and the statistical FA model is employed to actively and sequentially select which new matrix entries would be most informative, in order to estimate the remaining missing entries, i.e., complete the matrix. The model inference and active learning are performed within an online variational Bayes (VB) framework. A fast and provably near-optimal greedy algorithm is used to sequentially maximize the mutual information contribution from new observations, taking advantage of submodularity properties. Additionally, a simple alternative procedure is proposed, in which the posterior parameters learned by the Bayesian approach are directly used. This alternative procedure is shown to achieve slightly higher prediction error, but requires much fewer computational resources. The methods are demonstrated on a very large matrix factorization problem, namely the Yahoo! Music ratings dataset. © 2012 IEEE.
|Original language||English (US)|
|Title of host publication||2012 IEEE Statistical Signal Processing Workshop, SSP 2012|
|Number of pages||4|
|State||Published - Nov 6 2012|