We have a new paper appearing in the forthcoming AISTATS2012 - you can get the paper and code here:
M. E. Khan, S. Mohamed, B. M. Marlin and K. P. Murphy. A stick-breaking likelihood for categorical data analysis with latent Gaussian models, AISTATS, April 2012.
In this paper we look at building models for the analysis of categorical (multi-class) data -- we try to be as general as possible, and look at both multi-class Gaussian process classification and categorical factor analysis. Emtiyaz will soon be on the post-doc trail, so you might here about this live in a lab near you soon. Existing models look at probit and logit link functions, and here we look at a third, new likelihood function, which we call the stick-breaking likelihood (related to the stick-breaking you know from Bayesian non-parametrics). We combine this likelihood with variational inference and show convincing results in favour of our new likelihood. One of the key messages is that this likelihood, in combination with the variational EM algorithm proposed, gives better correspondence between the marginal likelihood and the prediction error. Thus choosing hyperparameters by optimising the marginal likelihood will also give good prediction accuracy, where this is not the case with other approaches. The paper has all the details - all the Matlab code is online as well, so feel free to play around with it and let us know what you think.