Dirichlet process (DP) mixture models are the cornerstone of non-parametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of non-parametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow,and it is important to explore alternatives.One class of alternatives is provided by variational methods, a class of deterministic algorithms that convert inference problems into optimization problems (Opper and Saad 2001; Wainwright and Jordan 2003).Thus far, variational methods have mainly been explored in the parametric setting, in particular within the formalism of the exponential family (Attias2000; Ghahramani and Beal 2001; Bleietal .2003).In this paper, we present a variational inference algorithm for DP mixtures.We present experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem.