The problem of estimating prior probabilities in a mixture of M classes with known class conditional distributions is studied. The observation is a sequence of n independent, identically distributed mixture random variables. The first moments of appropriately formulated functions of observations to facilitate estimation are used. The complexity of these functions may vary from linear functions of the observations (in some cases) to complex functions of class conditional density functions of observations, depending on the desired balance between computational simplicity and theoretical properties. A closed form, recursive, unbiased, and convergent estimator using the density functions is presented; the result is valid for any problem in which prior probabilities are identifiable. Discrete and mixed densities require a minor modification that has been worked out. Three application examples are described. The class conditional expectations of density functions, required for the initialization of the estimator algorithm, are analytically evaluated for Gaussian and exponential densities. Simulation results on Gaussian mixtures are included. Performance of existing and the proposed estimators are briefly compared. One of the proposed estimators is shown to be mathematically equivalent to an existing method while being computationally efficient to implement. © 1990 IEEE