A long standing goal of quantitative research in cognitive psychology has been to provide a lawful description of the retention of information over time. While a number of theoretical alternatives for a retention function have been developed, their empirical evaluation has almost exclusively relied on their ability to fit experimental data. This has meant that the issue of model complexity, which considers the number of parameter in a model and the functional form of parameter interaction, has generally not been considered in a rigorous way. This paper develops a Bayesian method for comparing retention models that naturally considers the competing demands of goodness-of-fit and complexity. We first implement the Bayesian method using numerical techniques, highlighting the basic properties of the method and showing, in particular, how assumptions about the precision of the data affect the inferences that are drawn. We then develop an analytic Bayesian method, based on the Laplacian approximation, that offers some theoretical insights into the inherent complexities of different retention functions, and has the practical advantage of being computationally efficient. We demonstrate both methods by evaluating linear, hyperbolic, exponential, logarithmic and power retention functions against the collection of data sets considered by Rubin and Wenzel (1996). (C) 2004 Elsevier Inc. All rights reserved.