Considers the problem of the calculation of the bias of the maximum likelihood information estimate H, based on independent choices among k events. The expectation EH is calculated exactly as a function of the probabilities p1, p2, pkk. The bias H - EH is approximated by using a convergent expansion for a logarithm and using the 1st 2 terms of a finite expansion for the jth moment of a random variable. The resulting approximation is more generally valid, although less concise and simple, than the classical Miller-Madow approximation. (PsycINFO Database Record (c) 2006 APA, all rights reserved). © 1969 American Psychological Association.