The likelihood principle is one of the most important concepts in Statistics. Among other things, it is used to obtain point estimators for the parameters of probability distributions of random variables by maximizing the likelihood function. The resulting maximum likelihood estimators usually have desirable properties such as consistency and efficiency. However, these estimators are often not robust as it is known that there is usually a trade-off between robustness and efficiency; the more robust an estimator is the less efficient it may be when the data come from a Gaussian distribution. In this paper we investigate how the estimators change when the likelihood function is replaced by a trimmed version of it. The idea here is to trim the likelihood function rather than directly trim the data. Because the likelihood is scalar-valued, it is always possible to order and trim univariate as well as multivariate observations according to their contributions to the likelihood function. The degree of trimming depends on some parameters to be specified by the analyst. We show how this trimmed likelihood principle produces many of the existing estimators (e.g., maximum likelihood, least squares, least trimmed squares, least median of squares, and minimum volume ellipsoid estimators) as special cases. Since the resulting estimators may be very robust, they can be used for example for outliers detection. In some cases the estimators can be obtained in closed form. In other cases they may require numerical solutions. In cases where the estimators cannot be obtained in closed forms, we provide several algorithms for computing the estimates. The method and the algorithms are illustrated by several examples of both discrete and continuous distributions. (C) 1997 Elsevier Science B.V.