One of the central problems in classifier design is the estimation of classification error. The difficulty in estimating the error probability of a classifier is due to the unknown sample distribution and small number of training samples available. In this correspondence, we will present a new approach for solving this problem. In our model, there are two types of classification error: empirical and generalization error. The first is the error observed over the training samples, and the second is the discrepancy between the error probability and empirical error. In general, the error probability of a classifier can be bounded from above by the sum of empirical and generalization errors. Since both terms depend on classifier complexity, we need a proper measure for classifier complexity. In this research, we adopted the Vapnik and Chervonenkis dimension (VCdim) as such a measure. Based on this complexity measure, we have developed an estimate for generalization error. An optimal classifier design criterion (the generalized minimum empirical error criterion (GMEE)) has been proposed. The GMEE criterion consists of two terms: the empirical and the estimate of generalization error. This criterion is useful for optimal classifier design since the classifier that minimizes the criterion is the one with the smallest error probability. Furthermore, we prove that the GMEE criterion is GAMMA optimal. This means that the criterion can select the best classifier from GAMMA, which is a collection of classifiers with finite VCdim. As an application, the criterion is used to design the optimal neural network classifier. A corollary to the GAMMA optimality of neural network-based classifiers has been proven. Thus, our approach provides a theoretic foundation for the connectionist approach to optimal classifier design. Experimental results are given to validate this approach, followed by discussions and suggestions for future research.