Start with a multiple regression in which each predictor enters linearly. How can we tell if there is a curve so that the model is not valid? Possibly for one of the predictors an additional square or square-root term is needed. We focus on the case in which an additional term is needed rather than the monotonic case in which a power transformation or logarithm might be sufficient Among the plots that have been used for diagnostic purposes, nine methods are applied here. All nine methods work fine when the predictors are not related to each other, but two of them are designed to work even when the predictors are arbitrary noisy functions of each other. These two are recent methods, Cook's CERES plot and the plot for an additive model with nonparametric smoothing applied to one predictor. Even these plots, however, can miss a curve in some cases and show a false curve in others. To give a measure of curve detection, the curve can be fitted nonparametrically, and this fit can be used in place of the predictor in the multiple regression. When a curve is detected, it can be approximated with a parametric curve such as a polynomial in an arbitrary power.