While it is well known that "biased galaxy formation" can increase the strength of galaxy clustering, it is less clear whether straightforward biasing schemes can change the shape of the galaxy correlation function on large scales. Here we consider "local" biasing models, in which the galaxy density field delta(g) at a point x is a function of the matter-density field delta at that point: delta(g) = f(delta). We consider both deterministic biasing, where f is simply a function, and stochastic biasing, in which the galaxy-density delta(g) is a random variable whose distribution depends on the matter density: delta(g) = X(delta). We show that even when this mapping is performed on a highly nonlinear density field with a hierarchical correlation structure, the correlation function xi is simply scaled up by a constant, as long as xi much less than 1. In stochastic biasing models, the galaxy autocorrelation function behaves exactly as in deterministic models, with (X) over bar(delta) (the mean value of X for a given value of delta) taking the role of the deterministic bias function. We extend our results to the power spectrum P(k), showing that for sufficiently small k the effect of local biasing is equivalent to the multiplication of P(k) by a constant, with the addition of a constant term. If a cosmological model predicts a large-scale mass correlation function in conflict with the shape of the observed galaxy correlation function, then the model cannot be rescued by appealing to a complicated but local relation between galaxies and mass.