We investigate synchrotron emission models as the source of gamma-ray burst (GRB) spectra. We show that including the possibility for synchrotron self-absorption, a "smooth cutoff" to the electron energy distribution, and an anisotropic distribution for the electron pitch angles produces a whole range of low-energy spectral behavior. In addition, we show that the procedure of spectral fitting to GRB data over a finite bandwidth can introduce a spurious correlation between spectral parameters-in particular, the value of the peak of the vF(v) spectrum, E-p, and the low-energy photon spectral index alpha (the lower E-p is, the lower [softer] the fitted value of alpha will be). From this correlation and knowledge of the E-p distribution, we show how to derive the expected distribution of alpha. We show that optically thin synchrotron models with an isotropic electron pitch angle distribution can explain the distribution of alpha below alpha = -2/3. This agreement is achieved if we relax the unrealistic assumption of the presence of a sharp low-energy cutoff in the spectrum of accelerated electrons and allow for a more gradual break. We show that this low-energy portion of the electron spectrum can be at most flat. We also show that optically thin synchrotron models with an anisotropic electron pitch angle distribution can explain all bursts with -2/3 less than or similar to alpha less than or similar to 0. The very few bursts with low-energy spectral indices that fall above alpha = 0 may be due to the presence of a synchrotron self-absorption frequency entering the lower end of the BATSE window. Our results also predict a particular relationship between alpha and E-p during the temporal evolution of a GRB. We give examples of spectral evolution in GRBs and discuss how its behavior is consistent with the above models.