Seismicity rate variations are commonplace in teleseismic catalogs. These changes have two possible causes: they could be real, or they could be related to changes in the networks used to detect the events. It is clearly important to differentiate between these two possibilities. We demonstrate a technique for making this differentiation using uniform data generated with known rates and magnitude shifts. We then apply this technique to data from large regions of the world. All of these regions show significant variations in seismicity rates for sets with magnitude cutoffs of 5.0 and 5.5. We show that most of these changes have distributions in the magnitude domain which are consistent with systematic changes in magnitudes of between 0.1 and 0.2 units. These results suggest that most observed rate variations in teleseismic sets with high magnitude cutoffs are related to systematic changes in magnitudes and are, therefore, not real. This result has important implications for earthquake prediction and tectonic studies which interpret these variations.