We investigate the possibility that background galaxy contamination is responsible for producing observed variations in the peak absolute blue magnitude, M(B, peak), and the rate of decline of the light curve beta for Type Ia supernovae. We look at the effect of adding a small fraction of the light from the parent galaxy to the photometric data, and find that this will increase the peak magnitude and flatten the light curve, precisely the effect originally suggested by Pskovskii (1977a) and discussed by Branch (1981). We find the change in peak magnitude to be small, as expected, because the supernova light dominates the background at time of maximum light. Miller & Branch(1990) showed that obscuration by gas and dust within the galaxy might produce a greater variation in peak absolute magnitude. We adopt their corrected data, and assume that correlations due to background contamination between peak magnitude and distance, and between peak magnitude and decline rate, will be small. The background light becomes much more significant as the supernova grows dim, however, and therefore can have a large effect on the rate of decline from maximum. Both effects should be more important for more distant SNe Ia on the average because such data is more likely to include a larger fraction of the light from the background galaxy. We calculate synthetic light curves with varying contamination, and hypothesize that if background contamination is generally present in SNe Ia observations, then: (1) beta- should decrease with increasing mu-O; (2) M(B, peak) should become brighter with increasing mu-O; (3) M(B, peak) should become fainter with increasing beta. Using data compiled by Miller and Branch, we identify some evidence for these trends. We conclude that there may not be a real variance in the decline rates of most SNe Ia, and that the problem needs to be addressed using more systematic observing methods.