The effect on surface-enhanced Raman scattering due to the increase in temperature of the metallic substrate is studied via a simple model for physisorbed molecules. Surface roughness is represented by a spherical or spheroidal island, and the temperature effects on the surface plasmon are accounted for via a slightly modified Ujihara model. It is found that the enhancement ratio in general decreases as substrate temperature increases. For noble metals like silver, this temperature effect is particularly pronounced at scattering frequencies close to that of the surface-plasmon resonance; and for frequencies well below the surface plasmon frequency, the enhancement ratio is relatively insensitive to the change of substrate temperature. A tentative explanation is provided for our modeling results, and the implications from these results are discussed.