This paper examines the implications of chirping-induced waveform distortion on the sensitivity of multigigabit-per-second receivers which utilize a traveling-wave semiconductor optical amplifier to linearly amplify the received signal prior to the photodection process. A novel method of evaluating the probability of error is used which treats the signal-spontaneous and spontaneous-spontaneous beat noise components in a rigorous manner. The dependence of the receiver sensitivity on the fiber dispersion coefficient x length product is found to differ from that of an avalanche photodiode based receiver; the traveling-wave semiconductor optical preamplifier receiver is more tolerant of chirping-induced waveform distortion. For example, for 9.6 Gb/s NRZ modulation, a traveling-wave semiconductor optical preamplifier receiver, with a filtered amplified spontaneous emission noise power of 15-mu-W, outperforms an avalanche photodiode receiver, with an average avalanche gain of 12, by 1.3 dB for no dispersion and by 3.1 dB for 105 ps/nm of dispersion. However, in the presence of dispersion, the traveling-wave semiconductor optical preamplifier receiver is more susceptible to decision time jitter than an avalanche photodiode based receiver. These differences are attributed to the gain x bandwidth product limitation of avalanche photodiodes and the dependence of the signal-spontaneous beat noise on the chirping-induced waveform distortion.