Although a source star is fainter than the detection limit imposed by crowding, it is still possible to detect an event if the star is located close to the seeing disk of a bright star and is gravitationally amplified (''amplification bias''). Using a well-constrained luminosity function, I show that similar to 40% of events detected toward the Galactic bulge are affected by amplification bias, and the optical depth might be overestimated by a factor of similar to 1.7(-0.4)(+0.7) depending on the effective size of the seeing disk. In addition, I show that if one takes amplification bias into consideration, the observed timescale distribution matches significantly better, especially in the short-timescale region, with the distribution expected from a mass-spectrum model in which lenses are composed of the known stellar population plus an additional population of brown dwarfs, than it is without the effect of the amplification bias.