This study concerns two situations that could limit a listener's sensitivity to interaural temporal delay (ITD). The first is having to detect a ''target'' ITD in one spectral region when diotic energy is also present in remote and/or adjacent spectral regions. In such a situation, sensitivity to ITD is typically degraded, as compared to when no remote and/or adjacent diotic energy is present. Both the outcome and the paradigm are commonly termed ''spectral interference.'' In the present study, spectral interference was measured using a broadband (100 Hz to 9 kHz) noise that was diotic, save for a restricted spectral region that was interaurally delayed. The portion of the noise that contained the ITD had a center frequency (CF) of either 300, 1200, 2400, or 4800 Hz and a bandwidth that was 40% of the CF. The second situation of interest was ''spectral uncertainty.'' Here, the center frequency of the portion of the noise containing the ITD was varied (chosen from one of four possible CFs) on a trial-by-trial basis. Consequently, the listener was uncertain about which spectral region could contain the ITD. Data were also collected with one of four narrow bands of noise with the CFs and bandwidths identical to the interaurally delayed spectral regions used in the spectral interference and spectral uncertainty conditions. Within a block of trials, the center frequency of the narrow band of noise was either held constant or was randomly chosen from the set of four CFs. In all cases, threshold ITDs were measured in a single-interval task. Consistent with previous studies, the largest amounts of spectral interference occurred when the portion of the noise containing the ITD was centered at or above 2400 Hz. The novel finding was that spectral uncertainty produced extremely poor performance, rendering some listeners unable to detect even a 1-ms ITD presented within high-frequency regions.