We used event-related potentials (ERPs) to evaluate the role of attention in the integration of visual and auditory features of multisensory objects. This was done by contrast- contrasting the ERPs to multisensory stimuli ( AV) to the sum of the ERPs to the corresponding auditory- only ( A) and visual- only ( V) stimuli [i. e., AV vs. ( A + V)]. V, A, and VA stimuli were presented in random order to the left and right hemispaces. Subjects attended to a designated side to detect infrequent target stimuli in either modality there. The focus of this report is on the ERPs to the standard ( i. e., nontarget) stimuli. We used rapid variable stimulus onset asynchronies (350-650 msec) to mitigate anticipatory activity and included "no-stim'' trials to estimate and remove ERP overlap from residual anticipatory processes and from adjacent stimuli in the sequence. Spatial attention effects on the processing of the unisensory stimuli consisted of a modulation of visual P1 and N1 components ( at 90-130 msec and 160-200 msec, respectively) and of the auditory N1 and processing negativity (100-200 msec). Attended versus unattended multisensory ERPs elicited a combination of these effects. Multisensory integration effects consisted of an initial frontal positivity around 100 msec that was larger for attended stimuli. This was followed by three phases of centro-medially distributed effects of integration and/or attention beginning at around 160 msec, and peaking at 190 ( scalp positivity), 250 ( negativity), and 300-500 msec ( positivity) after stimulus onset. These integration effects were larger in amplitude for attended than for unattended stimuli, providing neural evi-evidence that attention can modulate multisensory-integration processes at multiple stages.