The purpose of this investigation was to evaluate the potential of monoenergetic x-ray sources to improve image quality and reduce patient dose compared to conventional tungsten anode x-ray spectra. This was purely a computer simulation study. X-ray spectra were generated using the Birch and Marshal spectral model, patient x-ray transmission was calculated using Monte Carlo techniques, and a numerical method was developed for determining antiscatter grid performance. A 120 mg/cm2 Gd202S intensifying screen was simulated for radiography and a 144 mg/cm2 Csl image intensifier was simulated for fluoroscopy. The source of subject contrast that was simulated included tissue, calcium, and iodine targets which varied in mass thicknesses from 10 to 1000 mg/cm2. The figure of merit of the [contrast to noise ratio] 2/[integral dose] was used as a relative measure of dose utilization. Depending on the object thickness, monoenergetic x-ray sources with a screen-film detector exhibited a 1.4 to 2.4 improvement over tungsten anode spectra for iodine contrast, a 1.5 to 2.0 improvement for calcium imaging, and about a 1.4 to 1.6 improvement for tissue contrast. The thicker patients (30 cm) benefitted more than thinner (10 cm) ones. For the image intensifier as a detector, a 1.4-2.3 improvement factor was found for monoenergetic sources and an iodine signal object. For the practical range of radiographic imaging scenarios using present-day detector technologies, monoenergetic sources may provide an improvement in dose utilization that is comparable to the improvements that can be expected with scanning slit devices over conventional antiscatter grids. © 1994, American Association of Physicists in Medicine. All rights reserved.