This study examines the effects that the spatial configuration of flood retention ponds have on the reduction of flood peaks across different spatial scales in the catchment. A continuous simulation approach is used to investigate how different spatial organizations, storage capacities, and release capacities of retention ponds alter the downstream flood frequency curve. The simulation experiment involves a small (approximate to 30km2) hypothetical catchment, a 1,000year-long randomly generated rainfall time series and a distributed rainfall-runoff model that simulates the transport of water along the drainage network. Moreover, a hypothetical order four Mandelbrot-Viseck tree is used as a generic template of river network branching. Three important insights emerge from this study's theoretical exercise. First, flood retention ponds that are placed in the upstream section of the catchment and that are configured in parallel offer better flood control than either ponds that are configured in series along the main stem or a single, bigger pond that is located near the catchment outlet. Second, for ponds that are configured in series and have different flood release or storage capacities, the maximum reduction of peak discharges of low exceedance probability at the catchment outlet is achieved when the upstream pond is emptied first or has a larger storage capacity than the downstream pond. Third, the effect that retention ponds have on flood frequency diminishes in the downstream direction as the proportion of unregulated subcatchments that contribute to the peak discharge at the outlet also increases. This means that the flood mitigation benefits of flood retention ponds are primarily local, which underscores the added value of distributed flood retention ponds in terms of their ability to disperse the flood control benefits across the catchment.