It is widely believed that the global baryon content and the mass-to-light ratio of groups and clusters of galaxies are fair representatives of the matter mix of the universe and therefore can be used to reliably determine the cosmic mass density parameter Omega(M). However, this fundamental assumption is challenged by growing evidence from optical and X-ray observations that the average gas mass fraction and mass-to-light ratio increase mildly with scale from poor groups to rich clusters. Although a number of time-consuming hydrodynamical simulations combined with semianalytic approaches have been carried out that permit a sophisticated treatment of some complicated processes in the formation and evolution of cosmic structures, the essential physics behind the phenomenon still remains a subject of intense debate. In this Letter, using a simple analytic model, we show that radiative cooling of the hot intragroup/intracluster gas may allow one to reproduce the observed scale dependence of the global stellar and gas mass fractions and mass-to-light ratio of groups and clusters, provided that about half of the cooled gas is converted into stars. Together with the recent success in the recovery of the entropy excess and the steepening of the X-ray luminosity-temperature relations detected in groups and clusters, radiative cooling provides a simple, unified scheme for the evolution of hot gas and the formation of stars in the largest virialized systems of the universe.