We present an X-ray spectral study of a sample of 12 distant (z = 0.17-0.54) rich clusters of galaxies observed on-axis with the Einstein Observatory imaging proportional counter. Statistically, X-ray spectral data of the highest redshift (greater than or similar to 0.3) clusters in the sample are inconsistent at the 3 sigma confidence level with an optically thin plasma emission model plus absorptions only in the Milky Way. Inclusions of excess absorptions in the clusters significantly improve the spectral fits to the data, indicating the possible presence of large amounts of X-ray absorbing cool gas (approximately 10(21) cm-2) in some of the distant clusters. By using X-ray luminosities determined from observed fluxes only in the 0.8-3.5 keV band (where the fluxes are less affected by the absorptions) and the temperature-to-luminosity correlation for nearby clusters, the estimated temperatures of the hot intracluster medium (ICM) in these distant clusters constrain the absorption columns in the clusters, e.g., for the cluster CL 0016 + 16, the lower limit on the column density is found to be approximately 6 x 10(20) cm-2 at the 99% confidence limit. The data also show possible positive temperature evolution of the hot ICM with time which, if present, would further increase the absorptions required in fitting the spectra. These tentative results need to be confirmed with the ongoing ROSAT observations. The absorptions, together with the possible temperature evolution, may explain why there are more high-luminosity clusters now than there were in the past. We discuss a scenario in which the cool gas was originally contained in individual galaxies and was removed into the ICM when the galaxies were falling into the cluster core for the first time.