The observed galaxy velocity dispersion in clusters of galaxies is compared with the X-ray temperature of the intracluster gas for a large sample of clusters. A strong correlation is found between the observables which is best fitted by the relation sigma = 332 +/- 52(kT)0.6+/-0.1 km s-1; here sigma is the rms radial velocity dispersion in the cluster, and kT is the gas temperature in keV. The relation is consistent with that expected from an isothermal model where both the gas and the galaxies are in hydrostatic equilibrium with the binding cluster potential, i.e., sigma is-proportional-to T0.5. The observed relation is used to determine the best-fit average beta parameter for clusters, beta = sigma2/(kT/mum(p)), which describes the ratio of energy per unit mass in the galaxies to that in the gas (where mum(p) is the mean particle mass); we find beta = 0.94 +/- 0.08 [and beta(median) = 0.98]. This implies that the galaxies and the gas trace the same cluster potential with approximately equal energy per unit mass (i.e., beta = 1). The best-fit observed beta also suggests that no significant velocity bias exists in clusters of galaxies, i.e., the galaxy velocity dispersion in clusters is a fair tracer of the dark matter velocity dispersion. The best-fit velocity bias is b(upsilon) = sigma(gal)/sigma(DM) congruent-to beta0.5 = 0.97 +/- 0.04 (with b(upsilon) = 1 corresponding to no bias).