The predictions of the theory of Nabarro-Herring diffusion creep are compared with experimental data from the literature for foils and wires with each grain extending through the foil thickness or wire diameter. These experiments have established conditions giving strain rate proportional to stress, and strain rate per unit stress inversely proportional to the product of two grain dimensions, as predicted by theory for mass transport by lattice diffusion. However, diffusion coefficients calculated by theory from these creep data are in some cases up to an order of magnitude higher than reliable radiotracer values. It is shown that all experiments showing this apparent enhancement of creep rate compared with theory, involve a relatively short test duration t, as expressed in the dimensionless form P = Drt (al), where a is the specimen (and grain) thickness or diameter, 1 is the grain diameter or length, and Dr is the radiotracer value of the diffusion coefficient. Reasonable agreement between theory and experiment is invariably obtained when P ≳ 3. The discrepancy at low P is considered to be a transient effect of dislocations acting as sources and sinks for vacancies. The increased magnitude of the effect in the order Au, Ag, Ni and Cu is explicable in terms of assumed differences in dislocation density and the effects of differences in width of splitting, particularly as this affects absorption and release of vacancies. © 1969.