The performance of systems in which picomole quantities of sample are mixed with a carrier gas and passed through an isotope-ratio mass spectrometer system was examined experimentally and theoretically. Two different mass spectrometers were used, both having electron-impact ion sources and Faraday cup collector systems. One had an accelerating potential of 10 kV and accepted 0.2 mL of He/min, producing, under those conditions, a maximum efficiency of 1 CO2 molecular ion collected per 700 molecules introduced. Comparable figures for the second instrument were 3 kV, 0.5 mL of He/min, and 14 000 molecules/ion. Signal pathways were adjusted so that response times were <200 ms. Sample-related ion currents appeared as peaks with widths of 3-30 s. Isotope ratios were determined by comparison to signals produced by standard gases. In spite of rapid variations in signals, observed levels of performance were within a factor of 2 of shot-noise limits, For the 10-kV instrument, sample requirements for standard deviations of 0.1 and 0.5 parts per thousand were 45 and 1.7 pmol, respectively. Comparable requirements for the 3-kV instrument were 900 and 36 pmol. Drifts in instrumental characteristics were adequately neutralized when standards were observed at 20-min intervals. For the 10-kV instrument, computed isotopic compositions were independent of sample size and signal strength over the ranges examined. Nonlinearities of <0.04 parts per thousand/V were observed for the 3-kV system. Procedures for observation and subtraction of background ion currents were examined experimentally and theoretically. For sample/background ratios varying from >10 to 0.3, precision is expected and observed to decrease approximately 2-fold and to depend only weakly on the precision with which background ion currents have been measured.