This article presents some experiments of a real-time navigation system driven by two cameras pointing laterally to the navigation direction (Divergent Stereo). Similarly to what has been proposed in (Franceschini et al. 1991; Coombs and Roberts 1992), our approach (Sandini et al. 1992; Santos-Victor et al. 1993) assumes that, for navigation purposes, the driving information is not distance (as it is obtainable by a stereo setup) but motion and, more precisely, by the use of qualitative optical-flow information computed over nonoverlapping areas of the visual field of two cameras. Following this idea, a mobile vehicle has been equipped with a pair of cameras looking laterally (much like honeybees) and a controller based on fast, real-time computation of optical flow has been implemented. The control of the mobile robot (Robee) is based on the comparison between the apparent image velocity of the left and the right cameras. The solution adopted is derived from recent studies (Srinivasan 1991) describing the behavior of freely flying honeybees and the mechanisms they use to perceive range. This qualitative information (no explicit measure of depth is performed) is used in many experiments to show the robustness of the approach, and a detailed description of the control structure is presented to demonstrate the feasibility of the approach in driving the mobile robot within a cluttered environment. A discussion about the potentialities of the approach and the implications in terms of sensor structure is also presented.