The authors describe a research experiment in which an autonomous robot, the HERMIES-IIB, placed in an arbitrary indoor location without prior specification of the room's contents, successfully discovers and navigates among both stationary and occasionally moving obstacles, picks up and moves small obstacles, searches for and locates a control panel, and reads meters found on that panel. All computation is done onboard the robot, which contains an eight-node NCUBE parallel processor. Available sensors include an array of sonar transducers and two cameras. The robot uses an expert system for real-time navigation, implementing machine vision algorithms that run in parallel on the nodes for panel recognition and meter reading. The authors discuss dynamic replanning, rapid decision-making under uncertainty, and computational capability within the context of an indoor laboratory environment.