Flying animals accomplish high-speed navigation through fields of obstacles using a suite of sensory modalities that blend spatial memory with input from vision, tactile sensing, and, in the case of most bats and some other animals, echolocation. Although a good deal of previous research has been focused on the role of individual modes of sensing in animal locomotion, our understanding of sensory integration and the interplay among modalities is still meager. To understand how bats integrate sensory input from echolocation, vision, and spatial memory, we conducted an experiment in which bats flying in their natural habitat were challenged over the course of several evening emergences with a novel obstacle placed in their flight path. Our analysis of reconstructed flight data suggests that vision, echolocation, and spatial memory together with the possible exercise of an ability in using predictive navigation are mutually reinforcing aspects of a composite perceptual system that guides flight. Together with the recent development in robotics, our paper points to the possible interpretation that while each stream of sensory information plays an important role in bat navigation, it is the emergent effects of combining modalities that enable bats to fly through complex spaces.