Courbon et al. 2009 : This paper presents a vision-based navigation strategy for a vertical take-off and landing (VTOL) unmanned aerial vehicle (UAV) using a single embedded camera observing natural landmarks. In the proposed approach, images of the environment are first sampled and stored as a set of ordered key images (visual path) and organized providing a visual memory of the environment. The robot navigation task is then defined as a concatenation of visual path subsets (called visual route) linking the current observed image and a target image belonging to the visual memory. The UAV is controlled to reach each image of the visual route using a vision-based control law adapted to its dynamic model and without explicitly planning any trajectory. This framework is largely substantiated by experiments with a X4-flyer equipped with a fisheye camera.