Werner et al. 1995: A pilot support system performing navigation and control tasks to guide
the helicopter autonomously along a flight track based on visual machine
perception is currently under development at UBM. The machine perception system
uses conventional measurement data as well as CCD image sequences for state
estimation, landmark / landing site recognition and tracking. The state
estimates are used by a control module to perform the given guidance task.
Before real flight tests can be undertaken, intensive testing and optimization
of the algorithms is required; this is performed through simulation. The
simulation environment allows real-time performance regarding helicopter
dynamics, sensor data communication to the machine perception system, control
output computation and perspectively mapped synthetic computer images
representing the external world.
The paper describes the simulation environment for real-time hardware-in-the-loop simulations. As many real hardware components to be used in real flight tests as possible are included within the test environment. The machine perception system design to perform sensor fusion for ego-state estimation is presented; data interfaces to the simulation environment are discussed. A combined feedback / feed-forward command generation control module uses the state estimates for guidance along the planned flight trajectory. Results from real-time simulation runs are described using simulation data for ‘ground truth’.