Aircraft landing by monocular vision in HIL-simulation 1982 to 1987
- BVV2 vision system with a few microprocessors Intel 80×86; attention control to predicted regions of interest.
- Vector graphics generating edge pictures of runway and horizon line in correct 3-D perspective projection;
- no inertial sensors, not allowing winds and gusts as unpredictable perturbations.
Aircraft landing with binocular vision and inertial sensors in HIL- simulation 1987 to 1992
- Rotation rates from inertial sensors reduce time delays in perception;
- vision-based perception of runway and horizon;
- lateral ego-state from symmetry of runway image.
- Look-ahead ranges (bi-focal) up to ~ 100 m.
Flight experiments with bi-propeller Do-128 of University of Brunswick 1991
Only the autonomous visual / inertial perception part was tested. A pilot controlled the aircraft for safety reasons till shortly before touch down; then a go-around maneuver followed. The machine vision system also generated control outputs that was compared to the pilot control output after the mission.
(for details see [Schell 1992]).
Robot Technology Experiment of DLR-Oberpfaffenhofen in Spacelab D2 onboard Space Shuttle Orbiter ‘Columbia’, May 1993
Remote automatic visual control of grasping a small ‘free-flyer’ with two robot fingers; all computers for data processing were on the ground, resulting in ~ 6 seconds time delay between measurement and action on board.
Video – ROTEX- GraspingInSpace 1993