and Space mission elements (1982
landing by monocular vision in HIL-simulation
BVV2 vision system with a few microprocessors
Intel 80x86; attention control to predicted regions of interest.
Vector graphics generating edge pictures of
runway and horizon line in correct 3-D perspective projection;
no inertial sensors,
not allowing winds and gusts as unpredictable perturbations.
landing with binocular vision and inertial sensors in HIL-
Rotation rates from inertial sensors reduce
time delays in perception;
vision-based perception of runway and horizon;
from symmetry of runway image.
Look-ahead ranges (bi-focal) up to ~ 100 m.
with bi-propeller Do-128 of University of Brunswick
Only the autonomous visual / inertial
perception part was tested. A pilot controlled the aircraft for
safety reasons till shortly before touch down; then a go-around
maneuver followed. The machine vision system also generated
control outputs that was compared to the pilot control output
after the mission.
(for details see [Schell 1992]).
performance in HIL- simulation 1992
Transputer system with binocular, gaze
controlled vision system
simulated Brunswick airport with
Helicopters with sense of vision
click on image
opens high quality picture
Planar (2-D) visually
guided docking by air jet propulsion in
a laboratory setup; the best combination of 4 corner features was
to be automatically selected for the approach and docking
Satellite model plant
Air cushion vehicle
Experiment of DLR-Oberpfaffenhofen in Spacelab D2 onboard Space
Shuttle Orbiter ‘Columbia’,
Remote automatic visual control of grasping a
small ‘free-flyer’ with two robot fingers; all computers for
data processing were on the ground, resulting in ~ 6 seconds time
delay between measurement and action on board.
Vision - guided grasping in Space