The ThirdEye project aims to develop and characterize innovative support technologies that allow the operator to safely and efficiently approach and capture uncooperative targets in orbit.
Recognizing and understanding the relative position and attitude of two spacecraft during rendezvous & docking is a challenging task for the operator. All the more so when communication time delays, high image compression ratios, and low image recovery rates severely limit the amount of information available.
In the ThirdEye project, technologies are being investigated which are intended to provide the operator with a larger amount of sensor data and to present the available data more effectively, i.e. easier to capture and understand.
Head-up displays will be investigated, which intuitively display the position of the remote-controlled satellite in space so that the operator always has a reference. The HUD also provides a prediction of the satellite's flight path based on the current control inputs. This supports the operator in dealing with the unintuitive relative flight paths in orbit and compensates for the negative effects of signal propagation times between ground station and satellite.
Furthermore, possibilities are investigated to provide the operator with additional information about relative position and attitude without the use of complex virtual reality models. To this end, the change in the performance of the teleoperation system is investigated, e.g. if the operator receives an additional camera view which he can freely position on a robot arm during the final approach.
Funded by the DFG within the framework of the SFB 453 "Realistic Telepresence" from April 2008 to 2010.