NASA’s autonomous robotics group have created a 3D robot user interface which could be the first step towards Robots with independent minds.
The group is currently developing new technology to improve how humans explore the solar system, and how robots can help.
Terry Fong, senior scientist for autonomous systems at the NASA Ames Research Centre, said that the cunning plan was for humans to interact with autonomous systems and create trusted systems.
When NASA began working with Remotely Operated Robots several years ago, Fong said the scientists needed a piece of software that would allow them to look at terrain and sensor data coming from autonomous robots. That led to the creation of the VERVE interface which allows scientists to see and grasp the three-dimensional world of remotely operated robots.
Verve has been tested with NASA’s K10 planetary rovers (a prototype mobile robot that can travel bumpy terrain), with its K-Rex planetary rovers (robot to determine soil moisture), with SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) on the International Space Station (ISS), and the new Astrobee – a robot that can fly around the ISS.
In 2013, NASA carried out a series of tests with astronauts on the ISS, during which astronauts who were flying 200 miles above Earth remotely operated the K10 planetary rover in California.
Maria Bualat, deputy lead of intelligent robotics group at the NASA Ames Research Centre said that because of the time delay, astronauts can’t just “joystick a robot” and need the bot to complete tasks on its own.
“On the other hand, you still want the human in the loop, because the human brings a lot of experience and very powerful cognitive ability that can deal with issues that the autonomy’s not quite ready to handle.”
Human capabilities and robotic capabilities comprise a powerful combination.
One of the goals at NASA is to transfer its technology to the commercial sector, such as supporting autonomous vehicles in partnership with Nissan.