Scientists Will Teach Robots To Independently Perceive Space - Alternative View

Scientists Will Teach Robots To Independently Perceive Space - Alternative View
Scientists Will Teach Robots To Independently Perceive Space - Alternative View

Video: Scientists Will Teach Robots To Independently Perceive Space - Alternative View

Video: Scientists Will Teach Robots To Independently Perceive Space - Alternative View
Video: Will Self-Taught, A.I. Powered Robots Be the End of Us? 2024, May
Anonim

Researchers from the Sorbonne and the National Center for Scientific Research (CNRS) of France have studied the premises of simplified spatial concepts in robotic systems based on the sensorimotor flow of the robot. Their work, published in the arXiv.org preprint database, is part of a larger project in which scientists are looking at how fundamental concepts of perception (body, space, object, color, and so on) can be grafted into biological or artificial systems.

Up to this point, the development of robotic systems mainly reflected how a person perceives the world. However, because of this, robots, guided solely by human intuition, can be limited in the perception of what people experience.

To create fully autonomous robots, researchers may have to step back from conventional methods and allow robotic agents to develop their own perception of the world. According to a team of researchers from the Sorbonne and NCNI, the robot should gradually develop perception by analyzing sensorimotor experiences and identifying principles that make sense.

The agent can move its sensors in the external space using a motor. Although the configuration of external agent X may be identical, its sensory experience varies significantly depending on the structure of the environment
The agent can move its sensors in the external space using a motor. Although the configuration of external agent X may be identical, its sensory experience varies significantly depending on the structure of the environment

The agent can move its sensors in the external space using a motor. Although the configuration of external agent X may be identical, its sensory experience varies significantly depending on the structure of the environment.

Alexander Terekhov, who worked on the project, and his colleagues showed that the concept of space as a phenomenon independent of the environment cannot be deduced only with the help of exteroceptive information, since it varies greatly depending on what is happening in the environment. This concept can be more clearly defined by studying the functions that connect motor commands with changes in external stimuli in relation to the agent.

“Important information comes from old research by the famous French mathematician Henri Poincaré, who was interested in how mathematics in general and geometry in particular can arise in human perception,” Terekhov says. "He suggested that touch timing could be crucial."

Poincaré's idea is easier to explain with a simple example. When we look at an object, the eye captures a specific image, which will change if the object moves 10 centimeters to the left. However, if we move 10 centimeters to the left, the image that we see will remain practically the same.

To apply these ideas to the development of robotic systems, the scientists programmed a virtual robotic arm with a camera at the end. The robot understood the measurements taken from the joints of the arm every time it received an image.

Promotional video:

“By combining all these dimensions, the robot builds an abstraction that is mathematically equivalent to the position and orientation of its camera, even if it does not have direct access to this information,” explains Terekhov. - The most important thing: even though this abstract concept is derived from an image, in the end it becomes independent of it, which means it works for all environments. Likewise, our concept of space does not depend on the specific scene that we see."