New AI algorithms could allow robots to learn to move by themselves, mimicking animals – ScienceDaily



[ad_1]

For a newborn giraffe or wildebeest, birth can be a dangerous introduction to the world – predators are waiting for the opportunity to prepare a meal for the weaker member of the herd. This is why many species have developed methods allowing their juveniles to find their place a few minutes after their birth.

It's an amazing evolutionary feat that has long inspired biologists and robotics – and now a team of USC researchers at the USC Viterbi School of Engineering think it's the first to create an AI-controlled robotic member, led by animal-like tendons even to be triggered and then recover in time from the next stride, a task for which the robot has never been explicitly programmed to do .

Francisco J. Valero-Cuevas, Professor of Biomedical Engineering and Professor of Biokinesiology and Physiotherapy at the USC as part of a project with the USC Viterbi School of Engineering PhD student, Ali Marjaninejad, and two other doctoral students – Dario Urbina-Melendez and Brian Cohn, have developed a bio-inspired algorithm that can learn a new task on foot by itself after only 5 minutes of unstructured play, then adapt to other tasks without additional programming.

Their article, described in the March cover story Nature Machine Intelligence, opens up exciting possibilities for understanding human movement and disability, creating responsive prostheses and robots that can interact with complex and changing environments such as space exploration and search and rescue.

"Nowadays, it takes the equivalent of several months or years of training for a robot to be ready to interact with the world, but we want to achieve rapid learning and training. adaptations that we see in nature, "said lead author, Valero-Cuevas, who also has: nominations in computer science, electrical and computer engineering, mechanical and aerospace engineering and neuroscience at the same time. USC.

Marjaninejad, a doctoral candidate at the USC's Department of Biomedical Engineering, and lead author of the journal, said that this breakthrough is close to the natural learning that occurs in babies. Marjaninejad explains that the robot was first allowed to understand its environment in a free play process (or what is called "engine babble").

"These random movements of the leg allow the robot to create an internal map of its limb and its interactions with the environment," said Marjaninejad.

The article's authors say that unlike most current work, their robots learn by doing so, without prior computer simulations or parallel computer simulations to guide learning.

Marjaninejad also added that this is particularly important because programmers can predict and code for multiple scenarios, but not for all possible scenarios – thus, preprogrammed robots are inevitably prone to failures.

"However, if you leave these [new] Robots learn from relevant experience, and eventually find a solution that, once found, will be used and adapted as needed. The solution may not be perfect, but will be adopted if it fits the situation well enough. Marjaninejad said that each of us does not need or want – or can not spend – time and effort – to win an Olympic medal.

During this process of discovering their bodies and their environment, members of the robots designed by USC's Valero Cuevas laboratory use their unique experience to develop the gait model that works well enough for them, producing custom motion robots. "You can recognize someone who comes into the hallway because he has a special attendance, is not it?" Request Valero-Cuevas. "Our robot uses its limited experience to find a solution to a problem that becomes its custom habit, or" personality. "We have the delicate walker, the lazy walker, the champion … it's up to you to say it."

The potential applications of this technology are many, especially in assistive technologies, where robot members and intuitive exoskeletons meeting the personal needs of a user would be of invaluable help to those who have lost the use of their limbs. "Exoskeletons or assistive devices will need to naturally interpret your movements to meet your needs," Valero-Cuevas said.

"Because our robots can learn habits, they can learn your habits and imitate your movement style for the tasks you need every day – even when you're learning a new task, whether you're getting stronger or weaker."

According to the authors, the research will also have strong applications in the areas of space exploration and rescue missions, allowing robots to do what needs to be done without being escorted or supervised while on the fly. they venture on a new planet or on uncertain and dangerous terrain. as a result of natural disasters. These robots would be able to adapt to low or high gravity, unstable rocks a day and mud after the rain, for example.

The two other authors of the journal, the doctoral students Brian Cohn and Dario Urbina-Melendez, weighed in the research:

"The ability of a species to learn and adapt its movements as its body and its environment evolve has been a powerful force of evolution from the beginning," he said. Cohn, doctoral candidate in Computer Science at the USC Viterbi School of Engineering. "Our work is a step forward in enabling robots to learn and adapt to each experience, just as animals do."

"I'm designing muscular robots, able to master what an animal takes months to learn, in just minutes," said Urbina-Melendez, a PhD student in biomedical engineering, who believes in robotics' ability to Inspire of life. "Our work combining engineering, artificial intelligence, anatomy and neuroscience is a strong indication that this is possible."

[ad_2]

Source link