MIT robot could help people with reduced mobility get dressed



[ad_1]

Robots have a lot of potential to help people with reduced mobility, including models that could help the infirm with clothing. It is a particularly difficult task, however, which requires dexterity, safety and speed. Now, scientists at MIT CSAIL have developed an algorithm that strikes a balance by allowing non-harmful impacts rather than allowing no impacts like before.

Humans are hardwired to adapt and adapt to other humans, but robots have to learn all of this from scratch. For example, it’s relatively easy for a person to help someone else get dressed because we instinctively know where to hold the garment, how people can fold their arms, how the fabric reacts and more. However, robots must be programmed with all of this information.

In the past, algorithms have prevented bots from impacting humans for the sake of security. However, this can lead to what is called the “frozen robot” problem, where the robot essentially stops moving and cannot perform the task it set for itself.

To overcome this problem, a team from MIT CSAIL led by doctoral student Shen Li developed an algorithm that redefines the safety of robotic movements by allowing “safe impacts” in addition to collision avoidance. This allows the robot to establish harmless contact with a human to accomplish its task, as long as its impact on humans is low.

“Developing algorithms to prevent physical damage without unnecessarily affecting task efficiency is a critical challenge,” Li said. “By enabling robots to have a non-harmful impact on humans, our method can find efficient robot trajectories to dress humans with a guarantee of safety. “

For a simple dress-up task, the system worked even if the person was doing other activities like checking a phone, as shown in the video above. It does this by combining multiple models for different situations, rather than relying on a single model as before. “This multifaceted approach combines set theory, human-sensitive safety constraints, human motion prediction and feedback control for safe human-robot interaction,” said Zackory Erickson of Carnegie University Mellon.

The research is still in its early stages, but the ideas could be used in areas other than just getting dressed. “This research could potentially be applied to a wide variety of assistive robotics scenarios, with the ultimate goal of enabling robots to provide safer physical assistance to people with disabilities,” said Erickson.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.

[ad_2]

Source link