The prototype robot will allow you to feel what it feels



[ad_1]

The prototype of the robot expresses its "anger" both with its eyes and skin, which turns by stitching through inflated fluidic actuators under its skin, according to his "mood". Credit: Lindsay France

In 1872, Charles Darwin published his third major work on the theory of evolution, "The Expression of Emotions in Man and Animals," which explores the biological aspects of the emotional life.

In this book, Darwin writes: "Almost no expressive movement is as general as the involuntary erection of hairs, feathers and other dermal appendages … it is common across three of the great classes of vertebrates. " Nearly 150 years later, the field of robotics is starting to take inspiration from these words.

"The appearance of the touch was not much explored in the human-robot interaction, but I often thought that people and animals their skin that expresses their internal state," he said. said Guy Hoffman, Assistant Professor and member of the Mills Family Faculty at the School of Mechanical and Aerospace Engineering Sibley (MAE)

Inspired by this idea, Hoffman and his students in his Human-Robot and Companionship collaboration Lab have developed a prototype robot capable of expressing "emotions" through changes in its outer surface, its skin covering a grid of texture units (TU) whose shapes are controlled by fluidic actuators. , based on a design developed in the laboratory of Hoffman's colleague Rob Shepherd

for social robots, "presented in April at the International Conference on Flexible Robotics in Livorno, Italy. doc student Yuhan Hu's torat was the main author; the paper was presented on May 16 in IEEE Spectrum, a publication of the Institute of Electrical and Electronic Engineers.

Hoffman, whose TEDx talk about "Robots with" soul "was seen nearly 3 million times, said a robot that emits nonverbal signals through its outer skin comes from the animal world, based on the idea that robots should not be thought of in human terms.

"I've always thought that robots should not only be modeled after humans or be copies of humans," he said. "We have a lot of interesting relationships with D & B. Other species: the robots could be considered as one of these "other species", not trying to copy what we do, but interacting with us with their own language, tapping into our own instincts, 19659010] Part of our relationship with other species is our understanding of the nonverbal cues that animals emit – such as raising fur on the back of a dog or the neck of a dog. 39, a cat, or the rustling of the feathers of a bird.These are unmistakable signals that the animal is somehow awake or angry; the fact that they can be seen and felt strengthens the message.

"Yuhan said it very well: she said that humans are part of the family the species, they are not disconnected, "said Hoffman. "Animals communicate in this way, and we have a sensitivity to this kind of behavior."

At the same time, there is a lot of technology developed with active materials, which can change their shape and properties on demand. In fact, one of the innovators in this field – Shepherd, head of the Organic Robotics Lab – works at Upson Hall's five doors away from Hoffman

"It's one of the most pleasing things in the world." 39 to be here in Cornell "Hoffman said. "Rob is down the hall, and that's how I discovered this technology: this kind of close collaboration is largely what I was so excited about joining Cornell."

The design of Hoffman and Hu forms, goosebumps and spikes, which correspond to different emotional states. The actuating units for both forms are integrated in texture modules, with fluidic chambers connecting bumps of the same type.

The team tried two different actuation control systems, minimizing the size and noise level. "One of the challenges," said Hoffman, "is that many form-changing technologies are quite noisy, because of the pumps involved, and these also make them very bulky."

Hoffman does not have a specific application for his robot with texture-changing skin mapped to his emotional state. At this point, simply proving that this can be done is an important first step. "It's just about giving us another way of thinking about how robots could be designed."

Future challenges include adapting the technology to an autonomous robot – whatever the shape of the robot – and the technology reacting to the immediate emotional changes of the robot.

"At the moment, most social robots do not express [their] the internal state that using facial expressions and gestures," concludes the article. "We believe that the integration of a skin that changes texture, combining both haptic and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction."

[ad_2]
Source link