Beware of the emotions of robots; "the simulated love is never the love"



[ad_1]

SAN FRANCISCO (AP) – When a robot "dies", does that make you sad? For many people, the answer is "yes" – and it tells us something important and potentially disturbing about our emotional reactions to the social machines that are starting to enter our lives.

For Christal White, the 42-year-old director of marketing and customer service in Bedford, Texas, that moment came months ago with the cute and friendly robot Jibo perched in his home office. After more than two years in his house, the humanoid with high foot and his round and welcoming "face" screen were beginning to tear it apart. Of course, she danced and played amusing word games with her children, but she also interrupted her sometimes during conference calls.

White and her husband, Peter, had already begun talking about moving Jibo into the empty room on the floor. They then heard about the "death sentence" imposed by the Jibo manufacturer on the product when his business collapsed. News arrived via Jibo himself, who announced that his servers would be shut down, which would lobotomize it effectively.

"My heart shattered," she says. "It was like an annoying dog that you do not really like because it's your husband's dog. But then you realize that you like it from the beginning.

Whites are far from being the first to experience this feeling. People taken to social media this year, we say goodbye to the rover of Mars Opportunity when NASA lost contact with the 15-year-old robot. A few years ago, many concerned commentators commented on a demonstration video by the Boston Dynamics robotics company, in which employees were kicking a robot that proves its stability.

Smart robots like Jibo are obviously not alive, but that does not stop us from doing as they were. Research has shown that people tend to project human traits onto robots, especially when they move or act in a vaguely human way.

Designers recognize that such features can be powerful tools for connection and manipulation. This could pose a particularly serious problem when robots move into our homes – especially if, like many other home appliances, they also become channels for transmitting data collected on their owners.

"When we interact with another human, a dog or a machine, the way we treat it is influenced by the type of mind we think we have," said Jonathan Gratch, a professor at the University of Toronto. Southern California, which studies virtual human interactions. "When you feel that something has emotion, it now deserves to be protected."

The way robots are designed can influence people's tendency to project stories and feelings about mechanical objects, said researcher Julie Carpenter, who studies people's interaction with new technologies. Especially if a robot has something that looks like a face, its body looks like that of humans or animals, or just seems autonomous, like a Roomba robot vacuum cleaner.

"Even if you know that a robot has very little autonomy, when something moves in your space and that it seems to have a reason to be, we associate it with something that has an awareness or inner goals, "she said.

Such design decisions are also practical, she said. Our homes are built for humans and pets, so robots that look and move like humans or pets will be more easily integrated.

Some researchers fear, however, that designers underestimate the dangers of attaching robots to more and more realistic.

Sherry Turkle, a long-time IA researcher and professor at MIT, is concerned that design cues may lead us to believe that some robots express an emotion towards us. Some artificial intelligence systems are already presenting themselves as socially and emotionally conscious, but these reactions are often written in a script, which gives the machine a "smarter" appearance than it does. in reality.

"The performance of empathy is not empathy," she said. "The simulated thought may be thinking, but the simulated feeling is never felt. The simulated love is never the love. "

The designers of robotics startups insist that the humanization of elements is essential to the development of the use of robots. "It is necessary to appease the public, to show that you are not disrupting public culture," said Gadi Amit, president of NewDealDesign in San Francisco.

His agency recently worked on the design of a new delivery robot for Postmates – a four-wheeled bucket-shaped object with a cute, though abstract, face; rounded edges; and lights that indicate in which direction it will turn.

It will take time for humans and robots to establish a common language as they travel together around the world, said Amit. But he expects this to happen in the coming decades.

But what about robots that work with children? In 2016, the Dallas-based startup RoboKind introduced a robot called Milo, designed specifically to help teach social behavior to autistic children. The mechanism, which looks like a young boy, is now present in about 400 schools and has worked with thousands of children.

Richard Margolin, co-founder of RoboKind, said the company was sensitive to the fact that children might become too attached to the robot, which has expressions and facial expressions that resemble those of humans.

RoboKind therefore suggests limitations in his teaching program, both to maintain Milo's interest and to allow children to transfer these skills into real life. It is recommended that children meet Milo three to five times a week for 30 minutes each time.

[ad_2]

Source link