[ad_1]
When Mangesh Gururaj's wife left home to pick up the child's math lessons a Sunday earlier, she turned on her Tesla Model and called "Summon" an automatic parking feature. cars.
But as the $ 65,000 sedan withdrew from the garage, Gururaj said the car suddenly hit a wall, tearing his forehead with a sharp crack. Tesla's mutilation seemed to have continued to drive, Gururaj said, if his wife had not braked.
No one was injured, but Gururaj was shaken: the car had failed disastrously, during the simplest maneuvers, using one of the most basic features of autonomous driving technology with his family. .
"It's just an accident in the garage, you can fix it, but what if we were summoning and there was a child he did not see?" said Gururaj, a computer consultant in North Carolina, who bought the car last year. "I had a lot of confidence in Tesla, as a car, but it's gone … You talk about great responsibility and your life is at stake."
The crash is an embarrassing incident for a Tesla Chief Technology Officer, Elon Musk, unveiled in 2016 with fanfare, claiming that it would soon allow owners to click a button and roll their car around the country.
But the crash also highlights the growing problem of trust with assistive technology and autonomous cars. The promise of electric cars with automatic driving, assisted by robots and almost magical, has given way to a more nuanced reality: cars that also engage in confusion, confusion or crash, often with little warning or explanation.
This is not the first time that the security and capabilities of the "Summon" feature are being challenged. In 2016, an owner in Tesla, Utah, said his Model S had become a thug after parking, rushing and impaling himself under a parked trailer. Tesla said that the car's logs indicated that the owner was at fault, but later updated "Summon" with a new feature that could have prevented the crash.
When asked for details of Gururaj's accident, a spokesman for Tesla only told the car's owner's manual, which calls Summon a "beta feature" and says that the car does not detect a range of narrow objects like a bike.
Driver support systems such as the Tesla autopilot have been involved in a tiny fraction of the country's car accidents, and the companies that develop these technologies claim that in the long run, they will improve road safety and save lives. lives. The review of rare accidents, they add, is wrong in a country where more than 40,000 people died on the road last year.
But the causes of the collisions are often a mystery, leaving drivers like Gururaj deeply troubled by the possibility that they can reproduce. Companies impose restricted access to internal car computer logs and generally reveal little about what went wrong, stating that information about how sensors and computers interact is proprietary and must remain secret in a competitive industry.
This uncertainty has contributed to drivers' apprehension about technology that has not yet been proven for public use. Two public polls released in July by the Brookings Institution think tank and non-profit organization Advocates for Highway and Auto Safety revealed that over 60% of Americans surveyed said they did not want to drive a car and fear. to share the road. .
Tesla says car owners need to constantly monitor the movements of their vehicles and their environment and be ready to stop at any time. But Tesla is at the same time putting its self-driving technology into better capacity than human drivers: the Tesla website promises "fully autonomous hardware on all cars", claiming that they work "at a level of"
Cathy Chase, president of Advocates for Highway and Auto Safety, said Tesla's beta testing strategy with normal drivers on public roads was "incredibly dangerous."
"People feel flouted in a sense of security" about the safety and capacity of cars, Chase said. "Tesla's approach is risky at best and deadly at worst."
The Tesla autopilot has been involved in large-scale accidents. In 2016, a Tesla owner in Florida was killed when his Model S, driven on the autopilot, struck a trailer in front of him on the highway. The car did not slow down or stop to prevent the accident, but federal road safety investigators did not name the company for safety concerns, saying that Autopilot needed "continued attention and total 'of the driver.
In California, this year, Tesla vehicles hit the back of a police vehicle and a fire truck parked on the autopilot. The National Transportation Safety Board is investigating another autopilot accident in March, in which a California pilot was killed after his X-model accelerated automatically to 70 mph in the last three seconds before hit a road barrier.
Tesla accused the autopilot of crashing due to human error, suggesting that the people in the driver's seat accidentally hit the pedal or were not paying attention. The company also designed the cars to repeatedly warn drivers to remain alert, blinking notifications when, for example, the driver's hands can not be detected on the steering wheel.
Gururaj said Tesla had fired computer logs from the car to investigate the crash of his garage. But the company told him that she would not share any information about what had happened, adding by email: "You are responsible for the operation of your vehicle even in Summoning mode".
Gururaj's family, he said, had used "Summon" hundreds of times over the past year, saying "we thought it was the coolest feature." But he said that he would stop using the features lest they work properly at the wheel. He also said he was baffled by Tesla's answer to the question of why the man had not intervened quickly enough, rather than knowing why the car had entered a wall.
"They want us to trust technology because its response time is faster than that of humans, which is the very concept of automation," he said. "For them to absolutely tell the customer to stop, it's really worrisome, if the car does not smell something in front or on the side, then they should not put this in the spotlight. thu. "
[ad_2]
Source link