[ad_1]
When Mangesh Gururaj's wife left home to pick up the children's math class one Sunday early this month, she turned on her model Tesla Model and called "Summon" cars.
But as the $ 65,000 sedan withdrew from the garage, Gururaj said the car suddenly hit a wall, tearing his forehead with a sharp crack. Tesla's mutilation seemed to have continued to drive, Gururaj said, if his wife had not braked.
No one was injured, but Gururaj was shaken: the car had failed disastrously, during the simplest maneuvers, using one of the most basic features of autonomous driving technology with his family. .
"This is only an accident in the garage.You can solve this problem.But if we summoned and there was a child that he had not seen?" " said Gururaj, a computer consultant in North Carolina, who bought the car last year. "I had a lot of confidence in Tesla, as a car, but it left … You talk about great responsibility and your life is at stake."
The crash is an embarrassing incident for a Tesla Chief Technology Officer, Elon Musk, unveiled in 2016 with fanfare, claiming that it would soon allow owners to click a button and roll their car around the country.
But the crash also highlights the growing problem of trust with assistive technology and autonomous cars. The promise of electric cars with automatic driving, assisted by robots and almost magical, has given way to a more nuanced reality: cars that also engage in confusion, confusion or crash, often with little warning or explanation.
This is not the first time that the security and capabilities of the "Summon" feature are being questioned. In 2016, a Tesla owner, in Utah, said his Model S had become a thug after parking, rushing and impaling under a parked trailer. Tesla said that the car logs indicated that the owner was at fault, but later updated "Summon" with a new feature that could have prevented the crash.
When asked for details of Gururaj's accident, a spokesman for Tesla only told the car's owner's manual, which calls Summon a "beta feature" and says that the car does not detect a range of narrow objects like a bike.
Driver assistance systems such as Tesla's "autopilot" have been involved in a tiny fraction of the country's car accidents, and companies developing these technologies claim that in the long run they will improve the safety of cars. road safety and save lives. The review of rare accidents, they add, is wrong in a country where more than 40,000 people died on the road last year.
But the causes of the collisions are often a mystery, leaving drivers like Gururaj deeply troubled by the possibility that they can reproduce. Companies are imposing restricted access to internal car computer logs and generally reveal little about what went wrong, stating that information about the interaction of car sensors and computers are proprietary and must remain secret in a competitive sector.
This uncertainty has contributed to drivers' apprehension about technology that has not yet been proven for public use. Two public polls released in July by the Brookings Institution think tank and non-profit organization Advocates for Highway and Auto Safety revealed that over 60% of Americans surveyed said they did not want to drive a car and fear. to share the road. .
Tesla says that car owners must continually monitor the movements of their vehicle and their environment and be ready to stop at any time. But Tesla is at the same time putting its self-driving technology into better capacity than human drivers: Tesla's website promises "fully autonomous hardware on all cars", claiming that they work "at a level of security clearly higher than that of a human driver. "
Cathy Chase, president of Advocates for Highway and Auto Safety, said Tesla's strategy of testing beta technologies with normal drivers on public roads was "incredibly dangerous."
"People feel flouted in a sense of security" about the safety and capacity of cars, Chase said. "Tesla's approach is risky at best and deadly at worst."
The Tesla autopilot has been involved in large-scale accidents. In 2016, a Tesla owner in Florida was killed when his Model S, driven on the autopilot, struck a trailer in front of him on the highway. The car did not slow down or stop to prevent the accident, but the federal road safety investigators did not mention the company for safety defects, saying that 39; Autopilot needed a "continuous and total attention" from the driver.
In California, this year, Tesla vehicles hit the back of a police vehicle and a fire truck parked on the autopilot. The National Transportation Safety Board is investigating another autopilot accident in March, in which a California pilot was killed after his X-model accelerated automatically to 70 mph in the last three seconds before hit a road barrier.
Tesla accused the autopilot of crashing due to human error, suggesting that the people in the driver's seat accidentally hit the pedal or were not paying attention. The company also designed the cars to repeatedly warn drivers to remain alert, blinking notifications when, for example, the driver's hands can not be detected on the steering wheel.
Gururaj said Tesla had fired computer logs from the car to investigate the crash of his garage. But the company told him that she would not share any information about what had happened, adding by email: "You are responsible for the operation of your vehicle even in Summoning mode".
Gururaj's family, he said, used "Summon" hundreds of times over the last year, saying "we thought it was the coolest feature". But he said that he would stop using the features lest they work properly at the wheel. He also stated that he was puzzled by Tesla's answer to the question of why the human had not intervened fast enough, rather than knowing why the car had been introduced. in a wall.
"They want us to trust technology because its response time is faster than that of humans, which is the very concept of automation," he said. "For them to absolutely tell the customer to stop, it's really worrisome, if the car does not smell something in front or on the side, then they should not put this in the spotlight. thu. "
[ad_2]
Source link