Elon Musk said a Tesla could cross the country by 2018. We crashed behind a garage.



[ad_1]


A Tesla Model S loads at a station in Switzerland. (Stefan Wermuth / Bloomberg News)

When Mangesh Gururaj's wife left home to pick up children's math lessons a Sunday earlier this month, she turned on her Tesla model and called "Summon," an automatic parking feature. cars.

But while the $ 65,000 family saloon was coming out of the garage, Gururaj said the car had hit the side wall of the garage, ripping his forehead with a sharp crack. The Tesla mutilation seemed to have continued driving, said Gururaj, if his wife had not slowed down.

No one was injured, but Gururaj was shaken: the car had failed disastrously, during the simplest maneuvers, using one of the most basic features of autonomous driving technology that his family and he had often trusted him at higher speeds.

"It's just an accident in the garage.You can solve this problem.And what if we invoked and there was a child that he did not see?", Said Gururaj, a consultant computer in North Carolina. "I had a lot of confidence in Tesla, as a car, but it's gone, you talk about great responsibility and your life is at stake."

The crash is an embarrassing incident for Elon Musk, Tesla's chief technology officer, unveiled in 2016 with fanfare, saying it would soon allow owners to click a button and roll their car around the country.

"In about 2 years, summoning should work anywhere connected by land and not blocked by borders, for example, you're in Los Angeles and the car is in New York," Musk said. tweeted in 2016.

But the crash also highlights the growing problem of trust with assistive technology and autonomous cars. The promise of electric cars with automatic driving, assisted by robots and almost magical, has given way to a more nuanced reality: cars that also stop, do not react or crash, often with little warning or explanation.

This is not the first time that the security and capabilities of the "Summon" feature are being challenged. In 2016, a Tesla owner, in Utah, said his Model S had become a thug after parking, rushing and impaling under a parked trailer. Tesla said the logs indicated that the owner was at fault, but later updated "Summon" with a new feature that could have prevented the crash.

When asked for details of Gururaj's accident, a spokesman for Tesla only told the car's owner's manual, which calls Summon a "beta feature" and says that the car does can detect a range of narrow objects like a bike.

Driver support systems such as the Tesla autopilot have been involved in a tiny fraction of the country's car accidents, and the companies that develop these technologies claim that in the long run, they will improve road safety and save lives. lives. The review of rare accidents, they add, is wrong in a country where more than 40,000 people died on the road last year.

But the causes of collisions are often a mystery, leaving drivers such as Gururaj deeply troubled by the possibility that they can reproduce. Companies impose restricted access to internal car computer logs and generally reveal little about what went wrong, claiming that information about how cars' sensors and computers interact is proprietary and must remain secret in an industry. competitive.

This uncertainty has contributed to drivers' apprehension about technology that has not yet been proven for public use. Two public polls released in July by the Brookings Institution think tank and non-profit organization Advocates for Highway and Auto Safety revealed that over 60% of Americans surveyed said they did not want to drive a car and fear. to share the road. .

Tesla says car owners need to constantly monitor the movements of their vehicles and their environment and be ready to stop at any time. But Tesla at the same time puts its self-driving technology in better capacity than human drivers: the Tesla website promises "a fully autonomous hardware on all cars", claiming that they operate "at a level of security clearly higher than that of a human driver. "

Cathy Chase, president of Advocates for Highway and Auto Safety, said Tesla's strategy of testing beta technologies with normal drivers on public roads was "incredibly dangerous."

"People feel flouted in a sense of security" about the safety and capacity of cars, Chase said. "The Tesla approach is risky at best and deadly at worst."

The Tesla autopilot has been involved in large-scale accidents. In 2016, a Tesla owner in Florida was killed when his Model S, driven on an autopilot, struck a tractor-trailer road in front of him on the highway. The car did not slow down or stop to prevent the accident, but federal road safety investigators did not name the company for safety concerns, saying that Autopilot needed "continued attention and complete 'of the driver.

In California, this year, Tesla vehicles crashed on the back of a police car and a fire truck parked on the autopilot. The National Transportation Safety Board is investigating another autopilot accident in March, in which a California pilot was killed after his X-model accelerated automatically to 70 mph in the last three seconds before hit a road barrier.

Tesla accused the autopilot of crashing due to human error, suggesting that the people in the driver's seat accidentally hit the pedal or were not paying attention. The company also designed the cars to repeatedly warn drivers to remain alert, blinking notifications when, for example, the driver's hands can not be detected on the steering wheel.

Gururaj said Tesla had fired computer logs from the car to investigate the crash of his garage. But the company told her that she would not share any information about what had happened, adding in an email: "You are responsible for the operation of your vehicle even in Summoning mode".

Gururaj's family, he said, used "Summon" hundreds of times over the last year, saying, "We thought it was the coolest feature." He also said he was baffled by Tesla's answer to the question of why the man had not intervened quickly enough, rather than knowing why the car had entered a wall.

"They want us to trust technology because its response time is faster than that of humans. That's the whole concept of automation, "he said. "For them, it's up to the customer to stop everything, it's really worrying. If the car can not feel something on the front or side, then they should not put that as a feature. "

[ad_2]
Source link