The Feds say the Tesla autopilot is partly responsible for a 2018 accident



[ad_1]

The design of the Tesla autopilot function contributed to an accident in January 2018 in which a Model S sedan crashed on the back of a fire truck in the south of California, according to federal security investigators. This is the second time that the National Transportation Safety Board acknowledges Tesla in part responsible for an accident involving the semi-automated function. The federal commission said it was also investigating two other collisions involving an autopilot.

No one was injured in the 2018 accident, but investigators found that the pilot had tilted under the autopilot about 14 minutes before the accident and that he was not injured. had not driven actively for the last 13 minutes. Investigators stated that the driver's inattention and excessive dependence on the autopilot were the probable causes of the accident. In the last 14 minutes, the car warned the driver to press the steering wheel four times, but not in the four minutes before the accident, investigators found.

Investigators said that the autopilot's use by the driver was "in an inconsistent manner" with Tesla's indications. The driver stated that he learned to use the autopilot from a Tesla salesman but did not read the owner's manual, which tells drivers exactly when and where to use the autopilot.

Want the latest news about autonomous cars in your inbox? Register here!

The incident highlights what industry observers and even Tesla herself have already said: Autopilot is not a self-driving technology. This requires the attention of the drivers, even when the road ahead looks like smooth sailing.

But investigators also seem to believe that Tesla is not doing enough to secure the autopilot. In its report, the NTSB put forward a recommendation following another accident involving an autopilot that killed a driver in Florida in 2016. The panel asked automakers to "develop applications to more effectively detect Driver engagement and alert the driver in case of engagement ". using "automated vehicle control systems". Tesla has changed the operation of the autopilot, forcing drivers to exert more pressure on the steering wheel when the feature is activated. But the NTSB seems to believe that this is not enough.

"Deceive me once, shame on you, deceive me twice, shame on me, deceive myself four, five or six times now, that's too much," says David Friedman, former acting chief of the National Highway Traffic Safety Administration and now Director of Advocacy at Consumer Reports. "If Tesla does not repair the autopilot, then [the federal government] should do it for them. "(The NTSB can only recommend security improvements, NHTSA may adopt regulations.)

Tesla said in a statement that "Tesla drivers have traveled billions of kilometers with autopilot enabled, and data from our quarterly report on vehicle safety indicate that drivers using autopilot remain safer than those who operate unattended Our driver monitoring system for the autopilot regularly reminds drivers of their responsibility to remain alert and prohibits the use of autopilot when warnings are ignored, but we have also introduced many updates to make our smarter, safer and more efficient protections on all hardware platforms we have deployed Since this incident occurred, we have updated our system, including changing the time intervals between the practical warnings. and the conditions under which they are activated. "

The vehicle in the 2018 accident, in Culver City, Calif., Was a 2014 model. Tesla has since reviewed the equipment – front cameras and radars, ultrasonic sensors – of its vehicles. (Chief Executive Officer Elon Musk said today's Teslas has all the equipment needed to drive, and the electric car manufacturer is still working on the software part.)

[ad_2]

Source link