Tesla's Model 3 used autopilot before crashing fatally into semi, NTSB says



[ad_1]

In a preliminary report on the March 1 accident, the NTSB stated that the Tesla's initial data and video indicate that the driver had activated the autopilot about 10 seconds before the accident on a highway divided with turning lanes in the median. From less than eight seconds to the time of the accident, the driver's hands were not detected on the steering wheel, says the NTSB report.


See 3 Photos

Neither the data nor the videos indicate that the pilot or the autopilot system has slowed or attempted to avoid the end of the trailer, the report said.

Model 3 was driving at 68 mph when he hit the trailer on the 441 US, and the maximum speed was 55 mph, the report said. Jeremy Beren Banner, 50, was killed.

Tesla said Thursday in a statement that Banner had not used the autopilot at any other time of his training before the accident. The vehicle logs indicate that he removed his hands from the steering wheel immediately after activating the autopilot, the statement said.

Tesla also said that she was saddened by the accident and that the pilots had traveled over a billion miles with the autopilot.

"Used properly by an attentive pilot ready to take control at any time, Autopilot-assisted drivers are safer than those who work without assistance," the company said.

The circumstances of the Delray Beach accident are very similar to those that occurred in May 2016 near Gainesville, Florida. Joshua Brown, 40, from Canton, Ohio, was traveling in a Tesla Model S on a divided highway and was using the autopilot system when he was killed.

Neither Brown nor the car braked for a trailer that turned left in front of the Tesla and crossed it. Brown's Tesla also went under the trailer and his roof was shaved. After this accident, Elon Musk, CEO of Tesla, said the company had changed its system so that the radar plays a greater role in detecting objects.

David Friedman, who was acting director of NHTSA in 2014 and is now vice president of Consumer Reports' division, said he was surprised that the agency did not declare the faulty autopilot after the Gainesville crash and request a reminder. The Delray Beach crash, he said, reinforces the fact that the autopilot is allowed to operate in situations that it can not handle safely.

"Their system can not literally see the wide side of an 18 wheel vehicle on the highway," Friedman said.


See 3 Photos

The Tesla system was too slow to warn the driver to pay attention, unlike the systems that Consumer Reports has tested from General Motors and other companies, Friedman said. GM's Super Cruise Driver Assistance System only works on divided highways without any median turn lanes, he said.

Tesla needs a better system to detect faster if drivers are alert and warn them if they are not, said Friedman, adding that some homeowners tended to rely too much on the system.

"Tesla has been using human drivers for a long time as guinea pigs, which is tragically what is happening," he said.

To force a recall, NHTSA must conduct an investigation and demonstrate that the vehicle design is outside industry standards. "There are currently many systems on the roads that support a certain level of control of speed and direction, but we hear only one at the place of death or death. This type of system is remarkable, "says Friedman. I said.

NHTSA said Thursday that its investigation is ongoing and that its findings would be made public once completed.

The Delray Beach crash casts doubt on Musk's statement that Tesla will have fully autonomous vehicles on the roads next year. Musk said last month that Tesla had developed a powerful computer that could use artificial intelligence to safely navigate the roads with the same camera and radar sensors as those currently installed on Tesla cars.

"Show me the data," Friedman said. "Tesla has long claims and evidence to come – they literally show how not to do it by hastily shipping technology."

In a 2017 report on the Gainesville accident, the NTSB wrote that autopilot design limitations played a major role. The agency said Tesla told Model S owners that the autopilot should only be used on limited-access motorways, mainly highways. The report states that, despite improvements to the system, Tesla has not incorporated safeguards against its use on other types of roads.

The NTSB found that the cameras and radar in the S model were not able to detect a vehicle that was making its way. The systems are rather designed to detect the vehicles they are following to prevent rear-end collisions.

[ad_2]

Source link