NTSB accuses autopilot fault, driver error is at the origin of Tesla 2018 crash



[ad_1]

DETROIT – Design problem in Tesla autopilot autopilot steering system and driver inattentiveness combined to snap a Model S electric car into a fire truck parked along a California highway revealed a government investigation.

The National Transportation Safety Board determined that the driver was excessively dependent on the system and that the autopilot design allowed him to disengage from driving.

The agency on Wednesday released a brief report outlining the probable cause of the January 2018 accident in Interstate 405's busy lane for vehicles in Culver City, near Los Angeles.

The results raise questions about the effectiveness of the autopilot, which was engaged, but failed to curb in the Culver City accident and in three other cases in which drivers were killed since 2016.

According to the report, no one was injured in the I-405 crash involving a 2014 Tesla Model S that was traveling at 50 km / h at the time of impact.

The accident occurred after a large vehicle in front of the Tesla, described by the driver as an SUV or pickup truck, got out of his lane and the Tesla hit the truck parked with flashing emergency lights while the firefighters were managing a different accident.

The probable cause of the rear-end collision was the driver's failure to respond to the fire engine "because of his carelessness and excessive dependence on the vehicle's Advanced Drive Assist System; the design of the Tesla autopilot, which allowed the driver to disengage from the driving task, and the use of the system by the driver in a manner not in accordance with the manufacturer's instructions and warnings, "writes the NTSB in the report.

You're here

TSLA, -1.92%

has repeatedly stated that the semi-autonomous system is designed to help drivers, who must be careful and ready to intervene at any time. The company says that Teslas with autopilot is safer than vehicles that are not equipped and that the system does not prevent all accidents.

CEO Elon Musk has promised a fully autonomous system next year, using the same sensors as the current Teslas, but with a more powerful computer and software. Current Teslas have more sensors than the 2014 model in the crash.

The report indicates that the Tesla's automatic emergency braking was not activated and the driver did not brake, a 47-year-old man traveling from his home in Woodland Hills to Los Angeles. In addition, the driver's hands were not detected on the steering wheel in the moments that led to the accident, says the report.

Cell phone data showed that the driver was not using his phone to talk or text in the minutes before the accident, but the NTSB could not determine if any applications were used.

A statement from a driver in a nearby vehicle provided by Tesla indicated that the driver appeared to be looking at a cell phone or other device before the accident.

The conclusion of the NTSB is another black mark against the autopilot system, which has been activated in three fatal accidents in the United States, including two in Florida and one in Silicon Valley.

During collisions in Florida, one in 2016 and the other in March of this year, the system failed to turn back in front of the Teslas and the vehicles went under the trailers who were turning. In the other victim, in Mountain View, California, in March 2018, the autopilot accelerated just before the Model X SUV crashed into a highway barrier, killing its driver, was discovered by the NTSB .

The NTSB investigates traffic accidents and makes safety recommendations to another federal agency, the National Highway Traffic Safety Administration, which has the power to request reminders and regulate.

David Friedman, former NHTSA Acting Director and currently Vice President of Advocacy at Consumer Reports, said Tesla has known for years that its system allows drivers to be unresponsive, without taking the problem seriously.

The autopilot can steer a car in its lane, change lanes with the driver's permission, stay at a safe distance from the vehicles ahead and automatically brake to avoid an accident.

Some drivers will always rely too much on support systems, and the system must be programmed to handle this, Friedman said. The autopilot, he says, warns the driver if he does not detect the torque on the steering wheel at varying intervals. But unlike a similar General Motors system, he does not watch the driver's eyes to make sure he's paying attention, Friedman said.

"It's unrealistic to try to train people to automate," Friedman said. "You have to train automation for people."

Tesla's sensors were unable to see the side of an 18-wheel vehicle in previous accidents, he said. "Is it so shocking that he can not see a fire truck? It's been at least three years since we knew about it, "said Friedman, who asked NHTSA to declare the autopilot defective and force Tesla to call him back so the drivers remain engaged.

The Center for Auto Safety, another lobby group, also called for a reminder.

"In simple terms, a vehicle that allows a driver to not pay attention, or to fall asleep, while speeding up in a parked fire truck is faulty and dangerous," the group said in a statement. "Any company that encourages such behavior should be held accountable, and any agency that fails to act bears the same responsibility for the next fatal incident."

NHTSA said it would review the NTSB's report "and will not hesitate to act if NHTSA identifies a security-related defect."

Tesla said in a statement on Wednesday that the autopilot repeatedly reminded drivers to remain alert and banned the use of the system when warnings were ignored.

"Since this incident, we have updated our system, including changing the time intervals between the practical warnings and the conditions in which they are activated," the statement said. Tesla said the frequency of warnings varies depending on speed, acceleration, surrounding traffic and other factors.

In the Culver City accident, the larger vehicle ahead of the Tesla changed lanes three to four seconds before the accident, revealing the parked fire truck, the NTSB said.

"The system was not able to immediately detect the danger and accelerated the Tesla to the truck at a standstill," the report says. The system spotted the fire truck and sent a collision warning to the driver a little less than half a second before the impact – it's too late for a driver to act, writes L & R. # 39; agency.

The NTSB found that a stationary vehicle in Tesla's field of vision was a challenge for the system to assess a threat and a brake. It indicates that the detection of fixed objects is a challenge for all manufacturers of driver assistance systems.

[ad_2]

Source link