Tesla to expand full beta of self-driving, but senior security official says it needs to tackle ‘core safety issues’ first



[ad_1]

Image from article titled Tesla to Extend Self-Driving Full Beta, But Senior Security Officer Says It Must Tackle 'Basic Safety Issues' First

Photo: Justin sullivan (Getty Images)

Tesla is preparing to roll out a radical update to its “Full Self-Driving” mode that would expand beta testing of the feature to more customers and areas. But before that, the automaker needs to fix some “basic safety issues,” said Jennifer Homendy, head of the U.S. National Transportation Safety Board, in a recent interview with the wall street journal.

Full Self-Driving is a more advanced version of Tesla’s assisted driving system designed for highway navigation, Autopilot. Despite their namesakes, none of the versions of Tesla’s driver assistance software are fully autonomous, and Tesla cautions that a human driver should remain vigilant behind the wheel and ready to take over at any time.

Homendy called it “misleading and irresponsible” that Tesla advertises its software as “entirely self-contained,” adding that the company has “clearly misled many people into abusing and abusing the technology.”

A beta version of Full Self-Driving mode launched in October 2020 for a few selected Tesla drivers. After announcing plans for a wider release by the end of September, Tesla CEO Elon Musk said on Friday that drivers who want to try the latest version of Full Self-Driving mode will have access to a ” beta request “around October 1.

“The beta button will ask for permission to assess driving behavior using the Tesla insurance calculator,” he wrote on Twitter. “If the driving behavior is good for 7 days, beta access will be granted.”

The update is also expected to add new tools to help drivers navigate city streets as well as highways. But Homendy thinks the move is dangerously premature:

“Basic security concerns need to be addressed before it is extended to other streets in the city and other areas,” she told the Journal.

The NTSB, which can conduct investigations and share recommendations but has no regulatory authority, has previously investigated three fatal Tesla crashes involving the company’s autopilot system. He launched a fourth investigation Friday after two people were killed in a vehicle crash involving a Tesla Model 3 in Coral Gables, Florida. In February 2020, the board determined Tesla’s Autopilot software was one of the possible causes of a fatal crash in 2018 in Mountain View, California, where the driver was playing a mobile game when the incident occurred.

In 2017, the NTSB advised Tesla and five other automakers to improve the safety of their semi-autonomous vehicles to make it more difficult for drivers to misuse them. The other five companies responded and agreed to adopt more stringent safeguards. Tesla single-handedly ignored the NTSB’s recommendations, although he worked on some of its safety features over the following years, such as increasing the frequency of alerts if a driver using autopilot removed their hands from the vehicle. flying.

Tesla did not immediately respond to Gizmodo’s request for comment. The company has largely stopped responding to media inquiries since the disbandment of its public relations department in October 2020.



[ad_2]

Source link