[ad_1]
- The new NTSB director called Tesla’s use of the term full self-driving “deceptive and irresponsible,” the Wall Street Journal reported.
- Elon Musk admitted that FSD was “not great” and could improve.
- The moniker has been criticized by regulators who say it can trick drivers into thinking cars are fully autonomous when they are not.
A leading US safety regulator has said Tesla needs to address some major safety concerns, and it has criticized the company’s use of the term Full Self-Driving in its driver assistance technology. as “irresponsible,” according to a Wall Street Journal report released on Sunday.
Tesla CEO Elon Musk announced last week that Tesla drivers can soon expect a new version of FSD, which is an improved version of Autopilot, driver assistance software that comes with every driver. Tesla vehicle. FSD does not make the car fully autonomous, but allows the vehicle to change lanes, park and recognize traffic lights and stop signs.
However, Jennifer Homendy, the new director of the National Transportation Safety Board, told the Journal in an interview that the upcoming publication is premature.
“Basic security concerns need to be addressed before it is extended to other streets in the city and other areas,” Homendy told the Journal.
Homendy called the electric carmaker’s use of the term “Full Self-Driving” “misleading and irresponsible”. This moniker, along with the name Autopilot, has been criticized by regulators and lawmakers who say it can trick drivers into thinking cars are fully autonomous when they are not.
“It has clearly misled many people to abuse and abuse the technology,” she told the Journal.
Last month, Musk said the FSD software was “not great” and could be improved. While Musk noted the most recent and groundbreaking version of the technology has been “much improved,” he said in July in a Tweeter that drivers should be “paranoid”.
—Elon Musk (@elonmusk) 23 Aug 2021
Tesla did not respond to Insider’s request to comment on the next FSD update.
Experts advise drivers to know the limitations of the software before getting behind the wheel. Homendy also said those with regulatory and enforcement power, which the NTSB does not have, should aggressively try to regulate driver assistance technology for the safety of consumers.
U.S. safety regulators have launched an autopilot investigation after a number of Tesla vehicles collided with vehicles at first responder scenes. A female driver was arrested on Thursday for impaired driving after crashing into a Southern California freeway wall while her car was on autopilot, Insider reported.
Earlier this year, the United States’ National Highway Traffic Safety Administration (NHTSA) launched an investigation into the role of autopilot in 30 crashes that killed 10 people. The NHTSA has already ruled out the autopilot system in three of these crashes.
[ad_2]
Source link