By Hyunjoo Jin, David Shepardson and Tina Bellon
BERKELEY, Calif. (Reuters) – The fatal crash of a Tesla with no one apparently behind the wheel has shed new light on the safety of semi-autonomous vehicles and the nebulous regulatory terrain in the United States in which they navigate.
Harris County, Texas, police said a Tesla Model S crashed into a tree at high speed on Saturday after failing to negotiate a turn and caught fire, killing an occupant found in the passenger seat front and owner in the back seat.
Tesla chief executive Elon Musk tweeted Monday that preliminary data uploaded by Tesla indicates the vehicle was not running on autopilot and was not part of the automaker’s Full Self-Driving (FSD) system.
Tesla’s autopilot and FSD, along with the growing number of similar semi-autonomous driving functions in cars made by other automakers, present a challenge for those responsible for motor vehicle and road safety. .
The US federal highway safety authority, the National Highway Traffic Safety Administration (NHTSA), has yet to issue specific regulations or performance standards for semi-autonomous systems such as autopilot or fully autonomous vehicles ( AV).
There are no NHTSA rules requiring automakers to ensure systems are used as intended or to prevent drivers from misusing them. The only significant federal limitation is that the vehicles have steering wheels and human controls required under federal rules.
In the absence of performance or technical standards, systems such as autopilot inhabit a regulatory gray area.
The Texas crash follows a series of crashes involving Tesla cars being driven on autopilot, its partially automated driving system that performs a range of functions such as helping drivers stay in lanes and steer on highways.
Tesla has also rolled out what it describes as a “beta” version of its FSD system to around 2,000 customers since October, allowing them to test how well it works on public roads.
Harris County Police are now seeking a search warrant for Tesla’s data and said witnesses told them the victims intended to test the car’s automated driving.
Adding to the regulatory confusion is that the NHTSA traditionally regulates vehicle safety while the Motor Vehicle Departments (DMVs) in individual states oversee drivers.
When it comes to semi-autonomous functions, it may not be obvious whether the on-board computer or driver is controlling the car, or whether supervision is shared, says the United States National Transportation Safety Board (NTSB).
California has introduced audiovisual regulations, but they only apply to cars equipped with technology capable of performing the task of dynamic driving without the active physical control or supervision of a human operator, the DMV said. of the state to Reuters.
He said Tesla’s full autonomous driving system does not yet meet those standards and is considered a type of advanced driver assistance system that it does not regulate.
This leaves Tesla’s autopilot and its FSD system operating in regulatory limbo in California as the automaker rolls out new versions of the systems for its customers to test.
NHTSA, the federal body responsible for vehicle safety, said this week it has opened 28 Tesla vehicle crash investigations, of which 24 remain active, and at least four, including the fatal Texas crash. , which have occurred since March.
The NHTSA has repeatedly argued that its broad power to require automakers to recall any vehicle that poses an unreasonable safety risk is sufficient to address driver assistance systems.
So far, NHTSA has not taken any enforcement action against Tesla’s advanced driving systems.
White House spokeswoman Jen Psaki said NHTSA was “actively engaged with Tesla and local law enforcement” on the Texas crash.
The NTSB, a US government agency responsible for investigating traffic accidents, has criticized NHTSA’s hands-on approach to regulating cars with autonomous functions and self-driving vehicles.
“NHTSA refuses to take action for vehicles qualified as partial or less automation and continues to wait for higher levels of automation before requiring that audiovisual systems meet minimum national standards,” wrote the chairman of the NTSB, Robert Sumwalt, in letter dated February 1. at NHTSA.
“Because NHTSA has no requirements in place, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limits of AV control systems,” the letter said.
REVIEW OF REGULATIONS
NHTSA told Reuters that with a new administration in place, it is reviewing autonomous vehicle regulations and welcomes the NTSB’s contribution as it advances policies on automated driving systems.
He said the most advanced vehicle technologies on sale require a fully attentive human driver at all times.
“Abusing these technologies is, at the very least, distracted driving. Every state in the country holds the driver responsible for the safe operation of the vehicle,” NHTSA told Reuters.
The NTSB also claims that the NHTSA has no method of verifying whether automakers have adopted system safeguards. For example, there are no federal regulations that require drivers to touch the steering wheel within a specific time frame.
“NHTSA is in the process of drafting autonomous vehicle rules, but regulation of semi-autonomous vehicles has been slow,” said Bryant Walker Smith, professor of law at the University of South Carolina. “There is a growing awareness that they deserve higher priority review and regulatory action.”
New York has a law requiring drivers to keep at least one hand on the wheel at all times, but no other state has legislation that could prevent the use of semi-autonomous cars.
Regarding VAs, 35 states have enacted laws or governors have signed executive decrees covering VAs, according to the National Conference of State Legislatures.
Such rules allow companies like Alphabet’s Google and General Motors, among others, to test their Waymo and Cruise vehicles on public roads.
But regulations differ from state to state.
Texas AV regulations state that vehicles must comply with NHTSA processes, although there are no such federal regulations. The Texas Department of Public Safety, the regulatory body overseeing VAs, did not respond to a request for comment.
The Arizona Department of Transportation requires companies to regularly submit filings to verify, among other things, that vehicles can operate safely in the event of autonomous technology failure.
While most automakers offer vehicles with various forms of assisted driving, there are no fully autonomous vehicles for sale to customers in the United States.
Concerns about the safety of autonomous driving technology, however, have grown in recent years and Tesla has warned of its limits.
In February 2020, Tesla’s autonomous driving technology director Andrej Karpathy identified a challenge for his autopilot system: how to recognize when the emergency flashing lights of a parked police car are on.
“This is an example of a new task that we would like to know about,” Karpathy said at a conference on Tesla’s efforts to deliver FSD technology.
In just over a year since then, Tesla vehicles have crashed into police cars parked on roads on four occasions and since 2016 at least three Tesla vehicles running on autopilot have been in fatal crashes.
U.S. security regulators, police and local government investigated the four incidents, officials told Reuters.
At least three of the cars were on autopilot, police said. In one case, a doctor was watching a movie on a phone when his vehicle crashed into a police soldier in North Carolina.
Tesla did not immediately respond to a request for comment.
Accidents and investigations haven’t slowed Musk’s drive to promote Tesla cars as being capable of driving themselves.
In a recent Tweet, Musk said Tesla was “almost ready with FSD Beta V9.0. The improvement in stage changes is massive, especially for weird corner cases and bad weather. Pure vision, not radar. “
Tesla also claims to have used 1 million cars on the road to collect image data and improve autopilot, using machine learning and artificial intelligence.
Tesla’s Karpathy said he rode in his Tesla for 20 minutes to grab coffee in Palo Alto without intervention.
“It’s not a perfect system, but it does,” he said in a “Robot Brains” podcast in March. “I definitely keep my hands on the wheel.”
(Reporting by Hyunjoo Jin in Berkeley, California, David Shepardson in Washington and Tina Bellon in Austin, Texas; editing by Joe White and David Clarke)