Tesla Model S Autopilot Crash—at 80MPH—Is Subject of New Lawsuit



[ad_1]

Tesla’s driver assistance tools are again being challenged in court.

Floridian Shawn Hudson is suing the electric car maker for negligence and breach of duty, following a collision with a disabled vehicle on Florida’s Turnpike that destroyed the front end of his Tesla Model S.

Hudson claims the collision—which happened while he was doing around 80mph on Autopilot—left him with “severe permanent injuries” and is seeking unspecified monetary damages. The lawsuit also claims that Tesla is misleading consumers into believing its Autopilot system can safely transport passengers at highway speeds.

“If you engage it (Autopilot) at over 50 mph it’s got trouble finding stationary objects and parked cars,” Hudson’s lawyer, Mike Morgan told the Orlando Sentinel. “To me, that’s a big problem. To me, that means you’re selling nothing. You can’t use it in the city and you can’t use it on a highway that has a speed limit over 50, yet we’re gonna charge you $120,000 for a self-driving car.”

Hudson, who admitted to a local paper that he was looking at his cell phone periodically ahead of the collision, added to the Sentinel, “I was looking up, looking down, looking up, looking down, and I’m looking up and a car’s disabled in the passing lane on the Turnpike.”

In May of this year, Tesla settled a class action lawsuit from drivers who had bought cars with Autopilot 2.0, a feature that cost an extra $5,000 per vehicle, and which the drivers said was dangerous and unusable. In the settlement, Tesla put $5 million in a fund for legal fees and to compensate buyers of the enhanced Autopilot package from 2016 and 2017 with payments of $20 to $280.

While Autopilot’s enhanced features are Tesla’s incremental steps towards developing a fully self-driving car, it’s worth repeating that they are not self-driving yet.

Ars Technica points out that while Tesla’s Autopilot system can handle a range of driving conditions, it’s not designed to stop for parked cars or other stationary objects when traveling at highway speeds.

Elon Musk’s company also released a major Autopilot software update last week. Tesla’s new ‘Navigate’ feature on Autopilot “guides a car from a highway’s on-ramp to off-ramp, including suggesting and making lane changes, navigating highway interchanges, and taking exits.” Yet the press release also includes the following caveat: “Until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain in control of their car at all times.”

The root of the issue may lie with the way the car is being marketed. Hudson’s lawsuit claims that a Tesla sales representative reassured him he only had to “occasionally place his hand on the steering wheel and that the vehicle would ‘do everything else.’”

In September, a driver from Utah lodged a similar complaint after her Tesla hit a stationary fire truck at a red light while on Autopilot. Heather Lommatzsch claimed Tesla salespeople told her she only had to occasionally touch the steering wheel of the Model S while using the Autopilot mode.

In response to the Florida lawsuit, a Tesla spokesperson told CNET that they are unable to review the vehicle’s data from the accident because “the car was incapable of transmitting log data to our servers.” The spokesperson added, “However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed.”

Tesla also stressed that driver vigilance remains paramount. “When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not.”

[ad_2]
Source link