Tesla Asks Owners To Share Fewer Full Self-Driving Beta Error Clips



[ad_1]

Tesla is forcing owners who opt for the controversial beta of its “Full Self-Driving” software to sign nondisclosure agreements and also discourages them from sharing video clips that show the driver assistance system is making mistakes. errors.

According to a copy of the NDA obtained by ViceTesla tells those signing the document that “there are a lot of people out there who want Tesla to fail; Don’t let them distort your comments and media posts. The company also said owners of the beta should “share on social media responsibly and selectively” and “consider sharing fewer videos, and only those that you feel are interesting or worth sharing.”

Vices The report comes as Tesla is currently working to expand access to “Full Self-Driving” software, while the National Highway Traffic Safety Administration is investigating the company’s less advanced Autopilot driver assistance system currently available on its. cars.

Tesla has allowed a small group of die-hard owners to test the beta of the Full Self-Driving software for about a year now. Some of them take their role as “beta testers” very seriously and try to find loopholes in the system in order to help Tesla improve the software. Many also film themselves traveling with the software running. Some compress their longer discs into supercuts, speeding up the footage to emphasize just how far the software can take them without human intervention. Others display the raw images, warts and all.

(As always, to be clear: This software doesn’t make Tesla’s cars fully autonomous. Tesla CEO Elon Musk even said he believes the “full-feature” version of the software his company calls ” Full Self-Driving ”will, at best, be“ likely ”to drive someone from their home to work without human intervention and will still require supervision. This does not describe a fully autonomous car.)

This whole process – the years of unfulfilled claims to be able to make fully autonomous cars, the idea of ​​beta testing developing driver assistance software on public roads with owners untrained behind the wheel – drew in Musk and Tesla a lot of attention. Recently, however, a clip from a video originally shot by Tesla owner and investor Galileo Russell has gone viral and has charged the conversation even more.

In it, Russell’s car should merge to the left, but it suddenly takes a dip to the right, ultimately pointing straight at pedestrians in a crosswalk. A hedge fund owner shared this clip on Twitter, where many people (rightly so) were appalled at the car’s proximity to crush pedestrians.

In a follow-up video, Russell casually mentioned that Tesla “doesn’t want” people in the beta to share clips that look bad while also explaining why he posted the video to begin with. But it is only Vice reported on the NDA this week that it was clear what he meant.

Tesla is using this language to try to control public perception of its “Full Self-Driving” software as the company begins to open up access to a much larger group – although the software is still in development. Tesla added a button to its cars’ UI last weekend that allows owners to request to be in the beta. It also launched a “safety rating” system, which monitors drivers who apply and evaluates them on a number of metrics, such as braking habits or aggressive acceleration.

For now, Musk says that drivers with a Perfect Safety Score of 100 will be accepted into the beta, although he tweeted that Tesla will lower that bar. He also said that Tesla will soon start adding to 1,000 new owners per day in the beta version, a spectacular expansion of who will be able to test the driver assistance software on public roads.

Expanding access will certainly draw even more attention to Tesla and Musk’s patchwork approach to deploying “Full Self-Driving” software. In fact, it has already been done. Last week, National Transportation Safety Board chair Jennifer Homendy told the the Wall Street newspaper that she wanted the company to address “basic safety issues” before allowing new owners to participate in the program. Homendy was one of the more outspoken board members in a 2020 hearing that found autopilot was partly responsible for the 2018 death of a driver in Mountain View, California.

Tesla doesn’t seem to change course, however. Over the weekend, in response to a blog post about his comments, Musk tweeted a link to Homendy’s Wikipedia page – which ultimately had to be locked down after a sudden wave of edits.



[ad_2]

Source link