Tesla expands beta to self-drive, but top safety officer says “basic safety issues” need to be addressed first

Image for article titled Tesla Expands Beta To Self-Drive First

photo: Justin Sullivan (Getty Images)

Tesla is preparing to launch a major update to its “Full Self-Driving” mode that would extend beta testing of the feature to more customers and territories. Before that, however, the automaker has to deal with some “fundamental safety problems,” Jennifer Homendy, head of the US National Transportation Safety Board, said in a recent interview with The Wall Street Journal.

Full Self-Driving is a more advanced version of Tesla’s assisted driving system designed for highways navigation, autopilot. Despite its namesake, no version of Tesla’s driver assistance software is fully autonomous, and Tesla warns that a human driver at the wheel must remain vigilant and ready to take the lead at all times.

Calling it “misleading and irresponsible” for Tesla to promote its software as “fully self-driving,” Homendy added that the company “has clearly misled numerous people into abusing and abusing technology”.

A beta version of the full self-driving mode that was launched for a select few Tesla drivers in October 2020. After announcing plans for a wider release by the end of September, Tesla CEO Elon Musk said on Friday that drivers wanting to try the latest version of the full self-driving mode will have access to a “beta request” by October 1st -Button “will have.

“The beta button requests permission to evaluate driving behavior with the Tesla insurance calculator,” he continued Twitter. “If the driving behavior is good for 7 days, beta access is granted.”

The update is also expected to add new tools to help drivers navigate city streets and highways. But Homendy thinks this step is dangerously premature:

“Fundamental safety issues need to be addressed before they can be rolled out to other streets in the city and other areas,” she told the Journal.

The NTSB, which can conduct research and share recommendations but has no regulator, previously investigated three fatal Tesla crashes involving the company’s autopilot system. It opened a fourth investigation on Friday after two people were killed in a vehicle accident involving a Tesla Model 3 in Coral Gables, Florida. In February 2020 the board of directors certainly Tesla’s autopilot software was one of the potential causes of a fatal accident in Mountain View, California in 2018, where the driver was playing a cell phone game at the time of the incident.

In 2017, the NTSB advised Tesla and five other automakers to improve the safety of their semi-autonomous vehicles to make it harder for drivers to abuse them. The other five companies responded and agreed to put stricter security in place. Tesla alone ignored the NTSB’s recommendations, despite having worked on some of its security features over the past few years, such as:

Tesla did not immediately respond to Gizmodo’s request for comment. The company has largely stopped processing media inquiries since the PR department was closed October 2020.

.

Leave a Comment