Tesla Removes Full Self Driving Beta Over ‘Issues’

by


Tesla Inc. pulled its Full Self Driving (FSD) beta off the table over the weekend, with CEO Elon Musk stating that testers had been “seeing some issues with [version] 10.3.”

To remedy the issue, the company has reverted back to FSD 10.2 temporarily. Musk made the announcement over social media on Sunday morning. The following day, he had already promised that version 10.3.1 would be coming out to address problems encountered during the exceptionally short public testing phase.

“Please note, this is to be expected with beta software,” the CEO noted. “It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”

Let’s get a few things out of the way before we dive into what actually happened. Despite Tesla promising vehicular autonomy via its expensive Full Self Driving suite for years, FSD may not actually be capable of living up to its namesake. Elon Musk has even stated that it probably would always require some form of supervision when feature-complete. At best, that leaves the completed suite achieving conditional automation (SAE Level 3) but falling short on its promise of delivering total self-driving functionality (SAE Level 6).

Had we teleported directly from 1991, even the worst versions of FSD would be a technological marvel. But we’re living in the years following a decade where the automotive industry promised that self-driving cars would be commonplace by 2020. It’s also becoming clear that the trade-offs for implementing unfinished versions of these systems may not be worth it. Manufacturers are advancing driver-monitoring protocols, including cabin-facing cameras that track eye and facial movements that seem to represent the anthesis of a luxurious automotive experience.

There are also mounting legal questions in regard to who is liable when an autonomous vehicle is involved in a crash. Despite reports highlighting the shortcomings of advanced driving aids, the industry would very much like to keep drivers responsible as a way to avoid seeing legal actions being taken against the business. This has also encouraged the influx of monitoring measures while making it imperative that self-driving systems function near perfectly — the latter of which has proven to be exceedingly difficult.

Based on the limited time the FSD beta was active, it’s difficult to get a real sense of what went wrong. However, there are numerous videos of citizens testing the system showcasing some reoccurring issues. Beta vehicles were obviously still having trouble dealing with construction zones and poorly marked lanes. Users also noted that cars became timider while using similar features, asking the driver to retake control under conditions where it previously would not. There were also a few social media postings alleging that their car was making decisions to deactivate certain safety settings without human input.

Reuters reported that drivers were experiencing Forward Collision Warnings when there was no immediate danger, sometimes with their Tesla braking automatically to avoid phantom obstacles. Ironically, that was one of the systems other users were claiming vehicles were mysteriously shutting off.

This may explain why Tesla required beta testers to be subjected to a safety score to qualify. But we’re not about to claim this is the best way for an automaker to operate their software development programs. Withholding access to a system that you paid thousands extra for (and doesn’t even work) because you failed to drive in a ridiculously conservative manner somehow doesn’t seem fair.

[Image: Tesla]

Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.





Source link