Self-driving technologies are the new space race with car manufacturers jostling to be the first to achieve fully autonomous vehicles. But the road to get there has been bumpy, with a track record that’s far from flawless.
Autopilot helped Tesla become one of the first automakers to introduce advanced driver assistance systems, an innovative technology that’s been available in Tesla vehicles since 2015, expanding to all models by 2019.
This feature is an advanced assistance system that helps drivers behind the wheel. It can detect nearby cars and obstacles, apply the brakes, monitor blind spots and aid with automatic acceleration reduction. But while Autopilot is helpful, it’s been a contributing factor in multiple fatal accidents over the last decade.
The first widely reported incident occurred in Williston, Florida, in 2016. Autopilot warned the driver to keep his hands on the wheel, though he ignored the signs. The vehicle ultimately crashed into a truck, killing the driver.
Reports revealed Autopilot was activated for most of the trip, yet the driver only held the steering wheel for 25 seconds. A few months later, Tesla updated the software to require drivers to respond to audible warnings.
While self-driving technologies were new and innovative in the mid-2010s, this Florida incident was a sobering warning. Though human error takes some of the blame for Tesla accidents, subsequent crashes have put Autopilot in the spotlight due to its malfunctioning.
In Mountain View, California, a Tesla Model X drove into a crash attenuator and collided with two other vehicles. Once the car wrecked, its high-voltage battery caught fire and started a blaze.
Investigators determined Autopilot steered the Model X into a gore point due to its system constraints. Then, the vehicle crashed because the driver relied too heavily on the partially automated mechanisms.
Autopilot ineffectively monitored the driver’s disengagement, which led to the accident. However, California shouldered blame when its highway patrol failed to report the nonoperational attenuator barrier.
By 2021, Autopilot was entering its sixth year of operation. Despite advancements in software, fatal incidents have still occurred. In Spring, Texas, a 2019 Model S went off-road and crashed into trees, killing the two passengers.
Initially, officials were uncertain whether Autopilot was activated before the crash. An NTSB report said the feature was unavailable because it required lane lines to function. Investigators said the driver could’ve used Tesla’s Traffic Aware Cruise Control. However, the feature would’ve only worked up to the road’s maximum speed.
This crash emphasized the need for better driver monitoring software. After analyzing the event data recording, investigators determined the driver was in the front seat when the Model X crashed. Then, he moved to the rear.
Tesla has improved Autopilot over the years, leading to more advanced versions like Full Self-Driving (FSD). This innovative feature does basic driving maneuvers for the operator, including steering and route navigation.
However, the advanced software has caused more problems for Tesla. In 2024, a Tesla Model S struck and killed a motorcyclist in Seattle. Local police said the driver was using his cellphone while FSD was enabled in his vehicle.
FSD benefits drivers by performing automatic lane changes and helping with parking. Despite its capabilities, Tesla says its software requires the driver’s active engagement while operating the car. These vehicles may have autonomous features, but they’re not fully self-driving cars.
Autopilot has come a long way since its introduction in 2015. FSD, at face value, suggests the future is bright for autonomous technologies. However, these features have a long way to go before the public can trust them fully.
Improving these technologies is essential for public safety and Tesla’s bottom line. Recent court cases have found the manufacturer liable, awarding plaintiffs millions of dollars in damages.
Current Tesla systems require human attention, although some drivers have felt comfortable enough to take their hands off the wheel. It’s up to the manufacturer to communicate limitations and prevent misuse.
While drivers are responsible for their actions, Autopilot and FSD can do more to save operators from themselves.
The TTAC Creators Series tells stories and amplifies creators from all corners of the car world, including culture, dealerships, collections, modified builds and more.
[Image: Tesla]
Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.
