Skip to content
Tesla Autopilot Safety Probe Highlights a Painful Truth
Tesla

Tesla’s Autopilot is subject to a formal safety probe from the National Highway Transportation Safety Administration due to 11 accidents that have occurred between Teslas with Autopilot engaged and emergency safety vehicles parked on the side of the road. The painful truth emerging from the NHTSA actions: cars are dangerous. These 11 accidents are tragic, delivering life-long impacts to the families involved. Outside of those events, all serious auto accidents are tragic and will continue because humans are lousy drivers. More oversight guidelines for AV and advanced cruise control testing are needed to achieve the long-term goal to use machines to make our roads safer. As a reminder, 35-40k lives are lost in the US annually from vehicle accidents.

Two takeaways

First, it’s not just Tesla that is impacted by the investigation. Greater autonomy scrutiny for tech companies will likely delay any mainstream autonomy availability. We’ve been hoping that we’d see public full self-driving in 2025, and that’s looking to be optimistic. In the end, autonomy will likely take longer than most think to come to fruition, and almost certainly be more transformative than anyone can imagine.

Second, Tesla will likely have to soften its marketing language around FSD, which would have a near-term negative impact on both FSD’s upfront and subscription uptake. Long term, it won’t matter because eventually FSD will be what its name implies, and will be a required feature for all carmakers.

Safety boards exist for a reason and understand the catalysts for the Tesla investigation. These accidents are horrible and need to be avoided at all costs. That said, we struggle with lawmakers’ skepticism toward advancing these new technologies and look for lawmakers to formulate a coherent vision for autonomy, including laws that increase proper testing, and build an environment tech companies can use to more safely advance the technology.

Disclaimer

Back To Top