Tesla Vehicles Operating in Autopilot Mode Involved in 273 Crashes in Under a Year, the NHTSA Reports
Tesla Chief Executive Elon Musk has long touted the company's Autopilot SAE level-2 automated driving system as a safer alternative to having a human take the wheel. But Tesla vehicles operating on Autopilot have beens involved in dozens of high profile crashes with some resulting in fatalities, despite Musk's claims on how well it performs.
Now the latest data from the National Highway Traffic Safety Administration (NHTSA) show that Tesla vehicles operating in Autopilot have been involved in 273 crashes since last July. Of these crashes, 125 occurred in California. The number is much higher than previously reported and shows how far the nascent technology needs to advance to become safer than a human driver.
"The data released today are part of our commitment to transparency, accountability and public safety," said Dr. Steven Cliff, NHTSA's Administrator. "New vehicle technologies have the potential to help prevent crashes, reduce crash severity and save lives, and the Department is interested in fostering technologies that are proven to do so; collecting this data is an important step in that effort.
The crash data published by the NHTSA on Wednesday shows that Tesla vehicles operating on Autopilot accounted for nearly 70% of the 392 crashes involving advanced driver-assistance systems (ADAS) reported since last July, as well as the majority of fatalities and injuries resulting from them.
For Japan's Honda Motor Co, the NHTSA reported 90 incidents. For all of the other automakers, including BMW, Toyota, GM, Ford, ADAS incidents were in the single digits, according to the NHTSA data.
Of the 392 crashes, 116 of them were collisions with another vehicle.
According to the NHTSA, crashes are reportable if the Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user or resulted in a fatality, a vehicle tow-away, an airbag deployment, or any person transported to a hospital for medical treatment.
Prior to the report released on Wednesday, the NHTSA collected more limited data on accidents potentially caused by a vehicle's ADAS. Since 2016, the NHTSA said it had probed just 42 crashes where drivers were using ADAS features. Of these crashes, 35 involved Tesla vehicles.
The NHTSA has stepped up its oversight on automated driving systems like Tesla's Autopilot beginning in June 2021 when it issued a Standing General Order requiring identified vehicle manufacturers to report to the agency certain crashes involving vehicles equipped with SAE Level 2 ADAS features. The General Order was issued in order to evaluate the safety of level-2 automated driving systems as they become available as an option on many new vehicle models.
In June 2021, the NHTSA had teams review 30 Tesla crashes resulting in 10 deaths since 2016. In all of these cases, Autopilot was suspected of being used.
It's important to note that although SAE level 2 systems provide some steering and braking input, they still require a drivers attention at all times.
The General Order issued on June 29, 2021 also covers vehicles equipped with SAE Levels 3-5 Automated Driving Systems (ADS), which was gathered separately. However, there are currently no level 3 autonomous driving systems on the market in the U.S.
The NHTSA received the first ADAS incident report on July 20, 2021. Prior to the implementation of the General Order, the NHTSA reported that timely crash notifications were limited and for the most part, inconsistent across manufacturers. Some of these incidents were first reported by the media before the NHTSA learned of them.
"These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations," said NHTSA administrator, Steven Cliff, said during a media call on Wednesday.
Tesla's Autopilot is also blamed for a number of collisions with emergency vehicles. In August of last year, National Highway Traffic Safety Administration (NHTSA) identified the 12th crash involving a police or fire vehicle in Florida, which launched a safety probe.
On Aug 28, an Orange County, Florida state trooper stopped to offer assistance to a disabled motorist. While the officer was assisting the driver, a Tesla Model 3 operating with Autopilot engaged struck the patrol car. The incident once again put focus on the safety of Tesla's Autopilot.
In one of the other cases, a driver was watching a movie on a phone when his vehicle rammed into a state police vehicle in North Carolina.
Less than two weeks before the latest incident on Aug 28, the NHTSA opened a formal safety probe into Tesla's Autopilot after 11 prior crashes involving police or fire vehicles.
Tesla pioneered the use of autonomous driving technology in the auto industry when its Autopilot automated driving feature was introduced in Sept 2014 on the Model S.
But Tesla has also been criticized for not using the laser-based perception technology lidar for object detection for Autopilot like other automakers are doing for their automated highway driving systems. Most developers of autonomous vehicles use a combination of cameras, radar and lidar to avoid obstacles and safely navigate.
Musk once referred to the use of lidar as a "fool's errand" despite its widespread use by automakers and dozens of autonomous driving startups. He even called lidar technology "lame".
Instead of using lidar for object detection, Tesla's Autopilot autonomous driving system is camera-based, relying on advanced computer vision and machine learning algorithms for identifying other vehicles and road lanes for safe navigation.
Autopilot is further supported by 8 external cameras, radar, and 12 ultrasonic sensors for an additional layer of safety, but its only for newer Tesla vehicles. Models built between September 2014 and October 2016 include a single camera and less-powerful radar and ultrasonic sensors.
Tesla continues to roll out more advanced Full Self-Driving (FSD) software to beta testers. FSD expands the capabilities of Autopilot from highways to secondary roads and residential streets. Tesla collected data from the vehicles to improve FSD before its official release. FSD is an optional feature that costs $12,000 extra.
Despite its name, Tesla warns customers on its website that FSD requires active driver supervision and does not make the vehicle autonomous.
In Oct 2020, when FSD beta was first offered to select Tesla owners, the NHTSA said it was closely watching Tesla's software. The agency said it was standing by and is ready to protect the public against safety risks.
Now that the high number of incidents involving Tesla's Autopilot has come to light, Tesla faces enhanced scrutiny from the NHTSA going forward.
- Baidu CEO Believes That SAE Level-4 Autonomous Driving Systems Will the First to Enter Commercial Use After L2, Skipping Over L3
- Volvo’s Electric Vehicle Brand Polestar Reports $1 Billion in Revenue in the First Half of 2022, Adds 6 New Global Markets
- Toyota is Investing an Additional $2.5 Billion to Expand its North Carolina Factory to Boost EV Battery Production
- Nexar Releases its ‘Driver Behavioral Map Data’ That Can Help Autonomous Vehicles Operate More Like Human Drivers
- Honda and LG Energy Solutions to Build a $4.4 Billion Joint Venture EV Battery Plant in the U.S.
- Mercedes-Benz Begins Production of the Highly Anticipated EQS Electric SUV in Alabama
- Toyota is Working With the U.S. Dept of Energy to Advance ‘Megawatt-Scale’ Fuel Cell Powered Stationary Energy Generators
- Mercedes-Benz Signs MoU With the Government of Canada to Source the Raw Materials for Electric Vehicle Batteries