The NTSB Determines Tesla ‘Autopilot' Was Engaged Before Fatal Crash in Florida
After a fatal accident in Florida involving a Tesla Model S operating in "Autopilot" mode, the National Highway Safety Board (NHSB) launched an investigation into what caused the vehicle to drive at high-speed into a semi-truck without automatically applying the brakes or attempting to steer around it.
It was at least the third fatal crash involving a Tesla vehicle while using the automaker's Autopilot autonomous driving feature.
The NTSB's preliminary report said the driver engaged Autopilot around 10 seconds before crashing into a semi-truck, and the system did not detect the driver's hands on the wheel for fewer than eight seconds before the crash.
The March 1 crash in Delray Beach killed 50-year-old Jeremy Beren Banner when his 2018 Model 3 hit a semi-truck that crossed his path in early morning traffic. The NTSB said it had reviewed forward-facing video from the Tesla in the Delray Beach crash.
In addition, the NTSB determined that the Model S was traveling over the legal speed limit of 68 miles (109 km) per hour (mph) on a highway with a 55-mph (89-kph) speed limit, and neither the system nor the driver made any evasive maneuvers, the agency said.
Tesla said that soon after the crash it shared information with investigators about the Autopilot status and said after the driver engaged the system he "immediately removed his hands from the wheel. Autopilot had not been used at any other time during that drive."
Tesla Stands Behind the Safety of its Autopilot
Although Autopilot was actively engaged in the seconds before the crash, Tesla stands behind the safety of it semi-autonomous driving feature. The company added that "Tesla drivers have logged more than one billion miles with Autopilot engaged and insists that the system is safe.
In a statement the automaker wrote, "Our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance."
Regardless, the crash has brought renewed scrutiny about the safely of Tesla's Autopilot, which Elon Musk said would eventually replace the need for a steering wheel or pedals.
In Oct 2015, Musk boasted that "In the long term, drivers will not need to keep their hands on the wheel. Eventually there won't be wheels or pedals."
In 2017, the NTSB said that Tesla lacked proper safeguards allowing the driver "to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention."
Since Autopilot was launched it has undergone software updates. One of them includes Tesla incorporating real-time data from the vehicle's steering angle sensor, which measures minute variances of the steering angle and torque resistance to determine if a drivers hands are touching the steering wheel.
If the sensor detects zero driver input, a series of dashboard warnings alerts the driver. If the driver does not respond, the Autopilot system is automatically disengaged.
In another fatal crash in Mountain View, Calif in March 2018 involving a Model X in Autopilot mode, Tesla said the vehicle logs showed the driver had received warnings to put his hands on the wheel, but no action was taken by the driver. That incident is being investigated by both the NTSB and the National Highway Traffic Safety Administration.
The National Highway Traffic Safety Agency (NHTSA) is also investigating a fatal incident in Davie, Florida, on Feb. 24, 2016 involving a Tesla Model S that caught fire and burned.
NHTSA has the Authority to Demand a Tesla Recall
NHTSA has the authority to demand a recall if it believes a defect poses an unreasonable safety risk, while the NTSB is the agency which makes safety recommendations.
NHTSA is also investigating a Jan 23, 2018 crash of a Tesla vehicle apparently traveling in Autopilot that struck a fire truck in Culver City, California. In that incident, the Tesla plowed into the back of the fire truck that was stopped in the lane ahead. The driver was uninjured.
Another accident occured in May 2018 crash in Utah of a Tesla in Autopilot mode, which resulted in only minor injuries to the driver.
The incidents involving Tesla's Autopilot have raised questions about the safety of advanced driver assist systems (ADAS) throughout the automotive industry. Some of these systems include General Motors' Super Cruise autonomous driving feature in Cadillac models and Nissan's Pro Pilot Assist, which can perform automated highway driving tasks in traffic with little or no human intervention.
Tesla CEO Elon Musk continues to insist that Autopilot is safer than traditional cars. However, auto safety experts warn that the semi-autonomous features give drivers a false sense of security, allowing them to become distracted.
- Florida Governor Signs Bill Allowing Self-driving Vehicles to Operate Without a Safety Driver
- Israeli Lidar Company Innoviz Technologies Raises an Additional $170 Million
- Workhorse, the EV Company Looking to Move Into GM’s Shuttered Ohio Plant Gets a $25 Million Cash Infusion
- Toyota to Speed up its EV Development, Will Turn to China for Batteries
- General Motors CEO Defends Plan to Sell its Shuttered Ohio Assembly Plant to EV Startup Workhorse
- Polestar 1 Enters Final Prototype Stage Before Production
- HERE Technologies Making it Easier for Drivers to Navigate with New Partnership
- Automotive Vision Sensor Startup TriEYE Raises $17 Million From Intel Capital