On Tuesday, the National Transportation Safety Board (NTSB) offered an update about a 2018 Tesla crash that occurred in Culver City, California.
In the non-fatal accident, the owner of a 2014 Model S collided with a parked firetruck on Interstate 405. Subsequently, the unidentified driver told investigators that he had the vehicle’s Autopilot system engaged. The NTSB reports that Tesla’s driver assistance program was indeed active during the crash. Moreover, the organization found that the application was active for an extended period despite the fact that the operator kept his hands off the wheel.
The NTSB’s Findings
The NTSB’s report indicates that the Model S driver had the Autopilot system engaged for 13 minutes and 48 seconds during the fateful highway drive. During that time, the operator only had his hands on the steering wheel for 51 seconds. Notably, the owner did not have control of his vehicle in the 3 minutes and 41 seconds preceding the crash.
Thankfully, the fire truck was unoccupied when the Model S hit it.
The organization also discovered that Autopilot’s alert system warned the driver to put his hands on the wheel on four separate occasions. Following each alert, the operator followed the instructions but went back to not paying attention when the warning ceased. The car’s owner told the NTSB that he was eating a bagel and drinking coffee right before the crash occurred.
Throughout its history, Tesla has maintained that Autopilot is an advanced driver assistance program, not an autonomous operation application. The firm also sells an upgraded version of the feature called “Full Self-Driving.” Moreover, Tesla has promised to unveil a fleet of fully autonomous robo-taxis sometime next year.
According to the NTSB’s report, the carmaker’s self-driving roll out plans may be overambitious. Indeed, the federal government is unlikely to approve its use without human operators if it can’t detect and avoid an object the size of a fire truck.
The NTSB’s new findings will also likely have an impact on the two Autopilot-related lawsuits that Tesla is currently battling.
Wrongful Death Lawsuits
In March, the surviving family of Walter Huang filed a wrongful death lawsuit against Tesla. Last year, Huang died after his Model X collided with a concrete barrier on a California highway. At the time of the fatal collision, the vehicle’s autopilot system was engaged.
The NTSB found that the car actually raised its speed from 62 to 70 miles per hour in the seconds before hitting the barrier. Huang’s widow filed the suit against the carmaker alleging that Autopilot’s defective design caused her husband’s death. In response, Tesla claimed that the driver’s death occurred because he didn’t heed warnings to take control of the SUV.
Similarly, the family of Jeremy Beren Banner sued Tesla in August for its part in his death. In March, Banner died after his Model 3 drove underneath a tractor-trailer at 68mph, shearing off the sedan’s top. The NTSB found that the driver engaged Autopilot and took his hands off the wheel seconds before the collision.
As with the Huang case, the Banners’ lawsuit asserts that Tesla’s Autopilot is inherently defective. Currently, the NTSB is investigating both fatal crashes.
Since the NTSB has maintained that operator error has played a factor in the aforementioned Tesla crashes, the company will likely prevail in the lawsuits it faces. However, if the plaintiffs win their cases, it will be because Tesla overhypes the capability of its driver assistance program. Indeed, it’s hard to believe that Autopilot isn’t an autonomous operation application when it demonstrates that exact function in Tesla’s video advertising.