Investigators who looked into a 2019 crash that killed the driver of a Tesla Model 3 that slammed broadside into a semi trailer on a Florida freeway determined that the crash was caused by the truck driver’s failure to yield to the car’s right of way — combined with the Model 3 driver’s inattentiveness while relying on Autopilot, the partially autonomous driver-assist system.
National Transportation Safety Board investigators also chastised Tesla for failing to limit the use of Autopilot to conditions for which it designed, and it cited the National Highway Traffic Safety Administration for failing to develop a way to verify automakers’ system safeguards for partially automated driving technologies.
It also released images pulled from the Model 3 that show the semi truck obscuring the roadway in the final seconds before the car struck the trailer and passed underneath it, shearing off its roof. This is what the inattentive driver apparently never saw, and Autopilot never reacted to:
“The Delray Beach investigation marks the third fatal vehicle crash we have investigated where a driver’s over-reliance on Tesla’s Autopilot and the operational design of Tesla’s Autopilot have led to tragic consequences,” NTSB Chairman Robert Sumwalt said.
Autoblog sought comment from Tesla.
The fatal crash occurred just before sunrise March 1, 2019, when Jeremy Banner, 50, was driving his Model 3 to work on U.S. Highway 441 in Delray Beach, Florida. The semi trailer was traveling east and had pulled out into the southbound lanes of the freeway when Banner’s car slammed into the trailer, shearing off the roof of the Model 3, which coasted to a stop nearly a third of a mile later. Banner died at the scene.
Investigators say Banner was driving 69 miles per hour at the time, did not apply the brakes or take other evasive action and was operating with Autopilot, which he switched on just under 10 seconds before impact.
The system detected no steering wheel torque for the final 7.7 seconds before the crash, and neither the forward collision warning nor the automatic emergency braking systems activated. Investigators said the highway where it occurred was not compatible with Autopilot because it had 34 intersection roadways and private driveways in the immediate 5-mile vicinity. Tesla Autopilot is supposed to be used on highways with limited access and no intersecting roadways.
Tesla told NTSB investigators that forward collision warning and automatic emergency braking on the Model 3 were not designed to activate for crossing traffic or to prevent crashes at high speeds, so Autopilot did not consistently detect and track the truck as it pulled out into oncoming southbound traffic. It also said the system did not warn the driver to put his hands back on the steering wheel because the roughly 8 seconds was too short to trigger such a warning.
Tesla advertises Autopilot as a tool that “enables your car to steer, accelerate and brake automatically within its lane,” but it adds that the system features “require driver supervision and do not make the vehicle autonomous.” It also says drivers must stay alert and “keep your hands on the steering wheel at all times and maintain control of your car,” with a visual reminder whenever the system is engaged.
The truck driver, who was uninjured, told investigators he was on anti-seizure medication and had undergone refractive surgery on both eyes in 2012. He reportedly said he was able to “read with my right eye” and “see my distance in my left eye,” a condition commonly referred to as monovision or blended vision.
Banner’s family has filed a wrongful death lawsuit against Tesla, trucking company FirstFleet and the truck driver.