Law Street Media

NTSB Criticizes Tesla, Regulators For 2018 Autopilot Crash

Connected notes

Glowing spheres connected by binary code and exchanging data.

The National Transportation Safety Board (NTSB) has critiqued Tesla for a deadly 2018 crash in California that involved its Autopilot system. The NTSB also directed criticism at other regulators, arguing they lacked the oversight to prevent “forseeable abuse” of Tesla’s Autopilot feature. NTSB Chairman Robert Sumwalt stated, “[g]overnment regulators have provided scant oversight” of self-driving car systems. This was in reference to the National Highway Traffic Safety Administration (NHTSA), which has the authority to enforce and recall vehicles.

Tesla’s Autopilot is “an automated driver-assist feature sold as part of what Tesla calls a “Full Self Driving Capability” package for $7,000 that can speed up, brake and change lanes automatically, although the driver is supposed to pay attention.”

The fatal crash killed Walter Huang, the driver. Huang used the Autopilot feature while he was playing a game on his phone when the car crashed into a safety barrier. The vehicle “sped up from 62 mph to 71 mph and plowed into a damaged safety barrier at the end of a concrete wall. The wall divides a left-hand exit ramp that veers away from U.S. Highway 101, known locally as the Bayshore Freeway.” The Tesla hit two other vehicles and it burst into flames. He engaged Autopilot for about 19 minutes prior to the crash and his hands were off the steering wheel during the last six seconds., crashing head-on into a flexible steel “smart cushion” that is intended to soften the impact of a crash.” Further, “the cushion already was severely damaged. After a Toyota Prius crashed into it two months earlier, the length of the attenuator was shortened, offering less protection against the 3-foot-tall concrete median wall behind it. The safety device had gone unrepaired by the California Department of Transportation, known as CalTrans, until three days after Huang’s death.” These factors contributed to the severity of the crash.

Addressing Huang’s family at a hearing in Washington, D.C., Sumwalt stated, “[o]ur goal is to learn form [sic] what happened so others don’t have to go through what you’re going through.”

However, Sumwalt was critical of drivers who use Autopilot for self-driving. He stated, “[y]ou cannot buy a self-driving car today. You don’t own a self-driving car so don’t pretend you do.” Sumwalt did not want driver to do anything other than pay attention while driving, even if using driver-assist systems, this includes not “sleep[ing], read[ing], text[ing], [or] eat[ing].” However, perhaps car companies should not be labeling this feature as “autopilot” since, today, it is at best a driver-assist system; this could potentially cause less consumer confusion.

A preliminary NTSB report found that the car’s autopilot made a “left steering movement” toward the same area where the accident occurred days before; however, he was able to catch it in time and steer in the right direction. Huang told his family that this had happened a few times before in that same location. NTSB concluded that he was over-reliant on this technology despite knowing its shortcomings.

Huang’s family has filed a lawsuit against Tesla. His family stated that “‘based on Tesla’s advertising and promotional material’ Huang ‘reasonably believed the 2017 Tesla Model X vehicle was safer than a human-operated vehicle because of Defendant’s claimed technical superiority regarding the vehicle’s autopilot system.’” The lawsuit reiterated that the “vehicle should not leave a marked travel lane and accelerate, without the input of the operator, in such a way as to cause damage, harm or injury.” Huang’s family argued that Tesla should have known of the defect that caused it to leave lanes and strike an object, and thus should have issued a recall.

The NTSB critiqued Tesla in comparison to Uber for transparency in relation to their respective crashes. “We chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot — claims which made it seem as though Autopilot creates safety problems when the opposite is true,” Tesla said in a statement.

There were two other Tesla Autopilot crashes, one in 2016, where the system failed to identify that the car needed to brake and another in 2019, where a car crashed into a tractor-trailer that blocked its path.

The NTSB concluded in a 2017 report that these crashes are due in part to technological and driver challenges. “Many of the crashes appear to involve driver behavior factors, including traveling too fast for conditions, mode confusion, and distraction.” These three Tesla autopilot crashes are part of the 17 crashes that the NTSB is currently investigating in an effort to “advance our knowledge of safety issues.” NHTSA is also investigating 13 autopilot crashes.

Exit mobile version