A federal agency is investigating a ‘violent’ Tesla crash that left one person in critical condition, and police don’t yet know whether the driver was using Autopilot software

Elon Musk

  • The NHTSA is investigating a crash between a Tesla sedan and a tractor trailer in Detroit.
  • The passenger of the Tesla was hospitalized in critical condition, Detroit police said.
  • It is unknown whether the car was using Tesla’s Autopilot or full self-driving software.
  • See more stories on Insider’s business page.

A federal agency is investigating a crash between a Tesla sedan and a tractor trailer in Michigan, which left at least one person in critical condition.

Detroit police told Reuters Monday that the crash happened at 3:20 a.m. on Thursday in southwest Detroit.

A Tesla sedan drove through an intersection, struck a tractor trailer, and became wedged underneath it, the police said.

Both the driver and the passenger were taken to a local hospital, where the passenger was listed in critical condition, the police said.



The National Highway Traffic Safety Administration (NHTSA) told the publication that it had sent a crash investigation team to look into the “violent” crash.

Detroit Police Sgt. Nicole Kirkwood told AP that she could not say whether the driver was using Tesla’s Autopilot or “full self-driving” (FSD) software.

Tesla did not immediately respond to Insider’s request for comment.

Tesla’s full self-driving doesn’t make a car fully autonomous

Tesla’s electric vehicles come with Autopilot, which allows the cars to brake, accelerate, and steer automatically.

Tesla also sells its FSD software as a $10,000 one-off add-on and plans to release it as a subscription model this summer. FSD allows cars to park themselves, change lanes, and identify both stop signs and traffic lights.

The company released a beta version to some Tesla owners and employees in October, which lets drivers try some updates before they’re fully rolled out, and around 2,000 users now have access. Tesla plans to widely release the system in 2021.

Neither Autopilot nor FSD make a Tesla car fully autonomous.

At least three drivers have died while using Tesla’s Autopilot.

Joshua Brown died in May 2016 when his Model S was struck by a semitruck in Florida while traveling at 74 mph using Autopilot. Tesla said at the time that “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

Apple engineer Walter Huang died in California March 2018 when his Tesla Model X crashed into a highway barrier while on Autopilot. His family launched a lawsuit against the company, alleging the car was “defective in its design.”



In March 2019, Jeremy Beren Banner died when his Tesla Model 3 collided with a tractor trailer at 68 miles per hour while using its Autopilot mode. Banner’s family sued the carmaker, alleging wrongful death.

The National Transportation Safety Board (NTSB) has called for increased scrutiny of self-driving software.

In February, the board sent a letter to its sister agency NHTSA, asking for updated requirements for carmakers testing software like Tesla’s on public roads.

The letter mentioned Tesla by name 16 times, and said that Tesla was testing its software on public roads “with limited oversight or reporting requirements.”

Tesla CEO Elon Musk tweeted on Friday that the company had revoked access for drivers who didn’t pay close attention to the road, but added that there were “no accidents to date.”

News source