A Tesla Model 3 struck a Florida Highway Patrol (FHP) trooper’s vehicle on Saturday just weeks after the NHTSA established an investigation into the semi-autonomous driving functionality. While the driver and FHP confirmed the vehicle was operating on Autopilot, it is unclear if the driving assistance feature is to blame for the accident.
According to the Orlando Sentinel, a 27-year old was driving his Model 3 westbound on Interstate 4 near Orlando at around 5 a.m. eastern time on Saturday morning when the vehicle struck a Highway Patrol vehicle that was stopped on the side of the road to assist a disabled automobile on the shoulder.
The driver stated that the vehicle was operating on Autopilot, according to FHP and ABC affiliate WFTV9. The driver of the Tesla, along with the owner of the disabled vehicle, had minor injuries. The Trooper on the scene was unhurt.
Happening now: Orange County. Trooper stopped to help a disabled motorist on I-4. When Tesla driving on “auto” mode struck the patrol car. Trooper was outside of car and extremely lucky to have not been struck. #moveover. WB lanes of I-4 remain block as scene is being cleared. pic.twitter.com/w9N7cE4bAR
— FHP Orlando (@FHPOrlando) August 28, 2021
Interestingly, the accident occurred just weeks after the National Highway Traffic Safety Administration (NHTSA) opened an investigation into Tesla Autopilot. The agency told Teslarati that it would investigate eleven separate instances of accidents that occurred under Autopilot operation. However, several of the accidents in the investigation were ultimately not the fault of the system itself and was actually a result of gross negligence by the driver. Two of the eleven incidents being examined were caused by the driver being intoxicated. Another was caused by a driver with a suspended license, and four were the result of incorrect Autopilot use.
Because of its unfamiliar nature to many people, Autopilot receives a bad reputation and is often misrepresented and misunderstood by media and critics. Tesla Autopilot is not a fully autonomous driving functionality and is standard with every Tesla from 2017 or later. Tesla has never indicated that Autopilot is a replacement for human drivers and has said on numerous occasions that the system must be used while the driver is fully attentive and still focused on the road. To operate Autopilot in a vehicle, the driver’s hands must be on the wheel at all times in case of a needed intervention, and the wheel has sensors that confirm the driver is still maintaining ultimate control of the vehicle.
While the driver told FHP that the vehicle was operating under Autopilot, it is still the responsibility of the driver to maintain control of the vehicle. Frequently, the Autopilot and Full Self-Driving suites are abused by some. When these irresponsible acts of operation result in an accident or injury, Tesla takes the blame and not the driver. Unfortunately, this is not an accurate depiction of how safe Autopilot actually is.
Tesla reports the safety of its vehicles every quarter, with the most recent statistics revealing that vehicles operating under Autopilot are involved in accidents significantly less frequently than human drivers. In Q1 2021, the company said:
“…We registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.”