Over the years, Tesla owners have paid as much as $10,000 for the package, called Full Self-Driving. FSD, which can be purchased as an extra on Tesla cars, is a collection of services that adds to Tesla’s Autopilot.
As US investigators escalate their scrutiny of Tesla’s driver-assistance technology, another problem is emerging for the electric carmaker: complaints among customers that they have been sold an additional driver-assistance option that does not operate as advertised.
Over the years, Tesla owners have paid as much as $10,000 for the package, called Full Self-Driving (FSD). FSD, which can be purchased as an extra on Tesla cars, is a collection of services that adds to Tesla’s Autopilot, the driver-assistance technology that government investigators are taking a look at after a string of crashes.
Critics say FSD has not lived up to its name since its debut more than two years ago. It can help a car navigate off one highway and onto another and respond to traffic lights and stop signs. It also includes a service for summoning a car out of a parking space or parking lot with a mobile app. But full self-driving? Not quite.
When Joel Young paid $6,000 for FSD in 2019, he assumed he would receive a system that could drive anywhere on its own by year’s end. Two years later, that remains beyond the system’s abilities. Young, a lawyer, writer and car-enthusiast living in Placitas, New Mexico, recently asked Tesla to refund his money, and it declined. On Wednesday, he sued the company, accusing it of fraud and breach of contract, among other complaints.
“Tesla has not delivered what it promised,” he said.
Young’s suit is most likely the second from a customer aimed at the FSD add-on feature. Two brothers in Southern California have filed a suit that raises similar complaints. And as many enthusiasts on social media platforms like Reddit question, whether they have paid for something that does not exist, the California Department of Motor Vehicles recently said it was reviewing Tesla’s use of the term Full Self-Driving.
Also Wednesday, Sens. Richard Blumenthal, D-Conn., and Edward Markey, D-Mass., sent the chair of the Federal Trade Commission a letter calling on the agency to investigate the marketing and advertising of Autopilot and FSD.
Tesla privately acknowledges the limitations of the technology. As the public advocacy website PlainSite recently revealed after a public records request, Tesla officials have told California regulators that the company is unlikely to offer technology that can drive in any situation on its own by the end of 2021.
“If we can’t trust Tesla when they say their vehicles are full self-driving, how can we trust the company when it says they are safe?” said Bryant Walker Smith, an associate professor in the Schools of Law and Engineering at the University of South Carolina who specializes in autonomous vehicles.
Tesla did not respond to several requests for comment.
Complaints about the FSD kit may pale in comparison with the concerns that people are being killed by misuse of or glitches in Tesla’s driver-assistance technology. But they point to a common thread of Tesla’s approach to driving automation: The company is making promises that other carmakers shrink from, and its customers think their cars can do more on their own than they really can.
“One of the downsides of automated technology can be overreliance — people relying on something it may not be able to do,” said Jason Levine, executive director of the Center for Auto Safety, a nonprofit that has monitored the industry since the early 1970s.
Other automakers are being considerably more conservative when it comes to automation. The likes of General Motors and Toyota offer driver-assistance technologies akin to Autopilot and FSD, but they do not market them as self-driving systems.
Backed by billions of dollars from major automakers and tech giants, companies like Argo, Cruise and Waymo have been developing and testing autonomous vehicles for years. But in the near term, they have no intention of selling the technology to consumers. They are designing vehicles they hope to deploy in certain cities as ride-hailing services; think Uber without the drivers.
In each city, they begin by building a detailed, 3D map. First they equip ordinary cars with lidar sensors — “light detection and ranging” devices that measure distances using pulses of light. As company workers drive these cars around the city, the sensors collect all the information needed to generate the map, pinpointing the distance to every curb, median and roadside tree.
The cars then use this map to navigate roads on their own. They continue to monitor their surroundings using lidar, and they compare what they see with what the map shows, keeping close track of where they are in the world.
Tesla is taking a very different tack. The company and its chief executive, Elon Musk, believe that self-driving cars can navigate city streets without 3D maps. After all, human drivers do not need these maps; they need only eyes.
For years, Tesla has argued that autonomous vehicles can understand their surroundings merely by capturing what a human driver would see as they speed down the road. That means the cars need only one kind of sensor: cameras.
Since its cars are already equipped with cameras, Tesla argues, it can transform them into autonomous vehicles by gradually improving the software that analyzes and responds to what the cameras see. FSD is a step toward that.
But FSD has notable limits, said Jake Fisher, senior director of Consumer Reports’ Auto Test Center, who has extensively tested these services. Automatically changing lanes can be enormously stressful and potentially dangerous, for instance, and summoning the car from a parking space works only occasionally.
“These systems are good at dealing with the boring, monotonous stuff,” Fisher said. “But when things get interesting, I prefer to drive.”
Machines cannot yet reason like a human. Cars can capture what is happening around them, but they struggle to completely understand what they have captured and predict what will happen next.
That is why other companies are deploying their autonomous cars so slowly. And it is why they equip these cars with additional sensors, including lidar and radar. Radar and lidar can track the speed of nearby objects as well as their distance, giving cars a better sense of what is happening.
Tesla recently removed the radar from its new cars, which now rely solely on cameras, as the company always said they would. During a January earnings call, Musk said he was “highly confident the car will be able to drive itself with reliability in excess of humans this year.”
This promise rests on a “beta” service, now under testing with a limited number of Tesla owners, that aims to automate driving beyond highways. In a March post on Twitter, Musk estimated that 2,000 people were using the beta, called “Autosteer on city streets”.
But like Autopilot and other FSD services, the beta calls for drivers to keep their hands on the wheel and take control of the car when needed.
Most experts say this is unlikely to change soon. Given the speed of cameras and the limitations in the algorithms that analyze camera images, there are still situations where such a setup cannot react quickly enough to avoid crashes, said Schuyler Cullen, a computer vision specialist who oversaw autonomous driving efforts at the South Korean tech giant Samsung.