Tesla test drivers believe they’re on a mission to make driving safer for everyone. Skeptics say they’re a safety hazard.

Tesla FSD 10.5 Beta Looks Like A Huge Improvement Over 10.4

Drivers said they are willing to take on the risk even if they have to intervene — believing they are on a world-changing mission

SAN FRANCISCO — Kevin Smith has a love-hate relationship with driving. He was rear-ended twice in a short span of time, his daughter crashed her car weeks after getting her drivers license and his mother chose to surrender hers after she started missing red lights.



“I felt like I needed better driver assistance or I was going to have a panic attack,” he said.

Smith is now part of a group of at least 12,000 beta testers for Tesla’s polarizing “Full Self-Driving” software, which can attempt many everyday driving tasks, albeit sometimes unpredictably. Despite its flaws, Smith believes it’s safer. He is willing to take on the task even if he knows he might have to intervene when software makes mistakes: running a red light, driving onto light-rail tracks or nearly striking a person in a crosswalk, all scenarios that beta testers interviewed by The Washington Post have encountered on the road.

Tesla FSD Beta video

“It de-stresses me,” he said in an interview. “I observe more. I’m more aware of everything around. I feel safer with it on.”

At the heart of Tesla’s strategy is a bold bet that the thousands of chosen test drivers, many of whom passed a safety screening that monitored their driving for a week or more, will scoop up enough real-world data to rapidly improve the software on the fly. In navigating public roads with unproven software, Tesla’s Full Self-Driving beta testers have not just volunteered to help, but have taken on the liability for any mistakes the software might make.



Safety experts and autonomous driving companies say the decision to do so is reckless and shortsighted. The National Highway Traffic Safety Administration recently required deployers of autonomous vehicles and advanced driver-assistance systems to report many crashes within a day of learning about the incidents, stoking fears that Tesla’s brazenness will invite tighter regulations, slowing progress across the industry.

Tesla FSD Beta 10.4

Though its name suggests otherwise, Tesla’s Full Self-Driving software is not autonomous, and drivers must pay attention to the road at all times. Full Self-Driving is an evolution of the earlier software suite Autopilot, which could navigate vehicles from highway on-ramp to off-ramp, making lane changes and steering within marked lane lines. Full Self-Driving expands those capabilities to residential and city streets.

For Tesla’s willing guinea pigs, the promise of Full Self-Driving, even if risky and unproven, offers an immediate antidote to traffic monotony and a glimmer of hope for safer roads, where 20,000 Americans died in the first half of 2021 alone, an 18 percent surge. Many Tesla owners brush aside safety concerns about the software in part because they don’t think it could get any worse.



It’s a big gamble that the technology will improve faster than the associated liability with testing it on the roads, said Andrew Maynard, a professor at Arizona State University who is director of its Risk Innovation Lab.

Elon Musk Tesla

“It’s a gamble that may pay off — if there are few serious incidents involving drivers, passengers, other road users [et cetera], consumer opinion continues to support the company, and Tesla stays ahead of the regulators, I can see a point where the safety and utility of FSD far outstrips concerns,” he added.



But drivers say their experience shows that day is far off. Some were startled one day in October when Tesla vehicles started behaving erratically after receiving a software update overnight. The cars began abruptly braking at highway speeds, which Tesla said came after false triggers of the forward-collision warning and automatic emergency braking systems prompted by a software update.

The company later issued a recall, and owners — including Smith — said they were dismayed by its actions related to the move.

Others have soured in the meantime.

Marc Hoag, a self-described Tesla fanboy and a shareholder of its stock, waited for a year and a half to get the software. But once he tried it, he was disappointed.

“It’s still so impossibly bad,” he said.



Hoag said the driving experience is worse in person than it looks on videos he’s posted to YouTube, which show the car taking turns too wide, speeding into curves and mistaking a crosswalk sign for a pedestrian — while otherwise acting apprehensively at intersections alongside other traffic. Its fidgety wheel and the indecisive braking make for an unpleasant ride, and its unpredictable nature make it scary, he said.

“It seems to me remarkably, shockingly premature to allow it to be tested on public streets,” he added. “You can’t test it and not be reckless at the same time.”

Tesla Autopilot

In 2019, Musk boldly promised that the company’s cars would have the capability of driving themselves — turning Teslas into a fleet of 1 million “robotaxis” by 2020. It’s now clear that robotaxis weren’t on the horizon and a California DMV memo reported by Bloomberg — and released to the legal transparency site PlainSite — suggested Tesla knew it was overpromising.

Many loyal Tesla drivers don’t fault Musk for promising the impossible.



“It’s understanding the game Elon is playing with all of us. Most of us get it and we’re willing to help out,” said Nicholas Puschak, a Tesla owner who is eagerly awaiting his turn to become a beta tester of the Full Self-Driving software. “Tesla is really pushing the envelope, not only in technology but the way they are releasing it. I don’t think any other corporation would have the guts to do what Tesla is doing.”

Part of the appeal for most beta testers interviewed by The Post is that they can play a small part in advancing the technology more quickly. When the car does something wrong, drivers can hit a button on the car’s display that causes data on the incident to be sent to Tesla, so that software engineers can analyze it and improve the algorithm.

Though hitting that button is satisfying, it may not help very much, said Mahmood Hikmet, a New Zealand-based self-driving car engineer who works in another part of the industry focused on autonomous shuttles.

He and other experts in the field say Tesla can’t use the fire hose of data from its cars all at once. Other companies instruct safety drivers to test specific problem areas at specific times, rather than have them drive around aimlessly.



“You don’t really need 12,000 or more people,” Hikmet said. “For a company with the sixth-largest market cap in the world, they’d be able to hire 50 to 100 internal testers and run into all the issues they’re running into. … There’s only so much you can gather.”

The latest is that Tesla will offer the feature as a subscription in early 2021. Brandon M. / YouTube

For Chris, buying a Tesla was a “life changer” because the Autopilot feature allowed him to improve his grueling commute that begins on the dirt road leading out of his rural home in Fenton, Mich., and ends in Ann Arbor.

Chris, who requested using only his first name due to safety concerns, has seen the car do some things completely wrong during early beta testing, like stop for no reason or nearly smash into a curb. But he’s intervened and steered the car to safety every time and says the car performs more than 90 percent of the driving flawlessly.

“I’m not going to put anybody in danger. I have to be mindful of the cars around me,” he said. “You’re potentially getting to be part of this future of transport as you are basically training this car.”

News source

Pin It on Pinterest