December 6, 2023

Car Auto Finance

Car Auto Finance FOR Everyone

Why Tesla’s volunteer army of ‘test pilots’ is helping train Full Self-Driving on public roads

Placeholder while article actions load

SAN FRANCISCO — Kevin Smith has a love-hate relationship with driving. He was rear-ended twice in a short span of time, his daughter crashed her car weeks after getting her drivers license and his mother chose to surrender hers after she started missing red lights.

“I felt like I needed better driver assistance or I was going to have a panic attack,” he said.

Smith is now part of a group of at least 12,000 beta testers for Tesla’s polarizing “Full Self-Driving” software, which can attempt many everyday driving tasks, albeit sometimes unpredictably. Despite its flaws, Smith believes it’s safer. He is willing to take on the task even if he knows he might have to intervene when software makes mistakes: running a red light, driving onto light-rail tracks or nearly striking a person in a crosswalk, all scenarios that beta testers interviewed by The Washington Post have encountered on the road.

“It de-stresses me,” he said in an interview. “I observe more. I’m more aware of everything around. I feel safer with it on.”

At the heart of Tesla’s strategy is a bold bet that the thousands of chosen test drivers, many of whom passed a safety screening that monitored their driving for a week or more, will scoop up enough real-world data to rapidly improve the software on the fly. In navigating public roads with unproven software, Tesla’s Full Self-Driving beta testers have not just volunteered to help, but have taken on the liability for any mistakes the software might make.

The Post interviewed a half-dozen of the beta testers who paid as much as $10,000 for the ability to upgrade their cars with the software. All self-described fans of Tesla, the testers were all awed by what the software can do, but well aware of its limitations and the risks involved. Some beta testers have found the software too inconsistent and harrowing to use and faulted Tesla for releasing it too early.

Billionaire Elon Musk named Time’s Person of the Year for 2021

“In the beginning when I heard it was going to be pushed out to the public I was like, ‘Uh-oh, not good,’” said one former Tesla Autopilot engineer, who had early access to the Full Self-Driving beta and spoke on the condition of anonymity, fearing retaliation from the company. He recalls thinking: “It’s not ready to be put into the hands of the public.”

Tesla did not respond to a request for comment. The company has argued in favor of its software when compared with normal driving in the past. For example, it has said its Autopilot driver-assistance software is safer than normal driving, citing crash data comparisons. Tesla CEO Elon Musk has called Autopilot “unequivocally safer.” Such data, however, is not directly comparable because Autopilot is limited to certain types of roads and conditions.

The race to develop fully autonomous vehicles, among the most ambitious technological undertakings since the moon landing, is now a multibillion-dollar industry. The players include major automakers, Silicon Valley giants like Apple and Google, and a host of deep-pocketed startups, all pouring money and personnel into the effort.

Each company has its own unique approach to testing the technology, although most companies pay “safety drivers” to methodically test the software in the real world. Several of these pay hundreds or thousands of “labelers” to painstakingly annotate “high definition” maps to ensure the cars know roads, speed limits and traffic signs, even without the array of high-priced sensors placed around the vehicles.

Tesla‘s approach stands out as radically different. The software — which Musk called part of a “debatable” upgrade in July — has been used to update cars already on the roads and in the hands of drivers.

Safety experts and autonomous driving companies say the decision to do so is reckless and shortsighted. The National Highway Traffic Safety Administration recently required deployers of autonomous vehicles and advanced driver-assistance systems to report many crashes within a day of learning about the incidents, stoking fears that Tesla’s brazenness will invite tighter regulations, slowing progress across the industry.

Tesla is putting ‘self-driving’ in the hands of drivers amid criticism the tech is not ready

Though its name suggests otherwise, Tesla’s Full Self-Driving software is not autonomous, and drivers must pay attention to the road at all times. Full Self-Driving is an evolution of the earlier software suite Autopilot, which could navigate vehicles from highway on-ramp to off-ramp, making lane changes and steering within marked lane lines. Full Self-Driving expands those capabilities to residential and city streets.

For Tesla’s willing guinea pigs, the promise of Full Self-Driving, even if risky and unproven, offers an immediate antidote to traffic monotony and a glimmer of hope for safer roads, where 20,000 Americans died in the first half of 2021 alone, an 18 percent surge. Many Tesla owners brush aside safety concerns about the software in part because they don’t think it could get any worse.

Christopher Hart, former chairman of the National Transportation Safety Board and now a safety consultant, said driver-assistance systems like Tesla’s should be tested more thoroughly before they’re released on public roads.

“It’s making an already skeptical public even more skeptical,” he said of the effort. Hart, who advised Uber on its safety culture after a deadly crash involving one of its autonomous vehicles, said he wants the technology to succeed and think it could save lives eventually if done in a careful and considerate way.

Tesla’s recent Full Self-Driving update made cars go haywire. It may be the excuse regulators needed.

It’s a big gamble that the technology will improve faster than the associated liability with testing it on the roads, said Andrew Maynard, a professor at Arizona State University who is director of its Risk Innovation Lab.

“It’s a gamble that may pay off — if there are few serious incidents involving drivers, passengers, other road users [et cetera], consumer opinion continues to support the company, and Tesla stays ahead of the regulators, I can see a point where the safety and utility of FSD far outstrips concerns,” he added.

But drivers say their experience shows that day is far off. Some were startled one day in October when Tesla vehicles started behaving erratically after receiving a software update overnight. The cars began abruptly braking at highway speeds, which Tesla said came after false triggers of the forward-collision warning and automatic emergency braking systems prompted by a software update.

The company later issued a recall, and owners — including Smith — said they were dismayed by its actions related to the move.

Six Tesla workers file additional lawsuits alleging sexual harassment

Others have soured in the meantime.

Marc Hoag, a self-described Tesla fanboy and a shareholder of its stock, waited for a year and a half to get the software. But once he tried it, he was disappointed.

“It’s still so impossibly bad,” he said.

Hoag said the driving experience is worse in person than it looks on videos he’s posted to YouTube, which show the car taking turns too wide, speeding into curves and mistaking a crosswalk sign for a pedestrian — while otherwise acting apprehensively at intersections alongside other traffic. Its fidgety wheel and the indecisive braking make for an unpleasant ride, and its unpredictable nature make it scary, he said.

“It seems to me remarkably, shockingly premature to allow it to be tested on public streets,” he added. “You can’t test it and not be reckless at the same time.”

In 2019, Musk boldly promised that the company’s cars would have the capability of driving themselves — turning Teslas into a fleet of 1 million “robotaxis” by 2020. It’s now clear that robotaxis weren’t on the horizon and a California DMV memo reported by Bloomberg — and released to the legal transparency site PlainSite — suggested Tesla knew it was overpromising.

Many loyal Tesla drivers don’t fault Musk for promising the impossible.

“It’s understanding the game Elon is playing with all of us. Most of us get it and we’re willing to help out,” said Nicholas Puschak, a Tesla owner who is eagerly awaiting his turn to become a beta tester of the Full Self-Driving software. “Tesla is really pushing the envelope, not only in technology but the way they are releasing it. I don’t think any other corporation would have the guts to do what Tesla is doing.”

Tesla tempted drivers with ‘insane’ mode and now is tracking them to judge safety. Experts say it’s ludicrous.

Part of the appeal for most beta testers interviewed by The Post is that they can play a small part in advancing the technology more quickly. When the car does something wrong, drivers can hit a button on the car’s display that causes data on the incident to be sent to Tesla, so that software engineers can analyze it and improve the algorithm.

Though hitting that button is satisfying, it may not help very much, said Mahmood Hikmet, a New Zealand-based self-driving car engineer who works in another part of the industry focused on autonomous shuttles.

He and other experts in the field say Tesla can’t use the fire hose of data from its cars all at once. Other companies instruct safety drivers to test specific problem areas at specific times, rather than have them drive around aimlessly.

“You don’t really need 12,000 or more people,” Hikmet said. “For a company with the sixth-largest market cap in the world, they’d be able to hire 50 to 100 internal testers and run into all the issues they’re running into. … There’s only so much you can gather.”

City planners eye self-driving vehicles to correct mistakes of the 20th-century auto

For Chris, buying a Tesla was a “life changer” because the Autopilot feature allowed him to improve his grueling commute that begins on the dirt road leading out of his rural home in Fenton, Mich., and ends in Ann Arbor.

Chris, who requested using only his first name due to safety concerns, has seen the car do some things completely wrong during early beta testing, like stop for no reason or nearly smash into a curb. But he’s intervened and steered the car to safety every time and says the car performs more than 90 percent of the driving flawlessly.

“I’m not going to put anybody in danger. I have to be mindful of the cars around me,” he said. “You’re potentially getting to be part of this future of transport as you are basically training this car.”


link