Monday, December 23, 2024

Tesla to Unveil ‘Robotaxi’ While Facing Flood of Lawsuits Over ‘Self-Driving’ Accidents

Must read

On Thursday, Tesla will at last reveal its prototype for a fully autonomous “robotaxi,” a vehicle that CEO Elon Musk had originally wanted to unveil in August and envisions as the future of the brand. Certain investors and tech entrepreneurs are extremely skeptical that the company can pull off such a product. Even if a staged demonstration goes well, it could take years to manufacture (like the unfortunate Cybertruck), while Google, Amazon, and General Motors already have self-driving cabs in operation.

But while Tesla navigates those corporate headwinds, it is also facing potential legal fallout from driver-assistance systems it has sold with existing Tesla cars: Autopilot, which includes basic features like cruise control and automatic lane-changing, and the “full self-driving” (FSD) upgrade, which according to Tesla allows your vehicle to “drive itself almost anywhere with minimal driver intervention and will continuously improve.” Both the National Highway Traffic Safety Administration and the Department of Justice are investigating whether Tesla has given customers and investors the impression that its driver-assistance software is safer than it really is.

It’s not just the government probes, either: Tesla is named in more than a dozen pending lawsuits that allege injuries or deaths are directly attributable to Tesla drivers who assumed, based on Tesla’s marketing language (and Musk’s continual overpromising), that their vehicles were capable of fully autonomous driving when Autopilot or FSD was engaged. At least three of these cases are scheduled to go to trial next year.

“It’s not hyperbole to say that they’ve turned our public roadways into their own personal laboratory, and everybody else is now a guinea pig in what can very much be a life or death experiment,” says Brett Schreiber, partner at the San Diego injury law firm Singleton Schreiber, who currently represents “four different families who were either suffered injuries or lost loved ones” in accidents involving Tesla’s driver-assistance technology. (Tesla did not respond to a request for comment on these suits.)

‘Tesla has set the stage for this continued bad conduct’

The most significant legal battle over the automated driving features so far began with a suit brought by the family of Apple engineer Walter Huang, who died in 2018 when his Tesla struck a concrete freeway median in Silicon Valley. He’d had Autopilot engaged for nearly 20 minutes at the time of the collision and was traveling at over 70 miles per hour, and Huang’s family alleged that the software caused the accident. Tesla settled that case for an undisclosed amount in April of this year, on the eve of jury selection. (In 2022, as Musk committed to building a “hardcore litigation team” at Tesla, he vowed in a tweet that the company would “never surrender/settle an unjust case against us.”)

The difference between that case and those now proceeding against Tesla, Schreiber explains, is that the latest lawsuits mostly pertain to injured third parties. In suits brought by customers, Tesla can shift liability by pointing out that their user manual instructs drivers using Autopilot and FSD to remain alert and in control of the vehicle, with their hands on the wheel at all times in case human intervention is required to avert a crash. Tesla has won at least two verdicts on these grounds, Schreiber says. But what happens when a “self-driving” Tesla strikes another motorist or pedestrian?

“These are cases about shared responsibility,” Schreiber argues. “Were these drivers bad actors? One hundred percent. But every actor needs a stage, and Tesla has perpetually set the stage for this continued bad conduct.” In addition to representing several clients seeking damage for such incidents, he is collaborating with other firms around the U.S. working on similar litigation, in part to get on the same page about what relevant documents and data Tesla may have. Unlike other auto companies, Schreiber says, “they know what these vehicles are doing and where they are located every moment, down to the millimeter, anywhere in the world.”

‘[The] Tesla Model S had an Autopilot system that was still in Beta’

One case, presently scheduled for trial in June 2025, describes a 2023 collision in which a Tesla driver — named as a defendant along with the company — had the car’s automated driving technology engaged as he drove on a California freeway at approximately 65 miles per hour. The software allegedly did not take action to avoid a motorcyclist stopped for traffic ahead the lane and “rammed into” him “at freeway speed,” sending him flying into the air. The Tesla then continued, hitting the car that had been in front of the motorcycle hard enough to make it strike the next vehicle ahead. The motorcyclist, the suit alleges, “suffered and continues to suffer severe and life-threatening injuries, including severe traumatic brain injury, multiple fractures, and injuries to internal organs requiring surgical intervention and intensive care.”

Another suit, slated for a March trial, concerns a Tesla driver’s use of Autopilot in 2019 on a county road in Key Largo, Florida. By the driver’s account, he dropped his phone and took his eyes off the road as he tried to retrieve it from the floorboard — in that moment, the Tesla blew past a stop sign, warning lights, and caution signs indicating a T intersection where he had to turn either left or right. Instead, the car went straight, plowing into a truck parked on the shoulder with two people standing nearby. One was thrown 75 feet and killed; the other suffered “severe, permanent injuries,” according to the filing.

The complaint goes on to explain that the “Tesla Model S had an Autopilot system that was still in Beta, meaning it was not fully tested for safety, and, further, the system was not designed to be used on roadways with cross-traffic or intersections. Nevertheless, Tesla programed the system so that it could be operated in such areas.” Furthermore, it alleges, “Tesla programed Autopilot to allow it to be used on roadways that Tesla knew were not suitable for its use and knew this would result in collisions causing injuries and deaths of innocent people who did not choose to be a part of Tesla’s experiments,” including the injured parties.

‘Tesla has refused to use this technology because of expense and aesthetics’

In a lawsuit set for a February trial, over the death of a driver in a truck hit by a self-driving Tesla in Fremont, California in 2019, a filing claims that “Tesla’s conscious decision to expose members of the general public to its defectively designed product is despicable conduct. Tesla made a conscious decision to manufacture, distribute, market, advertise, and sell a defectively designed product it knew exposed members of the general public to a significant risk of harm purely out of a desire to maximize profits.” Although the complaint accuses the Tesla driver of negligence, it also alleges that the company’s use of terms like “Autopilot” and “Full Self-Driving” obscure the fact that all its driver-assistance features are only Level 2 systems according to engineering standards organization SAE International, offering “driver support” rather than the “automated driving” of systems at higher levels. Vehicles at Level 3 or 4, the filing notes, make use of an “expensive combination of cameras, multiple radar units, and one or more light-detection-and-ranging (‘LIDAR’) units,” whereas “Tesla has refused to use this technology because of expense and aesthetics.”

Many of these lawsuits, covering crashes in New York and Maryland to Texas and Indiana, pertain to incidents in years when Tesla’s driver assistance features were still in beta, meaning that drivers were effectively volunteering to test them before a wider release. That’s the basis of the recurring claim that Tesla was recklessly using other motorists and pedestrians as test subjects without their consent. “Unlike the drivers of these vehicles, John and Jane Q. Citizen never agreed to sign up for the beta test,” Schreiber says. “Tesla has has created a circumstance where, unlike every other major automotive manufacturer since the beginning of time, they test their technology in production vehicles, rather than doing proper [research and development] on the front end.”

Trending

FSD officially emerged from beta earlier this year, with Tesla now describing it as “Full Self-Driving (Supervised)” in order to highlight the importance of driver attention. Still, FSD — a version of which will presumably operate Musk’s “robotaxi” — remains an unreliable technology, inadequate at ensuring driver focus and prone to basic, potentially catastrophic errors. It has even been shown to fail the California DMV’s road test for a driver’s license.

As for the Thursday robotaxi event, Schreiber says he’ll be watching, “only because it’s part of the the long line of Tesla’s deeply flawed, not-ready-for-primetime vehicles.” Whether it’s an ongoing legal effort to hold Tesla accountable for allegedly unsafe software or trying to keep up with Musk’s latest gambit to dazzle shareholders, Schreiber says, “it’s the same shit, different shovel. That’s what we’re dealing with here.”

Latest article