‘One of the most dangerous and irresponsible actions by an auto company in decades’, activist Ralph Nader calls on regulators to recall Tesla’s FSD technology


Ralph Nader, a consumer and public rights activist, recently harshly criticized Tesla’s self-driving technology. In a letter to regulators, Nader said Tesla’s Autopilot and its enhanced version, Full Self-Driving (FSD), are dangerous and should not be allowed on the roads. He described Tesla’s large-scale deployment of FSD as “one of the most dangerous and irresponsible actions by an automaker in decades.” Nader is asking regulators to recall this technology.

Ralph Nader is probably America’s most famous consumer and public rights activist, a role that has repeatedly brought him into conflict with corporations and government. Nader made headlines in 1965 with his book “Unsafe at Any Speed,” which blamed the auto industry for producing unsafe vehicles. He became a hero in many people’s eyes when General Motors executives hired private investigators to harass him, before issuing a public apology to him during a televised Senate committee hearing.

In his letter to regulators, posted on his own website, Nader sharply criticizes Tesla and blames the company for widely deploying still-experimental technology. “Tesla’s major deployment of so-called Full Self-Driving (FSD) technology is one of the most dangerous and irresponsible actions by an automaker in decades. Tesla should never have incorporated this technology into its vehicles. Today, more than 100,000 Tesla owners use technology that research shows malfunctions every eight minutes,” he wrote.

Tesla has always insisted that the Autopilot and FSD functions of its electric vehicles are safe, safer than human driving. “Tesla’s Autopilot and FSD features improve our customers’ ability to drive more safely than the average driver in the United States,” wrote Rohan Patel, senior director of public policy at Tesla, in a letter dated 4 March 2022 to US Democratic Senators who had asked the company for an explanation of the rapidly increasing number of crashes involving Autopilot. However, senators and activists are not entirely convinced.

Tesla CEO Elon Musk has touted self-driving cars as the next big thing for years. In 2015, he said self-driving vehicles would be on the road within two years. Although that timeline hasn’t been met, Musk has yet to give up on his dream or temper his ambitions. In May 2022, he again said fully self-driving cars should be available around the same time next year. But Nader, a four-time presidential candidate and pioneer of modern car safety standards, thinks Musk has done enough damage and needs to be stopped.

“I call on federal regulators to act immediately to prevent more deaths and injuries from Tesla crashes with this technology. The National Highway Traffic Safety Administration (NHTSA) has the power to act quickly to prevent such disasters. NHTSA has been investigating Tesla and its fully self-driving technology for many years. NHTSA must use its safety recall power to order that FSD technology be removed from every Tesla,” he asks the regulator. Nader believes that Tesla should be held responsible for the accidents.

“Our country must not allow this faulty software, which Tesla itself warns can do ‘the wrong thing at the worst time,’ on the streets where children go to school. Together, we must send an urgent message to regulators concerned about the safety of victims: Americans must not be used as guinea pigs for a powerful, high-profile corporation and its famous CEO. No one is above the laws of manslaughter,” he concluded. Recall that, for his contributions to vehicle safety, Nader was even inducted into the Automotive Hall of Fame in 2016.

Nader’s letter is the latest in a growing chorus of voices calling on the government to make a decision on Tesla’s FSD, which critics say pushes the boundaries of what should be available to drivers. NHTSA is currently investigating 16 crashes in which owners of Tesla vehicles using Autopilot crashed into stationary emergency vehicles, resulting in 15 injuries and one death. Most of these incidents took place after dark, with the software ignoring scene control measures including traffic lights, flares, cones, and an illuminated arrow sign.

The investigation has recently been transformed into “technical analysis”, which is the second and final phase of an investigation before a possible recall. Upon delivery, Tesla vehicles are equipped with Autopilot. For an additional $12,000, owners can purchase the advanced FSD option, which Musk has repeatedly promised will one day offer fully autonomous capabilities. But to this day, the FSD also remains an advanced “level 2” driver assistance system. This means that the driver must remain fully involved in the operation of the vehicle when in motion.

In addition to crashes involving emergency vehicles, NHTSA has also compiled a list of special investigations into other crashes, in which the agency collects data beyond what local authorities and companies insurance usually collect at the scene. By the end of July, 48 crashes were on the agency’s special investigation list, 39 of which involved Tesla vehicles using Autopilot. According to NHTSA data, nineteen people, including drivers, passengers, pedestrians, other drivers and motorcyclists, were killed in these Tesla crashes.

In early August, the California Department of Motor Vehicles (DMV) accused Tesla of falsely advertising its Autopilot and FSD features, alleging the company made “false or misleading” statements about the self-driving capabilities of its vehicles. vehicles. The DMV’s action could result in the suspension of Tesla’s licenses to produce and sell cars in California. Note that Tesla has already faced similar complaints in the past. In 2016, Germany asked the company to stop using the term “Autopilot”, fearing that it could suggest that its vehicles are fully autonomous.

Last year, U.S. Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) asked the Federal Trade Commission (FTC) to investigate how Tesla advertises Autopilot and FSD. They claimed the automaker has “exaggerated the capabilities of its vehicles”, which could “pose a threat to motorists and other road users”. Now Nader lends his expertise and reputation to wrestling. The consumer advocate said NHTSA must act before someone else is killed.

Source: Ralph Nader

And you?

What is your opinion on the subject?
What do you think of activist Ralph Nader’s statements?
Do you think regulators should follow his recommendations?
What do you think of the Autopilot and FSD functions of Tesla’s electric vehicles?
What do you think would happen if Autopilot and FSD were banned by regulators?

See as well

Tesla Vehicle With Autopilot On Hits Motorbike On Highway And Kills Driver, Crash Again Calls Reliability Of Tesla’s Driver Assistance System Into Question

Tesla’s Autopilot practically slams a Model 3 into an oncoming streetcar, ‘Full Self-Driving’ option was on

Tesla tells U.S. lawmakers the Autopilot system requires ‘constant monitoring’, bolsters criticism that name misleads drivers

Munich court orders Tesla to reimburse customer for problems with Autopilot, after finding safety gaps in automaker’s technology

Tesla Autopilot: US investigates feature after 11 Teslas crash into emergency vehicles

Leave a Comment