On Nov. 23, Tesla and Twitter billionaire Elon Musk tweeted: “Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option. Congrats to Tesla Autopilot/AI team on achieving a major milestone!” The following day was Thanksgiving, and a white Tesla Model S car changed lanes traveling east on the San Francisco to Oakland Bay Bridge—and then came to an abrupt halt.
The Tesla, which had been traveling at around 55 miles per hour, dropped speed to 20 mph and then stopped directly in front of a car that was traveling in the lane the Tesla was moving into. The result was an eight-vehicle crash that injured nine people, including a 2-year-old. The driver of the Tesla told authorities that he had been using the new self-driving feature Musk had boasted about just the day before.
The Intercept’s Ken Klippenstein has gotten his hands on the highway surveillance footage of the event.
In the video you can see the car changing lanes and then stopping right in front of another car already moving through the passing lane. The change of lanes was already an ill-advised move, and then stopping was incredibly damaging. The crash was compounded by the fact that the lane change came as the road went from being open-aired into a tunnel that passes through Treasure Island at around the halfway point of the about 4.5 mile long span of bridge. The cars entering the tunnel can lose the ability to interpret whether brake lights have come on versus automated tail lights. The cars bump into each other, with one car being up-ended from behind. It is the up-ended car that contained the 2-year-old “who suffered an abrasion to the rear left side of his head as well as a bruise.”
This is just one of the many recently published videos and stories of Tesla automated failures. A warning in advance: Some of the accidents are terrifying and include video of cars accelerating to high speeds without the driver’s ability to control the vehicle, resulting in deaths. According to the National Highway Traffic Safety Administration National Highway Traffic Safety Administration, cars with advanced driver assistance systems reported 605 crashes between July 2021 and October 2022.* Of those reported crashes, almost 80% involved Teslas.
No one died in this crash (miraculously), but the May 2022 crash involving a Tesla traveling at 70 mph through city streets in Columbus, Ohio, was terrifying.
As the Intercept and others have pointed out, Musk himself has always promoted the technological futurism of his product, and more importantly has hung the survival of his electric vehicle business on its ability to do what other car manufacturers cannot do. Over the summer, Musk gave an interview where he explained that developing fully functioning and safe self-driving technology was “essential” to the Tesla business model: “It’s really the difference between Tesla being worth a lot of money or worth basically zero.”
There are other autopilot technologies out there being employed by other manufacturers, but most if not all of them have been very clear to stay away from using the “full-self-driving” label for what it is they are able to actually accomplish with their tech. It is important to note that Tesla and its branding in connection with its technology seems to be the issue here.
Tesla has about 830,000 vehicles with the systems on the road, giving it the highest ratio of crashes to self-driving vehicles. The next automaker with the highest was Honda. Honda reported 90 crashes using driver assisted systems, but Honda says it has about six million vehicles on roads in the United States. Subaru was next with 10, and all other automakers reported five or fewer.
One of the main issues with self-driving vehicles is that the businesses spending all of the money on developing them are trying to make money. That has driven them to promote the idea that fully self-driving vehicles are just around the corner. They aren’t. Even the people who first came up with the science and engineering don’t believe that. As Anthony Levandowski, one of the self-driving world’s biggest stars and pioneers, told Bloomberg a few months ago that while he once believed self-driving vehicles were imminent, he realizes there is a delusion about what artificial intelligence can and cannot do at this moment in time, explaining: “You think the computer can see everything and can understand what’s going to happen next. But computers are still really dumb.”
Update Jan. 12, 2023:
*It has only been since June 2021 that the National Highway Traffic Safety Administration has required all automakers to report “timely and transparent notification of real-world crashes associated with ADS and Level 2 ADAS vehicles from manufacturers and operators.”