Tesla is one of the most lauded and criticized automotive and technology companies out there. Every time the company or its CEO, Elon Musk (the man who is sometimes compared to comic book hero Tony Stark, a.k.a. Iron Man), makes headlines—or even tweets—the stock moves significantly and more people jump on the Internet to rant and rave about the good and evil of Tesla. Recently, that debate has gotten more vociferous as the National Transportation Board recently launched an investigation of the Tesla autopilot crash that killed a Florida man in May of this year.
In order to understand the debate, we have to go back to October of last year. At that time, to the rejoicing of most of the tech world, Tesla released it’s “self-driving” system called Autopilot. The system was made available to all owners of new Tesla Model X SUVs or Model S sedans for a price of around $2500. The system essentially makes the vehicles semi-autonomous with a simple over-the-air software update. Autopilot uses radar, cameras, GPS, and ultrasonic sensors to do things like keep the car in its lane, ensure a safe following distance, and change lanes. It is also supposed to warn the driver if there is something blocking the car’s progression down the road and in dire situations apply the brakes to stop the car safely. This is what is known as Level 3 autonomy. There are 5 levels of autonomy set out by the National Highway Safety Transportation Board or the NTSB, and each level increases the amount of input and interaction a driver has to have when behind the wheel. The release of Autopilot was seen as the first steps towards completely autonomous driving and lauded as a new dawn in driving. In reality the features that were released are readily available (and far more reliable) in cars like any modern Mercedes-Benz equipped with the Intelligent Drive system.
Rules about autonomous vehicles are murky at best. They vary from state to state and in some places they don’t exist at all. Tesla wasn’t doing anything wrong by flipping on their Autopilot system and, as we mentioned before, the systems that are employed in Tesla are already in regular use in many modern day cars. The difference in the Tesla case is that the system allows drivers to take their hands off the wheel for extended periods of time before it disables the system. According to CNNMoney it still requires that drivers watch the road but it doesn’t force them to. In a car like a Mercedes, for example, the system will warn you to put your hands back on the wheel and disable the system if you don’t do so within a set amount of time. The Autopilot system doesn’t do that. These feature decisions have been made by the companies and not by a governing body. That is why there is a difference between the behavior of the systems.
The other aspect of Autopilot that makes it unique to Tesla is that it has the ability to learn from the “fleet.” This means that each Tesla is outfitted with a sensor that sends travel information back to Tesla. There the information is turned over to the central Autopilot system which then uses the anonymized data to create better maps of “where cars go and don’t go,” according to Green Car Reports. Musk argues that he has warned his buyers to “exercise caution in the beginning,” since the system is still so new. He also has said that this version is only a “beta” version of the system.
Which brings us to this May, and the first fatality. Tesla openly admits that there are cars and SUVs out there that are crashed while Autopilot is engaged. Back in May, in Williston, Florida, a tractor trailer made a left turn in front of a Tesla at an intersection. According to Tesla the car failed to recognize the white side of the truck from the bright sky and the brake wasn’t activated. The car drove under the semi and the crash killed driver Joshua Brown. A second reported crash took place July 1 in Pennsylvania but the driver walked away from it unharmed. Following the second crash, regulators are looking into Tesla’s Autopilot system. Tesla admits that there have been crashes while owners have been using Autopilot but doesn’t say how many.
In many of the reported crashes owners complain of a situation where Autopilot did not engage the brake or behave correctly. Tesla, then responds that, in fact, the crash was the driver’s fault because they pressed the brake or steered to correct in the last moments, disengaging Autopilot. It leaves owners in a strange spot. As one woman who crashed into a parked car on the highway while using Autopilot told the Wall Street Journal “So if you don’t brake, it’s your fault because you weren’t paying attention. And if you do brake, it’s your fault because you were driving.” As she continues on she compares the issues to a bug in an app. “When I have a bug on my app, it crashes. When I have a bug on my car, people die. There is a slightly different burden on the company.”
As a result Tesla is facing a maelstrom of customer and government inquiries. Fans are now foes and Tesla is under pressure to do something. Other car manufacturers are watching the events closely, too as the outcome could affect the implementation of future technologies that other manufacturers have planned. Tesla has said that it could decide to disable the system in cars where the driver repeatedly abuses it. There are plenty of YouTube videos of Tesla drivers doing utterly asinine things like reading a newspaper or playing cards while Autopilot drives. Elon Musk hasn’t handled the controversy well either, leading the Guardian Newspaper to cite him as an example of how not to handle a crisis. The good news is that Tesla has plans to roll out the next Autopilot update this month and the update is supposed to make some features better and add some new features. We’ll have to wait and see what those updates include and whether or not they can prevent future crashes. Stay tuned to Instamotor for all your up-to-date car info.
Digital media content producer/consultant & former CNN senior producer, now running CN'TRL : Cars, Tech, Real Estate & Luxury.