On May 7th Navy veteran Joshua Brown was killed in his Tesla Model S when it collided with a tractor trailer that came into his lane. News broke days ago as it came to light Brown was using the Tesla’s much praised Autopilot system which allows it to navigate itself on the highway through use of sensors, radar, and camera systems. Brown was allegedly watching a Harry Potter movie on a portable DVD player at the time of the accident, which adds to an already alarming list of people testing the limits of the system with stunts such as playing Jenga and sleeping.
Vehicular accidents are common and unfortunately many are fatal. While car manufacturers have made great strides in vehicle safety, with vehicular deaths plummeting year after year, we are still far away from a completely safe experience. Tesla’s autopilot system is pitched as a convenience to those who don’t care much for the act of driving itself, but many people have been calling it a safety system that can save you in the event of an accident regardless if that lines up or not with Tesla’s statement that responsibility is ultimately in the hands of the driver. Even Brown himself had posted a video just a month earlier praising the system for avoiding an accident he had experienced. Since the National Highway Traffic Safety Administration’s investigation into the case has tempered Tesla’s comments regarding the system, needless to say they are walking on eggshells saying what the system can and cannot do now as opposed to the more relaxed and ideological pitch before. The question on some minds is whether Tesla is doing enough to educate the public about the system and whether the marketing, including the choice of name “Autopilot”, is creating a false perception about the system’s capability.
Tesla’s explanation of the system was contradictory to a statement Mobileye, who provides the equipment and software used by the Autopilot system, gave to Streetinsider just after the crash. Their Chief Communications Officer, Dan Galves, issued the following statement.
“"We have read the account of what happened in this case. Today's collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."
What will this mean for Tesla and for Self-Driving cars in general? Buyer confidence in the system, which is already found in limited form in many car manufacturers like Mercedes and Ford (while these systems help to avoid collisions and to help driving in traffic, they usually require the driver’s hands to be on the wheel, though some have found ways to circumvent this), will certainly take a hit but it’s very possible that hit won’t affect Tesla despite being in the foreground of the incident. Public perception of Tesla is still very high, and while a company like BMW, who just announced their first self-driving car a day after the news blitz on this incident, may find themselves having difficulty in overcoming buyer uncertainty in their system, Tesla will likely not have the same issues. Their lack of competition and the perception of their technology standing out from the crowd despite using many of the same components will help them weather the storm just as it did when Tesla vehicles had similar controversy with cars lighting on fire, the same controversy that caused their lone competitior Fisker to go under. It may be cynical to consider the boon to business Tesla will gain from this death, but the incident will hopefully help foster an industry wide effort to pitch such an advanced but potentially dangerous system in more concrete language.