X

Tech Talk: Laws Supporting Driverless Cars May Come Soon

Recently, the first day of the 2016 Automated Driving Symposium took place in San Francisco, and the focus shifted a number of different ways throughout the conference. Still, one of the main takeaways from the whole affair was that federal lawmakers are hard at work trying to draft up laws that will help to implement self-driving and eventually driverless cars, but they still don’t know quite how to approach them. U.S. Department of Transportation secretary Anthony Foxx took the stage to tell the audience about where his agency was in the process and what some of the main focus points for lawmakers should be, and it only made it clearer that more concrete data will be necessary to get self-driving cars and the laws that govern them up to the level they should be.

At the conference, a white truck could be found, looking a lot like the one involved in a fatal accident with a Tesla Model S in autopilot mode, bearing a sign saying “Tesla Don’t Hit Me!”. Since the incident in question was one of the very few times thus far that an autonomous vehicle was definitely at fault in an accident, and marks the very first fatality involving a self-driving car of any sort, it was already an obvious talking point for both regulators and speakers at the conference. The truck only served as a poignant reminder, in essence, that no system is ever 100% perfect and bug-free, and that safety should take top priority in all dealings with self-driving cars. Of course, these weren’t exactly things that conference goes needed to be reminded of; those were, eseentially, the reasons for the conference.

When Anthony Foxx hit the stage, he reiterate those points, and promised to keep communications open between lawmakers, the public, and the makers of automated vehicles. This, he said, was the best way to keep a close eye on where the line is, and strike that delicate tripartite balance between real safety, perceived safety, and allowing room for innovation. He talked about “pre-market approval steps” that would help autonomous vehicle manufacturers get their products a pass before they hit the market, and perhaps help to speed things along while concrete, sweeping laws are still in the process of being made. The current state of Federal self-driving regulation is a bit of a mess, with the NHTSA having promised not to interfere with state and local laws making their way to the table. If overarching federal laws don’t get on the books before a patchwork of state and local laws do, however, regulators and tech firms could find themselves hitting walls and stepping on each other as fully driverless cars begin seeing the light of day, around 2020.

Speaking on the Tesla Model S fatality, Foxx essentially laid things out in equal fashion. He pointed out that drivers should always follow any rules set forth by the manufacturer for using their product, but that the manufacturer should have protections of some sort in place for drivers who don’t follow those rules for whatever reason. Unfortunately, the Model S that crashed essentially thought all was clear, meaning that no safeguard aside from an attentive human in the driver’s seat could have prevented the accident at the time. More testing and a wider variety of experiences in the database may have helped, but Tesla made the decision to go to market early in the hopes that consumers would realize the implications and gravity of using such a system. Google’s stance, meanwhile, is that suddenly asking the person inside the car to take over driving in the middle of an autonomous drive can be dangerous, meaning that even if a car can detect a hazard and know to ask for human help, human help should not be relied on for human safety. It’s a sentiment that speaks volumes about human nature and about Google’s philosophy toward driverless car development, and has largely set the tone for dialogues with lawmakers and the public in the future.