Self-driving cars are getting closer to reality every day, but before people can start using them, the governments and agencies that oversee the roads are going to have to set some rules for how they should be used. Will drivers have the ability to take control? How will they share the road with other drivers? And if a self-driving car gets into a crash, who’s responsible?
Last month, the California DMV took an initial step towards answering these questions by releasing the nation’s first proposed rules to govern the public use of autonomous vehicles. These guidelines have been eagerly awaited, as California’s rules will provide a framework for other states and the federal government to follow as they consider their own policies for self-driving cars.
The key rules under consideration include:
- There must be a licensed driver in the vehicle capable of taking control in an emergency. (The driver must also obtain an autonomous vehicle operator certificate from the DMV.)
- The vehicle must include controls (i.e., a steering wheel and accelerator and brake pedals) to let the driver take control.
- The vehicle must have the capacity to detect and respond to attempted hacking and allow the driver to take control in such situations.
- If the vehicle is involved in a collision, the licensed driver will be considered at fault.
- The vehicle’s compliance with safety and performance standards must be verified via third-party testing.
- Vehicle manufacturers may lease the vehicles or operate their own autonomous vehicle program, but may not offer them for sale.
- Manufacturers will be issued a permit to operate vehicles for three years and must regularly report on how the vehicles are performing and being used.
Keep in mind, these rules are still preliminary and may be revised after the public has had a chance to comment. Nevertheless, odds are good that when self-driving cars are finally made available to the public, the first rules they’ll be governed by will look a lot like these. After all, only a few weeks after California revealed its rules, the federal government announced that, within six months, the Department of Transportation will release a national framework to govern how autonomous vehicles will be tested and regulated.
Giving the People What They Want
One party that’s not a fan of California’s new rules is Google, which has admitted that it’s “ gravely disappointed ” in the DMV’s decision, further asserting in a statement that it is “hoping to transform mobility for millions of people, whether by reducing the 94 percent of [crashes] caused by human error or bringing everyday destinations within reach of those who might otherwise be excluded by their inability to drive a car.”
Yet as noble as Google’s intentions may be, the outcome they prefer may not fully reflect the desires of the people who would be using them. Indeed, though an international study of consumer habits suggests that 55% of consumers would be ready to buy a self-driving car from Google or Apple at the first opportunity, several recent surveys suggest that what people want is closer to California’s vision than to Google’s. For instance:
- According to a survey conducted by Volvo, 92% of drivers think it would be essential that the driver could take control of the car at any time.
- Moreover, while some modern vehicle technologies hold some appeal to 96% of baby boomers, only 31% would consider purchasing one out of fears that drivers could become too reliant on the technology.
- Consumers are already prepared to accept alternative ownership models: a survey by IBM found that up to 42% of respondents would consider new approaches including subscription pricing, car sharing, and on-demand ride sharing.
- Other potential features of autonomous cars that interest consumers include the ability to fix themselves without human intervention (59%), to connect with other vehicles in the area to optimize traffic flow (55%), to learn behaviors (54%) and adapt themselves to provide a personalized experience for the operator (51%)
How Will Drivers Adapt to Self-Driving Cars?
Ultimately, the main source of contention is whether giving the driver the ability to operate the car should be a requirement. On the one hand, most people who are used to having control over the car don’t want to give that up, and in an emergency few would want to be entirely at the mercy an uncontrollable machine. On the other hand, since 94% of collisions are due to driver error, we could be sacrificing one of the main safety benefits of autonomous cars if we have to rely on the driver to be in control in the most dangerous situations.
The truth is, in most situations self-driving cars are already proving to be safer than driver-operated ones. Indeed, according to a recent study by researchers at Virginia Tech, self driving cars being tested have been involved in 1.6 major collisions per one million miles of driving, while cars driving by humans are involved in 2.5 major crashes over the same span; for less serious collisions, these differences were even more pronounced.
In fact, as we discussed in an earlier post , collisions involving self-driving cars are typically the fault of another driver–usually, they’re rear-end crashes where the autonomous vehicle stopped appropriately but the driver behind it didn’t. This points to what researchers at Carnegie-Mellon suggest may be the biggest flaw in the current approach to self-driving cars: they always follow the law , whereas human drivers don’t. As defensive drivers, we’re always prepared for someone else on the road to make a mistake, but self-driving cars simply don’t know how to expect the unexpected.
Here’s the thing: if all vehicles on the road were self-driving, the system would almost certainly be safer than the current one. After all, these vehicles wouldn’t be truly autonomous: because they’d be able to communicate with one another as well as sense hazards around them, they’d always be able to maintain a safe following distance, maintain a safe speed, and avoid unanticipated conflicts; in a sense, there’d be no reason for any individual car to ever make a mistake.
Unfortunately, this isn’t the system we’re going to have. Even after self-driving cars are made available to the public, many people will prefer to keep their old cars and continue to do the driving themselves. And when driver-controlled and driverless cars have to share the road, it’s not necessarily going to be safer for everyone. In fact, it may be more dangerous, as drivers will have one more kind of road user to accommodate, and autonomous vehicles will have to respond to the one factor that can never be fully predicted: human behavior. For an easy-to-understand example of how quickly this could cause trouble, check out this clever online game that simulates this very situation.
Even with the option for the driver to take over in an emergency, self-driving cars aren’t necessarily going to be much safer, as most people who want to be chauffeured by their car probably aren’t going to be paying close enough attention to the road to take controls quickly enough in an emergency. Or, to put it another way, if you have to stay completely focused on driving even if the car is driving itself, what would be the point of giving up control in the first place?
Nevertheless, if our roads are to be safely shared by drivers and autonomous cars, requiring that the latter have the capacity to give control to the former is the only reasonable way to begin integrating the system and teaching both types of road user to learn to accommodate each other. In their statement objecting to California’s new rules, Google insisted that “safety is our highest priority and primary motivator as we do this.” Yet if this is true, they should have no problem with California’s rules, which are designed not for the ideal highway system that Google envisions (as safe as that vision may be), but for the one we have now.
These rules provide for a gradual integration of an unprecedented new technology into a system that is already volatile and full of risk, giving us the chance to learn, to discover what works, and to expose unexpected flaws before unleashing cars that are entirely self-driving into a system that isn’t prepared for them. One day, perhaps, all cars will be able to guide themselves. Until that time, the ultimate responsibility for keeping the roads safe will continue to reside with attentive, educated , and licensed drivers who are ready to respond appropriately to danger–whether it’s coming from another car with a driver, or from a car without one.