This week’s question comes from Adrian H. in San Francisco, who asks:
Q: “I read something in the paper about self-driving vehicles the other day. Did the government recently state that these cars can be operated without someone being able to take control in an emergency? What does that mean? Will we be seeing cars on the road without someone in the driver’s seat?”
A: Adrian, there is rapid development in the area of self-driving (autonomous) vehicles. Some, including me, think things are moving too fast without proper consideration for public safety. The federal government and the states, including California, are approaching the future of transportation from different directions. While California is approaching the future with an eye toward safety and security, the federal government appears to be bending to the whim of the major automakers that want to push ahead in a radical new direction.
From the beginning of the history of transportation, the driver of a vehicle has been understood to be the one who was sitting in the saddle, holding the reins or sitting behind the wheel of a car. On Feb. 29, 2016, the National Highway and Transportation Administration took a radical departure when it published an announcement stating the “driver” of an autonomous vehicle is the Artificial Intelligence system directing the vehicle’s movements.
In my opinion, NHTSA’s radical definition is a dangerous and unacceptable move. Ironically, the very same day of the NHTSA announcement, Google revealed on Feb. 13, its driverless vehicle caused an accident with a municipal transit bus. The collision happened when the vehicle, in autonomous mode with a human on board, drove out its lane and then merged back into the path of travel striking the side of the bus. Although the impact was minimal — the bus was traveling 15 mph, and the car traveling approximately 3 mph — it caused damage to both vehicles.
In a DMV report Google stated the vehicle’s movements were made “more complex” by the presence of some sandbags in the roadway. Google stated the AI believed the bus would slow or stop yielding to the vehicle to merge back into the lane of travel.
“From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future,” the company said in a statement.
If the AI is so unsophisticated that it cannot navigate a “complex” decision regarding sandbags in the roadway, and a vehicle as large as a bus alongside it, then what will it do when it encounters a truly complex situation, such as a child running into the roadway?
Google’s “hopes” that technology will act more “gracefully in the future” are unacceptable. When it comes to safety, the public has a right to certainty. NHTSA should rescind its determination and adopt the regulations that have already been established in a number of states, including California — where Google is based and the most AI-driven miles have been clocked — declaring a human operator is responsible for assuring the vehicle is driven safely.
Under the current California Vehicle Code, Section 38750 (4), An “operator” of an autonomous vehicle is the person who is seated in the driver’s seat, or if there is no person in the driver’s seat, causes the autonomous technology to engage. Subsection (5)(b)(2) states “the driver shall be seated in the driver’s seat, monitoring the safe operation of the autonomous vehicle, and capable of taking over immediate manual control of the autonomous vehicle in the event of an AI failure or other emergency.”
Unlike the proposed federal regulation, California law places responsibility on the individual who is in the position of control of the vehicle and/or the initiation of the AI. This allows for an analysis of the decision maker’s conduct in utilizing and/or controlling the technology allowing a determination of who is responsible when an accident occurs.
As the feds delegate responsibility to AI, the Department of Motor Vehicles is utilizing human intelligence to craft sound policy balancing safety, security and privacy rights. The proposed regulations, to be employed during a three-year testing program, require manufacturers to meet independently verified safety standards and to have a licensed operator inside the vehicle capable of taking control in the event of a technology failure or other emergency. The DMV regulations also are designed to protect privacy, requiring manufacturers to disclose to the operator if information is collected, other than the information needed to safely operate the vehicle.
As a trial lawyer volunteer who handles personal injury actions, I am helping to influence these regulations before someone gets injured or killed by reviewing and commenting on these proposals to make sure that consumer rights and safety are protected.