The Demon in the Machine

Chris Spencer

Chris Spencer

By James A. Bacon

On Oct. 25, 2013, Chris Urmson, a leader of Google’s autonomous car project, proclaimed that legal and regulatory problems posed no major barrier to the commercialization of Self-Driving Cars (SDCs). When accidents did occur, he told attendees of the RoboBusiness conference in Santa Clara, Calif., data collected by the cars would provide an accurate picture of exactly who was responsible. He shared data from a Google car that had been rear-ended by another driver. The annotated map of the car’s surroundings clearly indicated that it had halted smoothly before being struck by the other vehicle.

“We don’t have to rely on eyewitnesses that can’t act be trusted as to what happened—we actually have the data,” Urmson said. “The guy around us wasn’t paying enough attention. The data will set you free.”

The very same day, Toyota settled a case in which an Oklahoma City jury had awarded $3 million for a 2005 incident in which a Camry driven by 76-year-old Jean Bookout had accelerated out of control. Bookout had said she tried to use the foot brake and emergency brake to no avail. Toyota lawyers had argued that she must have hit the gas instead. At issue was the performance of an electronic throttle control system that replaced mechanical links between the accelerator pedal and the throttle in older models. Siding with Bookout, the jury bought the story that the electronic throttle was flawed.

Google may have data on its side but accident victims sometimes have judges and juries on their side. Toyota had won all previous unintended-acceleration cases and an exhaustive study by the National Highway Traffic Safety Administration could find no flaw in the brake’s computer code, but the judge instructed the Oklahoma City jury that it could find a product defective even if no defect could be identified.

“It opened the floodgates,” says Chris Spencer, a Richmond, Va., attorney who has represented automobile manufacturers in hundreds of cases, including dozens that have gone to trial and reached a jury verdict. “All a lawyer has to do is get his client to say, ‘I did nothing wrong but something went wrong – it must have been the vehicle’s fault.’”

(Cross posted from the Datamorphosis blog.)

Few would dispute that SDCs are safer overall than human drivers. Few would contradict the view that accidents, injuries and fatalities will decline as SDCs comprise an increasing share of the automobiles on the road. But the very thing that makes cars safer – the increasing use of software-controlled electronics – may make car makers more vulnerable to lawsuits. Without significant updates to tort law, the inability of auto manufacturers to disprove a negative – that their software is not flawed – could hinder the adoption of technology that could save thousands of lives every year.

There are roughly six million traffic accidents every year in the United States. Most of the time, humans are at fault. Sometimes mechanical failure or road conditions are to blame. Before the advent of electronic parts, explains Spencer, if the car failed, the failure usually was visible in the form of a blown tire, worn brake or some other damaged part.

But car mechanical systems are giving way to electronic systems. Electronic sensors flash a light if an engine part is nearing failure. Antilock brakes take over when the brakes are about to lock up. The steering wheel, once connected to the axle by a steering column, now guides the car by means of electric signals. Cars use cameras and lasers to survey the environment around them; road-recognition software tells them if the driver is drifting over a lane and sends a warning. Soon, cars will be communicating with one another, sensing immediately when vehicles ahead are slowing, and slowing instantly in response.

For the most part, that’s a positive trend. Electronics parts don’t get distracted, they don’t get tired, and they see as well on a rainy day as a bright sunny day. ”Cars can sense things faster and better than you can,” says Spencer. “They can respond faster and better than you can. They have all these wonderful features. The general public sees them as a great leap forward.” The problem is, he adds, “If you build it, they will sue.”

The electronics systems in today’s cars are enormously complex. “The most modern jet fighter in the U.S. arsenal, the F35, is thought to have 18 million lines of code,” says Spencer. “A modern vehicle has 100 million lines of code when you include the infotainment.”

Writing software is an art. One programmer can’t easily tell what another programmer has done. If a car malfunctions, it is exceedingly difficult to find a flaw in the code – and it’s almost impossible to rule out the possibility that a hard-to-identify defect lurks in the code.

There is a rich vein in the popular imagination of renegade computers and demon-possessed machines. “Autonomous machines scare the hell out of us,” Spencer says. Factor in the experience that most Americans have had with computers – unexplained crashes, disappearing files, degrading computer performance, bizarre error messages, viruses and malware, frequent software updates and patches – and it’s not surprising that some jurors do not regard software code as infallible.

Compounding the problem for defense attorneys is the fact that witnesses are not reliable. “Any trial lawyer will tell you that if you put a person on each corner of an intersection and had a crash, you’d get four different stories, six if you counted the drivers.” Spencer quips. “People see, hear and describe the same things very differently. Try to get an accurate statement from a witness of a crash that unfolded in milliseconds” – it’s almost impossible. Yet jurors may believe a little old lady who swears she didn’t push the accelerator when she meant to push the brake.

The Oklahoma case was settled so it can’t be overturned, and it does not create a binding precedent. But lawyers do look to history to guide them, suggests Spencer, and the case will encourage more lawsuits. While the tort system is good at providing redress in individual cases, it also can stifle and punish innovation. Autonomous cars and SDCs have the potential to save literally tens of thousands of lives.

Spencer offers no specifics on how the tort system should be reformed, but Don Howard, a Notre Dame philosophy professor, and Mark P. Mills, a senior fellow at the Manhattan Institute, do. In a recent Wall Street Journal column, they wrote:

The self-driving-car solution is clear. Congress should pre-empt [the National Highway Traffic Safety Administration] and the trial lawyers and pass a National Autonomous Vehicle Injury Act. The Fords and Nissans and Googles and Qualcoms should voluntarily create an Autonomous Vehicle Event Reporting System. And industry players should also create a National Autonomous Vehicle Compensation System.

This advance in automotive safety is too important to leave to the vagaries of the tort system, Spencer argues. “You can’t stop driverless cars. They can do too much good.”