autonomous car

All Watched Over by Machines of Loving Grace

Quite a furor surrounded the recent death of Elaine Herzberg in Tempe, Arizona, her body broken by a ton of steel and glass, processors and sensors. The most perplexing aspect of the tragedy, though, was the subsequent frenzy of legislative action, an outpouring of posturing that revealed, if nothing else, the deep denial we as a society are in regarding the dangers that we have created, and ultimately our own place in that society, ethically and functionally.

To wit, in 2016 40,200 people died as a result of the operations of motor vehicles, a number which has held fairly steady year-by-year, leading to the conclusion that 2018 shall hardly prove to be an exception. Ms. Herzberg’s death was unique in all of them in that she was killed by an autonomous, self-driving vehicle, part of Uber’s test fleet. Her death is unique in that, for the thousands of times that the reporting about vehicle fatalities invariably states that the victim was struck and killed by “a car” rather than “a driver,” this time it is actually true.

As a result, while Gov. Mark Dayton’s task force studies the future of autonomous vehicles in Minnesota, state Senator Jim Abeler (R-Anoka) has sponsored a bill to indefinitely ban the vehicles in the state. “Arizona confirmed my concerns,” Abeler said of Ms. Herzberg’s death. “I’ve been hearing about this and am very worried about it. And very frankly, the idea of driving home while you ride in the back seat is just a recipe for trouble.” The nonsensical second part of his statement aside (there are no self-driving vehicles that allow the operator to lounge in the back seat), it begs the question as to what Mr. Abeler’s concerns might be. In order to explore them, a brief review of the incident is in order.

Ms. Herzberg was struck on a 35-mph street. The vehicle was traveling at 38 mph in fully autonomous mode, meaning that the machine was in control. Video footage from the vehicle’s forward camera showed that she was walking her bike, and moved quite suddenly in front of the vehicle prior to being struck. She died of her injuries later in the hospital. Sylvia Moir, the police chief of Tempe, told the San Francisco Chronicle: “It’s very clear it would have been difficult to avoid this collision in any kind of mode,” a summation perfectly congruent with the oft-repeated refrain of drivers that the pedestrian-victim jumped out so suddenly there was no way to stop.

All of which is to say that, at this point, the machine is no less capable than a human. If there is any fault, it is that the site of Ms. Herzberg’s tragedy is just another street in America in which pedestrians are openly exposed to vehicle traffic and the speed limit has been set at greater than 25 mph. Aside from that, the sensors and hardware of the vehicle clearly either malfunctioned, or were inadequate to the task, but engineering can fix that rather quickly. What engineering can’t fix is the cognitive limitations of humans when they operate a vehicle around pedestrians at greater than 25 mph, or the decisions they make when they are angry/impatient/distracted/or any number of emotional simian shortcomings.

So. No legislators, certainly not Mr. Abeler, have moved quickly to lower speed limits, or to pursue any of the legal re-engineering of our roads that might have actually saved Ms. Herzberg in this incident. The data is incontrovertible that it would have saved thousands of the 40,000 who perish annually. But this death, and this one alone, seems to have confirmed some nebulous suspicions, enough to prompt an outpouring of legislative action aimed solely at robot vehicles. The concern that was confirmed was not, for a variety of reasons, the fact that pedestrian exposure to vehicles moving in excess of 25 mph has much greater lethal potential. No one seems to have asked; would the vehicle’s sensors have been adequate to the task at 25 mph? Perhaps, with humans and machines, we should act promptly to move our municipal streets in that direction.

Why don’t we? Why the proposed bans, rather than work to do something that will actually save many of the 40,000 who die each year; something of import and consequence. Maybe it is because we know, deep inside, the singular fact that is indisputable, yet we have not fully apprehended; that there is almost no task we humans do that a machine can’t do better. The few tasks remaining will fall in time to algorithms and indefatigable strength. And so we cling to our delusions and our flawed notions about what constitutes worth in a human being. Things like work and “free will,” which encompasses the supposed freedom to drive how we like, and do it well, because we are human.

New York livery driver Doug Schifter understood with cutting clarity the superiority of the machines, and the cruel hopelessness of a world in which we have determined that work is the value of a man, and where we have accepted that machines are better than some, but not all. He understood it with the clear vision of a shotgun rending his skull as he pulled the trigger. Doug Schifter knew.

If machines can do the simple and ubiquitous task of driving better than we do – the task around which we have built our cities – doesn’t that mean that they actually are better than all of us…not just poor Doug Schifter? Here’s the secret; like almost everything else, machines can, and this might be something for pedestrians and cyclists alike to celebrate.

Despite the flawed deontological roots of Isaac Asimov’s “Three Laws of Robotics,” and the frightening and entertaining breakdowns in the laws that occur in his fiction, imagine for a moment a world of robot cars in which other road users are kept “three laws safe.”

All that remains, at that point, is to engineer our cities to be ones that harmoniously interact with the Three Laws. For example, sensors are better than the 10-degree focal vision and inattentiveness of humans, and the processor so much faster than our brains, so all that remains is the kinetic and thermodynamic capacity of the braking system. In other words, ensuring that the vehicle is slow enough to stop in time — a goal that we should be working toward with human drivers as well.

Humans, wretched self-entitled animals as they become behind the wheels of their impervious and powerful weapons, will of course violate the speed limit. They will act with murderous aggression over being slowed down. They will perpetrate any number of ethical transgressions. But a robot?

Leaving aside the 38-mph speed of the car in Ms. Herzberg’s death, which might have been simply a momentary deviation, properly programmed with something akin to the Three Laws, a robot will never willfully break the speed limit. If the limit is 35 mph, or 45 mph, it will travel that speed — i.e., follow the Second Law — unless it possesses higher-order understanding of the hazards. But it will also drive 25 mph if that is the law.

The machine will never pass me dangerously on my bicycle. It will wait, with its graceful, eternal, robotic patience behind me while I pedal, until it is Three Laws Safe to pass, all while the human inside dissolves into inchoate rage. If it does strike someone — perhaps even if its human operator does — it might even stop, due to any of the three laws, to report and record the incident, and see that medical attention is dispatched to the scene. It will react with super-human alacrity and precision when a child chases a ball into the street. It will never strike out in anger. It will never rev its engine and try to muscle my 4 year old out of a crosswalk as we stroll to the store. The robot is literally worlds different from a human being, in that it is essentially decent.

Michael Daigh

About Michael Daigh

You might have seen Michael Daigh riding his bike around the Twin Cities metro. He resides in St. Paul, but only since 2015, so his opinions don't count. Michael holds an MA in History, and is the author of the book: "John Brown in Memory and Myth". He is also a decorated fighter pilot.