There's two parts to this. The first is that to get from point A to point B you have to trust someone to get there, either because you can drive yourself and trust the craft of the vehicle you use, or because you trust whoever is driving you and the vehicle they use.
The second part ... and this is the part I'm going to focus on a bit more, I am generally of the impression that the kind of technology that was developed in the last fifty years to create self-driving vehicles that, once set in motion to reach their destinations, go the distance has had one reliable application.
Fire and forget weapons.
I have no problem with an AIM-120 being a fire and forget air-to-air missile. I expect that a self-directing vehicle literally hits its target destination because that's what it's built to do.
The idea of adapting such technology to civilian transit scares me not because I think it's horrifying that military technology should somehow stain the civilian sphere. We have all sorts of glorious innovations in medical practice, surgery and so on thanks to military field innovations.
No, the problem is that I already fear the human drivers who are no less likely to kill someone on accident than a driverless car. But you can sue the human driver of a car for something.
So when I read about the recent death caused by a self-driving car my initial feeling was a depressed sense of the inevitability of this. Of course someone was going to get killed it was only a matter of how and under exactly what circumstances.
… On Sunday night, one of Uber’s self-driving cars struck and killed a woman in Tempe, Arizona.
Elaine Herzberg, a 49-year-old woman, was walking her bicycle across a road when a Volvo SUV, outfitted with Uber’s radar technology and in fully autonomous mode, collided with her. The car was traveling at 38 miles per hour in a 35-mile-per-hour zone, and it did not attempt to brake before striking her, according to Tempe police.
It is the first time that a self-driving car, operating in fully autonomous mode, has killed a pedestrian. Sylvia Moir, the police chief of Tempe, announced on Tuesday that Uber was likely not at fault for the collision. But after her department released footage of the collision on Wednesday, transportation experts said it showed a “catastrophic failure” of Uber’s technology.
The two stories did not perform equally in the press. By the middle of the week, the Uber news had drifted off the front pages of The New York Times, The Washington Post, and CNN. It often sat near the middle or bottom of the page on Techmeme, a website that aggregates technology news from dozens of outlets. The Cambridge Analytica story, meanwhile, consistently clanged around above the fold of every outlet. I found myself asking: Why?
Perhaps it’s because people still mostly believe the hype around self-driving cars. This isn’t surprising: I still mostly believe the hype. Statistically speaking, cars of all types are super-ubiquitous, high-speed murder machines. Automobiles kill about 102 Americans every day, according to government data. “Accidents,” a category which includes car crashes, are the fourth leading cause of death in the United States, according to the CDC.
Nor are nondrivers exempt from the carnage. Nearly 6,000 pedestrians were killed by a car in the United States in 2016. Hundreds of cyclists die every year as well.
So maybe the relative lack of coverage of the Uber crash represents a healthy perspective. It suggests, perhaps, that journalists and the public understand the difference between anecdote and data. Sixteen Americans die every day while walking near a street. Most of us never learn their names. What makes Elaine Herzberg different?
Almost any given day I commute to and from work in the Puget Sound area I think of how I could very possibly die today because of how people are. You or I are far, far more likely to die at the hands of an inattentive driver than at the hands of someone wielding a gun. I'm more worried that of those wielding guns cops will be trigger happy than some random stranger might show up. It's not that I can't imagine someone being murdered by some person with a gun. The year iMonk died was also the year someone I met from my college years was murdered by a stalker who had been stalking her for years.
All the same, I am always more afraid I'm going to get maimed or killed by some inattentive driver than I am afraid that I might be killed by someone with a gun.
A self-driving vehicle for civilian use doesn't appeal to me. At least with the fire and forget weapon the expected probable death of the thing on the receiving end is intended. It may be me being old fashioned and close-minded about the limited applicability of self-driving vehicles but I think civilian society would be better off leaving this set of technical advances made within weapons tech firmly on the side of weapons tech. If there comes a day when I should, God forbid, get killed by a vehicle, I at least want to be able to have a second's moment of being able to see whoever it is that hits me and not see some harbinger of a Stunticon attack.
Or maybe just put the Decepticon logo on every self-driving car and people will know to look for it.