Get email delivery of the Cadence blog featured here
You probably heard that last week, a woman was killed in Phoenix by a driverless car. In 2016, 37,461 people were killed on US roads. So if that day was typical (and it probably varies by weekday versus weekend, at least) then about 100 people would have been killed by cars with human drivers.
As I write this, the precise details of what happened are not clear. The police in Phoenix has already said that the accident is probably not the driverless car's fault. Other reports seem to imply it was a complicated junction that the software wasn't prepared for. When I first heard about it, it sounded like a "cyclist" was killed. But it was a homeless woman, with a lot of bags on a bike she was using as transport, at night. She was pushing the bike, not riding it. She was not crossing the road. She apparently stepped into the road so suddenly that the first the safety driver knew of her behavior was when the car hit her. There is video, so you can make your own mind up. Presumably, they will test whether the woman had alcohol or drugs in her blood.
By the way, the NTHSA definition of "alcohol-related" is if anyone, driver, non-driver, or passenger has been drinking or believed to have been drinking. It think they do this since it makes the drunk driving statistics look more dramatic, because the casual reader assumes "alcohol-related" means "alcohol-caused". But if a drunk pedestrian is hit by a sober driver, or a drunk driver is rear-ended by a sober driver, or a passenger has been drinking, it all counts as an alcohol-related accident. NTHSA has been criticized for this since, at the least, it makes the statistics misleading.
Naturally, Uber has suspended their testing of autonomous vehicles until what happened is clear, and some of the other companies with driverless cars have done the same.
Anyway, my point is not to blame or exonerate the Uber vehicle in this particular case. At some point, someone will be killed by an autonomous vehicle due to a bug or malfunction, whether or not this woman was that accident.
My point is a different one.
On average, 100 people are killed on the roads in the US every day. In round numbers, roughly a million people are killed on the roads around the world per year (1.25 million in 2013 according to the WHO). In the US, this is 12.5 deaths per billion miles driven. The French economist Frédéric Bastiat always discussed the seen and the unseen. When the baker's window is broken, the seen is the work made for the glazier. The unseen is the meat the butcher didn't sell to the baker because he had spent his money on glass. In the same way, the seen are the people killed by autonomous cars. The unseen are the people killed by non-autonomous cars. If all the autonomous vehicle programs are suspended for a period due to the death in Phoenix, I think it is likely that at least one more person will be killed by a human driver due to all the extra miles driven by regular Uber vehicles in the meantime. I've seen an analysis of 9/11 that showed many people were killed in the months after 9/11, when TSA made airline travel very inconvenient, because so many people drove instead of flying. Driving is much more dangerous than flying by any measure. In 2017, there were no commercial passenger jet deaths worldwide, so you can't even work out just how much more dangerous driving is than flying without dividing by zero (it seems there were some non-jet and non-commercial accidents).
If autonomous vehicles reach the point that they "only" kill 10 people per day (scaled up to the number of vehicles on the road), that would be a reduction of 90% in traffic fatalities. If some technology, like magic airbags, could reduce deaths that much, they would immediately be mandated. Twenty years ago, we mandated those extra little red brake lights in the rear window of cars by law because they were shown to reduce rear-end collisions by 10%, most of which are fender-benders in which nobody is injured, let alone killed.
My point is that if we don't let autonomous vehicles on the road until nobody is killed, then the seen is the people saved by the regulation. The unseen is the 100 people killed every day by normal driving. The US, in particular, is a very litigious society. Except in unusual circumstances, people who cause a fatal accident do not get sued. Partly because it is accepted that some accidents will happen, and partly because even insured people don't have deep enough pockets to pay the sort of multi-million dollar awards that attract trial lawyers. Autonomous cars are not like that, and Waymo and Ford and Tesla risk getting sued since they do have real money. Oh, and it turns out that about half of all fatal accidents are single vehicle, such as driving into a tree or rolling over.
The risk is that lawsuits drive (!) autonomous vehicles off the road, or postpone their introduction by years, condemning thousands of people to death.
Something similar happened to the vaccine industry. There is not a lot of money in vaccines since you only need a vaccine once in your life (there are only around 4M kids of any given age). The result was that if one in a million kids had a complication, sued, and won a big settlement, then the entire vaccine industry was not viable. As a result, companies withdrew from providing them. However, like autonomous cars, a lot of the good things about vaccines accrue to other people (people not killed in epidemics, pedestrians not killed by driverless cars). Economists call this a positive externality. Since not having any vaccines for US children would be a bad thing, in 1986 the US set up the National Vaccine Injury Compensation Program, funded by a 75¢ surcharge on every dose of vaccine. There is a "vaccine court" (officially The Office of Special Masters of the U.S. Court of Federal Claims, which sounds more like something out of Game of Thrones) who administers compensation. The thing that makes it all work is that it is mandatory. You cannot sue vaccine makers, you have to go through the compensation scheme.
I think that we may need to set up something similar for autonomous vehicles. We want to reduce the nearly 40,000 people killed every year on US roads. If we could instantly reduce it to zero, then there would be no need. But if we can reduce it to 10,000 people per year, that is such an enormous reduction that it should be something to celebrate. But those thousands of people, and thousands of lawsuits, and thousands of multi-million dollar awards would otherwise risk making autonomous vehicles as financially non-viable as vaccines were becoming, and leave us stuck with non-autonomous vehicles. And 40,000 deaths per year.
Any economist will tell you that the problem with positive externalities is that they are undersupplied, it is a form of market failure that should be addressed (negative externalities, such as pollution, are over-supplied, which we typically fix with regulation). Autonomous vehicles are an improvement on what we have once they are safer than humans, even if they are not perfect. We need to make sure that we don't prevent their introduction.
The 2018 Turing Award (often casuallly called the Nobel prize in Computer Science) has been awarded to Hennessey and Patterson, who wrote the book on computer architecture (literally) and invented the concept of the RISC (Reduced Instruction Set Computer). I'll say more about it in an upcoming post. But while you are waiting, see my post on Dave Patterson's 50 Years of Computer Architecture.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.