Take 39 of these and call me in the ... Oops, never mind.

Photographer: Dhiraj Singh/Bloomberg

Blame the Machines

Megan McArdle is a Bloomberg View columnist. She wrote for the Daily Beast, Newsweek, the Atlantic and the Economist and founded the blog Asymmetrical Information. She is the author of "“The Up Side of Down: Why Failing Well Is the Key to Success.”
Read More.
a | A

In late July 2013, 16-year-old Pablo Garcia, who was in the hospital for a routine colonoscopy to check on his congenital gastrointestinal condition, began complaining of numbness and tingling all over his body. Soon he was having seizures. What caused this strange condition? His medication, it turned out: He'd been given 39 times too much antibiotic. How this occurred is the subject of a fascinating piece on Medium, which I urge you all to read. But if I had to condense its five parts' worth of fascinating insights into one sentence, here's how it would read: "Machines make us stupid."

For example, I spent three months traveling last fall, with only a few weekends in the District of Columbia. By the time I returned, I had forgotten the number of our landline. To be sure, we don't use it very often. Still. We've had that number for five years. I forgot it in less than one football season.

But of course, I no longer need to remember phone numbers. I have a cell phone for that. For any other knowledge I need handy, I have a computer. For anything I need to learn, there's Google. In some sense, this means that I have a better memory and wider knowledge than I used to. But if I'm cut off from these tools, I am suddenly a moron. And if something gets entered into the computer wrong, I'm totally helpless. A few months back, people gently emailed to inquire where I was, as the panel I was on was about to begin. Turned out I'd put it into the calendar on the wrong day, which is pretty easy to do with a slip of the button. Oops.

These are minor errors, easily made up for by something like the very expensive lunch I bought one of the panel's organizers. But of course, these sorts of mistakes are not limited to minor situations like understaffed public policy panels. Like the software that was supposed to prevent medical errors, and instead contributed to a patient's getting nearly 40 times the desired dose.

There were a lot of human errors that led to that horrific outcome, but here are the two that stand out: alert blindness, and excessive trust in the automated system. The software did try to warn both the doctor and the pharmacist that something was wrong with the prescription. Unfortunately, the system had also tried to warn them that something was wrong with a huge fraction of the orders that were entered into the computer. Usually, these problems were trivial, so people got used to glancing at the alerts and dismissing them. Otherwise, the hospital would have ground to a halt as everyone devoted their days to reading software alerts. So when a message came along that wasn't trivial, they didn't read it. Or didn't even truly see it.

But the second, and more important, failure, is what happened when the system delivered all those pills to the nurse on the floor. She felt like something was wrong. And then she went ahead and administered them anyway, because the bar code system told her that yes, this is what the doctor prescribed.

Lest that sound like I'm castigating the nurse as some special kind of idiot, let me say that I think she did something idiotic, but idiotic in a way that we're all prone to. There's an eerie authority to an automated system. Most of us find it hard to resist. After all, computers are smarter than we are -- at everything from calculation to beating a grand master at chess. It's easy to turn off our judgment and hand the decision over to the machine.

But of course, computers aren't exactly smarter than we are. What they are is really, really good at reliably performing single tasks they've been programmed to handle. What makes humans so successful is that we can land in an entirely unexpected environment and probably stay alive. Our comparative advantage is judgment in unforeseen circumstances: "common sense." So when a computer tells us to do something obviously wacky, where is our common sense?

It might kick in when needed if it got regular use. In the old paper record system, errors were relatively common, so people were more vigilant. With automated systems, mistakes are more rare, so we're less likely to be watching out for them when they do occur -- which means that even bigger potential errors are possible. The old system gave a lot of people the wrong medication, in the wrong dose, but it probably never gave anyone a 39-fold overdose of antibiotics. Unfortunately judgment atrophies just like a muscle.

It's not clear what the solution is to this problem. The author of that Medium article and a related book, Robert Wachter, suggests a number of insights learned from the airframe industry, which has put a lot of thought into making sure that pilots get the alerts they need, without ever getting so many that they start to ignore them. That's surely a good start. And yet he also reports that pilot training experts are worried about an uptick of accidents when automated systems fail, and pilots, who spend a lot less time actually flying the plane than they used to, lack the judgment to take over. This is a problem that is also bound to afflict driverless cars, whenever they arrive, so let's hope that the experts crack it. But they apparently haven't yet.

However, if we don't have a solution, we do have some good starts. One is to work on better machine design, so that it's harder to make these sorts of errors. But another is to remind people that human brains are better than computers at a lot of things -- so when the computer's instructions seem crazy, you should trust your judgment, not the monomaniacal machine.

This column does not necessarily reflect the opinion of Bloomberg View's editorial board or Bloomberg LP, its owners and investors.

To contact the author on this story:
Megan McArdle at mmcardle3@bloomberg.net

To contact the editor on this story:
Philip Gray at philipgray@bloomberg.net