Why Negativity Is Really Awesome
Last March, Rocky King, executive director of Oregon’s health insurance exchange, predicted that Cover Oregon was about to become the pride of the state. “This is an incredible project,” he told the Lund Report, a local health-care news site. “When people see what we’re doing in Oregon compared to Idaho, Maine, or Florida, they’ll be proud of what we’ve been able to accomplish.”
He wasn’t the only one brimming with optimism. “The state of Oregon is leading the nation,” Carolyn Lawson, chief information officer for the Oregon Health Authority, told legislators in 2012. “According to the feds, we’re easily nine months ahead of any other state. We have multiple states that are asking if they can participate with us.”
Cover Oregon has certainly attracted national attention since the exchange went live. The wrong kind. Its online enrollment system still isn’t working, though the exchange did try a partial relaunch on Feb. 18. The state has resorted to paper applications and so far has signed up roughly 35,000 individuals to private plans. Officials originally projected 217,000. In Washington there’s talk of investigations into whether Cover Oregon misled federal officials about the project’s progress with a nonworking mock-up of their nonfunctional system.
The most amazing part of this whole story is that King and Lawson were told this would happen. Reporting by the Oregonian newspaper shows that Maximus, the consulting firm hired to provide oversight, warned as early as November 2011 that the project was already running late and had insufficient management controls. Bob Cummings, a technology-oversight analyst working for the state legislature, informed lawmakers in the spring of 2012 that the odds of the project being ready in time were “not good.”
Republican state legislator Dennis Richardson wrote the governor that Cover Oregon “is now in jeopardy of becoming the next state IT fiasco.” Meanwhile, Maximus continued to issue reports on the management problems crippling the implementation of the exchange. In response, Lawson threatened to stop paying them, according to the Oregonian.
As insane as it seems, it’s possible, even probable, that King and Lawson actually believed all was well. It’s not as if they had much to gain by launching a nonfunctioning website. In late December, just before the state started robocalling applicants to tell them their enrollment probably wouldn’t be processed by Jan. 1, Lawson resigned. Shortly thereafter, King announced his retirement. A Cover Oregon spokesman declined to comment, and King and Lawson could not be reached.
Why did they try to shoot the messenger instead of listening to the message? One answer is that’s what organizations do—especially dysfunctional organizations. As a young IT consultant, I sat through more than one meeting where we, or someone, tried to stop a client from doing something obviously crazy. Usually, the result was that the client did something crazy, and that someone went looking for another job.
Doctor No, that grating in-house critic, can be your most valuable employee—if you can make yourself listen. That’s surprisingly hard to do. Organizations exist for the purpose of doing stuff. That’s what their staff is hired to do. The guy who says maybe we shouldn’t do that stuff—or the stuff we’re doing isn’t working—is not very popular. There’s a large body of literature on dissenters, and it mostly tells you what you already know if you’ve ever been to a project meeting: Nobody likes a Negative Nancy.
Talk to turnaround experts, or read a little history, and you’ll hear this story repeatedly: There were people who raised the alarm, and they got shouted down. The night before NASA’s disastrous launch of the space shuttle Challenger in 1986, engineers from Morton-Thiokol urged managers to delay. They were worried that the cold weather forecast for the next day would cause problems with the rubber O-rings that sealed the joints in the solid-fuel booster rockets. “I am appalled by your recommendation,” George Hardy, NASA’s deputy director of science and engineering at the Marshall Space Flight Center, told Thiokol’s engineers, according to the book The Challenger Launch Decision by Diane Vaughan. NASA launched the shuttle on a bright cold January morning. The O-rings failed. The resulting catastrophe killed seven astronauts and could have ended the Space Shuttle program.
Investigations into the disaster showed NASA had fallen prey to what you might call “groupidity,” a special form of groupthink in which we collectively become willing to take risks we individually recognize as stupid—because everybody else in the room seems to think it’s fine. NASA had been noticing unexpected problems with the O-rings for a while. At meetings about that issue, they systematically redefined what they considered risky, and concerns about the O-rings were downplayed.
The problem is that the very characteristics needed to resist groupidity often make you a bad evangelist for the truth. We are social animals, and bucking the group doesn’t come naturally to most of us. Dissenters and whistle-blowers come off a little odd when you read their stories—stiff-necked, naive, and not quite in touch with basic social realities. Thiokol’s whistle-blowing engineer, Roger Boisjoly, filed quixotic lawsuits against his employer after the Challenger disaster and was ostracized by co-workers. Henry Chao, senior IT staffer at the Centers for Medicare and Medicaid Services, the federal agency that operates healthcare.gov, raised alarms about growing problems with the federal Obamacare website. He seemed like the cool voice of reason when his e-mails came to light—until he became inexplicably obstreperous with congressional Republicans.
There’s no denying that sometimes Negative Nancy is the problem not the solution. The people telling you why your project isn’t working—or can’t work—are often the same people who tell you why nothing will ever work. Since most new ideas fail, these people will often be right. But they’ll often be wrong when it counts the most: when you’re embarking upon a major innovation. They can be intensely irritating to the folks in the group who are trying to make things work.
You don’t want to let the perennial Voice of Doom kill every project. But if you listen carefully to the Voice of Doom, you’ll find he’s giving you something extremely useful: a list of almost everything that can possibly go wrong with your plan. Think of the VOD as your defensive coordinator, identifying all the holes you need to plug, and backup plans you need to have in place, before you launch. Instead of ostracizing your Doctor Nos and asking them to kindly shut up, why not give them a designated role on the team, telling you what’s likely to go wrong, and then pointing out when it is?
The trick is getting other workers to listen. In September 2004, 60 Minutes aired a bombshell story, anchored by Dan Rather. CBS had documents showing that George W. Bush had been AWOL during his Vietnam-era Air National Guard service. This was the sort of story that could easily change the outcome of a tight presidential race against a bona fide Vietnam veteran.
Unfortunately for the staff of 60 Minutes, it almost immediately became clear that the documents were crude forgeries, obviously created in Microsoft Word, a program invented several decades after the Vietnam War ended. CBS stuck by their story for nearly two weeks, attacking their critics as ideologically motivated partisans. Virtually all the key players in that broadcast were ultimately fired. It turned out that staffers and outside experts had questioned the documents before the story aired, but the producer ignored them. Even worse, she and Rather shrugged off questions about the provenance of the documents after the story aired.
Almost a decade later, 60 Minutes touched off another firestorm with its broadcast of a purported eyewitness account of the attack on the U.S. compound in Benghazi, Libya, that left Ambassador J. Chris Stevens and three other Americans dead. The segment, presented by correspondent Lara Logan, relied on a key source, ex-security officer Dylan Davies, whose account was later discredited. Only a small team at 60 Minutes were privy to the details of the story. Big mistake. An internal review faulted Logan and her producer for not pulling in the broader resources (is there a Dr. No in the house?) at CBS News to assess the veracity of Davies’s story.