Why People Want Less ChoiceAndrew McAfee
Posted on Harvard Business Review: November 11, 2010 1:25 PM
I write and talk a lot about Enterprise 2.0 and the new cornucopia of technologies that let people come together, interact, and collaborate with virtually no preconditions. No workflows, no set division of labor or roles and responsibilities, no concept of hierarchy. It would be easy to read my book or listen to a few of my talks and come away convinced that I'm some kind of technohippie, dedicated to maximizing human choice, freedom, and autonomy with digital tools.
So I want to set the record straight on two things. The first is that some of my favorite technologies are ones that take choices away from people. I wrote a couple posts a while back about how today's brilliant technologists have learned the difficult art of silencing the bells and whistles—leaving apparently important features out of their offerings.
It turns out that we often like having fewer choices instead of more, even though when asked we'll always express a preference for more, more, more. Choice can be confusing, paralyzing, and worse than unsatisfying—it can be dissatisfying. Stripped-down tech products can be hugely popular, as Apple keeps demonstrating to us. As Farhad Manjoo recently reminded us, lots of people predicted the iPad would be a big flop because it didn't do enough. Apple's on track to sell 13 million of them by the end of the year.
In addition to offering more or fewer choices, technologies can offer their users more or less autonomy, or freedom to act as they wish without constraints. While I like the tools of Enterprise 2.0, I also like many enterprise applications that take autonomy away from their users, slotting them into pre-defined roles within standardized business processes. Well-designed ERP, supply chain, and procurement systems bring order and consistency even to large and distributed companies. This can be hugely beneficial, and autonomy-reducing enterprise systems have proved their value over the years.
But their users must hate them, right? After all, don't people value autonomy and self-determination above almost all else? In many if not most cases, they probably do. But the second thing I'd like to be clear on is that people can get a lot of value from having their autonomy taken away, and from technologies that put constraints on their behavior. Web-based commitment contracts show this vividly.
Commitment contracts are just what they sound like. They document a person's commitment to, for example, stop smoking, lose weight, or save more money. They typically specify goals and timelines ("I'm going to lose 15 pounds within 15 weeks") and penalties ("If I don't lose enough weight, I forfeit $500").
The big problem with writing commitment contracts is enforcing them. Careful studies have shown that if participants know up front that the commitment contracts have ironclad enforcement mechanisms they can be quite effective, but building such mechanisms has historically been difficult, time-consuming, and expensive. A web-based service called stickK, launched in 2008, makes it easy to set up an ironclad commitment contract.
The economist Ian Ayres, one of the site's founders, used stickK to set up a $500 commitment contract specifying that he would lose a pound a week for twenty weeks. He gave the site this money up front and designated a referee who would tell the stickK each week whether or not he'd met his goal. If he hadn't, a $500 penalty would be automatically sent by stickK to a person or charity designated by Ayres; he had no way to stop this from happening. At the end of the 20 week period, he would receive back from the site his original money, minus all penalties sent.
Part of the site's brilliance is that it lets members like Ayres choose as penalty recipients 'anti-charities' and people that they don't like. This springs from the insight that a staunch gun control advocate might well be more likely to lose weight as specified in the contract if he knew that failure to do so would result in his money being sent to the National Rifle Association.
Ayres steadily lost at least a pound a week, then set up another similar contract on stickK to keep his weight below 185 pounds. More than 30 weeks later, it still was. Given how difficult it is for most dieters to maintain their initial losses, this is an impressive achievement. As he wrote of his experience, "For me, it's been surprisingly easy—five hundred bucks is a lot of money. And while the prospect of losing 25 pounds is daunting, it's not that hard to lose a pound a week when the alternative is to lose $500."
My conclusion from these examples is that it is easy, tempting, and wrong to make blanket statements about psychology and technology— about what we humans "always" want our tools to do. Innovations often occur when someone realizes that the standard story is too simplistic, and offers a technology that goes against the conventional wisdom.
What do you think? What counterintuitive technology success stories are you familiar with? Leave a comment, please, and let us know.