Microsoft's Privacy Czar on the "Trust Model"

Richard Purcell, the man Bill Gates charged with creating security standards, talks about the huge dimensions of the job

By Jane Black

If you're concerned about privacy, you can't ignore Microsoft. The Colossus of Redmond is at the center of setting standards for the way information is used and shared online. Last year, it released a new version of its Internet Explorer browser embedded with the Platform for Privacy Preferences, a specification that enables the browser to automatically understand a Web site's privacy practices (see BW Online, 12/14/01, Microsoft's Cookie Monster). Then in January, Bill Gates announced the Trustworthy Computing Initiative, a companywide effort to make Microsoft products more secure and privacy-friendly.

Richard Purcell, Microsoft's director of corporate privacy, is the man charged with making it all happen. An avid rock climber, Purcell knows it won't be easy to scale privacy's rough terrain. It's all about trust, Purcell points out. And many consumers, it seems, don't completely trust Microsoft. Recently, I discussed with Purcell Microsoft's approach to protecting users' information. Here are edited excerpts from our conversation:

Q: Do you believe that privacy is more at risk than ever before?


It's important right now to understand that privacy issues are not new issues. They're just being talked about in new ways. It's a lot like date rape. Is date rape a brand-new issue? No. It has been going on for centuries. But it's [now] being reported and talked about and disclosed and admitted to -- albeit in an uncomfortable way. But you know what? We're working it out. We're talking about it. Overtly now.

Q: Is there a solution?


I think there are many solutions that are appropriate to different people and different contexts. A solution for your privacy from government is going to be different from the solution for your privacy from commercial enterprises with whom you don't deal, which is different from the one with those who you do deal with, which is different from your co-workers.

To find those solutions, we all have a role to play. Individuals have to take more charge of their information. Commerce is going to have to get used to having a more open disclosure model, asking for permission, following those permissions more reliably than they have in the past. Government's going to have to figure out what its role is in terms of being open -- but at the same time, not disclosing too much information, [thus harming] their citizens.

Q: It's particularly tough for individuals. I could spend my whole life trying to figure out what's been done with my data, but I have other things to do. How much responsibility should individuals shoulder?


My job is dedicated to transferring control of information back to where I think it belongs -- in the hands of the individual. But that assumes they ever had control. That's not the case. In the offline world, for decades and decades, information has been gathered and shared and used completely outside of the control of the individual. So if I say, "Gee, I really want to turn control back to the individual," what's that going to mean in terms of how much work there is? And what's it going to mean in terms of a person's ability to do it?

Nobody has ever done it before. People have no training in this. How do you know how many companies have your information? How do you know what they're doing with it? How do you know how they're securing it? You don't. So how do you control that? You can't. You wouldn't have a life.

That means, instead of chasing our data, we have to change the trust model. It has to be similar to the way there's a trust relationship between two people.

How is it that I essentially trust another person in some reliable way? It's the vocabulary, the appearance, the references, the recommendations. It's an art, not a science. Our job is to figure how we can transfer the elements of trust into a machine environment. Your machine needs to know if another machine is trustworthy.

Q: Letting computers make decisions adds convenience -- but it also reduces the user's control.


The computer does the grunt work. Because I can dial up the trust model any way I want, the individual can say, "I'm not real trusting. So I'm going to dial this up and make it really, really hard for people to get stuff." Another person might say, "You know what, who cares? I gave privacy up a while back. I'm a more open person, more adventurous."

So you still get to have your personally defined trust model, just like you do in life.

Q: I understand you're working on a "privacy dashboard." What's that?


Internally, we're using it to change the process of product development. We're instituting a privacy review, multiple steps along the line -- from the design considerations to the functional considerations to the user interface to the transfer protocols.

Privacy considerations are now being integrated into the process. There are stages throughout...where I and other people who have authority to sign off. If they don't meet standards of privacy, security, reliability, the product gets pushed back.

Q: Will there be a privacy "dashboard" for consumers?


One of the things that might result from this is the ability for an individual to use Windows and have a control panel in Windows for privacy. So that any application -- or the operating system itself -- that wants to gather data and send it out of the machine, essentially has to go through a user's preferences. Nothing will happen unless the user has permitted it and is aware of it.

Q: Does this exist yet?


No. It's just an idea of one of the ways that the process we're putting in place for trustworthy computing may result in a product that's substantially different from what we've seen in the past.

Q: Institutionalizing privacy in the design process is a massive job, especially at a company as big as Microsoft. How do you approach it?


Well, you have to have a mixed bag of stuff.... There has to be a relevance that has to be articulated. There has to be awareness raised as to the consequences of good behaviors and not good behaviors. There has to be a management directive that says certain behaviors are going to be rewarded. Other behaviors are not only not going to be rewarded but are going to be punished.

You also need tools. For example, we're building a scoring model that's a health index for privacy. Think about when you go to the doctor. They ask: Do you exercise? Do you smoke? Do you have a family history of disease? They go through these kinds of very high-level risk factors that indicate your health index.

We're building a privacy health index that examines the capability of a division of the company to be privacy-healthy. Do you have these processes? Do you have a person in charge? Did you do these reviews on time? Have you gone through the training? Is it part of your performance reviews to do well? When the media exposes a privacy problem in your product, who's on point? How do you handle that?

So once you measure that stuff, then you can get an idea about where an organization is weaker and stronger, where to put resources to fix things. That tool is being put in play right now.

Q: In January, Bill Gates sent a memo to staff emphasizing the importance of trustworthy computing. Was it a turning point?


It's Bill's effort to put the focus of the company on one thing.... My job is to take that -- and Bill has given me this job very explicitly -- and transform the company into a different kind of company that produces products and services that are highly reliable, highly dependable, highly manageable, highly safe through security and privacy practices.

Q: Critics would say you're not succeeding. Since the big announcement, a half-dozen security bugs have been found in products such as Internet Explorer, MSN Messenger. How do explain that?


You can't issue a memo on Jan. 18 and, within two weeks or even two months, have introduced your entire product line that's consistent with that. Trustworthy computing, as I try to emphasize, is about process change, so that products can then be delivered as a result. And it's a very long-term vision -- 5, 10 years, maybe.

Black covers privacy issues for BusinessWeek Online. Follow her twice-monthly Privacy Matters column, only on BusinessWeek Online

Edited by Douglas Harbrecht

Before it's here, it's on the Bloomberg Terminal.