Don't Mess With Encryption

Just leave it alone.

Photographer: Michael Bocchieri/Getty Images News

Spies and their adversaries have long used technology to obscure their messages, with everything from invisible ink to the Enigma machine to "sensitive compartmented information facilities." They've never been entirely successful. But the fear of unbreakable codes still permeates the world's intelligence agencies.

Strong encryption is their latest concern. Big technology companies, worried about hackers and government surveillance, are starting to encrypt their devices and services more aggressively, often by default. In what's known as end-to-end encryption, messages and other shared data are legible only to the sender and recipient -- not to the companies themselves, or to government spooks who want access to them.

Alarmed by this, U.S. intelligence officials want companies to be more helpful. One idea is to create special keys or "back doors" that would enable the government, with a warrant, to access encrypted data when investigating terrorists and other criminals.

On the surface, this seems reasonable: Giving terrorists access to anonymous digital communications sounds like an invitation to mayhem. And there's a long-established legal basis for demanding that telecom companies allow law enforcement access to their technology.

But dig a little deeper, and it's clear that the risks of such an effort would substantially outweigh the potential benefits.

Most pertinently, providing special access for governments could weaken digital security for everyone else. As a group of top computer scientists recently warned, it would undermine enhancements that have made progress in warding off hackers, make systems far more complex and thus more vulnerable, and create enticing new targets for cybercriminals. The result could be an increase in fraud, identity theft and other crimes. A back door for cops can be opened by robbers, too.

Creating back doors could also abet the efforts of certain countries to steal intellectual property. It could impose major expenses on U.S. tech companies, provide a boost for their overseas competitors, and slow domestic economic growth. And it would make it awfully hard for the U.S. to continue criticizing other governments for doing the same thing.

For all that, it wouldn't stop terrorists from communicating in secret. Islamic State doesn't need to download WhatsApp to take advantage of encryption; open-source versions are easily found online. And nothing would stop foreign companies from offering such services. The one sure outcome would be that terrorists would stop using American messaging apps -- if they haven't already -- thus rendering the exercise largely moot.

That's one reason, among many, that cryptographers and security researchers have almost unanimously opposed this approach. They argue that even a brilliantly designed system that could minimize harm to law-abiding users -- no guarantee in government work -- would almost certainly not be worth the costs.

And they're not alone. In a recent Washington Post op-ed, a group of former intelligence grandees -- Mike McConnell, Michael Chertoff and William Lynn -- warned that such a requirement could undermine not only digital security but American moral authority. Despite similar worries about encryption in the past, they noted, intelligence agencies have always found ways around barriers to their eavesdropping -- and will surely do so again.

Perhaps the NSA's engineers will perfect the quantum computer they've been working on. Perhaps the U.S. espionage colossus, with its $70 billion annual budget, will come up with something cleverer still. American ingenuity, in the end, will surely be a better bet than degrading a promising technology. 

To contact the senior editor responsible for Bloomberg View’s editorials: David Shipley at davidshipley@bloomberg.net.