The Limits to Apple's Security Improvements

Photograph by Simon Dawson/Bloomberg

Apple wants to compete on privacy. The phonemaker’s announcement on Wednesday that it won’t be able to unlock iPhones in response to law enforcement requests is an attempt both to ease fears over government snooping and take a dig at its data-hungry rival Google. Security experts were quick to praise Apple for taking this step while noting some limits to its powers.

To review: Apple’s iOS8 operating system, which became available this week, encrypts photographs, messages, e-mail, contacts, call history, and other data, locking it all with a user password. “Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data,” Apple posted on its website. “So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”

This is the latest in a string of post-PRISM moves by Silicon Valley companies to build technological barriers that keep them from accessing user data; if the company can’t get at files, the government can’t force it to hand them over. Android quickly said it would follow Apple on encryption by default for its phones, although not all Android users will be covered quickly because a much higher percentage of Android phones run on older versions of its software. Both Yahoo! and Google are working on end-to-end encryption for e-mail to prevent email providers from accessing the contents of those messages (although potentially valuable information as to whom people are communicating with remains accessible). Some companies, such as CloudFlare, do the same thing for cloud storage.

Apple’s move will make its devices more secure while leaving users exposed in a few key ways. First off, just because Apple won’t open a device for law enforcement doesn’t mean that officers can’t open it themselves. “What Apple has done here is create for themselves plausible deniability in what they will do for law enforcement,” wrote security expert Jonathan Zdziarski on his blog. “While it’s technically possible to brute force a PIN code, that doesn’t mean it’s technically feasible, and thus lets Apple off the hook in terms of legal obligation.”

A “brute force” attack is one in which the hacker simply enters every possible password, which isn’t unthinkable for four-letter codes. To protect against this, Apple has a security option whereby a phone’s data is wiped clean after a certain number of unsuccessful attempts. Zdziarski has found an additional vulnerability, also covered by Wired on Thursday: Because Apple has designed iPhones to communicate with users’ PCs even if they’re locked, anyone with access to both the phone and the computer could get into the phone. This only works if the phone is turned on because Apple requires the password to be reentered when it boots. Zdziarski sees this as a potentially useful tactic for airport security, and recommends that people who are concerned about this should turn off their iPhones before going through security.

A wider issue is that Apple is protecting only a certain portion of information stored on the device. Some sensitive data—such as call logs—are also kept by wireless carriers, and a lot is stored on Apple’s servers through its iCloud services—as the world was reminded recently, when hackers stole nude photographs of celebrities. Users can manually bar their phones from sending information to iCloud.

Apple could put technical measures in place to protect cloud data in the same way it has with its devices. Such a move would be ideal, says Hanni Fakhoury of the Electronic Frontier Foundation. “There’s some funny tension there with the cloud issue,” he says. “Increasingly, the device isn’t where the data is being stored anymore.”

    Before it's here, it's on the Bloomberg Terminal.