Has Apple picked the wrong time to make its stand on smartphone encryption?

Apple chief executive Tim Cook.
Apple chief executive Tim Cook.

Apple Inc. is refusing on principle to help federal investigators obtain locked data files from an iPhone used by a mass murderer. But has Apple chief executive Tim Cook chosen the wrong hill to die on?

While many technologists and privacy activists back Apple’s bold stand, at least one computer scientist says the company should comply in this case, while continuing to fight other efforts to weaken smartphone security.

“I do think that Apple should go along with this,” said Clifford Neuman, director of the Center for Computer Systems Security at the University of Southern California, “but it’s based on my belief that doing so does not further compromise the security of other phones.”

US Senator Edward Markey, Democrat of Massachusetts, shares Neuman’s view. “Apple should try to find a way to work with law enforcement in terrorist cases, and it should work in a way that helps to get the information off of this phone without compromising the privacy of every other iPhone in the United States,” said Markey, a longtime Internet privacy advocate.

On Tuesday, a federal magistrate judge ordered Apple to assist the FBI in cracking encrypted files stored in a phone used by Syed Rizwan Farook, who along with his wife murdered 14 people in a terrorist attack in San Bernardino, Calif., in December. The FBI hopes the files will provide additional information about whether Farook and his wife were aided by accomplices, in the United States or abroad.

The next day, in an open letter on the company’s Internet site, Cook vowed to fight the court order, saying it would amount to creating a “backdoor” to the iPhone that would make it easier for criminals to spy on iPhone users and steal their most sensitive personal data. “The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals,” Cook said.

Since 2014, Apple has boasted that its newest iOS software for the iPhone is so secure that even Apple can’t unlock a passcode-protected iPhone to decrypt any stored files. But Cook, in his letter, did not dispute that the company could write software that would provide access to the encrypted data.

Neuman argues that the ability to modify the phone’s software in this fashion is itself a backdoor that has lurked inside Apple phones for years. “It wasn’t put there intentionally by Apple,” he said. “At least I don’t think it was put there intentionally by Apple. But it does exist.”

The FBI, Neuman said, simply wants to exploit a weakness that’s already there.

But the agency can’t without ­Apple’s help.

To change an iPhone’s software requires a unique key held only by Apple. So the FBI wants Apple to develop a piece of software that will include the unique key for Farook’s phone. Apple would then modify the phone’s software to allow the FBI to run its own suite of password-cracking tools. Apple wouldn’t crack the phone; it would just make it possible for investigators to do so.

Neuman said that Apple could cooperate to this extent without fatally compromising its commitment to user privacy. Meantime, the company could ensure that newer versions of the iPhone could not be accessed in the same manner. “If they are making this claim that not even Apple has access to this data,” Neuman said, “they need to follow through on that claim.”

However, federal and state politicians are calling for laws that would mandate backdoors in smartphones. New York and California lawmakers have introduced legislation to require that smartphone makers be able to unlock data stored on their devices, if there is a court order to do that. And The Wall Street Journal reported Thursday that US Senator Richard Burr, a North Carolina Republican who chairs the Senate Intelligence Committee, intends to offer a bill that would subject companies to criminal penalties for failing to comply with a court order to decrypt data.

“As a technologist, I can tell you that those proposals are just the wrong thing to do,” Neuman said. While he thinks Apple should cooperate in the Farook case, he agrees with the company that adding backdoors on purpose is a big mistake, because criminals, spies, and even terrorists will inevitably find ways to exploit them.

Bruce Schneier, a cryptographer and a fellow at the Berkman Center for Internet and Society at Harvard University, agreed that Apple left itself open to the FBI’s demand if it failed to completely lock down its iPhone software. But he said Apple should still resist the court order, to avoid setting a dangerous precedent.

“Does the FBI have the right to dictate the level of security that information security companies provide to their users and customers?” Schneier said. “I hope not.”

Hiawatha Bray is a technology reporter for the Boston Globe. E-mail him at h_bray@globe.com.
Follow Hiawatha on Twitter - Facebook - Google+