In the dim light of dawn, Mohammed Alshamari burst into a classroom building at the Pensacola naval base in Florida and proceeded to gun down eleven students. Three U.S. sailors were killed, and eight others were injured during the rampage. Alshamari was mortally wounded by sheriff’s deputies after they arrived at the scene. Alshamari, a second lieutenant in the Royal Saudi Air Force, was among a group of military officers studying at the naval base, as part of a program that trains foreign military personnel. The December shooting has been classified as an act of terrorism.

As an integral part of its investigation, the FBI sought access to the shooter’s encrypted iPhone. It was anxious to learn whether Alshamari acted alone or in concert with other terrorists. If so, were other attacks planned? The phone’s data might hold the clue to such information. But Apple refused any substantive assistance, hiding behind a thicket of vague language and hollow protests.

Apple’s Privacy Absolutism

Several years ago, Apple decided to incorporate hyperstrong encryption into its new iPhone operating system, called iOS 8. This means that everything on the iPhone—iMessages, e-mail, calendar, contacts, etc,—is locked up tight. A decryption key is linked to the user’s personal passcode. The encryption is seamlessly integrated into the system, so that the encryption and decryption happen automatically and by default. Apple also no longer retains any kind of master key to unlock the content on these phones. Only the phone’s user has access to this data by means of his or her passcode. The metaphor of “going dark” aptly captures the user’s experience.

Start your day with Public Discourse

Sign up and get our daily essays sent straight to your inbox.

The fraught terrain of the Pensacola dispute is a familiar one for Apple. In 2016, another terrorist, Syed Farook, killed fourteen people and wounded twenty-two others at an office party in San Bernardino. The FBI requested that Apple create encryption-breaking software for this one phone. But Apple demurred because it was concerned that such software could be manipulated or misappropriated. If that happened, they argued, all encrypted iPhones would become vulnerable to thieves and hackers.

Apple’s principled position also happens to advance its imperial ambitions in the smartphone market.

Apple’s principled position also happens to advance its imperial ambitions in the smartphone market. Consumer privacy has emerged as one of the company’s core values. Tim Cook has not been diffident in his critique of the intrusive, ad-supported business models of Facebook and Google. He frequently reminds prospective and current customers that their data is completely safe on the iPhone. Apple’s defiance of the FBI is another potential driver of iPhone sales.

Privacy versus Security

Apple’s high-profile feud with the Department of Justice is ultimately rooted in the sovereign nature of software code. As legal scholars like Larry Lessig pointed out long ago, software code provides powerful regulatory leverage for those companies that design and implement it. Because software code has the same effect as that of law, it becomes political. Code, like law, regulates behavior. For example, code can establish the parameters of privacy protection or proscribe some forms of free speech through filtering mechanisms. By adopting unbreakable encryption, with no universal backdoor or exceptional access, Apple gives each of its users a limitless right to restrict access to their data: it inscribes into its operating system an absolute right to privacy. Apple, a de facto regulator, now defines the boundaries between public safety and technological necessity, and its technology is tinged with moral hubris.

But privacy is surely not an absolute right when viewed through the lens of morality or law. The Fourth Amendment of the U.S. Constitution guarantees protection from “unreasonable” searches of government, but does allow warrants for searches “upon probable cause.” From a moral perspective, it would be impossibly difficult to make a case that privacy is an absolute right that trumps all others. With very few exceptions—such as the right of the innocent not to be killed as a means or an end—rights are limited by each other and by other aspects of justice and the common good, including public order and security.

The fundamental problem, which we as a society have yet to resolve, is how to deal with the sweeping regulatory effects of software programs. Rather than depend on Congress or the legal system to determine the scope of privacy rights, companies like Apple make unilateral decisions about what values are to be embedded into its code. This typically happens without public debate or without sufficient attention paid to the common good and to the reasonable laws that limit the unfettered exercise of rights. How can we trust private enterprise to establish software design parameters that are consistent with the common good and moral values?

Bolstered by public opinion, Apple is convinced that it occupies the moral high ground in its recent confrontations with the FBI. It simply refuses to take any action that will imperil the privacy of its users by building a universal back door with a master key. Apple contends that it cannot adequately protect a master key. Yet prior to its new policy, it securely possessed a master key that was never compromised or misappropriated. At a minimum, Apple could fully cooperate with law enforcement on an ad hoc basis, but it has offered only limited assistance in both the Pensacola and San Bernardino cases.

Apple may be quite sincere in its commitment to privacy rights, but its technology fails to acknowledge that the right to physical security, which is tied to the right to life and health, can take priority over privacy rights. Who among us would not sacrifice certain personal information to protect ourselves from physical violence that puts our life and health in jeopardy? A person can survive without privacy, but, as John Stuart Mill once observed, “security no human being can possibly do without.” If the infringement of someone’s privacy can prevent loss of life or debilitating injury, that infringement can be justifiable. Moreover, human rights can only be properly exercised within a milieu where people do not live in fear and distress, and are protected against identifiable threats to their physical well-being. The preservation of security is a necessary condition for the exercise of all the other rights, including privacy.

A Plausible Solution?

 Is it possible to resolve this polarizing dispute with a reasonable technological compromise? We must acknowledge the risks of universal backdoor solutions, at least as they have been implemented in the past. But there may be other methods of exceptional access where the risks are minimized and far more acceptable.

Steven Levy, a crypto expert, documents the work of Ray Ozzie, who has proposed a novel remedy known as Clear. First, a vendor of mobile devices, such as Apple, generates a pair of complementary keys: one public, and one private. The public key is then stored in every iPhone, while the unique private key to each of these phones remains in a secure vault with Apple. These keys can be used to encrypt and decrypt this additional confidential PIN that every user’s device generates once it is activated. This passcode is protected through encryption by means of the vendor’s public key, and it can only be decrypted by the vendor’s corresponding private key. If a law enforcement agency such as the FBI has a valid warrant to access the contents of a phone in its possession, it can force the phone into “recovery mode” that allows access to a barcode containing the encrypted PIN. This code is then transmitted to Apple. After confirming the authenticity of the warrant, Apple decrypts the PIN by means of the private key, and returns that decrypted passcode to the FBI, so that it can unlock the phone. Once the phone is unlocked, another chip freezes the contents of the phone as an added security measure. In the highly unlikely event that a private key is stolen from Apple’s vault, the culprit cannot manipulate the unlocked phone, since it has been permanently disabled.

Despite the enthusiastic support of some high-tech luminaires—like Bill Gates—Apple, Google, and Facebook have exhibited no interest in following up on Ozzie’s key escrow plan, or implementing any form of exceptional access. More effective solutions are probably feasible, but these companies are quite satisfied with the status quo. In the world of computer engineering it is usually possible to construct a solution to even the most intractable problems. If Apple dedicated a reasonable portion of its vast resources and talent to this technical puzzle, it should be able to uncover a low-risk mode of exceptional access. But Apple lacks the motivation to resolve this latest phase of the crypto wars.

Where Are the Leaders at Apple?

The tangled complexities of this moral problem call for prudence, a more careful weighing of the burdens and benefits of the choice for absolute privacy. Apple must discern how much harm it is willing to inadvertently facilitate by deliberately not cooperating with law enforcement authorities, even in the most exigent circumstances. As Aristotle might say, prudent leaders are mature people of practical wisdom who can deliberate about these issues with care and precision. Are there such people at Apple?

It’s always dangerous to peer into the future, but it seems inevitable that a preventable terrorist act will occur with the help of strong encryption that cloaks the terrorists’ nefarious plans.

It’s always dangerous to peer into the future, but it seems inevitable that a preventable terrorist act will occur with the help of strong encryption that cloaks the terrorists’ nefarious plans. In order to avert such a catastrophe, Apple must judiciously re-evaluate its commitment to promote absolute privacy. The challenge for this colossus of Silicon Valley is to take into account the full implications of “going dark,” and to reconcile the competing values of privacy and security in its software design decisions. If Apple and others fail to act, the challenge for policy makers will be to determine the law’s role in setting proper standards for software—like encryption code—that has a material social impact.