In Cybersecurity, Insights & Ideas, Network Security
FBI v Apple - The War On Privacy

Many of my clients and friends have asked me to explain the fundamental issues at hand in the encryption debate between Apple and the government. I usually preface my remarks with the following preamble: this is not a zero sum game and the winner need not be Apple or the state, but should be both.

Security and privacy are two sides to the same coin. Neither exists in isolation. It would be like postulating employees without employer, or society without individuals. This makes navigating the boundaries of the dispute challenging, but not impossible. What one is after is not the elimination of one for the other, but of striking the right balance.

In the following essay, I explore the San Bernardino case and what it means for the future of our privacy and security.


Disabling the auto erase-feature after ten failed guesses on an iPhone is roughly equivalent, if not equal to, building a back door. It allows almost anyone with enough processing power or time to simply guess all the possible combinations until they are into your device.

Disabling auto-erase is the only technique for bypassing encryption because encryption only works with data at rest and if you can crack the lock screen code, you have cracked encryption. The government wants it to be illegal for companies to sell this feature on their devices. This is because the feature makes it impossible for the state to guess secure lock screen passwords.

This essay will establish:

  • The what-why-where-how of encryption
  • The state’s role in IT and security throughout history
  • The San Bernardino data, legal case & possible outcomes
  • Discussion on privacy and common arguments for & against
  • A plan for the future of privacy and security


What is encryption?

Encrypting data is like taking a book, splitting all its words into letters, and reassembling those letters in a nonsensical way so no one can read the book. Encryption is a technology that can be used to protect any data – government, consumer, corporate, hacker, terrorist, etc.

The purpose of it is to address the fact that American (and international) citizens are not only at increased risk of identity theft and spying, but companies are losing IP to rogue nations. China, North Korea, Iran, Russia; ISIL, Al Qaeda, Hezbollah, Boko Haram, MS16, Knights Templar, Anonymous; regular old frauds, black-hat hackers, and thieves; they all want your data.

They want to spy on you and steal your identify. They want to steal corporate patents, new technologies, and financial information to cheat the stock market. They want to access government systems, steal plans and military technologies, and cause harm to key US infrastructure. Much of this is eliminated via Apple’s technology. But do not be fooled: This technology has been around for some time now.

How does encryption work?

To read an encrypted file, you must have access to a secret key or password that enables you to decrypt it. Encryption only works when data is in transit or “at rest” meaning not in use.[1] Encryption of data at rest, ceteris paribus, cannot be beat at the moment because the secret keys can be so long there is not enough time to guess all the possible passwords.

Even with super computers and unlimited computing resources, it can be impossible to guess the combination to an encryption key setup by some with intermediate IT knowledge. But this does not mean there are no consequences of eliminating the auto-erase feature, as if you are safe enough as it is. For most consumers, this feature is what truly makes them safe.

How to beat encryption?

The ideal way to beat encryption of data at rest ceteris paribus is to turn the device on and log in, since this deciphers 100% of the data in use (re: the data is no longer at rest). This is by design. If you encrypt your data at rest and in use, you can never read or use any of your data (which defies the point of storing data) (the OS would not work).

This is why the government does not want companies to be able to sell phones with an auto-erase feature; because after just ten guesses the state loses all ability to bypass device encryption (for now, this is qualified). If the state guesses only nine times and then waits the lock out period, it could take millions of years to guess all the possible combinations.

If the state can guess an unlimited number of times without an auto-erase, there remains a possibility they can crack the lock screen code (if that code is weak) rending encryption useless. This is partly because encrypted mobile phones have two keys – one for the encryption and one for entry into the device called the lock screen code.

The lock screen code is that 4+ digit number, code, fingerprint, swipe, etc. that one uses on the device lock screen. Get past this and you save a million years guessing the encryption key. It is very easy to crack a tiny numeric code compared to an enormous alphanumeric strong password. This means it is easy to crack the lock screen but not encryption.

A history of government interventionism in encryption and security

Many people just assume the CIA already has the technology to beat encryption. This makes them indifferent to the privacy debate. The claim is partially true and partially false. It is false because if the state had the ability to crack Apple’s encryption, they probably would not openly sue them in court for obvious reasons (i.e., it would make the terrorist newsletter).

It is true that the government can already beat some forms of encryption (i.e., Truecript) but this is not due solely to radical technological innovation as many believe. Two better explanations are: First, the government has excelled at coercing partnerships with tech giants; using court orders to build direct backdoors that bypass encryption; and have undermined industry security at onset.[2] Second, of the technologies they have created, this is almost always by breaking the encryption software (or OS or hardware it is housed on) itself, not breaking encryption (setup correctly, this is impossible for the foreseeable future).

A layered approach to cyber security

Everyone – hacker, state, company, etc. has a major challenge in waging and winning cyber war: a layered approach to security. The auto-erase feature is an excellent demonstration of the state’s quandary: by combining encryption with auto-erase, the state is helpless against two technologies which in isolation might be permeable. Dual-factor authentication is another example. Separate passwords for each account, another.

These are the utilization of a layered approach to security. Today, those businesses which are most successful against cyber threats layer systems, services, training, education & enforcement. Hackers, rogue governments, and terrorists all have one thing in common: to win against a layered approach to security almost requires the victim make a mistake. This is why phishing and socially engineered attacks remain the top attack vectors. The White House itself has been breached twice by these methods.[3] [4]


Cracking a phone by flashing a new OS ex post facto

Here is where I think most are led astray. You cannot retroactively upgrade the San Bernardino killer’s phone to the new OS which eliminates the auto erase feature. This is because you need the password to get into the device in order to upgrade the OS. That or you must wipe the data and start from scratch (obviously a nonstarter). There is but one exception to this: if an existing vulnerability is found elsewhere on the device. So we must get one thing straight: if no OS, firmware or hardware security gap is found on that iPhone then, ceteris paribus, it is impossible to get the San Bernardino data.

Once and done?

The idea that you could crack just the San Bernardino phone is false because you cannot limit the vulnerability created to just that phone (phones operate on shared platforms or operating systems, meaning if you can crack one, you crack them all).

This is a plain fact of the IT universe, like gravity is to physics. Operating systems are like books. You can read them. They are written in language. You cannot hide code. If Apple is forced by legal order to design all new phones moving forward without the auto-erase feature, this of course proves that the state does not intend this to be only for the San Bernardino phone.

If the state asks Apple to design a special bit of code that can help them find a way to exploit “just this device”, this would by necessity be possible for every other device in production or use that also has that flaw. It would involve searching high and low for an OS or hardware attack vector, identifying it, creating software to exploit it, and then not patching that surface ever again. By leaving this exposed, anyone will be able to hack it.

And, by the way, how might a terrorist know there is a vulnerability – a crack in the Apple dam? If there isn’t one already, the outcome of the Apple/FBI case will tell them, since Apple may be forced to publicly build the flaw by court order.

Could we ever get the San Bernardino data?

Only after an indeterminate period of time with corresponding increases in computing power and capabilities, might Apple or the government gain the ability to crack increasingly complex encryption. It is plausible Apple could devise a way to get into the San Bernardino phone but based on the facts of the case it seems unlikely and ultimately rests on the necessity of a security flaw to be found.[5] If no flaw is present, then the state must wait for gains in computing power and capabilities, but this could be years or even decades away, and is totally outside the control of any one group, mandate be damned!

Could we ever crack stand-alone encryption?

As elsewhere eluded to, it is easy to crack encryption with a court order and U.S.-based company – you can build a backdoor and bypass it. You can (continue to) encourage companies to adopt unsafe standards. You can attack the encryption software itself if it is programmed poorly. You can also beat encryption by cracking the lock screen code or hardware. Likewise, if the state has packed monitoring into hardware or software[6], or has a virus or malware already looking at processes and data in use, they can see data while it is unencrypted (man in the middle). Still, what about stand-alone, encrypted data with no other knowable attack vectors?

In theory, we can crack encryption today, assuming the key used is so weak that we can run all the possible combinations in a matter days, weeks, months, etc. But this is a spurious presumption that I state more as a disclosure. Borrowing from Dr. Carson, it would be “terrorist malpractice” to not use a reliable key. No one uses a weak key when they are insistent upon encrypting their data. That would be like a hungry person buying food to throw it away. Today, we cannot crack properly setup encryption, especially that which invokes a layered approach to cyber security. It may be possible in 1, 10 or 100 years, but this is wholly speculative. Efforts are better spent on cracking the device, OS, firmware, etc., or by coercing companies through court orders, agreements, money payments, etc. then they are spent on the futility of cracking good encryption itself.


There are not that many possibilities here if we are to believe the state when it says only Apple can help them get into this phone. As established, the state cannot ask Apple to build software that guesses the encryption key (not possible) nor can they ask Apple to build code to disable the auto-erase feature via ex post facto update (not possible). They might be able to ask them to search for an existing flaw in their work but if no flaw is found no additional recourse can be had.

Therefore, moving forward, the only thing the government can really ask for, regarding all newly produced phones, is that they either:

  1. Not have this feature at all, else
  2. They have this feature but it can be deactivated ex post facto by the state

If the feature is not allowed to exist ceteris paribus most lock screen codes will be unsafe, meaning we will be encrypting our data only in spirit, never in practice. If the feature is allowed to exist but can be turned off externally, well this is equivalent to not having the feature at all! It is the same thing.


Now keep in mind I’m an IT guy, not a lawyer, but I was a Polisci major and I am an avid lover of law. Here is how I see it:

The government’s case

The government believes it already has the power to compel Apple to:

  1. Eliminate this feature from all future product sales, and
  2. Force companies to code new applications and methods of spying for the sole intent and use of the government

It is not a mystery what precedent the government feels warrants this expansion of its power: the All Writs Act of 1789. Those who think this is a Fourth Amendment issue may want to re-read the government’s position. They simply claim the right exists under the AWA. Which makes sense: There is no legal precedent in this history of this country in which the state could force a company or person to turn over, “persons, houses, papers, and effects”[7] not possessed.

Ramifications of this expansion

What is particularly disturbing is that this is an unprecedented use of the AWA, one that if successful would eliminate, in a single stroke, nearly all privacy. This same interpretation of the AWA would give the state the legal authority to compel Apple or any other company to build spy technologies that work on other company’s products! The state could force Apple or even an individual to build software at their own cost that hack into your medical records, enable your TV’s camera, turn on the mic in your phone, and worse.

The real legal issue at hand

Apple does not deny this may be a great tool or asset for the state, one whose benefits outweigh the costs of making insecure all American citizens. They simply believe that is a matter better legislated not adjudicated. If Apple is correct, the state does not have this power and giving it to them will not accomplish the ends sought. In fact, it will be counterproductive! The suggested remedy, therefore, is writing and passing a new law. It is worth noting that Apple has even more at stake in this debate: the state is demanding by show of force the company undermine decades of advancements, the security of its clients and the future of the company itself.

Concerns with the state’s approach

For more than a decade, the state lied to nearly all of Congress, and to all the People, about its spying operations. This is no longer a conspiracy but a fact that even the President can acknowledge.[8]

James Clapper, Director of National Intelligence, under oath in March 2013, was asked whether the NSA collected “any type of data at all on millions or hundreds of millions of Americans” to which Clapper said, “No, sir…not wittingly.”[9]

It was not until after the Snowden release June of that year that the state admitted to this categorical searching and seizure of all American data records with corresponding cover up. Without Snowden and the likes of Apple, Google, Facebook and others, we would never have even known about these grave threats to our social contract and security. Today, it is an unassailable fact recognized by both parties to the debate that the state secretly spent more than twenty years undermining encryption and security standards.[10] The stateolotrists who remain indifferent to privacy today are the same people who argued it was a “conspiracy theory” that the state was spying on us or undermining us. Today they have shifted to, “It’s worth it.” It is clear the government adopted an approach of secrecy versus public dialogue because it thought this the most prudent approach to its national security interests. It is the hope of historians, economists and civil rights activists everywhere that the state learns from this failure and recognizes that there is no such thing as a “noble lie”.


The result of this case will have profound reverberations felt throughout the world. The US is a case study for freedom related issues abroad.

Who has the burden?

If Apple wins, the state will pursue a legislative decree giving it this power. This subjects the state to the People’s scrutiny and to an open dialogue. If the government wins, this means the courts believe the state already has this power and the only recourse individuals or companies will have is to write a new law or wait for the SCOTUS. In other words, if the court rules in favor of the state, it has successfully transferred the burden of cost, time, public dialogue, coalition building and legislative decree to the People.

Effects on terror and criminal activity

Meanwhile, so long as encryption is rendered useless by government decree, all Americans and all devices will be unsafe.[11] And, as is often the result of government interventionism, terrorists will not be at all be dissuaded. Remember, they do not follow our rules or laws. The terrorists will either stop using iPhones, or program the feature into their own open source OS, allowing them to fully utilize encryption, all the while exploiting the absence of ours. The result of this case, ceteris paribus, will not return San Bernardino data, nor will it give the government the ability to decrypt future terrorist devices. It only gives them the power to decrypt yours.


Privacy does not matter

We are told our individual privacy does not matter yet the government and companies alike have whole classifications of privacy (i.e., top secret, secret, confidential, patents,  copyrights, etc.). We are told that companies and the state have privacy rights that do not harm the individual, yet somehow the individual’s privacy harms the state. How can privacy matter to companies and governments but not individuals when companies and governments are nothing more than groups of individuals?

If you have nothing to hide you have nothing to fear

We certainly would not apply this logic to the Fifth Amendment right against self-incrimination, even though it would certainly solve more crimes, yet many flout this demagoguery daily. The nothing to hide nothing to fear argument is a circular fallacy, relying on the premise that privacy does not matter, or that you have nothing to hide. It’s like saying Oakland is in Ohio, therefore Oakland is in Ohio. “If these are really your principles… what occasion is there for your political testimony if you fully believe what it contains[?].”[12] In other words, why argue against privacy if it is unimportant? If the outcome does not matter to you, why do you openly criticize those to whom it does matter?

Security through insecurity

We have not only the government, but many companies and individuals too, all arguing that the best way to secure this country is to make insecure all its People and infrastructure. Enough said.

Adjudication versus legislation

Some argue that moving this dispute from the domain of the unelected judiciary to the domain of the duly elected legislature is, “dooming us to ‘special interest’”. By special interests I take it you mean the People and those tech companies the law affects. By this logic we ought to have no legislature whatever. Any extrajudicial or extralegislative decree by the executive would be a better, more impartial solution than constitutional due process!

Back and front doors

The Director of the FBI James B. Comey is adamant the state’s request is not a “back door”: “There’s already a door on that iPhone… Essentially, we’re saying to Apple ‘take the vicious guard dog away and let us pick the lock.”[13] An admirable but as I see it fallacious argument. The door that already exists – we call that the lock screen. This is the front door. If this door is guarded by a dog, and you are unable to incapacitate the dog, you need a second door. Or in colloquial speech: you need a back door. Remember, Apple cannot remove the guard dog – by logical necessity. It would fail to be a guard dog otherwise! It would be a service dog or a social anxiety pet, but not a guard dog.

Evidence-free zones

South Carolina Representative Trey Gowdy asks, “We’re going to create evidence-free zones? Am I missing something?”[14] What he is potentially missing is that these zones already exist! The Fifth Amendment is an evidence-free zone and is actually quite like encryption: you can only know my information if I give it to you. Fruit of the poisonous tree doctrine is another example.[15] Spousal privilege, another. In fact, due process itself falls under this kind of ambiguous scope (i.e., court appointed attorney, Miranda rights, following criminal/civil rules of procedure, speedy trial, humane treatment, etc.).


Normally, individuals have control over what others know about them. Giving information requires reciprocity. It involves a free and willing trade. Information mediates our relationships. Charles Fried says, “…the most fundamental sort [of relations]: respect, love, friendship and trust…without privacy they are simply inconceivable.”[16] What we tell our best friend is dramatically different than what we reveal to our employer or an IRS auditor. The more intimate the relationship, the more we freely share. It is rich, therefore, that we probably more strongly control what our spouse sees than what our government does!


There are many decent reasons for meeting the state somewhere in the middle. Admitted, this would have been much easier if the state had heard John F. Kennedy when he famously said, “The very word ‘secrecy’ is repugnant.”[17] Through a campaign of secrecy, the state has eroded individual and corporate trust. By propagating poor arguments in its defense, it appears reactionary and propagandistic. This is bad for everyone because it casts a veil over the state’s actual needs.

Those needs include the cache of spy tools needed to wage and win wars against modern insurgencies at home and abroad. The encryption element is probably a non-starter; you cannot, as they say, put the cat back in the bag. But this should not obfuscate our deliberations on this and other tools, such as monitoring pertinent communications, the collection of some data records, and the use of subversive techniques against terrorists (not US citizens). If that state wants some of these powers, they will need to quickly change the dialogue from force and misinformation to cooperation and collaboration.


If individuals and companies refuse to protect individual privacy it is hard to believe the state will. Or put another way, why should the state care if you do not? William Parent and Judith Jarvis Thomson say the right to privacy is more aptly called a right to liberty.[18] It is fundamentally requisite to the exercise of one’s liberties. In many places, surveillance is still used as an instrument of oppression. Alan Westin says, “Visibility itself provides a powerful method of enforcing norms [and social control].”[19]

The US has a storied history of protecting privacy – privacy from warrantless searches; privacy from the compulsory housing of troops; privacy as established in Griswold.[20] It is why we have doctor-patient confidentiality, attorney-client privilege and HIPAA. Your library records are secret. Even your video rentals are secret![21] Just imagine if every thought you ever had or will have was live-tweeted with complete accuracy. It is hard to believe you would have the right to think freely or to deviate from the norm. And so we care about privacy because, “To lose control of one’s personal information is in some measure to lose control of one’s life and one’s dignity.”[22]

The Universal Need for a Privacy Standard

Absence of a privacy right perpetuates a culture of servitude and quiet desperation. It minimizes individuality and expression and fosters majoritarian rule. You cannot exercise a Constitutional right without privacy. It is like constructing a triangular square; it is a computational error! An absence of privacy is contrary to every ideal we as Americans hold sacred and calls into question whether we are free at all, if we neither have privacy, nor want it.

Welcome to the Future

In a few years, privacy will be as much a buzzword as cyber security is today. People will be buying privacy products and services faster than they can be supplied. Well let’s slow down. What privacy looks like in a few years is actually playing out before us.

On the one hand, we may allow the government to continue its 20+ year trend of watering down IT & security, leaving us uninformed about both our rights and the implications of the state’s actions. On the other hand, we may force the government to engage in a real, substantive debate before Congress about the issues at hand: privacy, security, accountability, efficacy and constitutionality.

People and companies are just now recognizing the threats they face as a direct consequence of the state’s history of obscuring these critical facts. The state has perpetuated and staunchly defended a promotion of default-agnosticism towards individual & corporate security and privacy. As a result, most people will not care about security until they are personally victimized.

Many attacks could have been prevented, and lives saved or improved, had we been educated about these issues decades ago when they originally surfaced as a product of the IT revolution. Luckily, this inertia is countered by one unstoppable datum of the market: as privacy becomes more scarce it becomes more valuable.


The government has a real need to make use of as many tools as it can in its effort to combat terrorism and criminal acts. But the case against Apple is among the least effective ways of proceeding. Not only does it make Americans unsafe, is does nothing to alter the activity of terrorists. If this is truly the most effective way to secure the country, the state should not rely on an inapplicable law from 1789, it should create the law it deems necessary, like it did in enacting the Patriot Act.

At hand is an opportunity; a chance to establish or deny our own basic human rights. A chance to establish or deny the state the legal authority to do its job in the 21st Century. We are all authors of this next chapter and no one relieves us of this responsibility. States, companies and individuals all collect, store, make use of and transmit data. It is therefore relevant to everyone that a privacy standard be defined and enforced.

Establishment of a privacy right will be realized one of two ways:

  1. Through direct legislation, or
  2. Through a lengthy, potentially century-long wait for the Supreme Court

The disestablishment of a privacy right can be realized in one of three ways:

  1. Through direct legislation, or
  2. Through direct state usurpation (i.e., All Writs Act or other rationale), or
  3. Though denial of said right by Supreme Court decision

The best, most efficacious method for the establishment of a privacy right is for citizens to engage lawmakers, now, to legislate a direct protection. In establishing a right to privacy we will be leaving a legacy for our children, one in which their forefathers emphasized freedom and dignity before statism and corporatism.


Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.