Picture of an iPhone

I Stand With Apple Against The FBI

The big tech news lately is the court case between Apple and the FBI. The FBI has an iPhone from a murderer/terrorist, and they want Apple to write software that would bypass the encryption on the phone so they can see if it contains any useful evidence.

Obviously, it should go without saying that the shooting was a tragic event. Especially in this particular case, given that it was a mass shooting in which 14 people were killed and 22 injured. Nobody is denying the horrifying nature of the crime.

The debatable part is whether or not Apple can be compelled to help the FBI bypass the encryption. On that matter, I side with Apple. Let it be noted that I don’t even own any Apple products; I’m an Android guy, so I’m not coming at this as an Apple “fanboy”. My concern lies in the security implications of the case.

The main issue I have with the FBI’s demands is that it would not only create a slippery slope in favor of more government control over our digital lives, but also that I have doubts over how effective it would even be to combat terrorism.

1. Slippery Slope

First, the most common argument is that by forcing Apple to comply with the FBI’s demands, the courts would be creating a slippery slope that would lead to even greater government intrusion into our digital rights. I agree that this is a reasonable risk.

While the slippery slope argument can sometimes be fallacious, I believe it is a very real concern when it comes to high profile court cases. The US legal system is based on a system of precedents, in which the outcome of one case is used as a baseline to help determine future cases. Thus, to some extent, slippery slopes are built in to the court system itself.

By winning this case, the FBI would gain a legal precedent in which they can say, “Hey Google/Microsoft/Facebook/Acme Inc., a previous case ruled that we can compel Apple to bypass encryption. We’d like to take a look at what you have too.” With such precedent on their side, government agencies would have far more leverage to help them gather even more data from our online activities.

2. Security Risk

Another point of concern is the security risk of what the FBI is asking Apple to do. They aren’t asking Apple to “crack” the encryption; that is mathematically impossible, and if it could be done, the FBI would probably be the ones to figure it out. The fact that they are asking for Apple’s help is a sign that they probably haven’t figured out a way to do it.

Instead, the FBI wants Apple to bypass the encryption entirely by creating a “backdoor”, or a “golden key” that would allow anyone with access to it to decrypt the data. In theory, only the FBI would have access to that golden key.

In theory.

Therein lies the problem. There is no good way to guarantee that only the FBI would have access to it. If a hacker found out about the key and managed to crack it, they would have access to pretty much any device they wanted. If those hackers are state-sponsored attackers from, say, North Korea, or China, or Russia, this is not an unrealistic possibility. You want national security? Then putting a backdoor into every device sold in America is a horrible idea.

3. It wouldn’t work anyway

This leads me to my third argument, which is one I haven’t seen anyone else mentioning. It is, simply, that the backdoor would not really do much to prevent attacks, and it could even backfire to make it more difficult to gather digital evidence.

The government might be able to make laws requiring tech companies to install a backdoor onto devices, but they can’t do anything requiring terrorists to use those devices or software. Does the FBI really think a terrorist is going to avoid uncrackable encryption because it would be against US laws to do so?

US laws are only applicable to people in America, and terrorists and criminals care least of anyone about complying with those laws. That’s why they are criminals.

If the FBI wins, a more likely scenario is that American tech companies are forced to install security vulrnabilities like backdoors into their devices and software. Meanwhile, the bad guys just write their own encryption tools. The result? Bad guys get around the government surveillance while the rest of us are carrying around devices with security weaknesses.

Requiring tech companies to intentionally put exploits into their products is a really, really bad idea. There is literally no way it would end well.