There’s a lot at stake in the ongoing Apple encryption debate.
Apple is arguing that this is a “security versus security” issue. The privacy needs of everyday people is as important as the needs of the government and its ability to investigate cases.
The other side is arguing the rights of the victims require a special “backdoor” method to gain access to the data on a phone used by the San Bernardino shooter.
Here’s a simple way to understand that.
Imagine there’s a box that contains a detailed plan for another terrorist attack. The problem is that someone lost the key to the box. The only way into the box is to break it open, but there’s a risk that the contents would be destroyed.
Now, the FBI wants a way to open the box that does not exist yet, one that must be newly created, so they have asked the company that made the box to give them a special key that can open any box. Make sense so far?
At this point, there’s an important question to ask. If the box company can make a key that opens any box, is there any way to guarantee that the key is never misused? If the magic key fell into the wrong hands, another terrorist could use that same key to find out about how to break into a nuclear power plant or do other heinous act. A criminal could use it to find the location of your kids or steal your credit card information. It could be used for nefarious purposes. That’s obviously a problem.
Also important: If the box-maker decided to provide that magic key to the FBI, what does that mean for other box-makers? In my mind, it makes any other box much more valuable…and much more secure. We would all decide to use that box instead. It also means that a company outside of the U.S. could make the box, particularly as a way to differentiate themselves from the company that decided to make the magic key and compromise. In this case, Apple has decided not to make the magic key, and I support their decision. In terms of cyber-security, it’s the only way.
And yet, I have one big concern.
What if the box contained the plans for a much more extensive terrorist attack? Wouldn’t we do anything possible to open it? Wouldn’t it be an emergency? Would we just sit around and debate? Isn’t all of the back and forth just taking up more time if the box would prove valuable in stopping another attack?
My fear is that the issue has become too focused on the privacy of every user and not on the act of unlocking the phone to retrieve the information. The focus should be on a solution. I wonder why Apple thinks this “new code” would get out in the wild. It does make sense that it could set a precedence for creating a backdoor for other cases, and in many ways that is the real concern here. My concern is one of urgency. We need to find a way to open the box without making compromises.
It’s hard for me to come down on this issue with a definitive “Apple should never make the key” since this is an issue of national security. It’s also hard for me to say Apple should make any compromises at all. How about you? Have you landed on the perfect answer? Let me know in a public forum like my Twitter feed.