Some people think I tend toward paranoia. If a lack of blind trust in the human condition means I am paranoid, then I guess I am. This topic comes up today as we see the average citizen joining the discussion about encryption, security, privacy and the smartphone. By now you have ( unless you are living under a rock) heard that Apple has been ordered to help unlock the iPhone that belonged to the Santa Barbara terrorist Syed Farook. Apple has refused. You can get the details in the previous links, but mainly Apple doesn’t even know how to get into this iPhone. They designed it that way. It keeps them from getting into your data and keeps everyone else out as well. Apple would have to write a new operating system for this particular phone, sign it (to prove that it came from Apple and not some hack) and then upload it to the phone so that the FBI could get in. In essence, it is a master key, because it’s constituent parts become part of a knowledge-base that can render your phone insecure. If you think the FBI would never use this key or the program written to make it work on any other phone, well then, I think you can safely say you are not paranoid. Programmers call this a “back door.” If you believe that only the FBI will discover, find, or hack into the back door (for good reason, of course), then you can safely say you are not paranoid. Furthermore, there is no reasonable protection for this new “backdoor” into our phones. Once there is another way, someone will find it.
What good is encryption if it isn’t encryption?
I don’t think this is an argument for whether the FBI is justified in wanting to know what is on that phone. It’s is about how they get it, and whether or not they will be able to get it from anyone else (for good reason, of course) the next time they are curious.
You can see how this plays out in my design fiction scenario: Ubiquitous Surveillance. Check it out.