Apple have hit the news today for refusing to comply with an order from the FBI to produce a modified version of iOS, the operating system that runs on iPhones and iPads. A lot of nonsense has been talked about it, so I thought I’d explain what seems to have been going on (while bearing in mind that neither the FBI nor Apple are the epitome of trustworthiness, and we don’t have any independent sources for this.
The FBI have a mobile phone, belonging to Syed Farook, one of two alleged perpetrators of the mass shooting that took place in San Bernardino, California, last year. Farook is dead, killed in a police shootout, and so cannot enter his passcode for the phone. The FBI believe that if they could get access to it, they could discover information relating to other possible terrorists.
The problem for them is that Farook enabled a setting on the iPhone which means that if you enter the wrong passcode ten times in a row, the data on it is permanently destroyed. All the data on the phone, like on all iPhones running versions of iOS from iOS 8 onwards, is encrypted, so there’s no way to access the data without the passcode.
(Note that Fox News have been saying “Apple have helped the FBI unlock iPhones forty times in the past, so they should now!” — those phones were running the earlier versions of iOS which didn’t encrypt data.)
What the FBI want to do is essentially to jailbreak the iPhone, and then install a special new version of iOS on it, one which doesn’t erase the data when the wrong passcode is entered, so they can just keep trying different codes until they find the right one (they want a couple of other things from Apple, but this is the important bit).
To do that, they need Apple themselves to install the operating system. Every version of iOS is signed with a special key, which says it comes from Apple themselves, and the iPhone will refuse to install any operating system that isn’t signed by Apple.
Apple are refusing to provide the FBI with a signed operating system that they can use to do this. The question many people are asking is — why?
To understand this, first of all you have to understand that customer security actually *is* important to Apple. This is not the case with most of the other big tech companies. Google or Facebook, who people think of as Apple’s competitors, are actually selling tracking data. It’s in their interests to have their products be as insecure as it’s possible to be while still having people use them, the better to collect data.
Apple, on the other hand, have as their big selling point that they are a tech company that is actually selling something to their users, not to a third party, and they have to persuade you to give them your money. It’s in *their* interests to have their product as secure as possible within the constraints they’re working in, because they have to get their users to actually pay them.
(NB, Apple do many things I dislike intensely. I own no Apple products and use no Apple software, and think many of their business practices border on the obscene, so this isn’t an Apple fanboy talking…)
But of course, Farook isn’t their customer any more — he’s dead. So what’s to stop them making this special version of iOS?
The problem is that if they make a version of iOS that can be installed on people’s iPhones, and that will allow someone to gain access to the system without the password, then no iPhone user is safe. If the operating system is in existence, they will have to supply it to the law enforcement agencies of any country in which they do business — you might be happy with the FBI having this ability (I’m not), but how about the Chinese or Russian governments? They will be forced to supply it for offences a lot less serious than terrorism, and in cases where the suspect is still alive (and in many cases the suspect will be innocent, but will still have their privacy violated).
But more importantly, the fundamental lesson of the Internet age is that digital data leaks. Always. If you create something in a digital format — it doesn’t matter what it is — sooner or later, no matter what precautions you take, it will get copied, it will get passed around, and it will end up on the Internet somewhere. One copy in Apple’s offices becomes a copy in Apple’s office plus one in the FBI labs becomes one in Apple’s office plus one in the FBI lab plus one on an investigator’s laptop so she can deal with cases in the field, becomes…
Becomes, somewhere along the line, a copy on a laptop that gets stolen, or on a fileserver that gets hacked, and then that copy appears online and it’s available to everyone.
And then anyone who wants has the ability to access anyone else’s data, any time they want, if they just get hold of their phone.
So what happens then? Well, terrorists *know* that someone will try to get their data, so they install separate encryption software, encrypt everything using that, and law enforcement are back where they are now, unable to access their data. Meanwhile the teenager who has snooping parents and doesn’t want them to know she’s gay, or the abused spouse making plans to leave their abuser, or just anyone who loses their phone, would have to deal with the possibility of their private data becoming public knowledge.
There is no such thing as safe encryption with a back door. If there’s a way to break the encryption, then it’s useless. There is no way at all to create a back door that only law enforcement can use. Policymakers — and much of the general public — either don’t understand that or pretend they don’t, and talk about how all these clever people *must* be able to do something, but just don’t want to.
The FBI are right that this is a matter of national security — in fact it’s a matter of *international* security. No iPhone user, or anyone who communicates with them, could possibly be secure if Apple did what the FBI asked. Their communications — including communications with banks, medical practitioners, and anyone else — would not be private any more.
Apple are not your friend — no proprietary software vendor is — but they’re not stupid either, and they’re not going to give up without a fight on this one, because if they give up their customers’ privacy, they lose most of the customers they have.
So for once, a begrudging three cheers for Apple and consumer capitalism. Though if you *really* want to be secure, I recommend using only free software…
This post was brought to you by the generosity of my backers on Patreon. Why not join them?
Thank you for this very clear summary.