People died. While we can easily get lost in the implications of preventing deaths and understanding why a mass killing happened, there is one fact we’re left with.
The FBI have asked Apple to write a backdoor into the iPhone code to allow the FBI to brute-force entry into an iPhone.
What is Brute Force?
Quite simply, it means trying passwords over and over again, until the right one is determined. At its heart, it’s trial and error, and you can program it into another computer. We call it brute force because rather than trying to intellectually deduce a password, it’s done via direct effort.
Why does this involve Apple?
You can set your iPhone to, after 10 incorrect passwords, wipe itself out. After three (3) wrong passwords, the iPhone makes you wait a little. I’ve set my phone to wipe on 10 incorrect passwords since if someone has my phone and can get in, they can also get access to my banking information.
With a 4-digit passcode, there are 10,000 possible permutations. With 6, this increases to 1,000,000.
The Ten most common passcodes have probably been already tried. And if you want a fun read, check out Why repeating a digit may improve security on your iPhone’s 4-digit lockscreen PIN.
What did the FBI actually ask?
I have read a copy of the summary (this is not the full 40 page ruling) and many of the news articles. The best I can summarize is this:
Tuesday February 16th, 2016, a magistrate judge in Riverside, California ruled that apple had to provide “reasonable technical assistance” to the government to recover data from an iPhone 5c. This includes bypassing the auto-erase function (the one that happens after 10 bad passwords) and allowing them to submit an unlimited number of passwords. In order to do this, the FBI wants a special version of iOS that only works on the one iPhone.
Apple has five days to respond if they believe that compliance would be “unreasonably burdensome.”
Yes, it says that the FBI is asking to break into one iPhone, but the only way to do that is to write a system that could be used to backdoor any iPhone. This is because Apple intentionally wrote their code so that they couldn’t get at your data. Apple has no way to dismantle or override the 10-tries-and-wipe feature. Only someone with the passcode can do it.
Is that technically possible?
Of course. There’s no real question about that. It won’t be easy (so ‘unreasonably burdensome’ may or may not apply here). And to be honest, the technical possibility of this is not the issue either.
Does this mean ‘anyone’ could do this? Yes, but it’s unlikely. This sort of hack is an OS-level one, which means the software needs to be signed by a key only Apple knows, unless there’s some other vulnerability in the phone. You can introduce a vulnerability by jailbreaking the phone, of course, but for the most part we don’t know if you can hack it from the outside like that. Signs point to this not being probable. But if it was going to happen, Apple would be the best company to try. They’re the ones who would know best.
I want to stress: I believe anything is technologically possible. Human cloning? You bet! Hacking my iPhone? Sure thing. I do not believe these things are easy, or even probable, but they are in the realm of possibility.
Why did Apple Say No?
Apple did say no. They said it publicly in a Customer Letter on their website. And they said no, not because these things are hard, but because they are dangerous.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
[…]
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
You can read the whole thing for yourself, but in essence Apple is saying that by allowing the FBI to insist on this, they can use it as leverage to demand anyone’s phone be unlocked similarly. Keep in mind, while this case is certainly above board, do we really trust our government to always have our best interests in their hearts? Where can we draw a line between a known criminal act and a suspected one? Do you think they will never apply this to a case with tenuous links to an actual crime? We’ve already had wiretapping issues (Watergate, need I say more), and frankly the US government hasn’t gotten much better. And once the US has managed to allow this, many other countries will use this as their reasons to do so.
Also you can’t uncork the lamp. Once the genie is out and this is possible, it will be given out to other agencies and someone will reverse engineer how this works. Other countries will get their hands on this. They will use it against innocents. We know this is truth because it already happens now.
Privacy and Freedom
I’m going to give you the quote you’re expecting. The Ben Franklin one:
Those who surrender freedom for security will not have, nor do they deserve, either one.
From a technical aspect, the hurdles faced to hack into a cell phone make me feel safer as a user. It makes me feel better to know that the FBI are failing to break into my little iPhone.
Comments
2 responses to “Apple Does the Right Thing”
Well, I heard something a bit different… Apple only publicly acted as our privacy advocate because the FBI publicly demanded access while Apple asked for confidentiality: http://www.nytimes.com/2016/02/19/technology/how-tim-cook-became-a-bulwark-for-digital-privacy.html?_r=0
Still agreeing that refusing to build in backdoors is a good thing. Although backing up to the iCloud (or anything cloud-like) is kind of a backdoor in itself.
@Floutsch: That’s the only note I’ve seen, and it’s in passing, that anyone wanted the request to be private. Though it does actually emphasise the point. If it’s private, then there’s a chance it doesn’t get out. If it’s public, the genie is out of the bottle.