Craig Federighi, senior vice president of software engineering at Apple, penned this opinion piece in the Washington Post.
That’s why it’s so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies. They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers. What’s worse, some of their methods have been productized and are now available for sale to attackers who are less skilled but often more malicious.
To get around Apple’s safeguards, the FBI wants us to create a backdoor in the form of special software that bypasses passcode protections, intentionally creating a vulnerability that would let the government force its way into an iPhone. Once created, this software – which law enforcement has conceded it wants to apply to many iPhones – would become a weakness that hackers and criminals could use to wreak havoc on the privacy and personal safety of us all.
I can’t emphasize enough how important it is to stand side-by-side with Apple on this one. In France, they just voted to put technology executives of companies unwilling to decrypt their products in jail.
I’ll try an optimistic view
* there will always be leaks about overreach and immorality – eg Snowdon, Chelsea Manning, Daniel Ellsberg, … information wants to be free, especially informaiton about foul play.
* technology will always outpace any attempts to kill it – RSA, PGP, clipper chip, openbsd, libdvdcss, …..
And please don’t worry about the relevance of french lawmakers, it’s been a long time they have their own agenda. While they voted a law to sneak into iphones, they refused hand down to vote a law to allow sneaking into companies that do offshore profit optimization. Forget about france for the time being…
Personally, I consider this a flaw in Apple’s security model that has been exposed. Now days, not only do you have to protect against hacker intrusion, but government intrusion as well. In regard to the latter, Apple still has a hole that it needs to close up.
Edit: That last link in the article currently results in a 404 error.
Edited 2016-03-07 22:11 UTC
I agree. Security that depends on the will of a 3rd party is just broken by design. I won’t support Apple on this until they acknowledge this design issue and show their intention to fix it.
ocroguette,
Companies tend to say whatever they want to, and sort the facts out later. I wouldn’t really trust what they say, only what they do. Ideally, all the security mechanisms would be open sourced and available for public inspection. Also, we need a way to verify the code our own phones are running. Wishful thinking, I know.
WorknMan,
I’m all for people standing up for apple’s right not to exploit it’s own technology, after all this case will set important precedent. But we do need to be extremely careful about this because there are different outcomes, some of which are worse than others. If apple refuses to exploit it’s backdoor itself, the government might issue a subpoena for apple’s own device keys. Apple may win a shallow victory in that it would no longer need to help with the software to crack it’s own phones, however I hope everyone understands why this outcome is much, much worse…
I think the argument is that if the binary blob of this “special” firmware ever escapes Apple’s control, then things do change. It’s signed, anyone could potentially use it. I.e. cooperating with the FBI is exactly as dangerous as cooperating with criminal and hackers, and involves exactly the same risk. You either trust the 3rd party (whoever it is) or you don’t. The root of the question is should Apple trust the FBI with what is essentially the keys to their kingdom?
But responding to the rest of your post…
I follow your reasoning (I think), but the question is how can Apple legitimately improve the security any further?
They already made this particular “backdoor” ineffective in current products – the passcode retry is no longer software controlled and involved much more sophisticated safeguards. But admittedly there are probably other points of attack they did not think of yet that could be exploited like this. As they are found I would hope they address them in similar fashion.
Regardless, the entire security model is based on a chain of trust, and Apple by definition has to be in the chain in order to ever update software on the device, don’t they?
Are you saying that by retaining the ability to update software they are creating a “backdoor”? If so, I mean I get it (again, I think), but what is the alternative to that? You can’t not have update-able software…
I’m just saying, calling this a backdoor seems kind of counter-productive. It’s not a backdoor, it is essential – you basically cannot have software updates without trusting someone – unless you expect users to write the firmware themselves…
That chain of trust is between you and Apple, you signed up for it when you bought the phone. It didn’t and doesn’t involve the FBI though, that wasn’t in the TOS… Shouldn’t Apple resist eroding it if they can? Isn’t that their job? Isn’t that, at it’s heart, what security actually is?
Besides, its not really a backdoor if it cannot be eliminated in any practical way. What do you propose as an alternative that doesn’t involve a chain of trust involving someone else? A chain of trust of one is just a secret – very secure but of little practical use in and of itself…
Edited 2016-03-08 04:38 UTC
galvanash,
Let me know what you think of my idea…
Secret sharing would be a great idea if your goal is to create a power balance. You don’t want to put all your trust in a single party, so you distribute it between multiple parties which have different motivations and goals. I get that.
But how does it help in this scenario? If the FBI can force Apple to comply, they can just force the other key holders to comply as well. Its the same problem, just instead of it being Apple resisting erosion of the trust chain, it would be Apple plus some other parties. Legally, if the government can force one of the parties they can force them all. It doesn’t solve anything – because the government simply holds all the power in the arrangement. Either the courts see this as an overstep or they don’t, and if it is the later there simply is no technological solution to it (or in point of fact such a solution will be deemed illegal).
A shared secret scheme might make it easier to “trust” Apple for users who are reluctant to do so, but this only addresses the issue of Apple being viewed as untrustworthy. It does nothing to address the issue of whether or not the government is overstepping their boundaries…
My point is I know Apple can (if they ever want to) get at my data. I also know they want to sell me more products in the future, and them mucking with my data in ways I don’t like is going to negatively affect my view of them. I don’t trust them per se, but I do trust they are financially motivated to keep me happy so I give them more money in the future. Honestly I don’t think having some 3rd parties involved in my trust chain with Apple changes much of anything for me – I trust Apple (as far as that goes) because they are greedy bastards and it is in their best interest not to f*ck me over.
I have no such quid pro quo with the FBI. I’m bound to the law of the land just like Apple is. If the FBI wants my data, and the courts say it is constitutional for them to demand it, well I’m in the same boat Apple is in…
You seem to be addressing the problem of Apple being untrustworthy, I don’t see that as the problem at all. I see the problem as the government being untrustworthy.
galvanash,
Actually, this addresses both, as neither would be able to sneak in undetected. If ether lies, then the evidence around the world will speak for itself. Public disclosure should not be a problem for a company that truly has public interest at heart. But therein lies the problem with corporations including apple, they just want the appearance of public interest. They don’t really want crypto that forces them to actually commit to it.
Edited 2016-03-08 14:07 UTC
The problem is, who are these other key holders? I’d darned uncomfortable with the idea of not knowing who holds my trust. With Apple, Google, Microsoft, or whoever, at least I know exactly who I’m dealing with and, should my trust be violated, who not to deal with in future. If the keys were spread across the world, in the hands of unknowns, who’s to say which party screwed us over? Sometimes it’s better a known “enemy” than an untested “friend”.
darknexus,
Simple, those who did not publish the information like they were supposed to are the ones who “screwed” you over (perhaps because of a gag order), but as long as some of them publish, then the system will still work as intended.
Edit: We can go into more details, but my point is that there are ways to solve galvanash’s dilemma. Crypto can be used in a way that provides much stronger public assurances than if it’s exclusively in the hands of a single entity like apple.
Edited 2016-03-08 15:51 UTC
Capitalism hardly ever favors the honest. That said, anonymity never does.
darknexus,
So you don’t have any objections to crypto that would make/keep them honest? Well that’s great, now it’s just a matter of convincing companies to embrace it; That is the hard part.
I get what your saying, but I think you are conflating public interest with customer interest. It would be nice if a company actually embraced the public interest in the way you suggest, but that isn’t what companies do – they are machines to generate profits. Addressing their customer’s interests is a means to an end (more money), addressing public interest is not. What I mean is there is certainly the possibility of overlap between the two, but ultimately addressing public interests is not (and frankly should not) be a factor in business strategy – unless your in the business of promoting public interest. Apple is in the business of selling phones, though.
Point is, Apple fighting the FBI in this case serves their customer’s interests, which in turn serves Apple’s interests. Incorporating some sort of shared secrete scheme as you suggest does neither, so why would they even consider it?
A company voluntarily giving up power over such an important aspect of their business operations to a 3rd party, whose interests do not coincide with theirs (and they wouldn’t, because that is kind of the point of it all), is pretty much the definition of fiduciary malpractice. They would be sued into oblivion by share holders…
The only way I could see something like this ever happening is if the 3rd party is the government itself. Frankly I don’t know if that is better or worse for the public interest…
galvanash,
Now your really stretching it… sued for disclosing the truth
Will you at least admit there is room for improvement?
Edited 2016-03-09 00:30 UTC
I never said there wasn’t room for improvement. I even admit, if I turn my brain off I like the sound of your idea. But if I roll it around in my head a while it and play it out I see all kinds of ways it can go off the rails – badly.
galvanash,
Hmm, I don’t think we’re on the same page at all. There are no blessed parties, no secrets, no malpractice, no acts of bad faith or anything even remotely like those things. I must have failed to communicate the idea properly.
Let us step back and go over what we can agree on.
You asked…
– We both recognize that enclave software updates have a legitimate purpose.
– We both recognize that, in the wrong hands, enclave updates could breach a user’s security (even if we don’t want to call it a “back door”).
– We both recognize that, when companies exist under a legal jurisdiction in which they could be compelled to add code or hand over keys, it poses additional risk and uncertainty for customers of the company.
– We both recognize that an update mechanism can send malicious code updates to a target without the target or anyone else becoming aware that this is happening.
– We both recognize that it’s harder to detect malicious code in open source products than in proprietary ones.
So far so good?
Now, the question asked whether there was any way to protect the enclave from changes via the update mechanism that subvert user security while still allowing legitimate updates. The assumption being no, there wasn’t.
Still good?
Now, turn the question around and ask “how can we protect the enclave from coerced security changes while still allowing legitimate changes”? This is the clever part, we recognize that so long as one company holds the only key to update the enclave, that company is the single point of failure for everyone’s security. But the enclave doesn’t need to be designed with a single point of failure. By enrolling the help of others around the world, we can add as much redundancy as we need. Then, not only does the government have to go after apple to break into the enclave, they’ll have to go after the others too.
My idea was that this should be done with public oversight and include full disclosure via the media. It would be a perfect way for apple to prove that there are no hidden functions in the enclave. But if full disclosure is too difficult for a company like apple to swallow, it still works with private partners under NDA’s. Customers just loose the benefit of transparency.
They could do that without ceding any such control to the 3rd parties – they could just choose to disclose to the code for review (under NDA or whatever) and ask them to “sign off” that there is no hidden functions in it. They don’t have to give them pieces of the signing keys to do that…
But that in itself does nothing to address blocking government interference, for that you do need to give the 3rd parties actual power, as you suggest. All I’m saying is ceding power like that can cut both ways.
There is an assumption in your idea that these 3rd parties would behave in a manner consistent with the public interest. That isn’t necessarily true, but lets be optimistic and say they will. That doesn’t mean they would act in a manner consistent with Apple’s interests. In fact I would argue the likelihood is at some point they definitely won’t.
galvanash,
Exactly, if you get rid of the 3rd party signatures, apple retains full control over the enclave, but we loose all protection from coercive forces. That’s where we are today. While we can relegate our security to companies like apple with a single point of failure, I think it would be very bold and forward thinking for apple to think outside the box and invest in crypto that can provide collective assurances and eliminates itself as a single point of failure.
Edited 2016-03-09 08:58 UTC
Require the user to enter their passcode before the firmware is flashed.
They do that already. At least, I’ve never been able to update or restore a device without the passcode, or my Apple ID information in the case of a restore.
I don’t think so, because if they did, there’s no way Apple would be able to flash a new firmware onto the device.
I take it back. DFU mode will, indeed, allow the firmware to be flashed. There’s still the activation lock, but I’d imagine that’s trivial for Apple themselves to bypass.
WorknMan,
Indeed. No updates should be possible without the owner’s consent. A vendor would have to wipe and reset the phone before being able to update it.
There’s still the issue of “trust”, how do you verify that? With proprietary products, you can’t. It forces us to take a company’s security claims at face value, but I don’t know how much that’s worth. It’s not like apple hasn’t lied before.
Never sure. Except on the simplest of the devices. How do you know [as a Consumer] about the good use of keys at hardware and firmware?
dionicio,
Haha, I hope you don’t take offense, but do you mind if we try to perform a Turing test on you? It’s just that while your postings are kind of intelligible, some much more than others, they have a distinctly unnatural quality to them. While you don’t owe me an explanation, I’m curious if you’d have one to offer?
I get the feeling I’ve seen you on other forums too.
Ha. I was thinking the same thing. Probably just someone with a unique philosophical viewpoint who learned English as a 2nd language. That is my guess anyway.
My 2nd guess would be someone took a tech sector dictionary and feed it into a fortune cookie generator
On being suspected small and primitive AI
As if Apple asks for ‘keys’ in order to trace signal ‘glitchs’ in some hardware component of their products.
As example
There was a proposal by a deputy to amend the new law.
The parliament voted against it.
(not by a wide margin, alas, and there are several iterations, between the Senat and Assemblée Nationale, etc..)
But “they just voted to put technology executives of companies unwilling to decrypt their products in jail.” is plain false.
Exactly! The article linked by Thom clearly explains that a penal reform bill was receiving its first reading in parliament and the rightwing opposition sneaked in an amendment stipulating that “a private company which refuses to hand over encrypted data to an investigating authority would face up to five years in jail and a €350,000 (£270,000) fine.”
And more to the point:
“It remains to be seen whether the thrust of the amendment can survive the lengthy parliamentary process that remains before the bill becomes law.”
RT.
@Thom Holwerda
How is that different than China/Russia/etc?
The official newspapers in China claim that Apple has backdoors on iPhones.
It’s safe to assume that pretty much every electronic device with a CPU made in China/Taiwan has a backdoor; the Chinese are only telling on themselves if they are claiming such. (Note I’m not saying it’s not true, just pointing out that it goes without saying.)
But but… it was only that one specific case! Right? Just that one…
</sarcasm>
Law Enforces has to focus on victims, if wanting a relevant chance of winning. But there are other victims [kind of stupid: Us, as consumers].
Puts a conundrum on Us:
https://www.techdirt.com/articles/20160306/22252833817/cockpits-phon…
Logistic mistake is akin to having no way to override a President gone nuts on pushing the red button.
It’s such a shame that people are even considering chipping away on their rights and freedom because we are afraid of terrorists. Politicians and government organizations have always used atrocities to reduce our freedom. This has to end!
It will, when humans stop being willing to let others do their thinking for them.
So basically never, then?
Unfortunately, your thoughts parallel mine.
Is just unfocusing the conversation.
Point is: You don’t have to break-in like an intruder. Not anymore. Time to enforce the Law frame.
http://www.theguardian.com/technology/2016/mar/09/edward-snowden-fb…
In rea
“I guarantee you, Google and Amazon will soon have much more surveillance capability with drones than the military,” says Cummings. “They have much bigger databases, much better facial recognition, much better ability to build and control drones. These companies know a lot more about you than the CIA. What happens when our governments are looking to corporations to provide them with the latest defense technology?”
http://www.rollingstone.com/culture/features/inside-the-artificial-…
Worst of the outcomes -from a Consumer perspective- would be that no agreement is reached in what is Private|Discrete|Law-Enforcement-Interest. iPhones would be re-classified as military encryption devices, which NSA would be more than happy to break the usual way, within their legal frame.
Saddest among all, is the Consumers delusion that someone else is fighting his fight, in this feudal war.
Sustained due in part to strict oversight of Steel Industry.
Could you extend the 5 days limit?
My low energy consumption CPU’s doesn’t process that fast.
-a simple, replicable, disposable, replaceable silicon device- to make petitions?
“So, Hannigan doesn’t want a backdoor. He wants another set of keys for the front door and is requesting that all parties work together to decide whether this set should be left in the mailbox or under the welcome mat.”
No, they are asking Apple to go and open the front door for them.
They know that Apple has the keys to that ‘personal’ diary and family photo cabinet. But no, they will not ask Apple to open that cabinet. You, so trustful Apple.
https://www.techdirt.com/articles/20160309/07334533847/gchq-boss-say…
Apple can keep all of the keyring at his belt.
And Consumer can keep her ‘Guest Pass Card’, also.
Leeloo: Multi-pass, Multi-pass!
Me fifth element – supreme being. Me protect you.
😀