Earlier today, a federal judge ordered Apple to comply with the FBI’s request for technical assistance in the recovery of the San Bernadino gunmen’s iPhone 5C. Since then, many have argued whether these requests from the FBI are technically feasible given the support for strong encryption on iOS devices. Based on my initial reading of the request and my knowledge of the iOS platform, I believe all of the FBI’s requests are technically feasible.
A look at the technical aspects involved.
If Apple doesn’t have such bypass-iOS on hand already, it will require a ton of man power (not just developers and testers). Is FBI going to cover such cost? or Is this something that Apples just has to suck it up?
Apple does have to care for its stock shareholders after all
I think the lower court has said the FBI must reimburse Apple for the expense.
For the engineering & development costs, sure. Will they compensate them for the R&D cost on these security mechanisms that would be rendered null and void if the precedent is set that Apple can basically be forced to bypass those security mechanisms?
What about for the marketing dollars they’ve spent touting those features that would be rendered pointless? Will they quantify the loss of goodwill and consumer confidence and the loss of potential future earnings because of that erosion of consumer trust? Given that Apple’s success is built on the loyalty of their customer base, I’m betting they would argue that would add up to a pretty penny.
This was never about just the technical viability of removing certain restrictions. It comes down to:
1. Creating a bypass to security mechanisms in the device that previously did not exist.
2. Risk of the weakened software being adapted and used to exploit other devices in the wild.
3. The precedent that the entire campaign Apple has built around “even *we* can’t access your data, so *no one* except yourself can!” is reduced to a smoldering pile of rubble, as this will just be the first instance such a request, with *many* more to come in its footsteps.
This incident does not occur in a vacuum. It has *huge* business implications for Apple, and the FBI and the court are basically either missing that point or telling Apple that they don’t give any shits about those implications.
justanothersysadmin,
Such claims were rubbish anyways. To Tim Cook’s credit though, his blog post backs away from that with more of a “well, maybe it can be done, actually. But we choose not to do it” kind of vibe, which I can respect more than defending bad claims.
Edited 2016-02-18 18:42 UTC
thinkmac,
I suspect the FBI would be paying Apple a healthy premium for it’s assistance, money is not the issue here. If Apple cooperates, it could actually save investigators a lot of money it would take to reverse engineer and extract the secret keys using an electron microscope.
Edited 2016-02-18 02:05 UTC
Although there is a good chance of recovery for any individual bit from a drive, the chance of recovery of any amount of data from a drive using an electron microscope are negligible.
Page 243:
https://books.google.com.au/books?id=qji1ilg2-pAC&printsec=frontcove…
Edited 2016-02-18 04:24 UTC
That is referring to hard drives. Alfman isn’t talking about hard drives, iPhones don’t have hard drives… He is talking about ICs.
http://semiengineering.com/every-chip-can-be-hacked-with-this-tool/
Edited 2016-02-18 06:40 UTC
I say BS. If the FBI had the ability to do this they wouldn’t have gone to a court demanding Apple’s help.
IMO the most likely outcome of an attempted recovery would be an incomplete pile of hexademical gibberish.
The fact is that law enforcement agencies wants people to believe that their data can be forensically recovered. It is great deterrent.
Courts tend not to accept any recovery method that has the potential to change any the original data. That is why drives are cloned and the tests performed on the copy.
Edited 2016-02-18 09:55 UTC
I’ve seen this done on a state-of-the-art design using public-university level equipment, so I’m certain it is within the hands of any capable government to do this on a few year old’s iPhone flash chip, even if the scale is much larger.
But the problem here is that the don’t even need to do this. According to the article, on the 5C the key itself is stored on the flash chip, possibly symmetrically-encrypted with the PIN. Thus, all they need to do is to dump the contents of the Flash, and this can be done by almost anyone at the bus level, even hobbists (see how videogame console DRM cracking is done).
Thus, it proves that the FBI, like Apple, are doing this just for the show. I’m considering donning my tinfoil hat right now…
They would also need to know the on-chip layout of data, as well as what encryption methodology is used. I’d be more than willing to bet the FBI already knows the encryption (they would have to to be able officially to accept or reject iPhones for usage by their agents), but the on-chip data layout is non-trivial to determine without either buying identical devices for comparison, or getting it direct from Apple.
ahferroin7,
Emphasis mine. That’s exactly what they’d do. Perfect the method of extracting the ID on other phones, and only once it’s solid would they apply it to the defendant’s phone.
Hackers do this all the time, hell the FBI probably have some in custody right now that could do this for them
If criminals being sure of that hability, they would surely look for additional alternatives.
galvanash,
Additionally, they wouldn’t need to extract gigabytes of data using this technique, only the 256 bits that are unique to the phone. From there, the data would no longer be physically tethered to the phone and it would be a simple matter to brute force the pin.
It may take lots of money for reverse engineering, expertise, sacrificial iphones, etc. But with patience I see no reason it couldn’t be done. It would be alot easier for them if Apple cooperated though. It’ll be interesting to hear whether or not apple will be coerced into helping (that is unless the court order is kept secret, in which case we’ll never know).
I don’t think it would cost so much to modify the delay between wrong pin codes and to alter the maximum number of retries.
Presidential candidate John McAfee has offered to do it for free:
http://www.businessinsider.com/john-mcafee-ill-decrypt-san-bernardi…
danger_nakamura,
I disagree here, the FBI wouldn’t be responsible for creating a new weakness. It’s seeking apple’s assistance in developing an exploit for a weakness that already exists. It would be naive to think this won’t be accomplished by others. Using apple is just a shortcut. I wouldn’t be surprised if hackers at chaos computer club or defcon can pull it off.
(somewhat unrelated link…)
https://www.ccc.de/en/updates/2013/ccc-breaks-apple-touchid
This is a most amusing display of theater. The FBI could simply bypass the middle man and ask the NSA for the data.
Or instead to break citizens’ privacy, just forbid weapons in the first place, there’ll be no more gunning out there. Thus no need to break into iPhones.
Do I need to explain the word criminal to you?
LOL :
http://killedbypolice.net/
https://www.washingtonpost.com/graphics/national/police-shootings/
http://www.washingtonpost.com/sf/investigative/2015/12/26/a-year-of…
http://www.huffingtonpost.com/entry/police-killings-no-indictment_u…
or even that :
https://dogmurders.wordpress.com/
https://www.facebook.com/DogsShotbyPolice/
Ever heard of police ?
I see guns, guns everywhere…
Police? You mean legalized criminals, like what we have in this crazy country? Yeah, I’ve heard of them.
But police, no police… doesn’t matter. Criminal, from the word crime. They don’t care what the law is. They don’t care if the citizens, or even the police, have guns or not. What’s important is that they have them and if we outlaw guns and somehow manage to keep criminals from getting them too (an impossible task) they’ll simply resort to some other weapon instead. They don’t care about you. They don’t care about your laws. This is the mistake so many people make by thinking more laws will stop these people. If these people cared about your laws in the first place, then they’d not be criminals at all. I do not understand how such simple logic escapes so many.
Imagine guns forbidden to citizens, kids of 5yo cannot shot by accident some family members. Police would feel less threatened hence less prone to shot everything that moves, etc. Get the idea ?
Citizenship, buddy. Like in Europe, people pretty damn cool, not hailing for the love of some sort of deity but worshiping weaponry like jewellery. No, I’m not talking about middle east countries.
Wasn’t there a massive shooting a couple of moths ago in Paris? How did the ban on weapons worked out for you then?
Edited 2016-02-18 17:40 UTC
It is a question of degrees. Such mass shootings are rare in Europe. Unfortunately not in the US.
I don’t think anyone was pretending that gun control was going to wipe out crime overnight. It would, however, have an impact on the many accidental fatalities that happen when handling guns at home. Who needs crime when blundering does just as well?
Some Europeans are not that different from some Americans when it comes to short memory spans and cognitive dissonance. Apparently.
Mass shootings in Europe are most definitely not a rarity. E.g. Europe has had plenty of terrorist groups, both homegrown and imported, operating for decades. There have also been bloody civil wars on European soil recently. Etc, etc.
The body count due to gun violence in Europe is not as insignificant as many Euros love to pretend it is.
Edited 2016-02-18 19:43 UTC
This may be true over the long-term, but recent statistics show quite a different picture between the US and Europe. For example, here are the 2011 stats for Finland compared to the US, taken from http://www.gunpolicy.org/
Finland (per 100,000 population)
45,700 civilian guns
3.50 gun deaths
0.33 gun homicides
3.06 gun suicides
2.1 homicides (by any method)
United States (per 100,000 population)
101,050 civilian guns
10.38 gun deaths
3.55 gun homicides
6.42 gun suicides
5.21 homicides (by any method)
I picked Finland for comparison since it apparently has the highest gun deaths per population in the EU. I’m sure the reasons for the difference are far more complex than just throwing around a few statistics suggests, of course.
Your post only reinforces my point: Gun deaths in Europe are not insignificant.
Banning guns is not the magic pill some Euros make it to be. Just like banning drugs did not work out like many Americans thought it did…
The shooting was not really french vs. french, it was a few french muslim extremists against some random targets. It’s just to spread fear and legitimate politics to vote more laws against freedom in the fallacious claims that if you’re against these laws, you’re for terrorism. But I guess you know the pitch already, it has became so common these days, like a miracle cure.
So French Muslims are not real French people, got it.
Oh the places dissonance will take you!
Being french is about a whole culture made of history and, you guess it, secularism. Embracing a foreign (id|th)eology to commit crimes against your mother country ain’t really what I call deserving respect. Hence your own qualification of “not real French people” that I not even thought about. But thanks for the dissonance.
LMAO, thanks for proving my point.
PS. No need to thank me for your homegrown dissonance, it’s all you really.
Edited 2016-02-18 22:06 UTC
I’m glad you see correlations and stuff, proving your points, validating your personal agenda. I’m a comprehensive guy and don’t blame your autism. Good boy, now quiet…
No, he’s right. What you tried to say about what being French is made absolutely zero sense. By your logic, one could say I’m French if I learned the history and embraced your ideals. However, I’m clearly by both name and looks of strong German ancestry, so therefore I’m not French and would not be even if I obtained citizenship in France. Your logic made absolutely zero sense, and you demonstrate it further by lashing out with childish name calling.
A French Muslim, if born French, is still French whether you agree with them or not and regardless of their actions. Period.
I wrote :
1- FRENCH, native from France, but perhaps foreign ancestor
2- MUSLIM, native or converted, I don’t care
3- EXTREMIST, someone who try to enforce his will/power over others
What the fuck have you missed ? Because I told about history and secularism ? Yeah, just like you can feel being a US citizen just by hanging a striped flag on your porch ?
Sure is french people, despite their vast diversities, are reuniting under a same conception of citizenship, democracy and yeah, even though our close relationship with the Pope through ages, had became a secular country by decree back in 1904.
We’ve separated politic and church from then, no more vowing on the bible, no more praying every now and then, faith is left as a personal choice, not a government attribution like in Germany where you have to pay taxes according to your confession.
Those extremists are into the Jihad thing, which is not compatible with the french values. Nothing prevent them to go into Jihad-land if they feels themselves more welcomed over there, instead to blast bomb in their native country.
But that’s a rather long topic that have little to see with operating systems. I was just into taking the problem from the right end instead to twist things like “let’s break into people privacy since we left them with the freedom to shoot anyone at will”
Ah, resorting to mud-slinging and cursing instead of debating. Well done, sir. I will ignore you in future.
Your lack of self awareness keeps you from realizing you are indeed proving my point regarding the utter dissonance in your posts:
– You drone on how progressive and inclusive your society is, only to openly exclude from it anyone who don’t share your religious/ideological background.
– You regurgitated something about my supposed confirmation bias, when you’re the one going out of his way to ignore the obvious flaws in your two bit reactionary naive “understanding” of this matter.
– You talk about being a “comprehensive” guy (that does not make much sense in English, so I assume you meant “sympathetic” or “understanding”) while insulting me and telling me to shut up.
Seems doublethink is your mother tongue. So once again, merci pour prouver mon argument.
PS. In any case, getting back on topic: banning guns does not work, as your country painfully proved a couple of months ago. Just like banning drugs hasn’t worked, or banning political ideologies, or banning sacrilegious speech, or banning per-marital intercourse, or banning…
Edited 2016-02-19 22:41 UTC
Indeed, I think it’s an attempt to set a legal precedent which lowers the bar for future cases. This was a work-supplied phone (at a government job no less) that likely didn’t have any incriminating info on it beyond location data, which would be on Apple’s servers anyway.
Morgan,
It seems so. I doubt they need the phone for a conviction in this case. It’s a government phone and the government are unambiguously entitled to it, so it almost seems the ideal case to go for precedent without any legal/constitutional technicalities getting in the way.
Very interesting article. So if you use an alphanumeric password on a 5S or newer you are safe, Apple can’t break it even if they want it.
At least on paper, because nobody really knows if there’s already a backdoor (or a duplicate storage for your passcode) in software (iOS) or hardware (A9)
Edited 2016-02-18 02:29 UTC
No, that’s not correct.
If one were to blindly apply a bruteforce attack against an alpha numeric passcode, then yes it would be unlikely to work.
Crackers don’t do that, because its dumb. There are much smarter ways than brute force, especially when much is known about the create of the password.
I meant using a proper password. Dictionary attacks works only if you use obvious passwords.
With a proper password the only way is to go brute force.
Yes, the majority of people use obvious passwords, but this is not a problem of the iPhone or the algorithm, is a problem of people. Apple gives you a proper way to close your home, but if you don’t lock the door you can’t blame Apple.
Proper passwords sound like true Scotsmen.
Seriously, ‘qeadzcwrsfxv1331’ is easy to crack. That’s not exactly password123
http://arstechnica.com/security/2013/05/how-crackers-make-minced-me…
You’re wrong. First of all you don’t have the password hash when trying to break an iPhone, so the method described in the ArsTechnica article is not usable here.
And also if you can get it, it wouldn’t be MD5 (a weak hash) but SHA256 or stronger.
And all the passwords in the article are not random, are all vulnerable to dictionary attack.
Ask them to crack this SHA256 hash (my iPhone password, less than 16 chars):
35e71157cbd29d0f9b60ccd651b3c17239ef0d55aa6c6953c5c8f57cee309bb9
fabrica64,
Yeah that’s kind of the point, it’s why they were able to crack a majority of real user passwords in so little time. If you have a long password with real entropy instead of pseudo random words, then that goes a long way in making it difficult to crack.
Apple should come out and say it explicitly: short or easy words are easily crackable even with strong crypto and obfuscation. The problem is that many people have trouble remembering high entropy passwords, especially when it comes to the number of unique passwords we’re supposed to remember.
Edited 2016-02-18 23:31 UTC
Alfman, fully agree with you, they were all weak passwords and that was the weak point hackers took advantage of.
Even “qeadzcwrsfxv1331” is a weak password, although it doesn’t seem weak, just see where those keys are on a US keyboard.
A proper password will not be crackeable
Do you understand what a true scotsman is?
Well… “a proper password can be cracked only by brute force or by exploiting algorithm vulnerability”
Sounds better?
No, it doesn’t tell you how to create a proper password, without expert knowledge of the state of art of cracking.
Honestly, reasonably smart people think that passwords like Fabrica64 are good and strong. They aren’t so just telling them to use alpha numberic isn’t doing them much good. Nor is telling them to use one that can’t be cracked. Well duh. Of course I’m going to use one that I think can’t be cracked, that’s why I chose “Fabrica64” its got both cases and numbers.
If you had said something like:
With a sha 256 randomly salted hash:
use min length of 12 chars that fits/doesn’t fit these patterns ….
That would be useful information that could be acted upon and would help people use Apple’s keys to lock their own door correctly.
You may use a passphrase with punctuation and spaces, that’s a reasonably secure password (obviously not using something like “the cat is on the table”).
The iPhone stores the key into the secure element that allows only 10 retries before cleaning up the key, so it’s very hard if not impossible to break. It’s the same concept used into smart cards.
If the attacker may access the password hash (he can’t on the iPhone) better choosing a true random password that you may create using a linux tool like apg.
Honestly I think people use weak password because they have little to protect, but if a password is used, for example, to protect a multi-million dollar account I bet even ordinary people would think about fabrica64 being a very weak one. I don’t think people are so dumb.
I thought the context of this was making an iphone resistant as possible even when the secure element firmware has been updated by a determined hacker to remove the auto erase and time limited features. If it still has them, then even week passwords aren’t that bad, which is why it was created to do the things it does.
As far as I know secure element in smart cards and elsewhere forbids firmware update without password otherwise they won’t be “secure element” at all, and most of them are also physical tamper resistant (i.e. opening and analyzing it with microscopes).
But I really don’t know if the SE inside the iPhone has been tampered by Apple to let firmware update without authentication.
In fact the current dispute is on a iPhone 5C that has no SE so a specialized iOS could avoid auto-erase and delays, I doubt a corporation like Apple would call something “secure” while it isn’t. It may be but they’d have a lot to lose if discovered.
Wow. That’s insane.
In any case, the article states that the firmware can be updated without data loss.
Yeah, a lot of things are feasible but that doesn’t mean that Apple should comply with requests to build a backdoor into its OS just to appease an out of control government that thinks it has a right to personal information on anyone! Apple should do the right thing and steadfastly refuse to cooperate with this ruling. If the government wants to gain access to an iPhone or any other device, let it build its OWN tools or be damned! I’m tired of the U.S. government thinking that it has a RIGHT to everyone’s data and pretending that “national security” is an excuse to get citizens to agree to practically anything! Get real! The government wants to control the people and it is willing to do anything to achieve that goal.
At some point the Fed’s chickens had to come home to roost. The FBI is basically asking Apple to do something that the government has gone out of their way to criminalize. No wonder the Apple legal counsel told the CEO to not touch this with a 100ft pole.
Exactly! “Any society that would give up a little liberty to gain a little security will deserve neither and lose both” – Benjamin Franklin
Apple or any company for that matter should never let these monsters have access to people’s private information no matter what… people are nowadays delusional they are ready to compromise their privacy for the sake of so called security… privacy is the most important thing we should never compromise for any reason…. once we lose it, we are going to lose each and every right we have as Human being one by one ….
I don’t think the question is of technical feasibility. If it were infeasible Apple would have argued that (they didn’t).
What they are arguing it is a slippery slope – and that it is dangerous. They are being asked to create a firmware that intentionally weakens the security of their own product, something obviously counter to it’s design – in other words they are being asked to perform a hack.
The problem is how do they keep it from being used on devices other than the one the writ is for? Does Apple even have a mechanism they can use to limit a firmware to a single device reliably? If so, is it designed securely? Can it be? Is there a way to do so without giving the government access to the firmware itself? If not, would it be feasible for the government to alter said firmware to apply it to other devices?
The point is that while the government is (to their credit) asking for a narrowly tailored remedy, is it possible for Apple to actually tailor it as such, given the vast capabilities of the government?
Its like they are being asked to build a lock pick that only works with one of their locks, and then giving it to the best locksmiths in the world for analysis. Its not that they can’t build the lock pick, the question is how can they ensure it can’t be used on other locks?
I for one am glad to see they are pushing back at least a little bit in the hopes of being able to comply without giving away the keys to the kingdom so to speak. Far too many companies would have just said f*ck it, and left it up to the judges to watch the watchmen – which they almost never do in practice.
Makes their reasoning with the design of the secure enclave processor in later devices a lot more relevant. You no longer have to worry about complying with court orders like this if you are no longer capable of doing so…
I believe it’s likely:
– each device has an identifier
– the firmware would contain the identifier of the device it needs to run on
– the firmware can only be loaded on a device if it’s signed by Apple
– the altered firmware with the build in identifier-check would be signed
I think it’s all show, or Apple knows their system isn’t as secure as they claim it is. 🙂
Right; and those “keep-this-unique-to-this-device” protections are totally guaranteed to hold up forever. The point is that there are protections in place against compromise and Apple is being asked to *remove* those protections and leave weaker “keep-this-unique-to-this-device” protections in place.
As has also already been mentioned, the point is that this is precedent-setting. If Apple complies, even if the FBI pays for the required development, the barrier to future orders to weaken other phones in the future has been basically completely removed.
Is customary for a lot of companies to complain before they comply. But if they full force of the law is applied they will comply or close shop (think Lavabit).
You can’t win from the government in the long run. The only other solution is to leave:
https://ar.al/notes/so-long-and-thanks-for-all-the-fish/
Unless you want to go to jail of course. Not a lot of people want that. Some journalists do that to protect their sources. Things like that. But who wants to go to jail for protecting a (suspected) criminal.. I mean principle ?
Edited 2016-02-18 17:20 UTC
Agreeing in your arguments. Maybe Apple is not so confident in the ‘uniqueness’ of the Security Solution each individual device is provided with…
This deserved to be quoted!
Facebook ?
Federal Bureau of Investigating your Operating System.
I thought it had the F of Forensic
https://xkcd.com/538/
Let’s all take a moment to realize the guy found a way to counter this: by getting killed.
It wouldn’t surprise me if this is all smoke and mirrors. We already know the US government can force US companies to do or say whatever. For all we know such backdoors have already been implemented and this is just a dog & pony show to make people believe they still have any kind of privacy left, and that US companies aren’t already slaves to the governments will. Afterall, people are increasingly becoming more fearful of our militarized police forces and above-the-law government behavior than they ever were of so-called terrorists.
ilovebeer,
Through the Snowden leaks it came out that lots of companies were lying about their cooperation with the NSA. Something I’m curious about, when the legal department gets these court orders, would there be a clause that the order has to be kept secret from the company executives themselves? Does it ever happen that employees who are cooperating have gag orders not to report it to the executives? It might be something a conspiracy nut would say, but I’m genuinely curious how these secret court orders work.
It could provide a kind of plausible deniability in that that executives can say in good faith that the company is not cooperating even when they are.
Your giving them WAY too much credit – they simply lie.
You compartmentalizing information, as you describe, for reasons of avoiding personal criminal prosecutions and/or perjury charges – none of which is going to happen because of cooperating with the NSA. Its only really useful as insurance against the courts.
It doesn’t help much with public outrage at all though. For that you just lie through your teeth, or just keep your mouth shut completely. If you get caught you get caught – whether you lied knowingly or not doesn’t really matter at all in the end – especially when you can just say “the government made me do it”.
Edited 2016-02-19 09:15 UTC
galvanash,
Yeah, haha, that’s more than plausible. I was just wondering if some court orders can be secret enough that they are left in the dark along with the rest of the public. I don’t know, maybe that’s far fetched.
Edited 2016-02-19 14:40 UTC
It’s not hard seeing gag orders like that happen in cases where need-to-know and/or secrecy plays an important role and potential leaks need to be minimized. I do agree that most people at the end of the day would feel equally betrayed whether a companies officer(s) knowingly lied or truthfully didn’t know what was happening. Does anyone really care about plausible deniability in cases involving government abuse, intentional public deception, and circumventing law because, you know, “national security”. I can’t say I do.
People can debate whether or not they should be afraid of terrorists, but everyone should definitely be worried about what the future looks like if the US Government keeps wiping its ass with the constitution.
Hi,
Eventually, after all the debate and whatever work is involved in unlocking the phone, the FBI is going to get access to the data on the phone, and use that data to convince a judge to imprison a corpse. 🙂
– Brendan
This backdoor already exists in all iPhones because otherwise iPhone would not be allowed to be sold officially in countries like China, Saudi Arabia, Russia, etc.
Does anybody remember why even Google has left China?
Basically, Apple plays hard ball with FBI but not with China/Russia/Saudi-Arabia!
Edited 2016-02-20 07:38 UTC
http://i.imgur.com/EXlda6Y.gif
😉
“http://i.imgur.com/EXlda6Y.gif“
This story just keeps getting stranger. It turns out that the FBI changed the passcode on the iPhone when it was in their possession and have locked themselves out.
According to Apple, the Apple ID password on the iPhone was changed “less than 24 hours” after being in government hands. Had the password not been altered, Apple believes the backup information the government is asking for could have been accessible to Apple engineers. The FBI has said it has access to weekly iCloud backups leading up to October 19, but not after that date, and it is seeking later information that could be stored on the device.
Apple had been in regular discussions with the government since early January, and that it proposed four different ways to recover the information the government is interested in without building a back door. One of those methods would have involved connecting the phone to a known wifi network.
Apple sent engineers to try that method, the executives said, but the experts were unable to do it. It was then that they discovered that the Apple ID passcode associated with the phone had been changed.
Apple says that the entire backdoor demand could have potentially been avoided if the Apple ID password not been changed, as connecting to a known Wi-Fi network would have caused the device to start backing up automatically so long as iCloud backups were enabled.
That talks of current Federal policies not even being adequately diffused. Enforcement goes well beyond that.
No intention of blaming those who tried and failed, worsening the case. It was an intent of cutting through.
Blaming the lack of coordination, from the Feds down.
Reportedly, it was a County employee that reset the iCloud password (not the phone password). Later it was learned that the FBI had requested the password reset. This prevented a security workaround in which the phone could be backed up to the iCloud account from where the data could be read.
Apple could have recovered information from the iPhone had the iCloud password not been reset, the company said.
So lesson learned from this, the phones are secure but the iCloud isn’t.
————-
Additionally, the County had purchased “mobile device management” software but never installed it on the phone.
http://hosted.ap.org/dynamic/stories/U/US_APPLE_ENCRYPTION?SITE=AP&…
That does sound like the U.S government we’ve all come to know over the past decades.
“… it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure: privacy and safety.”
…
“…We shouldn’t drift to a place—or be pushed to a place by the loudest voices—.”
https://www.lawfareblog.com/we-could-not-look-survivors-eye-if-we-di…