To make a long story short, it sounds like Apple is going to be collecting a lot more data from your phone. They’re mainly doing this to make their services better, not to collect individual users’ usage habits. To guarantee this, Apple intends to apply sophisticated statistical techniques to ensure that this aggregate data – the statistical functions it computes over all your information – don’t leak your individual contributions. In principle this sounds pretty good. But of course, the devil is always in the details.
While we don’t have those details, this seems like a good time to at least talk a bit about what Differential Privacy is, how it can be achieved, and what it could mean for Apple – and for your iPhone.
As I started reading the article, my first thought was “Yes, it’s easy to mask the source at first, but the more you use the device the more you can be identified even with ‘mathematical noise’.” Looks like I was right:
Morgan,
Yep, the more bits of information collected, even if some are randomized for privacy, the greater the probability of being uniquely identifiable.
I agree with the author that it needs to be peer reviewed in the open. Companies often provide blanket assurances (aka “trust us”) that turn out to be totally worthless under close inspection.
The question the article did not ask, but I certainly would: what exactly is this information being collected for? Is it really to improve the user service or is it for targeted advertising? Can the user opt out?
Another question that mustn’t be overlooked is whether the data really needs to leave the user’s possession in the first place. As an example, a GPS breadcrumb feature might be implemented in a way that reveals private data to the service provider where it can be be the subject of employee snooping, NSA, subpoenas, etc. Or it might be implemented in a way that the service provider only sees encrypted data and never plaintext data. This can be a good litmus test for whether a company’s concern for user privacy is genuine or fake.
Oh, I’ll predict the results of that litmus test 100%.
As it happens with Apple, the results of that litmus test will be either 100% or 0% depending on tester, with no litmus wasted.
Not going to happen, it’s not practical.
And also kind of naive.
Any small part of the system can collect information and send it to Apple. So all code would need to be checked and all binaries would need to be checked if they are based on that source. And it’s Apple’s own compiler and hardware. So those would also need to be checked. (also read: Reflections on Trusting Trust). And any change to the code would need to be checked again. Do you have any idea how much work that is ? And even then it wouldn’t give you real assurance.
Also Apple has a forced update system in place. They can change any part of the system at will.
It doesn’t matter how sincere Apple is, we can’t check it or keep them in check if they ever change (and companies do change, Microsoft used to always say they are about privacy. And now you have Windows 10). And the pressure to change will only get bigger in the future. If it’s not from more efficient building software than it’s from their share holders.
I wish you luck with buying your single vendor integrated solutions…
Edited 2016-06-20 08:35 UTC
Lennie,
It’s one thing to say an open peer review is not going to happen (ie because apple doesn’t want it’s claims to be independently verified). But I completely disagree that peer reviews are not practical and I believe most security researchers would consider that an excuse.
You are right that apple could deliberately con the peer review process by moving things into proprietary code outside of it’s official data collection program to avoid being peer reviewed. While this would be a problem, IMHO it’s not a good reason for not having peer reviews. This logic could be used anywhere: oh it’s too naive and too practical to peer review the entire debian repo, therefor it’s not worth reviewing this one package. Or: it’s naive to peer review FF or chrome’s data collection policies because mozilla and google could technically run arbitrary code via an updater if they wanted to…
Obviously we need to recognize the risks of deliberate sabotage by our vendors. But at the same time peer reviews aren’t just for adversaries, it’s also about catching mistakes that happen accidentally. Apple should be submitting to the community for peer review and encouraging outside researchers to look because it’s one of the better ways we have to catch mistakes.
Edited 2016-06-20 15:19 UTC
In other words, it’s the same tired “Apple and everyone else should open source everything to the community” argument that completely ignores all common business sense. Start by looking up what a trade secret is.
darknexus,
First off, apple hasn’t claimed that. But it would be rather disingenuous for a company to claim “trade secret” privileges for themselves on the one hand while on the other hand not respecting the exact same privacy rights of end users who want to know the scope and protections being used for their data. When it comes to digital security, “trust us” is the worst answer there is.
Who knows, apple may yet commit to a peer review, and I would be the first to commend them for it.
First: peer core review, security review and privacy review are not the same thing.
Second: I can’t believe Apple wants anyone from the outside to see what they are doing. Just look at how they participate in standards processes. A lot of the time they have someone sit in on all the standards making processes and basically say nothing, sometimes not even vote. And maybe they will make say something some of the time on an existing standard. But almost never on new standards. Because, well maybe someone might infer what products Apple could be making. All the companies that help produce products for them are under very strict NDAs. Does that sound like the kind of company that would open up everything to others from outside ?
Also something like Debian repo is peer reviewed, because it’s being developed in the open. That’s a very big difference, a different way of working.
Lennie,
Yes, that’s all true, and then some. However it’s user data we’re talking about here, so the motivation to verify that data is being collected appropriately isn’t without merit.
Now with that said, it’s “normal” for corporations to keep users in the dark. Is apple more responsible than google, facebook, ms, etc? I honestly don’t know, precisely because these companies are so opaque about what they do. They only volunteer details in vague corporate-speak that puts them in a favorable light. If apple really wanted to stand out from the rest and say “we’re better with privacy than those other guys *cough* *cough* google”, it ought to push for independent peer reviews and publicly shame other corporations into having them too.
This is fine in theory, the trouble is that apple’s plans for monetizing user data may be much more similar to competitors than it wants to admit, in which case they probably don’t want to call so much attention to it.
The thing is: Apple can never properly prove what they say they are NOT doing. The same things as (security) bugs: you can’t prove you don’t have bugs in software.
I’ll just leave this little quote here from my fellow countryman. 🙂
“Testing shows the presence, not the absence of bugs”
https://en.wikiquote.org/wiki/Edsger_W._Dijkstra#1960s
So that is why it’s all theory. Audits are a great way to get certificates though. 🙂 A lot of the time certificates make no sense though.
Anyway, that’s my opinion. Others might have an other opinion. And that’s a good thing.
Unless Apple owns the entire network as well, you are wrong. There is no single vendor system in the internet age. And there is now way that Apple can start phoning home in masses without anybody noticing.
Furthermore, no company would risk its credibility so carless.
You can’t compare that to the Windows10 case either, since user can know and choose not to upgrade to Win10.
They now have the perfect smokescreen don’t they ? 🙂
Are you sure about that, seeing as how closing the window now implicitly implies agreement? The only reliable way to block it now seems to be via group policy, which not every version of Windows has. Most of the registry keys seem to be ignored, and how many typical users would know what to do with a registry value anyway?
“Companies often provide blanket assurances (aka “trust us”)”.
Been Always the case on closed source. Ours is blanket confidence. Same on closed recipes at our preferred restaurant. On the Coke I drink. etc.
Maybe there’s a little fixation on this. Lots of weakness and holes have been there, residing on the open code -for decades, at some cases- until new, stricter dev&diag tools&platforms come into the game.
Nothing to say about hardware, there the game has not started, yet.
Waiting for my Personal IC printer [Should I wait seated?]
TRANSPARENT Integrated Circuits, audit-able with a conventional microscope
transparent IC: No need of practicality. Need for audit-ability, teach-ability, proof-ability.
“GPS breadcrumb feature might be implemented in a way that reveals private data to the service provider”
If not accessible to the Tel-Co, neither to anyone up the food chain.
Very good point from Alfman.
Here I’m more extreme. If sketching|typing a Personal note|drawing, and not pushing that note into any Comm Link Buffer. That note shouldn’t be available, even to Apple Cloud.
[Tower triangulation at another Frame].
As you certainly have noticed, targeted advertising is usually advertised as ‘improving user experience’… Thus no difference.
It’s thermodynamics.
Zeroth: You must play the game.
First: You can’t win.
Second: You can’t break even.
Third: You can’t quit the game.
Given apple’s stance on privacy, any leaking would more than likely (can’t be 100%) will remain within Apple. As they (apparently) don’t sell your data to others (eg advertisers) then overall, I’m happy with this approach.
Only time will tell eh?
Two final points:-
1) I’ve never seen the ads that Thom keeps spouting on about
2) Apple clearly ain’t Google.
“At the end of the day, it sure looks like Apple is honestly trying to do something to improve user privacy, and given the alternatives, maybe that’s more important than anything else. “
On Apple starting the conversation on Statesmanship Math.
[The same has been going on, for any centrally collected Citizen Info since WWWII Times].
Kudos to Apple on Leading.
Reason to bloggers about the weakness of the approach [but in the right direction]. A little more useful on short term to the regular Joe’ profile.
Impossible that an User perform average. But should be developed User Agents approximating our EXTERNAL, network behavior to that.
A ‘proxy server’ appliance averaging our behavior in between our PC and the wall plug?
Of course, a proxy as good as securities on internal activities.
companies will spy more and more as they’re allowed to, as they realize their customers are permissive to more things than not