A large number of security and privacy experts, legal experts, and more, in an open letter to Apple:
On August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.
The open letter contains tons of arguments, scenarios, and examples from experts about just how bad this technology is, and just how easily it can be abused.
I welcome the input of experts in this discussion. One thing missing is the voice of abuse experts and lawyers familiar with the law and reasoning which goes into creating that law. There are obviously the technical and security issues which you have produced topics on and this does need discussion but there are other issues too. I would like experts in the fields of law enforcement investigation and child protection to be heard. The feel this is much more complex than Apple’s marketing bullet points suggest and is governed by laws which exist for good reason. I feel Apple are overselling what they are doing and in practice it won’t produce much of a result in the real world. Combatting child abuse is a matter of law, provision of services, a lot of training and education, making sure victims are acknowledged and heard and feel welcome and safe coming forward. Abuse images are another problem and the proliferation of them online is a huge problem. I don’t think there is a quick fix for this.
See this thread:
https://mobile.twitter.com/alexstamos/status/1424054542879006728
I think the author kind of glosses over a point that apple’s always had this back door to people’s devices and data, but they are only now starting to (publicly) exploit their access, which will undoubtedly grow in scope over time. We know that apple have worked with law enforcement in the past, presumably in the context of legal warrants, which I deem as less intrusive. But this is more like the warrant-less wiretapping where people’s data is being scanned & used for dragnet operations without any judicial warrants whatsoever.
I’ve long advocated for owners being in control of their own data/devices, but over the years I’ve become acutely aware that I’m on the loosing side of this battle, corporations are becoming the defacto gatekeepers to our own devices & data. While many privacy/control advocates can see where things are going, we just don’t have the resources to stop it and our influence on the industry is marginal. The corporations are adept at making resistance futile for the masses. This is becoming the new norm.
Won’t this just ensure that anyone who has bad intent will use things other than iPhone to take their porn pictures? (Wouldn’t that kind of stupid in any case?) Or not put it in iCloud? (Since this apparently won’t on-device scan photos not in iCloud – at least not yet) Or use some camera app that keeps it’s data somewhere other than iCloud, and “non-Apple encrypted”? Meaning it takes the image data and runs it though the enryption algorithm itself, rather than leaving it to NSImage:CompressAndEncryptImageAndPutInMobileMe(withKey:) or some such that might have an Apple backdoor. Unless Apple wants to build an image porn-scanner right into the T2 chip. (Clipper chip anyone?)