Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.
After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated.
What could possibly go wrong?
Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes:
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”.
No wonder this company enjoys working with the most brutal regimes in the world.
In the UK there is law governing this area and thresholds are in place. They are not a state secret. The official view of the police is they do not encourage vigilante action and even the Internet Watch Foundation is only allowed to operate on the basis of an understanding. The training and support requirements of staff monitoring reports is a none trivial exercise. Before we begin Apple is contrained by child abuse law and GDPR and other rights regarding information they hold on individuals.
As for the topic one of the first thoughts which crossed my mind was what if an innocent human rights activist was in a photo standing next to the wrong person? Would that person then be targetted by state oppression? Would they wind up dead?
Putting the law and real world aside what if Apple’s system just made us more complacent?
That is simply arrogant patronising and insulting.
In the UK though it’s not going to be vigilantism as soon as it can be rubber stamped (with a few extras, for safety of course), and may even already be official by default
https://www.theregister.com/2020/07/23/investigatory_powers_act_commencements/
In other words, as section 12(5)(a)(i) puts it, any surveillance power granted to the public sector under any law other than the Snoopers’ Charter itself or the Regulation of Investigatory Powers Act 2000 (RIPA) now can’t be exercised unless your telco or Royal Mail (or parcel courier, for that matter) can be persuaded to hand over access to your communications.
that’s not referring to this, but maybe only underlines how non-vigilante it’d become (one way or the other, if not the literal above) and there’s a demonstrably ignoble history of this behaviour.
Apple’s greatest ad was the FBI’s obvious frustration with them.
I wonder how long it took for Hungary to supply LGBTQ image hashes and an insistence that’s covered under the same technology.
That internal memo is almost direct from the stereotypical movie supervillian script. Someone who has become convinced that their cause is so righteous that there’s no price that wouldn’t be worth paying to achieve it.
If the outrage won’t blow over Apple will just dial it back and scan _some_ of your pictures _part-time_. Politics 101.
Sadly the voices are often seen as screeching because they arnt backed up by any action. This will become accepted as normal and people move on just as when they granted China access and implemented their filters and firewalls
Not actually true. But what russia claimed after they were found to be involved in the downing of MH17 to discredit the organisation.
Apple tries to not host photos of children being raped on its servers – bastards!!!
Problem is, it isn’t Apple’s job to police people. What are they going to do next? Report sexting? Recipe exchange? Where does it end?
Who’s job is it, do you suppose? You would likely say it is the governments job, right? And so this whole thing is to allow the government to identify those who take and distribute inappropriate images.
The issue is that ‘inappropriate’ is defined by the government, some of which might be oppressive and so could try to use this to identify people using their phones to show government wrongdoing – such as police brutality.
Here’s the thing. One thing is for sure. Is it Apple’s job? Absolutely not. Is it a government (or likely a government body voted in by the people, well hopefully) then yes. Or you know, if you see some pedo scoping out kids, give them a swift kick to the nads and phone the police. But corporations have ZERO reason to be scanning our photos. If consenting adults want to share stuff with each other, there should be no issues with it.
First Apple uses it for “protecting the kids” then they are used by oppressive governments to stamp out any dissidents. While in theory the USA isn’t there yet, others for sure are.
This is a slippery slope (as much as I dislike that term.
Apple intends to do two things.
One: if they think someone is sending a child’s Apple account sexual images they will, as long as the child’s account is linked to a family account, notify the adults in the family that there may be a problem. Neither Apple themselves nor the government will be notified or take action, this is just intended as a sort of fail safe mechanism for parents. Of course parents can check what has been flagged and decide that it’s harmless, or that there is a problem and take action.
Two: if you upload images too iCloud for storage they will be checked against a database of known Child Sexual Abuse Material (known as CSAM) maintained by the National Center for Missing and Exploited Children. If you turn off iCloud storage then nothing is scanned.
So Apple intends to do two things. Help parents spot possible problems with their children involving predators and groomers, and make sure that Apple itself is not hosting child abuse images.
And people think this is bad?
Is there anyone here who is a parent of a small child that thinks this is bad and if so why?
There is a joke here about not all Apple users being paedos…
Presumed guilt is not good for a western democracy.
Thing to remember this is type of hash. This is fairly new type of hash. We don’t know what the hash collision rate/false positive rate will be will be. Heck we don’t even know how to exactly calculate the false positive rate on neural-hash type solutions.
https://github.com/nikcheerla/neuralhash
This type of hash was developed for doing rapid DCMA strikes and there have been a lot of false positives using it. Yes there is a high risk that a parent takes photos of their small child for valid reasons that it false positives out on them. Reason for this high risk is that we cannot get a rate for how reliable any neural hash will be until after its in use.
https://www.ijrte.org/wp-content/uploads/papers/v4i2/B1409054215.pdf
Something else items like neuralhash has have been used in anti-virus software what is called a HASH-ANN. Yes it has been found with anti-virus that you can get HASH-ANN values with a low error rate. You can also get HASH-ANN with a very high error rate. The worst screw up with a HASH-ANN generation with anti-virus end up declaring all PE files as virus.
Artificial Neural Networks based hash are not as stable as one would like. Please note there is also the false negative rate as well.
Strossen there is another problem. The hash solution cannot be on a person own device completely. Why becuase just like anti-virus the ones shipping the bad images could use keep on altering the image until they work out what are the key bits of the image to alter to render hash useless.
The issues with this hash solution is know from Anti-virus software with false positives and false negatives include how to generate both. Please note a false positive or false negative with a anti-virus will not result with the police kicking down your door.
Also Strossen there is a horrible fact. parents<< the biggest source of pedophile pictures are taken by the child parents themselves. So its unlikely that the police will want to keep the parents in the loop.
Finally this is very extendable.
There is also another point if the iCloud is properly end to end encrypted the only place that could scan the uploaded contents is on the device it self. So the idea that turning of iCloud storage will 100 percent mandate no scanning is also unlikely to last.
https://rentafounder.com/the-problem-with-perceptual-hashes/
Strossen the problem is way the images in iCloud storage is checked against the Child Sexual Abuse Material (known as CSAM) .
So yes its possible for a person to get enough matches to the CSAM just from non pornographic pictures.
I can see the day someone goes around taking sun sets with the iphone and they get taken in for having photos from the CSAM list that they never had due to hash failure. The type of hash apple is talking about being used is not that reliable or predictable.
Parent taking photos of their child is more likely to have photos that will incorrectly match something in the CSAM database by a perception/neutral net based hash.
It is the voice of a minority though. Most people don’t seem to care about privacy and have a “I don’t have anything to hide” attitude. You can justify just about anything in the pursuit of abuse images of minors. That said, they should care because when it will cause trouble for themselves it will be to late to stop.
Every single Apple commercial i watched in past couple of years was all about minorities and on how Apple supports minorities. Nice to see from time to time on what is really going on behind that facade. As for the privacy, surveillance … This won’t affect Apple sales in any meaningful way.