Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.” We disagree.
We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.
There’s now so much evidence from credible, trustworthy people and organisations that Apple’s system is bad and dangerous, that I find it hard to believe there are still people cheering Apple on.
There’s a lot of overpromoted managers and people who work to the box tick. The internet is a very low bar.
To be correct a lot of people don’t understand encryption or hashing. Lot of people think a trillion to 1 error rate is a good error rate when for hashing this is down right terrible.
You have what is a Perceptual hash that is design to be quick and fast source images that are similar with each other there are many different ways todo this but with a great level of error.
https://www.mdpi.com/1099-4300/21/5/449
But then you have beasts like deep hash. Yes this another form of AI hash. These are not looking at the picture in total instead are looking at the people in the picture. So you want a stack of images/videos to trace a person walking though a mall. You don’t care about similar right. You care about the subject in the picture.
Its these forms of deep hashes that are way better at hunting down forbid images. Even a proper deep hash can be abused in some very horrible ways. Note deep hash lot lower false positive rate because of how it works that it works out the person/target item in the picture then uses all its process to a hash on that restricted pixels. Like you might still be taking 360×360 pixels out a image like apple perception hash but a deep hash as focused on the subject matter instead of being random-ally spread over the image.
The reality here the one who wrote that paper really did not fully understand hashing either but were smart enough to work out that what they had was not workable. Yes more exact deep hash could be abused worse than a perception hash.
“I find it hard to believe there are still people cheering Apple on”
??Really? Apple didn’t invent fanbois but they may have perfected them.
Although I admit Tesla & SpaceX may be elevating them to Mars.
The concern here is now:
1. Other countries control the cryptographic keying material and certificate authorities, specifically those with horribly human rights records. This means that encryption might as well not exist.
2. The algorithm is also open source now since it has been reverse engineered.
3. There will be inclusions of this in security products used to ostensibly monitor employees. We know the real use is monitoring citizens and dissidents.
The combination of those two means that this process and algorithms, despite the best intentions of some, are going to be used to suppress free speech and expression. Human rights violations from autocratic regimes will occur because they will be adopted by others.
I wish people would come out and admit what they are really using it for.