San Francisco’s city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to “nudify” or “undress” photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.
These sites, Chiu’s suit claimed, are “intentionally” designed to “create fake, nude images of women and girls without their consent,” boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.
↫ Ashley Belanger at Ars Technica
This is an incredibly uncomfortable topic to talk about, but with the advent of ML and AI making it so incredibly easy to do this, it’s only going to get more popular. The ease with which you can generate a fake nude image of someone is completely and utterly out of whack with the permanent damage it can do the person involved – infinitely so when it involves minors, of course – and with these technologies getting better by the day, it’s only going to get worse. So, how do you deal with this?
I have no idea. I don’t think anyone has any idea. I’m pretty sure all of us would like to just have a magic ban button to remove this filth from the web, but we know such buttons don’t exist, and trying to blast this nonsense out of existence is a game of digital whack-a-mole where there are millions of moles and only one tiny hammer that explodes after one use. It’s just not going to work. The best we can hope for is to get a few of the people responsible behind bars to send a message and create some deterrent effect, but how much that would help is debatable, at best.
As a side note, I don’t want to hang this up on AI and ML alone. People – men – were doing this to to other people – women – even before the current crop of AI and ML tools, using Photoshop and similar tools, but of course it takes a lot more work to do it manually. I don’t think we should focus too much on the role ML and AI plays, and focus more on finding real solutions – no matter how hard, or impossible, that’s going to be.
Sadly, the reality of the Web is it’s engine is driven by this industry and is massively influential technology adoption. But because of the nature of the industry, we don’t often talk about that.
From helping VHS win the battle with Betamax to effectively defining Web video compression and codecs the porn industry has been at the forefront.
Now with AI/ML, again porn is again at the forefront. What is different this time around is it’s incredibly easy to not only create, but also share without consent. Look at the Taylor Swift examples recently. And the victims have little/no recourse. That is if they are even aware they are being exploited..
While a real nude image is nodoubt highly embarrassing and damaging, it being so easy to create a fake one is also going to erode the stigma around it.
Wether an image is real or not, you can now easily claim that it’s fake, and people will pay far less attention to them. You will get such a flood of fakes that any real images get lost in the noise.
bert64,
I don’t want to defend the bullies who go around harassing others, which is very wrong. I know they exist and I sympathize with victims of their abuse. I do think your point has merit though. That we have puritans vilifying nudity to the point where we’re meant to feel shame and embarrassment over our body’s natural form is regressive. The shame and stigma are harmful in and of themsevles, it leads to a kind of dysmorphic mentality around our own bodies.
Well, in an ironic way, creating a flood of this material would lesson it’s impact on individuals.
Tell me you have no idea how sexual abuse works without telling me you have no idea how sexual abuse works.
rainbowsocks,
I don’t condone abuse and to cause suffering, especially physical abuse. I agree non-physical abuse can be bullying too. However I don’t think it speaks fairly to the point actually being made; do you disagree that things can be culturally appropriated for shame and stigma where there would not otherwise be shame and stigma?
Calling it fake has already been used in the past. Not for nude images, but other embarrassing leaked stuff.
Unfortunately it would also make having politicians accountable much more difficult. Even if there is a graft operation with actual hard evidence of bribery and recordings of crime, and autocratic leader can easily dismiss them as being fake.
(They were already doing similar things in various countries, but this just makes it even easier).
And those who want to believe it is fake, would not even need to question that anymore.
“how do you deal with this?”… I don’t have an answer. Many people will call for the technology to be shut down, but I don’t predict that will be successful long term. Those who want the tech will find a way. Even the courts will struggle to stop it. I think we’re going to have to contend with a technology that’s not going away. As for putting a stop to the bullying, yeah I can agree in principal but how do you actually do it? Bullies have been around forever and I’m not sure how to really fix it…?
Alfman,
I do not usually condone violence. But against the violent bullies in my experience that works. But that is only middle / high school, and fights have very low stakes around only embarrassment.
In other contexts like work, or when the bully is not violent themselves, but use techniques like this, I am sure it would not work. Or be even counter-productive. And doing the same, i.e.: spreading their “fakes” does not sound like a good way either.
And yes, this is a very sensitive topic.
sukru,
Indeed. Life is so complicated. It starts out with small spats, but the long term repercussions of unchecked animosity end up amplifying the vitriol and hatred later in life. I wish we could work towards strong communities where everyone could share a stake in improving the community. But how is this possible when you’ve got those capitalizing on the hatred and spraying fuel on the fire? Those who purposefully divide us are crushing the hope for peace and unity.
>I’m pretty sure all of us would like to just have a magic ban button to remove this filth from the web
No, why this impulse to always ban things?
The only permanent solution is to cut off the source material – get girls to stop putting pictures of themselves and their friends on the Internet… which in a tech saturated country would require Afghanistan level control of their behavior. Given that reality, going after people who use the tools maliciously like San Francisco did here is the next best thing.
I disagree. This is not a tech problem nor a content problem. This is a societal problem.
At the root, if you care about what others think about you then you’re giving others power over you.