The people running the majority of internet services have used a combination of monopolies and a cartel-like commitment to growth-at-all-costs thinking to make war with the user, turning the customer into something between a lab rat and an unpaid intern, with the goal to juice as much value from the interaction as possible. To be clear, tech has always had an avaricious streak, and it would be naive to suggest otherwise, but this moment feels different. I’m stunned by the extremes tech companies are going to extract value from customers, but also by the insidious way they’ve gradually degraded their products.
↫ Ed Zitron
This is the reality we’re all living in, and it’s obvious from any casual computer use, or talking to anyone who uses computers, just how absolutely dreadful using the mainstream platforms and services has become. Google Search has become useless, DuckDuckGo is being overrun with “AI”-generated slop, Windows is the operating system equivalent of this, Apple doesn’t even know how to make a settings application anymore, iOS is yelling at you about all the Apple subscriptions you don’t have yet, Android is adding “AI” to its damn file manager, and the web is unusable without aggressive ad blocking. And all of this is not only eating up our computers’ resources, it’s also actively accelerating the destruction of our planet, just so lazy people can generate terrible images where people have six fingers.
I’m becoming more and more extreme in my complete and utter dismissal of the major tech companies, and I’m putting more and more effort into taking back control over the digital aspects of my life wherever possible. Not using Windows or macOS has improved the user experience of my PCs and laptops by incredible amounts, and moving from Google’s Android to GrapheneOS has made my smartphone feel more like it’s actually mine than ever before. Using technology products and services made by people who actually care and have morals and values that don’t revolve around unending greed is having a hugely positive impact on my life, and I’m at the point now where I’d rather not have a smartphone or computer than be forced to use trashware like Windows, macOS, or iOS.
The backlash against shitty technology companies and their abusive practices is definitely growing, and while it hasn’t exploded into the mainstream just yet, I think we’re only a few more shitty iOS updates and useless Android “AI” features away from a more general uprising against the major technology platforms. There’s a reason laws like the DMA are so overwhelmingy popular, and I feel like this is only the beginning.
That’s an oversimplification. For example, for whatever reason, when I experimented with a locally hosted copy of Stable Diffusion, it took about 125W of my RTX 3060’s 170W maximum as reported by nvidia-smi and, though I haven’t confirmed, the fan certainly sounds like it’s generating more heat when I play games on it instead.
Now if your argument is against the ridiculous waste of using chatbots to search the web, or the doomscrolling-esque addictive nature of prompting Stable Diffusion and seeing what comes out, those are harder to dispute.
As a clarification, based on what I’ve read, I believe Stable Diffusion’s power consumption maxes out at 125W on an RTX 3060 because it’s bottlenecked on memory bandwidth.
ssokolow (Hey, OSNews, U2F/WebAuthn is broken on Firefox!),
One of my gripes with the “AI wastes power” argument is that too often it ignores efficiency and productivity multiplication. For example, I used to have to painstakingly mask product images, create background fill, etc by hand. I remember one job where I did this for weeks for a couple thousand products in a catalogue. Today content aware AI tools can do the same job in seconds. Criticizing the AI’s high instantaneous power consumption, even if true, needs to factor in the increased productivity. The same task done manually without AI can take magnitudes longer (and use more overall power).
> (and use more overall power)
Show your math please.
Current AI infrastructure consumes in the range of a small country.
Serafean,
“AI infrastructure” is meaninglessly vague. The best I can do is test my own little setup, which is probably not meaningful to you. I would like to see experiments that measure the productivity and power consumption more rigorously. Still, it’s logically clear that we cannot just point at wattage and declare something inefficient without factoring in the time and productivity.
If something takes a minute to do using AI, but an hour or more to do manually, then even though the instantaneous wattage may be much lower than the AI, the overall power used can still favor the AI.
Obviously whether it does or not depends on the specifics, but hopefully I can convince you that productivity has to be factored in for a given task before it would make sense to declare that AI is inefficient at it.
We need to factor in the training costs too, but the marginal energy costs approach zero at scale so even that can be justified in terms of being more energy efficient with AI than without. So can we agree this depends on productivity? This is what I am saying. I don’t mean it as a blanket statement that all AI is necessarily more efficient, only that it “can” be.
Alfman,
So now you’ve done the task in seconds thanks to AI, you turned off your devices, and spent the next few weeks out in the sunshine…right?
I see the argument you’re making, and while it’s logical it assumes the total amount of output generated is fixed, and the issue is the most efficient way to generate the output. Unfortunately human societies tend to take any efficiency gain and respond by performing more tasks, so the CO2 is the cost of the AI to do the task for a few seconds, then the cost of powering your devices for the next few weeks while you do something else.
Put another way: if we were on a path to reduce aggregate power consumption due to AI, we’d see power plants being mothballed, not brought back into service.
The existential question we’re facing is whether the tasks we’re doing are worth an extinction event for. Objectively they’re not, but individually we’re all pushed to do them anyway.
malxau,
Maybe you could make the case that if AI is too efficient then it leads to over-consumption, but that’s a different issue. For example should we oppose energy efficient lights, “energy star” electronics, vehicles and jets with higher MPG because it’s just going to encourage more consumption anyways?
I’m not completely disagreeing with your point that consumption may increase, but not justifying efficiency on that basis is an absurd policy. Over consumption is a real issue but we need to deal with that while still promoting efficiency over inefficiency.
It’s worth looking at but honestly I think over consumption is the greater threat and not efficiency.
It’s telling that Google and Microsoft intend to use nuclear power specifically for their AI centres.
Athlander,
What is telling about it? Isn’t it better than power from coal plants? Nuclear has long been considered a good way to compliment intermittent power from renewables like solar and wind without carbon emissions.
I’m not privy to the breakdown for microsoft’s data centers unfortunately. Many media articles seem to link back to the IEA…
I’ve only seen these numbers reported as totals, inclusive of AI, cryptocurrencies, and other data center applications. I think media outlets have chosen to focus on “AI” in their headlines, but it may be misleading to portray AI as being exclusively responsible as some people are assuming.
I haven’t been able to find the information I want from credible original sources. Ideally MS/Amazon/Google/Apple/etc would report their own energy breakdowns, but I think they’re keeping such information confidential.
The great power usage is only for training.
Once the model is created the power usage is low.
I am running pocketpal on my phone with local models and the power usage is low.
>”I remember one job where I did this for weeks for a couple thousand products in a catalogue. Today content aware AI tools can do the same job in seconds.”
What you’ve just described is not AI – that’s simple pattern matching algorithms at work.
andyprough,
AI has made content aware fill and selection tools more sophisticated than any of the classic algorithms. Entire objects can be selected with a single click or tap almost eliminating the need to think about how to do it. The productivity of creative AI tools keeps getting better and while I concede that I don’t know if this is good for those who’s job it was to do the work, regardless as AI developments keep making workers more productive employers are likely to keep leaning into it further and unlikely to return to a pre-AI economy.
Alfman,
Agreed. It is very easy to understate how much helpful these tools are, and how much time they save overall.
Yes, there is this other concern you mentioned, namely using more of it. This happens everywhere.
Say the common issue…
Gas prices go down, people take vacations. They go up, people start optimizing. In any case, the price will go back to normal as dictated by the supply demand curve.
Same will happen with AI. As it becomes “cheaper”, we will use more of it, and more devices will use it. However it will reach an equilibrium,
What people miss is the massive efficiency gains we have in the models. The original “stable diffusion” would take 3-5 seconds to run on a desktop GPU, and much more on older, or weaker devices. Now we can generate images on mobile phones almost instantaneously.
This is true for LLMs as well. ChatGPT API used to be
$0.03 for 1K input tokens, and $0.06 for 1K output tokens
Now it is:
$2.50 / 1M input tokens, and $10.00 / 1M output tokens
A 12X reduction! And you can get 50% off if you use the batch API.
Basically it was extremely expensive (think spending about a dollar to check one large source file, now you can do entire codebases for a few dollars).
This is from a combination of better model architectures, better software, better silicon and of course new algorithms.
And it will only get better.
(Who wants to pay basically $10 per minute?)
@Alfman >”AI has made content aware fill and selection tools more sophisticated than any of the classic algorithms.”
That doesn’t appear to be the case. All that appears to be happening is that monumental amounts of computing power, cpu cores, graphics cores, and electricity are being thrown at completely mundane problems, using well-known algorithm types. No one was stupid enough to throw that much resource at easily solved problems in the past.
andyprough,
You can deny it and I do respect your own choices for yourself, but at the end of the day it’s not going to be your choice that’s significant, rather it’s going to be the employer’s choices.. They are going to make the choices that either implicitly or explicitly improve their bottom line (the AI might not be itemized). People who refuse to evolve over ideological reasons are not going to remain competitive because that’s the way capitalism works.
Alfman, I’m not sure why you fixated on the merits of nuclear power. I’ve got nothing against nuclear power and I think it’s preferable to fossil fuel and current renewable technology.
Anyway, what’s “telling” is that if there wasn’t a need for more energy then these companies wouldn’t be making such agreements. Indeed, Microsoft acknowledges that their AI systems are resource intensive.
It’s possible for someone to believe, (as I do), that AI has some very exciting uses and great potential, whilst also being concerned (as I am) about the environmental impact, ethical issues, accountability issues, potential misuse and so on.
I get that you are in the “AI at any cost” camp, but if you could apply the same openmindedness of your nuanced approach to ‘Wayland vs X11’ to AI, you’d see that there are valid concerns about AI as well as valid positives.
Athlander,
I’m not really, I’m not the one who brought it up. However when others imply these companies going nuclear is a bad thing I don’t really agree with that. if we can recommission nuclear reactors it can help lessen the dependency on fossil fuels/coal that are otherwise used.
We know data centers use a lot of power, but where is the information that specifically breaks down what the power in these data centers are going to? Please link it because I am genuinely curious. The IEA source that many media outlets have been using does not appear to itemize the applications inside of the data center and the IEA may not even have this information themselves.
Same here, but there seem to be many people who just want to assume that AI does tasks less efficiently, whereas I think such claims need to be specifically tested before making such assertions. It may be plausible that AI can be more efficient than without, at least with some tasks.
I don’t have enough public information to make a determination about the macroeconomic impacts on climate change, I can estimate AI’s energy requirements by scaling up small scale consumer electronics. but this is very hand-wavy, However I don’t believe that most of those criticizing AI are doing so based on any more information that I have, instead I suspect they’re making assertions that fit the AI narrative they want.
I don’t actually mean to give off that impression. I do have concerns over AI, however my problem with the prevailing anti-AI narrative is that so far I haven’t seen anyone with strong empirical evidence instead of blanket assumptions.
“Google Search has become useless, DuckDuckGo is being overrun with “AI”-generated slop, Windows is the operating system equivalent of this, Apple doesn’t even know how to make a settings application anymore, iOS is yelling at you about all the Apple subscriptions you don’t have yet, Android is adding “AI” to its damn file manager”
One of these is not like the others
Eh, no use discounting the common causes, that it’s all related.
While I generally agree with the points being made here, I tend to dislike all the complaining about tech companies treating us like “something between a lab rat and an unpaid intern” when we use services that they provide for free. That is, I think it is a fairly explicit and obvious contract that WE are responsible for the decision to BE the product instead of paying for one.
If you do not like Google Search and DuckDuckGo, there are excellent paid alternatives. I highly recommend kagi.com
If we are not willing to pay for services that behave the way we want, we should acknowledge the truth that we are paying in other ways and be more accountable for that.
Now, when we DO pay for a product then I completely agree. I find both Windows and macOS incredibly hostile to their own customers in various ways. It is why I have been primarily a Linux user for so many years.
I would add to this that I think the reason Google and other search engines seem worse is because the web has grown, and there is more crap on the web, so you are just getting a lot more “false positives”.
Google seemed better when there were fewer content mills creating all sorts of junk on the internet, and therefore there was less rubbish to sift through.
You also have more to search through, and the same amount of real estate, more or less, to return results. So if you imagine that the web has gotten 10 times bigger in the last 20 years, but users are not willing to sift through 10 times as many search results.
Also, people don’t want to pay for services on the internet – it was very funny (to me) that the EU was not impressed that Facebook offered a paid version for EU customers in return for a more private no ads service.
On the AI point, I hardly think anyone, including the companies that are involved, consider current AI energy consumption to be ideal. However, no one is more inclined to fix that issue than the same companies that have to pay a lot of money for the energy needed for those operations. The costs will come down and AI will become much less energy intensive. And on the six-finger point, well yes, AI is currently not perfect, but nothing ever is in its first incarnation.
I don’t find it surprising. They see negotiating over privacy in that way as a “pay the mafia protection money”-esque dynamic and are working to walk it back by saying that it’s unacceptable to haggle over a price for privacy.
From what I remember, that’s mostly because it does its latent-space calculations at something like 1/4 the resolution of the final image to limit VRAM requirements, which hurts accuracy for any fine details… we just notice it most when it’s getting something about the human body wrong.
>”when we use services that they provide for free”
These are multi-trillion dollar companies. Nothing is “free” where they are concerned. If you think you are getting something from them for free, you are sadly mistaken.
That is exactly my point. If you know that you are not getting it for free though, why all the moaning about the price?
I think corporations are a problem as is the endless march of enshitification. However, I also see extreme entitlement by non-contributors as a problem too. If you don’t want to pay money, you are agreeing to “pay” in other ways. If you do not like the “free” option, then pay for it or create an alternative ( Open Source ideally ). If you paid, and the product sucks, now you have a legitimate point and I am all ears.
Agreed.
Facebook tried this, but EU quickly backlashed. They offered users an “opt out” and pay directly for sevices.
The bureaucrats went crazy.
https://www.malwarebytes.com/blog/news/2023/11/meta-sued-over-forcing-users-to-pay-to-stop-tracking#:~:text=The%20choice%20for%20European%20users,up%20to%20%24275%20per%20year.
Their aim was to get Facebook to provide valuable services for free. Not asking $275 per year.
The issue is, most people do not realize how expensive these services are, and how good the companies make use of the data.
Part of the problem is we used to like tech, up to a point we were euphoric about it. GNU/Linux, new Windows, new iPhone, new dumb phone, free Gmail, new CPU, motherboard, LCD, SSD … Those days are long gone. Companies behind it are still using those products to force shit on us, shit nobody really asked for, so we all are in a bit of a pickle.
Completely agree, well said. Or maybe we’re just getting old and grumpy. Things were so much better in my day!
Most people don’t care. For example, they will happily use iOS despite the fact it misses basic things such as sideloading and the fact Apple sets capricious rules for its App Store. Again, people don’t care.
Computers don’t belong to us computer nerds anymore, If Apple locked app installation back in the MacOS Classic or MacOS X days, there would be severe backlash and market failure, because back then computers were still primarily the domain of computer nerds. Those days are long gone, a product like iOS can be a massive market success even if nerds who care about sideloading won’t touch it with a 100-foot bargepole.
Even if the move to shove AI text generators on every product fails, it would be because the mainstream rejects it, not us nerds, so it will be an isolated win that wouldn’t transfer to other issues the mainstream doesn’t care about.
“Most people don’t care”. On that we agree.
“Computers don’t belong to us computer nerds anymore”. On this point, I want to disagree, In my view, there has never been a better time to be a computer nerd. High-quality Open Source software for everything, access to incredible computer languages and toolchains for free, mind-blowingly inexpensive and powerful hobby hardware ( eg. SoCs and SoMs ), freely available AI models trained on billions of dollars of infrastructure that appear to almost think, and a global communications network for sharing information and expertise? Tell me that a geek in the 80’s would spend a week in our time and honestly think they had it better. I don’t buy it.
However, I think what you are really saying is that computers are not ONLY for nerds anymore and that regular people outnumber us. On that point, you are of course completely correct. However, the size of the market that is for the nerds is still massively larger than the entire market was in the past ( see above ). Sure, we cannot expect platforms that are marketed to the mainstream ( iOS, macOS, Windows ) to cater to nerds. However, we are not stuck having to modify the mainstream anymore. Instead, we have entire ecosystems that are practically nerd only.
Cory Doctorow’s enshittification theory is not only a succinct description of a pervasive phenomena, it’s also got practical recommendations for what to do about it. Any tech monopoly (or cartel) will behave this way, so they need to be broken up, or regulated until they are unable to do so.
Doctorow talks about how we went through a long period of time where Milton Friedman economics delusionally said monopolies were good, we stopped using AntiTrust powers and that led to this. Doctorow detailed how tech companies made the argument that they needed to be free to disrupt without oversight or regulation (or taxes) and governments bought it.
Ed Zitron’s Rot Economy says Doctorow is wrong to not notice this wider economic trend, but I guess he didn’t read Doctorow very much? Ed Zitron says that just calling things out as the Rot Economy and raising awareness is the entire solution. I just don’t think it goes far enough.
It is only helpful in getting more of the public to understand the general Tech companies enshittification process only because it increases pressure on governments to regulate big tech and break up monopolies.
In the meantime, getting more techies to self-host (for both themselves and for their friends and families), and to divest themselves from reliance on Big Tech makes it more viable for everyone.
Directly financially supporting Open Source efforts (like DivestOS), or advocacy groups (like the FSF or EFF) or preferentially buying from open source vendors (even if they are more expensive than the cheapest windows/chromebook) is something non-techies can do. How about funding great websites like osnews that help inform people? Thanks!
Thanks for your really insightful comment. It struck me as odd that Ed Zitron felt the need to explicitly separate his theory form Cory Doctorow’s, but your clarification helps demystify this. I also agree with your conclusions about how to actively go about making a difference. Promoting self-hosting and financially supporting organisations that work against these trends are both very practical and friction-free ways to contribute and, at least for me, help make me feel a little less powerless.
I use almost no online services and i am happier than the people around me who are dependant on them and think life without it is impossible or some kind of feat of strength. We are all free to ignore…
Oh dear, are “they” with us right now? Are “they” saying any bad things?
Would you like us to fu*k your ass? Yes/Later?
GrapheneOS! I have to purchase an Android device like a Google Pixel Phone and install it. I’ve been on the hunt for a privacy Phone OSto shed my IOS phone for.
You are right, of course.
But – to get rid of Google, you supported them by buying *Google* Pixel (there are other deGoogled Androids, sadly most supported phones are quite old).
To register here, I have to go through *Google* reCaptcha.
And in previous post, you wrote that you almost don’t use cash anymore – please check the logo on your card (watch/phone) next time you’ll pay for something (and your bank knows everything)…
As for search, did you tried https://kagi.com/ ? It’s not free, but it’s great, I’m using it since June 2024.
I’m hosting my own Nextcloud, email, even domains (yes, that is not for everybody, but hosted solutions exist – if you don’t want to be “The Product”, you have to pay, TANSTAAFL – https://en.wikipedia.org/wiki/No_such_thing_as_a_free_lunch ).
Social sites are problem, I’m just avoiding them (using https://getsession.org/ chat with friends).
Nice Christmas to everybody and good luck getting rid of Google/Apple/MS/…!
I agree with the general thrust of the linked piece and I’m sure Ed Zitron is sincere in calling out the Rot Economy. But I’d find it more convincing if halfway through reading I wasn’t interrupted by two different pop-ups asking me to subscribe. I’d find it more convincing if there weren’t six separate adverts on the page asking for my email. I’d find it more convincing if there wasn’t cross-site tracking content from three different organisations embedded in the site.
Ed laments the fact “every single website has to have 15+ different ad trackers, video ads that cover large chunks of the screen, all while demanding our email or for us to let them send us notifications”. He says he’s “not holding back, and neither should you”. Given this I think it’s fair to point out that his blog is contributing to the problem.
I see this so much. It’s really not hard to create a site that doesn’t track users and isn’t peppered with user hostile anti-patterns. So it saddens me when so often sites that are making the case against undermine the argument by being complicit.
flypig,
I disagree with your points, but we need to take stock of why all innovation throughout history always end up this way, be it radio/cable tv/email/internet/streaming/video games/sports events/the whole enchilada….it’s always the same enshitification that we might as well coin a Godwin-esque law to cover this inevitability. Almost everyone pressured to resort to these tactics to make ends meet or to make more profits. All capitalistic economies, given enough time, will end up with the spread of enshtification.
I had a website customer who had a marketing/SEO company review their ecommerce site and they added a full screen obnoxious popup. to sign up for a news letter. I mentioned to the customer how I thought it was annoying and tasteless, but the marketing company convinced them that everyone else is doing it so it must be ok.
flypig,
I didn’t catch this when I wrote it, but I meant that I don’t disagree with your points. Very bad proofreading my on part, haha.
The best revenge is something positive. Choose something else even if it’s harder. It’s really all up to you.
“Not using Windows or macOS has improved the user experience of my PCs and laptops by incredible amounts”
THIS!