We’ve already talked about snaps on Ubuntu, but it turns out it’s actually way worse than I initially thought.
On the latest Ubuntu, if you try to download the .deb version of Chromium using either the Software Store or command line, it acts as an alias to installing the snap version! Essentially, Chromium snap is shoved down your throat even if you explicitly asked for the .deb version. This is not cool Ubuntu – just because Chromium may be easier to maintain as a snap app doesn’t justify this forced behavior.
[…]Snap applications auto-update and that’s fine if Ubuntu wants to keep systems secure. But it can’t even be turned off manually. Auto-updating of snaps can only be deferred at best, until at some point, like Windows, it auto-updates anyway. Even on metered connections, snaps auto-update anyway after some time.
I only use Ubuntu on my laptop right now – my workstation and main PC run my distribution of choice, Linux Mint with Cinnamon – because the latest version of Ubuntu supports it better than the current Linux Mint release does. As soon as the next version of Mint is out, which will be based on the current Ubuntu version, I’m ditching Ubuntu right away.
I don’t like snaps, FlatPaks, AppImage, or any of that other nonsense that do nothing but make a clean .deb/APT-based system more complicated than it needs to be. Debian’s package management system is incredibly robust and easy to fix in the unlikely event something does go wrong, so I simply do not have a need for additional application installation methods that I can’t control through APT.
Ubuntu only barely just recovered from the Unity debacle, only for the project to now go down yet another route nobody is asking for.
Snaps are slow and awful from my experience. I use several Flatpaks on my Fedora install for things like VLC and Telegram because those programs make sense to me to use the Flatpak deployment system to stay up to date.
Ubuntu is shipping the System Monitor and Calculator as Snap packages. My calculator doesn’t need to be that up to date or secure that I’m willing to make that trade given the extremely slow start up time of Snap packages.
It feels like they are doing this with every package to sell users on Snaps, but it’s just a mess. I hated running the Snaps when I tried Ubuntu 20.04 and even said to myself there’s no way I could ever use Ubuntu with all the Snap stuff they have interwoven now.
Canonical does some good things for Linux, but Snaps don’t feel like one of those things.
If Debian’s package management is so “robust”, why does each distro and every version of a distro need to have its own set of repositories, with some of them hosting outdated versions of the apps? Imagine if, when using Windows, you had to download everything as an .msi through Windows Update, a different .msi per Windows version mind you. And a different one for the Home, Pro and Server variant for good measure. And the Microsoft had to package those .msi files which means some of them would be outdated due to lack of manpower to maintain them all. Because this is what Linux package management currently is.
Please make Snaps or Flatpaks happen… Linuxeros stuck in the past can use Devuan which as a bonus also shuns SystemD.
Different strokes for different folks? Not every user or setting WANTS the latest version of every package; in fact, there are countless use cases where sticking with a more robust, well-tested set of packages instead of the latest and greatest is greatly preferred. So, some users will stick with a set of packages and repositories that are more conservative, such as LTS releases or Debian, while some might want the latest and greatest and opt for a distribution or version of that distribution that caters to that need.
That’s not a weakness of APT (or RPM), but a strength.
“Robust” and “well-tested”. That’s the mind-set that puts people into zero-gravity. It gives us IBM, Ada, Go, and Rust.
Not every situation calls for such reliability. But that reliability is sometimes necessary.
“well tested set of packages” is the mindset of someone too deep into Linux compared to the average person. They only care that actual end user software is up to date, and all those “packages” that should really be part of a core OS aren’t their concern. As for user software, I recall Blender used to be horribly out of date in Ubuntu’s repos, even though it typically released every 6-8 months at the time. It’s not that they want something “well tested,” it’s that they want something that just works out of the box and to have the latest end user software. Meanwhile Linux is continuing to throw everything at the wall and hoping something sticks, but the only thing that sticks is stuff that is user unfriendly and wins technical arguments among the userbase. RPM/Deb/flatpack/snap/etc. will never be a magic bullet that suddenly wins a bunch of users to Linux or win a technical superiority argument amongst existing users. Just another case where Linux has 20 solutions because there’s no unity in the community and worse product for end users is the result.
“well tested set of packages” is the mindset of someone too deep into Linux compared to the average person. They only care that actual end user software is up to date
Average Linux users care less about version numbers and care more about whether or not the software is stable and works correctly. In the Linux world, running the “newest version” typically means you’re the alpha or beta tester. Thom is not wrong when he says there are users who prefer using known reliable versions over whatever the newest version is.
The problem is you’re only thinking about Linux users that have learned to tolerate and know all of this stuff. How’s that work out for end users being told “you should install LTS instead of the one you installed, and then PopOS instead of LTS later because it’s more “tuned” for nvidia drivers. “Linux people” will tolerate this, the average user does not. Especially when the “latest version” is user facing software that’s well tested and stable, but waiting on someone to “package” it for whatever distro they picked.
Unfortunately, especially in case of KDE, old usually means buggy. No 1 reason I was fighting with pkg mgr to get more recent version of sw were annoying bugs that were advertised to hat been already fixed by developer. The “beauty” of OSS “agile” approach is that 90% of beta testing is shifted to the user base (that’s the price of free) . And as in agile in general it only works if you can release often and get your code to the user fast.
Because they chose a different packaging strategy to Debian. Sticking to an “outdated” version of an app may also be about providing stability. Not so long ago even Ubuntu would refrain from upgrading e.g. Firefox during the lifetime of a release. However, having an older version does not mean it is somehow “outdated” — it might well be supported even by upstream WRT security updates and such.
Unfortunately by proliferating released versions Linux maintainers are shooting themselves in the foot as at least partially every build invalidates the effort committed to testing previous package. However in the Linux world that burden is pushed to the user base so the effort is free, no problem, right?
At the end of the food chain are paying customers who get the version that guinea pigs have played with long enough.
On the other hand, stable frequently should be read stale, because there’s simply not enough manpower to maintain zilion of versions, especially of given aspect is not what your customers are paying for.
That is the entire point of the concept of Linux distributions. You get a consistent set of software which is engineered to work properly together. And when you need some really up-to-date software, you just install it from the official binary on the official website, which is generally available, and looks mostly like your average setup.exe or foobar.msi on Windows.
Now compare this with Windows where you install manually 50 software, and then a third of these softwares have crappy updaters, the others will get unsecurely out of date as a matter of months. Yeah, that’s just perfect.
So sure, the situation is not perfect on Linux distributions, but it works rather well,
I am sorry but I cannot see the point of snaps and flatpacks for the average piece of software. Sure, it solves pretty much every corporate use case where you need severely outdated software but I do not think you are referring to those uses. I guess you are talking about the enthusiasts use case, where you need every single piece of software to be up to date, ’cause it is better, you know. Been there, done that, and I used Gentoo for two years, then Arch and Manjaro, for two years. I encountered non-working updates and got annoyed of having to seek answers from the web every couple of months or so. This is not fun.
So I got back to Ubuntu and Debian.
loic,
Not really, most software on linux outside of the repos is a tarball with source that needs to be built at the command line using a random assortment of build tools. Over the years new build tools were developed to address both real and persevered issues with the old ones, however this in itself has resulted in an extremely fragmented non-standard process for installing software. As a developer I do this frequently, but it’s not for the faint of heart and if we’re being honest it’s nowhere near the average setup.exe or foobar.msi on windows. When people talk about linux being easy, they are talking about installing software through the repos, which is easy.
The repos can cover most people’s needs, but installing software outside of the repos can be quite difficult, enough to drive away users. This is the motivation behind snap/flatpack, and I appreciate what they are trying to do, but I just wish we could all work together to find a consolidated solution because if we fail to do that we may well end up dependent on a random assortment of installers in the say way that we depend on a random assortment of build tools that ultimately do the same thing in different ways.
It works rather well… but only for those use cases it handles well.
I’ve been a huge proponent of debian stable for well over a decade because it has such comprehensive repos and doesn’t too often. It works well so long as your needs are totally satisfied by the software in the repo. For many of us that’s most of the time, yet when you need to install manually it becomes tedious. You iteratively search for and install dependencies, build tools. etc. It’s a form of DLL-hell. This is an area where linux needs some new innovations like snap. I’m not going to suggest “snap” is the best solution and it probably could be better, but I certainly understand the problems that it’s trying to solve.
If you don’t have those problems yourself, that’s great, but try to appreciate that not everyone is in your boat.
The tarball distribution is more of a statement than a technical obstacle.
It says: I do not take responsibility how this software works outside on my system, you’re on your own.
It’s a key point defining the relation between the provider and consumer of the software.
And it is the one of creator and integrator. In the Linux world every user is in put in the role of software integrator because Linux is in essence a very elaborate embedded software ecosystem.
If you embrace it and most software developers are capable of doing so, it can work better than of the shelf solutions.
But once I understood that I stopped offering it to my relatives or friends as I’m not willing to take the role of system integrator for them too.
dsmogor,
It’s fine if this is your opinion, I wouldn’t recommend linux for those who don’t like to DIY either. However for those who want linux to be more accessible to the masses then such difficulties are part of the problem.
For users who strictly stick to the repos/ubuntu store, then the desktop linux experience is arguably on par with consumer mobile platforms like android. Hell if we’re giving credit where credit is due then linux distros were doing this stuff before either IOS or android! But…desktop linux software continues to be deficient regarding mainstream support compared to windows, ios, android, etc. I suppose some linux users don’t care about this, but there’s a lot of room for improvement for users as well as developers. The thing is it requires cooperation, and some of the archaic methods that continue to be widely practiced are holding things back. People tend to be quite stubborn though and it’s for this reason I expect linux will continue to be fragmented. Maybe this is the way it has to be, but ironically I think nearly everyone would be better off if only we were willing to work together to make it better.
Hey @Alfman
>For users who strictly stick to the repos/ubuntu store, then the desktop linux experience is >arguably on par with consumer mobile platforms like android. Hell if we’re giving credit where >credit is due then linux distros were doing this stuff before either IOS or android!
I would say that’s only the case for vertically integrated solutions like the Linux laptop companies where the full hardware stack is actually supported by the distro provider. In most other cases there’s almost always some manual integration work involved, work that is frequently matter of quick search in google but still beyond causal customer ability.
Regarding mainstream support the situation is actually getting better than ever thanks to three factors.
– games availability have never been better thanks to massive dedication by Wine and Steam developers
– HW advancements made virtualisation based solutions like snap practical and commercial software developers have taken notice
– Linux desktop has finally found its sustainable niche as software developer workstation. I can imagine that if Apple is really insane enough to follow Thom’s macOS prediction many devs will follow his footsteps. Many of the companies have decided to open source the crucial tools so, so now they are properly integrated into linux ecosystem, snap or not.
dsmogor,
It sounds like you are talking about hardware compatibility whereas I was talking about the ease of installing software from the repos. Still, you aren’t wrong that it helps to buy hardware from a manufacturer that officially supports linux .
No doubt it’s getting better, though still behind.
Snap isn’t based on hardware virtualization, it uses software containers.
Desktop linux has always been almost entirely comprised of software developers like myself. Maybe it’s finally expanding past our niche though? I’m curious if someone like Thom would be using it if he wasn’t otherwise involved with osnews.
Developers may bemoan the changes, but they will always go where the users are because that’s how we have to put food on the table. Obviously there’s some positive feedback looping involved too (they need apps to get users) but when the companies we’re talking about already have a sizable base, it helps them create a critical mass for their platforms regardless of our protesting restrictions. Microsoft knows this and so does apple. Companies have been realizing that they can get away with more because most consumers are sheep and won’t necessarily act in their own interests, which can be exploited for profit. The trick is to do it in such a way that it minimizes backlash, but it bears keeping in mind that they don’t have to convince us (ie you and me), they just have to convince enough sheep and then peer pressure, popularity, and network effects will do the rest.
I tend to agree and we should laud Ubuntu for making a sacrifice shifting manpower to snap maintaninece for the benefit of the whole Linux ecosystem. Clearly at expense of it’s own user base as Thom point out. But that’s ok, choices are aplenty and Ubuntu company is not dependent on desktop revenue.
Having said that, shipping a calc as snap is one step too far. Your windows analogy is spot on here (Windows ships with calc after all), the borderline between core OS packages and 3rd party shall be crystal clear here.
On the one hand, you can’t innovate without ignoring what people say they want. On the other hand, you are virtually guaranteed to go down roads that won’t work out if you do ignore the users.
Apt is fine, I run very similar setup to the commentary using Linux Mint for my non-MS/Apple machinery.
I stay on a relatively slow update schedule for Linux and generally have very few problems, the irony for me being my most common problems come from AppImage updates which are supposed to stop problems with updates.
But that is life in open source, and I won’t bemoan the diversity because it’s a strength. The alarm bells for me ring when people starting claiming standardisation on some preferred solution will be the cure of all your problems, you just know it’s all going to turn sour the minute they head down that track!
cpcf,
Standardization isn’t the problem though. The problem is that, as they currently stand, deb repos don’t handle all use cases equal well. While repos work for the vast majority of software, for some users including myself repos don’t always get you what you want or need. In such cases it becomes rather difficult to manually install what you are looking for along with it’s dependencies. These alternative installer like snap have come in to fill in the gaps, which wouldn’t necessarily be so bad but it turns out snap is very inefficient compared to the repos. Ideally we would have a standard that takes the pros of each approach while minimizing the cons. I think it’s technologically doable to do better than any of the current solutions, but the bigger question is whether there’s a will. Open source is full of politics and ideological differences, which could get in the way of the cooperation needed to pull it off.
The problem is that, as they currently stand, deb repos don’t handle all use cases equal well. While repos work for the vast majority of software, for some users including myself repos don’t always get you what you want or need. In such cases it becomes rather difficult to manually install what you are looking for along with it’s dependencies.
It _can become_ difficult but that isn’t automatically the case. I’ve found a lot of the the time the dependencies are readily available via repo and the only `real` work is configuring the software you’re going to compile. Even that isn’t much of a challenge most of the time. If you’ve gotten to that point, you probably either have some clue what you’re doing or you’re following some kind of instructions/howto that are guiding you through it. Yes, compiling things yourself can be a serious pain in the ass. Yes, the dependency hell can be real and make you want to punt the keyboard over your neighbors house. _But_, I haven’t found that to be the default. In my experience, and I’m only talking about _my experience_, the majority of stuff I’ve compiled has been little to no problem.
friedchicken,
Sure, the difficulty of installing software changes on a case by case basis. Some software has very few dependencies other than pre-installed linux libraries that are both stable and easy to work with. I always cross my fingers that the dependencies I need are available (and compatible) with those from the repos, but when we’re not so lucky it can take an inordinate amount of time to get a working build because there’s often a cascade effect of more code dependencies. We could just say that’s the way it has to be, deal with it, but I don’t truly believe it has to be like this. Our tools for dealing with this really should be much better.
Something I’ve gotten absolutely sick of dealing with in my experience over the years are configure scripts. They should tell you all the dependencies you need up front. The part that checks for dependencies is typically at the end of a time consuming configure script that invokes the compiler hundreds of times to probe it’s compatibility, which is already annoying to begin with. And instead of telling us everything missing in a single go, it aborts after the very first missing dependency, forcing the user to install them one at a time and rerunning the configure script over and over again for each dependency. This procedure is stupid and to make matters even worse sometimes the error messages are too vague to identify exactly what’s missing, in which case you have to debug the project just to see what it’s looking for.
As with many things involving legacy software standards, a lot of the technology stack foundations we build on is kind of deficient and yet we keep using it rather than agreeing to fix it. It becomes part of the permanent cruft baked into our software & infrastructure.
Something I’ve gotten absolutely sick of dealing with in my experience over the years are configure scripts. They should tell you all the dependencies you need up front. The part that checks for dependencies is typically at the end of a time consuming configure script that invokes the compiler hundreds of times to probe it’s compatibility, which is already annoying to begin with. And instead of telling us everything missing in a single go, it aborts after the very first missing dependency, forcing the user to install them one at a time and rerunning the configure script over and over again for each dependency. This procedure is stupid and to make matters even worse sometimes the error messages are too vague to identify exactly what’s missing, in which case you have to debug the project just to see what it’s looking for.
Agreed! Dependencies should be presented, with clarity, to the user up front. Required dependencies, and ones that are optional based on intended configuration. I never understood why it’s sooo difficult to do this.
Another thing that bugs me about configuring is when switches don’t have obvious meanings or actions. Some devs seem to ignore the fact that more than themselves compile their software. You’ll have configuration switches that manipulate obscure internal settings & flags, are double-negatives, lie about what’s installed, etc. It’s ridiculous when you have to reverse-engineer how to actually configure things for a desired outcome. I’ve heard some devs argue that the easier the user-facing stuff is, the harder things get to maintain. I think that’s more an issue of bad design than anything else.
I don’t like these flatpaks/snaps/etc. either, but it’s undeniable that they were born out of a problem with the existing distro packaging infrastructures. I’m not sure whether the solution is actually better than the problem, though.
I see already Linux Mint is offering things from Flathub in its Software Manager UI, which I assume indicates a flatpak package… Thom, do you know what you’ll replace Mint with, then?
Snap and friends are good for testing or evaluating a package. As for relying on something it for day to day use (browser). No thanks. Isn’t having the choice great!
Yeah, I was trying out Ubuntu last time I bought a new laptop. It is really about time I return to Debian.
Sounds like it’s time to
apt-mark hold
the snappy components after removing them and then start investigating either a Chromium PPA or a move to Debian.(For those who aren’t familiar with it,
apt-mark hold
let’s you pin a package at its current state. I used it to build some scripting which ensures my nVidia drivers only get updated at startup so package upgrades can’t introduce a kernel-libGL version mismatch that needs an X restart to fix.)And to completely prevent snappy from existing on your system, “apt-get purge snapd” then add to your /etc/apt/preferences.d/10pin file (depending on apt version if you should use quotes or asterisk)
Package: snapd
Pin: origin “”
Pin-Priority: -1
Package: snapd:i386
Pin: origin “”
Pin-Priority: -1
Package: snapd:amd64
Pin: origin “”
Pin-Priority: -1
Package: snapd
Pin: origin *
Pin-Priority: -1
Package: snapd:i386
Pin: origin *
Pin-Priority: -1
Package: snapd:amd64
Pin: origin *
Pin-Priority: -1
Then all versions of snapd is blacklisted forever.
May i ask, why not use palemoon instead? Do you really need chromium for some critical software that does not run in palemoon? Palemoon is faster, uses less ram and has a lot better track record with bugs and security issues. And the biggest boon is that it gets you away from google code base.
And over that it is more standard compliant than Chrome. For example when a site is blocked by your employer, it does not attempt 40 times to reach it (hammering the site) nor does it shit itself when staring up a old session with many youtube videos. Also it has global dark mode, which makes the web soooo much more enjoyable. Reading wikipedia does not damage your eyes any more.
I might have a look at it. I’ve been on Firefox for a long time.
Couldn’t agree more with you Thom. I’d even accept Ubuntu Software and the Ubuntu MATE Software Boutique (and their analogues in other Ubuntus, if any) installing snaps whenever available, because naive users are those most likely to be using GUI package managers anyway, but installing snaps for things you specifically ask for the .deb for, using the terminal, is just rude. Hopefully, there will be a big backlash over this like there was over Manjaro and FreeOffice.
I fully support the *idea* of snaps and flatpak, however this implementation with the forced auto-update and the tricking you into installing a snap when you thought you were installing a package sounds really annoying.
This is absolutely true. I’ve enjoyed Snaps, but what you mention are two of my biggest issues. The auto-update I kind of learned to accept given that application developers can push out tracks (different versions of software). If you can convince the app developer to have a lot of tracks, you can basically stick to one version.
The sneakyness of hiding a snap install being apt is something I really don’t like. Let’s assume good intentions that they’re doing it for ease of use so people used to using apt to install can keep doing the same.
It just creates complications and indirection making it harder to use tools. Now I have to keep a mental model that apt might use snap in the background.
Hell, Bring back Lindow’s CNR instead for that matter
I wouldn’t call Unity 7 a debacle. It was and for some still is the best DE that was ever made for Linux. Tasteful front end and capable back end in the form of Compiz.
AppImage format is something that doesn’t’ try to replace .deb packages. It provides an option, where developers can package their software and dependencies and redistribute that in a way .deb packages can’t be redistubited. You don’t have to install anything and can run the software from an USB key. It’s not fair to throw AppImage in the same basket as Snap/Flatpak and dismiss is due to the fact you don’t like Snap/Flatpak packages.
As for the Snap/Flatpak packages. I don’t’ feel they are currently there yet and both could end up being a failure. Still, looking forward, such attempts will result in Linux packaging situation to improve in the next decade. As the traditional Linux packaging isn’t all that bad, but it is still far from being perfect.
Unity had a breaking bug (#906231) that wasn’t fixed for years, and it was still present after the “fix”. They literally couldn’t get the absolute basics to work properly.
C’mon, you can’t reduce a discussion about the whole DE to a single bug that affects you and you feel it should be fixed. At Unity 7 peak usage there where millions of people using it. If a single bug would play such big role, for sure others would have heard of it. Unity 7 got a couple of years of usage and it was considered a success, not a debacle. That was my point. In general people using it have expressed positive feelings about it, news sites. The pressure of Unity 8, that was a completely different thing, compared to Unity 7 and nobody being prepared to port Compiz to Wayland. Such things have most likely contributed to the fact Ubuntu went back to using GNOME Shell.
Now if people are happy, regarding the GNOME Shell, that is another story.
Why not just use Debian with your chosen environment?