PolishLinux has an editorial on program installation on Linux systems, and even though it’s a bit hard to wade through (the author’s native language sure isn’t English) it does make a number of very good points in favour of the way most Linux systems handle things. Still, as always in the discussion on program installation, it always feels a bit like listening to a discussion between a deaf and a blind man about whose condition is the easiest to live with.
For the sake of discussion, I usually identify three major ways of installing programs. First, we have the Linux method, using package management tools like dpkg and yum to install .deb and .rpm packages. Then there’s the Windows method, using built-in installers and uninstallers that install a program in the desired location. Lastly, we have the Mac OS X method of using application bundles (but mingled with programs using an installer that cannot be uninstalled).
All of these methods are inherently flawed, are evil, and have front-row seats in hell (and I know because they stood behind me in the queue). I’ve already discussed this in great detail in my article on the utopia of program management, but let me repeat the segment on what’s wrong with current implementations of package management:
Mac users are always quick to point out the benefits of their self-contained applications; one package to download, no installation procedures, easy to remove. While this seems ideal, there are many problems with the actual Mac OS X implementation of this idea. Applications in Mac OS X are generally not easy to remove at all, because they leave a trail of files around outside of /Applications that normal users rarely encounter. Over the course of time, this can amount to quite the mess. In addition, Mac OS X provides no way of updating applications in a central way, resulting in each application in OS X having its own updater application; hardly the user-friendly and consistent image Apple tries to adhere to.The Windows world is not much better off – in fact, it is probably worse. Not only does it have to deal with the same problems as OS X, it also has too deal with dreadful installers. Some of them are such usability disasters they make me want to curl up in foetal position and cry. And then, when I’m done crying, I can start all over again because the uninstallation procedure is just as dreadful.
This leaves us with the Linux world. They have the centralised, easy updating application – the update application in for instance Ubuntu is an excellent example of proper balance between providing enough technical information for experts, but still hiding all that fluff from normal users. However, Linux suffers from other problems. Dependency hell, while not nearly as huge a problem as it used to be, still exists to this day. Downloading a package outside of the repositories is a risky business, but it really shouldn’t be. You are completely dependent on your distributor updating his repositories and keeping them clean – nothing is as annoying as knowing there is a new version of Super Awesome Garden Designer Ultimate Edition, only to realise all the distributions except yours already packaged it.
These limitations still stand. Defending one method over the other comes across to me as a deaf man trying convince a blind man that being deaf is less of a problem than being blind; while he may be right, it’s a rather pointless discussion for those that are blind nor deaf.
What I’m trying to say is this: sure, Mac OS X, Windows, and Linux users can chest thump over who has the best method of program management, but it’s all for nought: none of those methods will get better this way. The only way to truly move program management out of the dark ages is to start from scratch, and implement something that has been thought through in a way that it incorporates the needs of the many (easy to use, no fuss), but also caters to the needs of the few who need more advanced functionality.
My utopia of program management (be sure to read the comments, lots of interesting stuff in there as well as more information on the CLI side of my system) is just one way of doing that (Haiku is working on a system quite like what I have in mind), but I’m sure there are others. The point, however, is that arguing over the current method of management is kind of pathetic. They are all either blind, deaf, or paraplegic.
I must say that comparison seems particularly apt, and not just when discussing package managers.
A click and run kind of web interface for newer apps would solve nearly all problem.
For example: I want OO.o 3.0 on my Ubuntu 8.10. So I go to the “CNR” site and it adds the ppa and installs OO.o.
That way even updates would work.
Of course there has to be a lot of ppas (even more than there are now) and they have to be properly maintained, but with more and more people using Ubuntu it could work and scale quite well.
That doesn’t solve the underlying problem though. There’s too many packages and not enough packagers, and even if someone hosted a ppa for each package, that doesn’t mean they will constantly be able to maintain it ppas for each version of Ubuntu they intend to support. Not everyone wants to perform an os upgrade just to get the latest version of Openoffice, for example, and the ppa maintainer would have to maintain separate builds for at least one version prior to the current one. The lts releases in particular would need to be maintained alongside the latest versions.
I don’t see this as an optimal solution at all, it’s just an extension of the current problem rather than a solution to it.
Linux style package managers make managing software easier.
Windows style “Download & setup” makes discovering software easier.
There must be a halfway house between these two methods. Obviously I’m biased towards the “Package directory” method: for example on Syllable you download a single archive (ZIP) of the software and unpack it. There is no package management as such. Even this could be easier and make upgrading a better process.
Ah Vanders, now that you’re here anyway .
The model I put forth in my original article (so not this one, but the one I linked to) – is something like that feasible on Syllable? You guys seem to be working with bundles anyway (bundles are just directories), so how hard would it be to implement the advanced functionality I detailed in my article?
I have limited knowledge on Syllable, so correct me if i’m wrong, but I guess the biggest hurdle is the reliance on metadata – how well does Syllable’s FS support metadata?
You are right in that no package management is required on Syllable, but at the same time, this puts it in the same league as OS X, and therefore, inherits its problems with centralised updating.
AFS (the Syllable filesystem) is comparable to BFS when it comes to meta data: it supports arbitrary metadata streams on any inode, and (in theory) supports indexing and live queries, although these are not yet supported.
The idea of storing additional meta data on the application directory is one we’ve discussed and may well implement in the future. Likewise, I’d like to have a central “update server” that applications can register with and which will watch for and download updates on their behalf. Such a system would have to be carefully designed however, as it could clearly be abused by a nefarious application.
Hmmm… well, I agree to this statement partially.
Let me first emphasize that I find downloading programs from a (eventually not trustworthy) source using a Web browser and then needing even more interaction to get it installed on the system… yes, I think I’d call it somewhat antiquated.
Modern (Linux and UNIX) operating systems provide excellent means to install and upgrade software. In most cases, they even allow to search for specific software (within the repositories or how you want to call the sources of the software packages). These search functions usually involve text pattern searches within the software descriptions.
I’d like to see improvements especially in this functionality. Some screenshots could be added as examples (especially useful for GUI applications). But this would have to be maintained centrally, to keep the good concept.
Of course, this system could be interfaced with a Web browser, no problem. A CLI utility would always be welcome (from the point of accessibility and automation).
Personally, I’d prefer a kind of (system) application that allows me to select a specific software, then display what version is installed and what version is available, then go on to commence the upgrade, with minimal interaction. But I’m in the minority, I know, because the majority prefers to do this with many clicks in the Web browser and next, next, next, reboot. 🙂
screenshots you say?
http://screenshots.debian.net/
things seem to be moving that way at least hopefully for things like aptitude-gtk/synaptic
The problem with getting all of your software from a central repository is that repositories are essentially a “closed” system. It’s easy to say “Ah well someone will package the software for my distribution” but what if they don’t? There are millions of small and obscure applications that are not in distribution repositories. How can users discover and use these applications? If they’re lucky the primary developer will provide a package that is suitable for the end users chosen distribution, but you can’t expect a single developer to learn three or four packaging systems and to build and maintain packages for a whole bunch of distributions.
Which leaves us right back at ./configure && make && make install I.e. no package management at all.
ZeroInstall is an interesting solution to the problem, and it’s one we’ve looked at and earmarked for Syllable at some point in the future. However packaging Linux software for ZeroInstall could be seen as just as much hassle as packaging for say, Debian, so why should developers support it?
Developers want people to test developer versions (GIT checkouts, etc), so they need to tell people the dependencies somehow anyway. Zero Install lets you do that in a machine-readable way, which doesn’t need to be harder than the alternative of writing a page of instructions.
Once you’ve got the dependency information in your code repository (where it belongs), actually releasing the package can be very easy. e.g. using 0release I just have to enter my GPG pass-phrase to sign the release and the archive gets exported from GIT, uploaded to the FTP server and the new release is added to the Zero Install feed (marked as “testing” so adventurous users will start using it first).
http://0install.net/0release.html
[ NB: I am the author of 0release ]
Linux has no package manager. Linux is a kernel.
Ubuntu packages do not have to install on Mandriva because Ubuntu and Mandriva are 2 different OSes.
If the latest version of your super program is not available in the package repository you use, you are out of luck, but how is that different from MacOS X or Windows? How can I install the latest svn version of openoffice easily on Windows? Answer: you install it exactly the same way as in Ubuntu or Mandriva: you get the source, you get yourself a compiler and you compile it. If the binary package is not available, it is the way it work, on any system (except on non-binary distros like gentoo).
Now you can’t put all ‘linux package managers’ into one group. There are more package managers for GNU/linux systems than on all other systems put together. You have conary, apt-get, portage, installpkg and several tens of other package managers. Comparing linux to windows to MacOS X is kind of silly.
Actually, talking about package management is talking about GNU/linux. MaxOS X and Windows have some kind of package management and you are stuck with it. On GNU/linux systems, you have the choice to use several package managers and methods of instalations and it makes sense to talk about it.
Blah blah I’m not going to repeat “Various Linux systems” every time. Don’t be so childish.
I’m not talking about SVN versions. I’m talking about ACTUAL releases that take far too much time to enter the repositories – if at all. Infamous is the problem where various newly released pieces of STABLE software are not available for Ubuntu x – because they are in Ubuntu x+1. This is very fcuking annoying, since there’s no reason why I have to move to an unstable system just to install a piece of stable software. I recently faced this problem with Xfce 4.6. It annoyed the flying monkeys out of me.
In any case, you just proved my point. You are the deaf man trying to convince the blind man that being deaf is less bad than being blind. It’s a pointless discussion.
Software management on any system today is broken, old, counnerintuitive, and harder than it needs to be. A clean break is needed, so go read my proposal and see how it provides all that you like in Linux – and MUCH, much more.
http://www.osnews.com/story/19711/The_Utopia_of_Program_Management
Edited 2009-03-15 22:36 UTC
This is a fair enough comment, but it could do with some context. For example, a lot of people are using Office 2003 (five years old) running under XP (about 8 years old) … so exactly why having to run OpenOffice 2.4.x (a year old?) on Ubuntu 8.04 or 8.10 versus OpenOffice 3.0 (six months old?), or not being able to run the very latest XFCE 4.6 (1 month old) should be quite so vexing is perhaps open to question.
However, given that it apparently is so, and that there are some people who desire to run cutting edge software rather than more stable older releases … then perhaps the best approach for them would be to use a Linux distribution that employs a “rolling release” policy. A “rolling release” distribution ships the latest versions all the time as opposed to a “scheduled release” distribution (like Ubuntu is) which has scheduled releases with fixed functionality (Ubuntu’s schedule is one release every six months).
Arch Linux, Debian Sid, and Gentoo Linux are perhaps some of the better-known “rolling release” distributions. Each of these distributions would typically support installing of a new version of applications (such as OpenOffice 3) perhaps a week or so after the release announcement.
Mind you, there is a cost to such “currency”. These distributions do not claim to feature stable, tested software. Install at your own risk.
Obligatory backup link:
http://www.debian.org/releases/unstable/
Edited 2009-03-16 00:53 UTC
Not that I disagree with your views, the way that Linux(and others) installs software currently is unsustainable and utterly insane technically.
However, most big named software that has Linux support provides easy to install packages at least for Ubuntu and Fedora. And that includes OSS like OOo and proprietary like Skype. So the windows way is there if the makers of the software actually want to support your OS.
Within the current system it could be nice if they maintained repositories that could keep you up-to-date, but they probably just don’t want the extra work an know you are going to go to their home page and download if you are so desperate.
I think you are wrong about linux. I think that the mere fact that Debian has the largest set of packages out of all distros says that it can work. But they arent updating as often as say Fedora. Different people want different things. Some want ease of use, some want stability, some want the bleeding edge. You can’t have everything though, some things are mutually exclusive.
You are blind, you didn’t get the point. Package management is very different when you have slackware and when you have gentoo. VERY VERY different. More different than between MacOS X and Windows. All your talk about linux package managers does not apply either to slackware or to gentoo. You just talked about Ubuntu, Mandriva and a handful of others. I’m not being childish, I’m just pointing out that various linux systems have various ways of managing packages
The SVN version of open office was an example. This is exactly the same thing with your example. How can you install XFCE4.6 on Windows? Well you know what? I managed to install 4.4 on cygwin and I not only had to compile it, but I had to tweak it to death to make it work and windows package manager didn’t help me one bit and that was 4.4 which is quite old already. Moreover, you tool the worst example possible because XFCE 4.6 is one of the easiest things to install on any linux system. There is a nice GUI to compile it on any POSIX system that feel like your Windows installer.
Anyway, do you get the point? If the binary packages are not available, this is a problem on ANY system, and I’m not talking just avout OpenOffice or XFCE, I’m talking about ANY package that is not available in binary form for your system. And I guess most linux systems handle the case where the binary package is not available much better than both MacOS X and Windows. Try installing XFCE 4.6 for instance, or garden designer (not available on windows).
It is a nice package management but not for everybody. For desktop systems it is good, but for my server, I want a disk for /var tuned for fast writing, a disk with /etc which is replicated, and a disk with /usr with fast writting. I want to check my /lib for rootkits and virii and I want to compile every library with -pie. I also don’t want to compile unecessary options for security. Maybe portage is for me then.
You see, there is no-one-size-fit-all solution. Try Gobolinux which is close to your package manager I think, or roll your own, I’m sure you will find many people interested.
How about a rolling release schedule (or lack thereof) like Arch Linux?
Either way, for non-critical software upgrades you have the Ubuntu backports, and sometimes the Launchpad PPAs like with KDE 4.2 (but yeah, I was somewhat annoyed at the fact that they wouldn’t even consider putting an experimental package for Kdevelop 4… though when I compiled it I kinda could see why they didn’t package it).
I think you’re aware that Thom used the word “Linux” here to mean “Linux distributions”. There’s no need to pretend he’s ignorant.
The problem is that there are so many Linux distributions, and each of them needs a slightly different kind of package. If a piece of software is released, it usually comes with a Windows installer, MacOS X disk image, and source. Occasionally it has a Debian package. That doesn’t help all the non-Debian-based distros though.
Application installation on Linux distros is horribly broken. Package managers are great for system software, but horrible for applications.
The world is broken then. freeBSD binaries don’t install on MacOSX and deb packages of XFCE don’t install on Windows. Vista software don’t install on Windows 3.1, etc etc…
Just don’t expect all linux distros to be one OS and you should understand that it is not that broken actually, no more than the world. The guys at gentoo doesn’t care what the guys at MacOS X or at Slackware does.
Do you get it?
I’m afraid I don’t get it, no. We can’t expect Microsoft and Apple to cooperate and create a unified package format that will work for both systems, because it’s not in their interests.
However, the open source community has a *huge* interest in creating a packaging standard that will work across many systems. Consider Firefox extensions — one zipped file that may contain not just javascript, but compiled binary components. One extension file can contain the binary components for as many systems as you care to support. It wouldn’t be that hard to create a similar format for linux-based distro applications. This would obviously be of little use for system software, but for applications, it makes installation very easy.
In my opinion, an application bundle system should be created that can work alongside a package manager. Both paradigms are required for a truly elegant experience.
When I first read the articles title “Blind or Deaf: Program Management on Modern Systems”, I expected to read something about how blind or deaf (???) users would install or upgrade their software installations. I expected something like “on Linux, you can use the command line tools provided by the distribution with a Braille readout; on ‘Windows’, you can’t do anything”. 🙂
Lol, well if you really want to know about that I could always write up an article about it.
Thom: While you bring up some interesting ideas, I have a few reservations about your plan. First, there are security issues that this type of system raises. Like using the no execute flag on home partitions. Some times you don’t want the user to be able to execute stuff from their home directory. This is especially important as a root user when I am having a look at some elses home. I want to be sure I am not executing any malicious binaries from their home while browsing it.
Second is bloat. While hard drive sizes are growing, I still like being able to fit a system into 500MB.
Third is being able to update a single library and have the entire system be up to date.
Given the audience on this site, I may not get such a warm reception with this post, but I believe that surreptitiously a new model has appeared and been adopted with nobody noticing yet: the iPhone application store.
Consider:
1) Built in, standardised install and uninstall.
2) Built in notification and installation of updates.
3) One stop shopping for all available packages (not in the wish list, but an important factor).
I believe that this is important because I am already so comfortable with my iPhone’s built in notification of updates that I am impatient that my Mac does not feature the same seamless facility. I too am frustrated with the orphaned settings and preferences left behind when uninstalling, and the general inconsistency in installers.
So – why not the iPhone App Store as the new software distribution paradigm? I confidently predict that it will not be long before Apple starts a Mac Software store on iTunes (why on earth not?) and I think the market will push the others in the same direction.
Apple store is basically a repository with money exchange, and Valve a repository with drm.
Someone could make apt support these features. If nobody has done so is because not many people use Linux that are also willing to pay money for closed source programs and install backdoors into their systems.
However, I could see people buying/downloading Valvuntu with integrated Valve repositories if there actually were linux games to buy.
i.e. Everyone noticed, but nobody cares because apple iStore is as old as shit as a concept.
Maybe not literally as old as the concept of “shit”, but yes, it’s the exact same principle as distros use for their repositories.
The difference, however, is that while there is one iPhone App Store, there are hundreds of distro repositories.
Contrary to what some seem to think, this can not be fixed simply by choosing one package manager and forcing every distro to standardize on that. That would just make them seem compatible on the surface. The actual packages would still not work on other systems than the one they were built for. This has to do with naming conventions for the packages, different splitting of bigger software suits, structure of the file system, boot scripts, decisions made by autoconf at compile time, etc.
The many shapes and forms GNU/Linux systems come in are their blessing and their curse.
Which is why we really need to either standardize the important parts of various Linux systems (not likely), or start thinking of them in terms of what they really are: separate operating systems with some similarities. They’re not distros, they’re operating systems compatible on a source level but not very compatible on the binary level.
True.
In many ways, this is a strength, not a shortcoming.
It means that attention is paid to make sure that the source code makes as few assumptions as possible at the binary level. Linux, for example, is readily available for big endian and little endian machines, and for 16-bit, 32-bit and 64-bit machines.
It means that there can be gentoo-like distributions, which use source code as the distribution mechanism, resulting in machines with binary packages optimised for that machine, instead of using one-size-fits-all binary packages that must assume a lowest-common-denominator (example: most binary packages for Linux distributions, like Windows, have to assume a 32-bit i386 CPU).
Obligatory supporting link:
http://www.gentoo.org/doc/en/gcc-optimization.xml
It also means that a new architecture with a particular advantage (say, low power drain making it good for running on battery power) will have a ready-made desktop OS available if it wants a way to break in to a new market.
Obligatory supporting link:
http://www.slashgear.com/arm-netbook-new-deal-with-ubuntu-backers-1…
Things like this adaptability are made possible by paying attention to the source code, and having individual distributors worry about particular architectures.
It is both a major advantage and, it seems, an unliftable curse. I agree with all your points regarding source portability, however Linux, being POSIX-compliant as every other UNIX is, inherits that in most cases without trying.
Where it becomes a curse is the way Linux is thought of versus what it really is. People see Linux, and believe it to be an operating system. Now, I know you’ve all heard this before, but Linux is no more an operating system than the Mach kernel is. Linux is the kernel, nothing more.
I’m not going to start in on the whole Linux vs GNU/Linux thing, that’s petty and pointless, but it raises a point all its own, and that is how the name “Linux” is seen by the average user. They associate it with the operating system, rather than the kernel and for good reason, as most users don’t know or care what the kernel is. This has lead, however, to the misleading impression that all operating systems based on Linux, at least on the PC, are basically the same with, perhaps, a different theme and/or GUI. Consequently, users expect there to be one package that will work for all Linux-based oses, and this simply isn’t possible, and likely will never happen. Different programmers have different views about how certain things should be done and, because of the system’s open source nature, they can and will do things the way they wish.
So, what is really needed, imho, is a change in the way most people think about Linux. The perspective must be changed so people don’t think “Linux,” but think Fedora, OpenSUSE, Debian, Ubuntu, or whatever. Most end users don’t think “Mach” when they think about OS X, or “ntoskrnl.exe” when they think about Windows.
Linux is a kernel that powers many operating systems. The focus, I think, needs to shift to emphasize this fact, and lessen the end-users’ confusion.
This doesn’t change some of the issues regarding package management, but it puts a slightly different perspective on it. Naturally, each os may have different package management concepts and formats. The fact that Linux is the kernel of all these various operating systems doesn’t change the fact they are different oses, not a coherent single os in contrast to the image that has been built up around it.
There also is only one iPhone and one iPhone OS which makes it possible to have just ONE repository for it that works. Vertical integration, it’s what Apple does well.
Maybe this has been said before, but it seems clear to me that there needs to be a distinction between *System Software* and *User Applications*.
Package managers work brilliantly for system software, but not for applications.
Bundles work brilliantly for applications, but not for system software.
Maybe we should try a split system!
Where would you put the point of difference? As far as I know, (some?) Linux package managers even treat the kernel as a package (that can be upgraded, for example), and I think I would consider this to be system software.
Furhtermore, many of today’s user applications may have an impact on system software, or at least require certain functionalities in it. To illustrate what I mean, I will mention “Flash”, the thing you use to make web pages unusabe and annoy the users. 🙂 It seems to be that complicated to have it working well on Linux, it’s much worse in BSD. Would this have to be if it would be a simple user application, such as an image viewer or sound player?
I may disagree. I’m using FreeBSD’s package managing tools (pkg_*) as well as the additional software portupgrade (expecially with its tools portinstall, portupgrade and pkgdb) to keep my user applications up to date, without any problems. I even don’t need to compile stuff (my home machine is not the newest one, so I prefer precompiled packages). The system can be updated as well, either in binary form or by sources. If you keep the GENERIC kernel, you don’t need to compile anything here, too.
NB that FreeBSD does have an understanding of system software – the kernel and the world -, and any additional software – such as can be installed via the ports collection, from source or in binary form.
Exactly. If something that is considered harmless (such as upgrading Firefox which can in worst case lead to a defective Firefox – delete and reinstall it) can render your system’s innermost parts unusable…
But as always, keep in mind that computers aren’t easy. The person who is doing the neccessary administration should know what to do, and if it’s an “average user” (those who use Linux are in fact at least advanced users already), he should first educate himself what to do and how.
Then have a look at how the BSDs do it successfully.
Yes, the kernel is undoubtedly system software. Can’t get much more system than that! Definitely package manager material. Oh, and that’s a big difference between BSD and Linux of course — most distros upgrade the kernel just like any package.
Yes, this is an issue. That’s why I think the app bundle system needs to be aware of the package management system. It needs to be a layer above it.
Fair enough — us geeks can get by with that I’m a big Archlinux fan myself — I love the simplicity (the “Arch Way”). However, the simplicity seems to break down when it comes to graphical applications. Package managers are a great help when you’re already at a terminal, but we need something more from a graphical environment.
I was given a Mac for work a while back, so I’ve been using MacOS X for a while now. It’s far from perfect, but it’s incredible how easy it is to install an application bundle. The problem with OS X is that not all software works as a bundle, and for those packages Apple has opted for an installer system that’s more broken than Windows’.
Yeah, that’s the problem — I sick of needing to apply advanced understanding of computers when I need to install software. It should be such a simple task!
Ze article is perfectly written : the proof is in the understanding. I understood fully its language though at the same time I totally failed to do so for most of its fine technical points.
Please do not forget that the main international language is no more English but Globish and refrain in the future from such unproductive and discriminative comments.Here is a man who extends his hand to you and you despise his efforts. Come on, shame on you.
Last, the author used a “translator” to express his ideas. The only thing he could have probably done better would have been to use Google for this. It’s such a wonderful tool. Don’t you think so ?
…Not really.
I love Polish Linux, although I struggled through the article. I really did It raises points about freedom; shared code; god I did get bored.
The bottom line fun or not is on a Microsoft System I run Windows Update, Have several programs that use system resources to check. Some that check on startup for a new version, and some that require manual patches or updates. This also includes drivers for your devices. This is a major task and I have never seen a computer in a home/work environment that is current…even my own. Its simply too difficult to manage, and the TCO is horrendous, we are talking hours of work.
The negative is until if YOU decide a program is stable and cannot wait you have to do a little more work, which is normally pre-compiled binaries from a third party, or you could use a distribution with an alternative package manager, but their are ways around this. The reality is most users do not upgrade to the latest products look at Vista; Office 2007 etc etc or even a Monopolistic Abuse free product like IE7.
…but inprovements can be made especially for critical open-source showpiece applications like Firefox and OpenOffice working with the developers to get these released in a timely fashion are critical, especially as these projects do have the source available to them.
The problem lies deeper in Linux. Until there’s no stable API\ABI system wide (kernel, X.org, libc etc) and distros are free to ship any of the components and any version of the components, there will be no standard Linux binary package. And no package manager will solve this problem because the problem is not the package format itself. It’s the system under it.
Until somehow there’s not a standard base which every distro maker has to use, you can forget about common packages between distros. I think I’m not too sarcastic when I say there won’t ever be such a base simply because Linux is used for too many purposes. While Haiku and Syllable has a clear focus on desktop and the developer crew develops all part of the system it’s not the case with Linux as we all know, so they are not good examples. They may have the most user friendly package manager ever, but they are not a general purpose OSes. Now that I think about it I think user friendliness could mean different things for a desktop user and for a sysadmin.
One solution could be LSB of course but I don’t know if it’s mature enough or not, or usable to every software or not.
Somebody wrote an interesting solution above: package manager for system packages and bundles for user apps. I guees this could be a solution if we had a big enough LSB which includes every core package.
Actually the LSB is not mandatory, but the big ones all try to follow it more or less when they can. The rpm package manager is in the LSB and it allows to distribute rpm like the acrebat reader one. If you want a program that installs on 95% of linux desktops, make it LSB compliant and distribute the rpm.
Now there are those distros that choose not to follow it, but they have their reasons. For instance, Gobolinux tryes to do something different and that is a Good Thing (TM). The standard is good to make everybody on the same line, but if it does not evolve, we are stuck. If nobody explore new paths, then the standard won’t evolve.
Actually, there are several standards. One could say that Windows msi is the standard because 95% of the machines in the world use it and it is not far from the truth. This is why there are projects like wine that tries to implement this standard on POSIX compliant systems. This is also good as far as I’m concerned. You can call the world a big mess or you can call it a rich and diverse place, it is up to you, but all those who have tried to order it the way they like have failed so far.
Anyway, I’m all with you on this. Distros that can follow the LSB should when it does not affect negatively their goals. This is one of the key features I expect from a big distro.
Edited 2009-03-16 18:42 UTC
Guys, guys, you apparently didn’t understand any of this. I see lots of deaf men in here trying to convince blind men that being deaf is less awful than being blind.
And with that, program management will remain in the dark ages. Too bad.
what eh!? clearly you need to understand irony! your trying to convince others of your perspective by calling them all blind.
Package management of Open-source programs is excellent, I look forward to it improving, from a maintenance perspective its better than the alternatives. Lets look at how it can be improved.
I find it strange how we keep talking about applications… but at the end we still treat everything as files.
Below is not a ‘solution’ but a general idea of how a solution would look.
Mac is the one that really got this part right with its app bundles. It’s not perfect, but at least it gets the concept.
Without special rights, an application should only be able to install to one directory. For example, let’s say you download FooApp. By default, if you click the bundle to ‘install’ it, the install is handled by OS, and FooApp only has access to ‘Application/FooApp’ and ‘Settings/FooApp’
*I use the generic term Application. In Windows it would refer to Program Files.
A similar concept could be used for shared libraries. Let’s say FooApp needs dep1. They would only ‘install’ to ‘Libs/dep1’.
—-
In terms of handling version… I would say we need to make sure the system does most of it automatically, but still allows an admin to easy resolve issues. Under each App/Lib you can have different versions. to make it easy to resolve problems, there should be some place to affect the loader.
For example, let’s say BarApp also needs dep1, but it needs a newer version (1.1) which somehow broke FooApp. FooApp worked fine with dep1(1.0) You should be able to change a setting to tell the loader to use dep1.1 with FooApp. Perhaps some kind of optional manifest file in the Application dir, where you can specify library overrides. Maybe per user?
Updating… well. This is more of an issue of coordination than a technical problem. They all pretty much operate the same way, its just a matter of who has control and how/when are things approved. I don’t hold a lot of hope in a centralized body being able to handle everything (sorry linux distros ). It can be an option developers can leverage if they want, but I think there will always be a way to install apps outside the central way and as such there will always be other ways to update it.
Your example with the two different dependency library versions reminds me to PC-BSD and their PBI installers. You can download a PBI from the Web (or use the much more conveniend command line tool to do it for you) and then install it. Allthough the installation is done to the system (and not locally to the user), all the dependencies needed are inside the created structure. According to your example, FooApp would bring dep1 1.0, and BarApp would bring dep1 1.1, each within its own subtrees without any interferences with other applications.