The author was rather vauge with his solution though.
Personaly, I think that the answer is BOTH forms of software distrobution. The central repository style holds all the core software for the distro, and any other important software that people decide to put the work into to customize and add. The rest is installed through a universal packaging system such as Autopackage.
For instance, if you want to upgrade Gnome, you should do so through the repository. Inkscape on the other had is a simple application and is not depended on by anything in your standard distro, so more correctly should be offered as a downloadable package by its devs.
Of course Autopackage itself, or whatever is used, is included in the central repository and is customized by each distro to use its repository for dependancy resolution.
Put it all together and you’ve got a pretty good solution if you ask me. Benefits of both repositories and universal packaging, and good dependancy resolution all around.
When a package is offered for download, in my opinion a version with all non-LSB dependancies staticly linked should also be available, specifically for use by those who will not have a good internet connection at the time of installation for any dependancies.
since I switched to a certain source based disto (name withheld because just mentioning it gets people mod happy). The only down side of building from source is that it takes a little longer. Otherwise, the benefits outweigh the time factor.
…n my opinion a version with all non-LSB dependancies staticly linked should also be available…
Sounds like a great idea, until you realize there are very lage barriers in the way of such:
1) Various licenses that could be used by those various dependencies (even just one) often mean you can’t static link
2) A static library often has dynamic dependencies (see Allegro for an example)
3) You can’t statically link to libraries that dynamic load other libraries and expect your software to work on any other distribution than your own (and sometimes not even then)
4) Static versions of libraries sometimes aren’t available at all
5) Static linking all those extra dependencies bloats your executable by several megabytes causing a large number of complaints from modem users (especially if the program is updated frequently)
since I switched to a certain source based disto (name withheld because just mentioning it gets people mod happy). The only down side of building from source is that it takes a little longer.
Except that only works for packages that are somehow maintained by your distribution (through a central repository), and only for free software, that’s not a viable solution for commercial applications.
I think you can leave out more dependencies than just LSB libs, it would not make sense to have fully GNOME or KDE libs in your own installer (note that not even the two toolkits are in LSB, but very likely installed on user systems)
I am a Debian user and are quite happy with the way our packagers create the dependency tree, but IMHO the version info is too strict.
Last week I upgraded from KDE3.1.4 to KDE3.2.3 and apt-get removed all KDE 3.1 applications.
I think this is unnecessary, KDE libs should be binary compatible within the same major release and all applications I compiled myself continued to work with the new libs.
Of course if it is not possible to offer a static package for some reason then that option simply would not be there. After all, I only proposed that staticly linked packages be made available for the rare situations in which it might be useful, such as a very slow or non-existant internet connection for the system on which it is to be installed.
The vast majority would download the “standard” dynamic package and let Autopackage, or whatever universal packaging system, install any dependancies through their distros central repository before installing the package.
— “I think you can leave out more dependencies than just LSB libs, it would not make sense to have fully GNOME or KDE libs in your own installer (note that not even the two toolkits are in LSB, but very likely installed on user systems)”
Of course you are correct. Hey, Im no expert here, just a Linux user with an idea.
The certain unnamed distro has commercial apps in its repository (which require serial #’s). Otherwise you install 3rd party binaries just as you would in Windows.
I really don’t see an issue here. The problems with Linux packaging really lie in the fact that most ditros are binary. Binary distribution really doesn’t make sense in an operating system whose design stems from an entirely open source base. If software is updated, it and its reverse dependecies must be updated as well. It is a fact of life with such an open and flexible platform. Thats the way the system works. Distros need to look into using Gentoo’s concept of slots, where multiple versions of the same libraries and apps can be installed simultaneously to eliminate broken dependencies after upgrades.
If you have a problem with the architectural deficits of a binary Linux distro, move to a source-based distro.
If you want to live on the bleeding edge, get off your arse and make it work instead of complaining about it. Otherwise, use the repository that your distro maintains.
If your package manager makes updates difficult and breaks your system, find a new distro.
People don’t get it. LINUX IS NOT A WINDOWS CLONE. Do it the right way, not the Windows way, and you will have few problems if any.
They exist and they work, quit whining. We’ve had these packaging rants ad nauseum. Packaging was solved long ago. Just because it works so perfectly that it gives Windows users a heart attack does not mean that it’s the wrong way to do it.
I think uniformity of packaging is the “next big thing” for Linux distros. It seems to me that when the community starts a-screamin’ someone usually kicks things into higher gear.
At least, this seems true for some past cases: free QT, better fonts/anti-aliasing, true-transparency in X, faster X development, and more concensus between KDE/GNOME (freedesktop.org).
— “Another idea would be to have the dependencies included.
I mean some kind of archive package that contains the program package and packages of the dependencies.”
I thought of that, but the problem is that you’d have to make sure that all those dependancies you packaged insalled and worked properly with every single distro out there, and with their repositories, without breaking existing packages, and at the same time not resulting in multiple installs that waste resources. Better to let the installation of libraries and other dependancies be handled by those who know each distro best: the repository maintainers.
Thats also why any universal package manager, such as Autopackage, should be cusomtized by each distro to resolve dependancies with its repository, or in whatever way is appropriate for that distro. There are vastly differing philosphies guiding many of the popular distros, and the best way to deal with them all is to simply let them deal with themselves.
The problem is that some distributions have horrible packagers/package managers or none at all! It is usually users of these distributions that yell “dependency hell!“…”linux is not ready“…”static linking solves it all“…”do it the Microsoft way“…”do it the Macintosh way“…etc.
Users should not have to go to an application’s website to install the applications. It is a backward and broken concept, that has thrived because Microsoft Windows dictates our computer usage standards.
Each distribution should have a repository that allows their users to install software easily and correctly(e.g Gentoo’s Portage and Debian’s Apt).
When I tell my friends that installing software is a lot easier on Linux than on Windows they ask me how. I give them this example:
A user wants to install a CD burning software on Windows(Nero), these are the procedures he might have to go through.
1). Launch your browser
2). Point browser to Nero’s website
3). Search the website for where the download link it after wadding through adds.
4). You might need to register to download some apps.
5). Nero provides links to various versions of their software, look for the one that suits your needs.
6). Download Nero to your system.
7). Click on the installer.
8). Click to agree to license you didn’t read.
9). Identify the components you need on your system.
10). Click on them and click on next.
11). Identify where the components are going to be installed, if you have weird setup(many users do)
12). Register the product or click on next.
13). Advertisements click on next.
14). Click on finish.
Contrast that with a Linux user who needs to install a similar app(K3b). His/Her procedures.
1). Launch your terminal.
2). Enter distro’s installation command (“install K3b”)
3). Close terminal.
or
1). Launch gui installer
2). Search for K3B
3). Click on it to install
I don’t know any other operating system that makes installing software easier. I agree with the article. Lets focus on upstream vs downstream dev communication. Lets advertise the benefits of intelligent package managers. Lets work on making package managers more robust.
I still get surprised when people today claim they experience dependency hell. The days I experienced dependency hell were the days when I had to go hunting for packages all over the internet to install them manually(i.e years ago). Gone are those days.
I still believe Linux has to most sophisticated method of installing software. It’s just that, like the article cleary demonstrates, new users to Linux are stuck to the bad habbits the inheritted from other operating systems. They are the ones who propose static linking as a solution to installation flaws of their distro.
The best package managers out there are Debian’s Apt and Gentoo’s portage. Gentoo’s portage is particulary impressive because Gentoo is a source based distro and installation of packages on source based systems can be challenging. These two also have the largest repository of softwares I know in Linux. Why…oh…why aren’t their package system Linux standards!?
Afterall, these two package managers have already solved 90% of software installation issues and most importantly they have the largest repositories.
Warning: Grammatical errors abound, I wrote this in a haste. My apologies.
So basically he wants us to depend on a repo system maintained by the distro maker?
1.)OK so what happens if the maintainer of a certain package take s a vaction? Community based distros(debian, gentoo, etc) can’t afford to pay people to maintain packages.
2.) Er portability? What if I want to install s/w on a comp that dosent have internet connection.
3.) Congestion. How much bandwidth/storage are we talking about to keep so many packages(14,000 +)
4.) Repackaging. So lets see….Fedora,debian,gentoo,MDK,etc.
LOL what thats 5 to 6 distros and 14,000 packages each.
~84,000 packages!!! Bandwidth isnt cheap!
Conclusion…guys a retard, full stream ahead with distro-neutral packages (Autopackage).
“Some teams get this almost right. The Debian GNOME packagers are certainly a decent example of upstream/downstream cooperation. Bugs get reported upstream, and upstream developers are involved in the packaging process downstream. The result, GNOME 2.6 makes it into Debian about a month after release.”
I tell you what. The creator of Knoppix gave up on GNOME due to its poor support on Debian. Don’t thrust me? Check out the Debian mailing list.
About packages: all the dependence resolution, different package formats, and centralization, aren’t very easy to overcome and make the software installation one click away.
And when he talks about Linux distros, he usually talks about a couple of them, but he needs to understand that there are hundreds, and people aren’t going to settle on his two, no matter what.
I thought we already assumed either compatible packages or one master package for each distribution, otherwise even the smallest external dependency wouldn’t work.
Actually that are all implementation details, I just wanted to point to another solution for having all dependencies in one downloadable file without requiring bad things like static linking.
“1.)OK so what happens if the maintainer of a certain package take s a vaction? Community based distros(debian, gentoo, etc) can’t afford to pay people to maintain packages.”
Why would they need to pay someone? And unless he took a rediculously long vacation without telling anyone, I can’t imagine it being even slightly disruptive.
2.) Er portability? What if I want to install s/w on a comp that dosent have internet connection.
That would be a fairly uncommon situation. I guess you’d have to fall back on the old “hunt down all the dependancies” method then. This is package managment, not magic.
3.) Congestion. How much bandwidth/storage are we talking about to keep so many packages(14,000 +)
Whos keeping packages? Debian may do it that way, but Gentoo doesn’t. Portage is simply a repository of scripts. The actual source archives are downloaded from the same places everyone else downloads them from. APT4RPM and similar RPM repositories are similar in that they all ususally download from the same mirrors. Why would everyone have to do it the Debian way?
4.) Repackaging. So lets see….Fedora,debian,gentoo,MDK,etc.
If the repository maintaners are willing to do it, who are you to tell them its to much work? Its their choice where to spend their time.
Is provide their app’s source code (where applicable) and a package with all the libs. Like the Mac OS X .app which is just a folder with all the needed executables inside. If you want a “clean” installation you can always build your distro’s package from the sources.
A lot of non-OS people seem to be doing this already.
I think this is much better than the Autopackage solution. Autopackage just seems to add an other distributed packaging system on top of the one that the installed distro already uses.
“I thought we already assumed either compatible packages or one master package for each distribution, otherwise even the smallest external dependency wouldn’t work.”
What makes you think that? Obviously there would have to be some working together to make sure it all worked. As you said, its all really just details of the implimentation.
Keep in mind that my goal was not to make everything work in perfect harmony, it was to find the best and most implimentable solution that still allowed distros to keep their differing conventions and philosophies.
You expect people to work(for free) *FULL* time maintaing packages?
And unless he took a rediculously long vacation without telling anyone, I can’t imagine it being even slightly disruptive.
“rediculously long”? What do you consider long? If a package is more than 1 week old, im pissed and I just think, hey if I was in Windows I wouldnt have to wait.
I guess you’d have to fall back on the old “hunt down all the dependancies” method then.This is package managment, not magic.
This is magic? Well I guess the guys at Autopackage must be magicians. Autopackage supports including ALL (non-LSB) dependecies into a single package!
Portage is simply a repository of scripts. The actual source archives are downloaded from the same places everyone else downloads them from.
You got me there. I was talking about Debian.
If the repository maintaners are willing to do it, who are you to tell them its to much work? Its their choice where to spend their time.
They can do whatever they want. I just think its rediculous to have 84,000 packages+ just to support 5-6 distros!
Packages usually have more than one maintainers, yes, even in community based distros like Gentoo or Debian.
Yes, thats the point I was making. Debian has social contracts and has induviduals that maintain specfic packages right? What if one of those maintainers leaves on a vaction, is sick, etc and a major release of a package is made(one that he or she in charge of)?
Copy the binary from one computer to the other.
If you can help me get a debian package to work perfectly on Vector and have no dependency problems, im listening
What congestion? Packages are stored in the distribution’s repository.
What?!? They get unlimited bandwidth for free? 😮
You lost me.
OK
-lots of redundancy. Repacking the same s/w many times over
-waste of bandwidth hosting all these different packages.
Autopackage has its issues.
Ofcourse it does, its not even API stable! Im just saying that the folks at Autopackage are on the right track.
Is provide their app’s source code (where applicable) and a package with all the libs. Like the Mac OS X .app which is just a folder with all the needed executables inside. If you want a “clean” installation you can always build your distro’s package from the sources.
One huge problem with that. Those precompiled libraries often DO NOT work correctly across distributions due to filesystem layout differences, kernel differences, and base system library differences (gcc, etc.). So this isn’t a silver bullet solution either.
-: If a maintainer is on vacation, in community based distros, another maintainer takes over. Or a user interested in the package steps up and provides the necessary script which is then submitted to bugzilla, forums or other community resources. This is the least of a packaging system’s problems.
-: You can get many statically linked packages to work accross many distros. I can’t, however, guarantee they will work. Some distros provide facilities to generate statically linked applications. You should acknowledge that technically speaking, Debian and Vector are two different operating systems, because they both use the Linux kernel doesn’t mean packages designed to work on one will or should work on the other. Such an expectation is unreasonable as is many other expectations for Linux.
-: A distribution’s repository are mirrored servers all over the internet. In a community based distro like Gentoo, the bandwith is sponsored by commercial interests, users and well wishers. I still don’t see how Autopackage solves the bandwith problem.
-: The distribution’s maintainer is responsible for packaging software for his/her distribution. I don’t see the redundance here. All the application developer needs to do is provide the source. Distributions usually have scripts that convert the source into a manner compliant to their needs. You keep bringing up this bandwith issue but I don’t see how Autopackage saves me, application developers, or distributions less bandwith.
-: They might be on the right track, but the track is unproven. Portage and Apt, today, have proven that software installation can be easy, reliable and consistent. While Autopackage is still a work in progress that might never be fruitful. I have used Portage to build a whole linux system from scratch(every single package). When Autopackage can do that, it will begin to tickle my fancy. Until then, it is unusable, period.
I am just not a big fan of unproven technologies. I do wish Autopackage developers the best of luck. Let’s work on improving what we have right now. If, eventually, Autopackage solves all our installation woes, I’m sure we will be quick to dump our current solutions, which work well at the moment.
I find it laughable that there are still people who complain about software installation on Linux – it’s easier than any other OS I’ve tried. As for packages not being updated immediately once a new version is released, that’s a big innaccuracy – Arch releases new packages the day the program is released (in most instances – bigger packages like X.org or Firefox take a day or two so they can test them).
You expect people to work(for free) *FULL* time maintaing packages?
Nope. Why would they have to work full time though? They can if they want to I suppose, but I believe the majority of them do not.
“rediculously long”? What do you consider long? If a package is more than 1 week old, im pissed and I just think, hey if I was in Windows I wouldnt have to wait.
Perhaps a few months? Hmmm, yeah, Id say thats unusuaully long. Besides, the “not telling anyone” part is the important part. If the other maintaners know hes gone, they can see to it things get taken care of in the mean time. Besides, new releases are hardly suprises. Why would he wait until an impending release to do his vacation?
This is magic? Well I guess the guys at Autopackage must be magicians. Autopackage supports including ALL (non-LSB) dependecies into a single package!
Well, if you found a way to download dependancies without an internet connection that would be magic.
Of course including them with the software package could work to, and when Autopackage is ready it will be another possible way to solve that problem. Still, this is an uncommon situation, and there is no solution that is best for EVERYONE.
You got me there. I was talking about Debian.
But… you can’t say that the apt repository won’t work, ’cause its been working fine for years!
I thought you were reffering to the idea that ALL distros have repositories, and that this would result in internet congestion (storage was never an issue, its too cheap to matter).
They can do whatever they want. I just think its rediculous to have 84,000 packages+ just to support 5-6 distros!
So you think that Linux needs less software?? Personally I would want there to be as many packages as possible available for my distro.
Aug 17: Developers of “foo” complete work on foo-3.75
Aug 17: “Foo” developers work with Debian, Fedora, SuSE, Gentoo to create .deb, .rpm, ~x86 packages which are fully tested with “foo”‘s co-dependant packages.
Aug 20: Basic requirements found – changes required in “bar”, “baz”, “giz” projects.
Aug 21: “bar”, “baz”, “giz” developers notified.
Aug 25: “bar” developers reply “impossible”; “baz” developers submit patches; “giz” developers reply “technically unlikely to work; fix goes against the giz roadmap”.
Sep 01: Distributors start work patching “bar”, “baz”, giz” code.
Sep 15: half-broken “bar”, “baz”, “giz” patches available
Sep 27: Debian, Fedora, SuSE, Gentoo packages available for “foo”. One month after the “foo” source was available. “bar”, “baz”, “giz” packages variously broken.
Sep 27: Users of Slackware, Yoper, JDS, [insert-fave-distro-here] complain “why isn’t “foo” available for me?” (answer 1: your distro isn’t part of the cabal). (answer 2: It’s been available for over a month, as source)
Sep 28: Developers of packages “bar” release new source (based on their own tree, not the developer-cabal tree). This breaks “foo”.
Sep 29: Developers of “baz” release new source (as Sep 28, but more so).
Sep 30: Developers of “giz” do the same (as Sep 29, but even worse).
Nov 01: Users complain “What is this crazy Linux packaging system?!”
(this post doesn’t necessarily represent the position of the posting IP address)
I run Gentoo, and Gentoo has theese problems too, I’m still waiting for a mysql-4.1 ebuild… (QA doesnt allow it in the tree)
Thought one:
1. The shared source tree is never released as a package.
2. Distro cvs works as an overlay to the shared source tree.
Thought two:
Recognize that computerized dependency resolving can be distributor/os agnostic webservice. Design an RDF/OWL language to describe deps and other meta data about source trees. Use URI as package/feature namespace.
Thought three:
The distro package managment system should just as easily install the latest cvs either anon or rw as it installs prepared packages. Installation from cvs code should go thorugh the distro package managment.
Thought four:
Gigantic P2P based CVS system…
Thought five:
install(1) could be extended to do more, db registering md5,mtime and such for uninstallation/updating.
I don’t know about other distros, but som of the features of portage could be incorporated into install(1)
I make a link /usr/bin/mozilla-firefox to it, and boom I have it updated. I know that’s a bit complex for most users. But Mozilla is a REALLY EASY package to install, that’s why you don’t need an rpm for it.
And most packages that don’t come with your dist are small enough to build within 10 minutes. I think auto-package’s source builder is exactly what we need. Basically just a check to make sure you can install by “./configure && make && make install && make clean”. Oh and deps checking as well (which can be done universally using each packages configuration script, always in roots path).
Maybe users should learn to live with a bit more difficult installation? I mean, making installs harder also makes adware installs harder…
The only install I’ve done that I don’t think the average person could figure out involved fixing some errors in a guys code. But it was a cheesey panel app, so I don’t think anybody’d cry about not having it .
– When Autopackage can do that, it will begin to tickle my fancy. Until then, it is unusable, period.
It’s unusable for you, I Installed new version of Gimp from their site without any troubles and I liked the idea of autopackage. Why are u trying to enforce your way of software installation to everyone. You use package manager like APT and that’s really fine solution for you, but I like to have choice here.
Package managers like apt or portage are not going anywhere, they will still be one of the strong points in using distro’s like gento and debian but it’s really cool for some people to also have a choice to use packages with installers. I really don’s see anything wrong with that. If autopackages are not your choice of software installation because it reminds you of windows way of doing things, than you should rather use apt and central debian package repository, or better yet, use tarballs and compile it yourself, everyobody happy.
Let the users choose what they want, they’re not geeks or developers.
I really wish good luck to Mike Hearn and autopackage project.
<< I still get surprised when people today claim they experience dependency hell.
>> It still exists for the people who package the stuff up for the users. But they minimize the problem. The problem people rather talk about and what Autopackage is meant for is the inconsidtency and difference between distributions. Remember that every package of every distribution needs to be maintained and that costs time. When you add it up, a lot of time, and i agree that it would be awesome if such double (more than double) work can be minimized.
For users who want software the way they want it which their distributor doesn’t provide:
* Bleeding edge.
* Newest stables.
* 3rd party patches.
Source distributions like Gentoo are a good way to solve this problem but it comes with other implications one might not prefer.
For example, the author says:
<< The way it works now: Mozilla releases Mozilla Super Edition. Users weep with joy, Debian and Fedora users check apt and yum, but no software is available. Stuck with outdated software and the overriding human nature to get new stuff to play with, users flood the bug tracking systems (if you’re lucky)
>> I totally disagree, and i wouldn’t want to run a new Mozilla version right after it was released. I would not want to force others to do that as well unless they explicitly understand and agree with the risks.
Why i disagree is: I do not *want* bleeding edge. My current version works *fine*. If i want a newer version, i compile that from source. But “want” is more “need” for people who want work done. I prefer the people who want bleeding edge to test the software so that i know for sure the new version works fine without any difficulties and if there are, i know what is gonna happen. As i said, i wouldn’t want to force other people -for whom i decide- to use something new either while they’re not aware of it, haven’t asked for it. You’d rather want to inform your userbase what’s going on, and why that happens.
Neither Debian Sid nor Gentoo allow me this. Woody does, nd Sarge is a middle ground. My point is: i don’t want Sarge (Testing) to become like the author described. That counts even more for Woody (Stable).
<< How it should work: Mozilla releases Mozilla Super Edition. Users weep with joy, Debian and Fedora users check apt and yum, and lo and behold, the software is updated to the latest version. Users spend the evening rejoicing how awesome Mozilla is, instead of yelling at their maintainers.
>> Okay well that’s interesting for some home users (geeks?), but not when time means money. A worker shouldn’t care for whatfeatures the browser has a worker should care for what he can do with the browser. The boss cares more for what is done with the browser than what the browser is able to do. He or she might even want the browser to be able to do exactly what it should so the Good things are done with it.
I think it is more common that when a new version of Mozilla is installed that the user shrugs and thinks whatever than the way described here. I think that’s more common on commercial levels, but that it ain’t uncommon in home user environments either. I mean, does your mother really care if that’s Windows 98 or Windows 98SE what she got? Only if either does enough or not enough that she wants and such decision is up to her.
<< Distributors are just as bad as we are of course. Debian’s release process is so broken that we have to rely on the concept of the “backport” as an excuse to explain a 2 year release cycle.
>> It ain’t broken, it is different than what you want. If you have a problem, you have other choices, but i think many people don’t have a problem with it.
(Maybe the distribution would have a better release cycle if people cooperated more with it instead of creating a similar alternative.)
Resume, i think that, in general people who are not happy with the packages they have in their distribution are mostly using the wrong distribution or even wrong OS.
Now back to my subject which is “Misses the point”. The author misses the point in various ways because he misses some steps in his research:
* I am seriously wondering if there was contact with Fedora or Debian developers before he mentioned his opinion that their release schedule, and packaging manners are “broken”. Maybe the current way it works in Fedora and Debian is how they want it to be?
* Is the author really unable to understand that “how it is now” has reasons? Why isn’t that analyzed?
* Why does the author think that his solution is able to fix a problem for distributions while some (at least i do) don’t see the problem he sees as a problem?
* Why is the diversity between the versions of apps in distributions a problem? Is it? What are the positive and negative implications of this?
Those are all points i’d like to have seen addressed.
<< Contrast that with a Linux user who needs to install a similar app(K3b). His/Her procedures.
>> Nice post, and a fine point.
[The following is more something for a usability thread, but still]
Except that users don’t have a clue what “k3b” is. When one wants to do something it ain’t true that they know what application to use. This is true for Windows as well. I’d like a local Freshmeat-like database driven application which guides a user to the application he/she wants. Such could be achieved with something like GNOME Storage. FM is pretty good on this. For example say you want a CD writing application. You go to multimedia, then cd-writing. You see a number of applications installed on your OS. You click on one, there you see what the current version is. You see the difference in features between that version and yours but you also see the features of that application and another one. You also see applications which are available in repository but not installed. After you gathered such information you’re able to decide which one you want to use. But what if you don’t know that cd-writing is part of the parent group multimedia? That’s where a search pops in. Heck, the features the apps provide could even be link ed to a tech-database a-la Wikipedia. IMO an interactive solution like this is a good step in helping a new (or even old) user to get something done while they’re not sure how.
utopian ideas like this are doomed to fail. If only we could get every application and distro working together then all package management would work as it should. Its kind of ironic this is found on the same day as the “linux != communiism” article. (Not saying it is… but referring to utopian ideals mentioned in the article. )
I’d love to see this magical kind of coordination happening. Poor joe coder could not release his piece of software to users unless it gets approved by their distro gods. Its a recipe for failure. There are just times you can’t control everything in the world.
Theres a reason why the windows model works fairly well, it is because its practical. Applications can count on binary compatibility for a few system dlls. The rest they package with their app. You want application foo. You got to http://www.foo.com. Download and install. MS has adopted the centralized approach to drivers to an extent, because its a reasonable subset of possibilities.
The windows model is not perfect. There used to be issues, but I haven’t had one since the win95 years.
However, I’ve mentioned this before, but if distros really want to use this utopian model, they could do it, by choosing for users. Only 1 windowing system, 1 text editor, 1 browser…that’s all distro X supports. That would be manageable to use this model. It certainly would NOT be a distro everyone (home user, power user, coroporation, server) could use, but each of those could have its own distro/sub distro.
This has not stopped over 3000 people downloading and installing the Inkscape packages using it, apparently without much issue.
2). Today, it doesn’t support many packages.
Correct, but API stability (a prerequisite for lots of developers using it) is our number 1 priority alongside stability. We’ll get there soon enough.
3). Today, it can’t build from source.
So what? Some people seem to be under the mistake impression that because Linux is open source, that makes source installs more “pure” or something. This is just funny. There should never be a need to compile from source. Often people think there is a need, when really there isn’t. Give me credible arguments for why source installs are “better” and maybe I’ll listen, but I’ve yet to see any.
Be warned. Binary portability issues you may have had are not an argument here. We already researched and fixed a great number of these.
And sorry but Windows/MacOS have install times of a few seconds for most programs – having that suddenly bumped to 30 minutes is a serious downgrade.
4). Today, I can’t install different versions of different packages/libs on it.
This seems wrong. Go try installing the GTKmm 2.2 and 2.4 packages at once: GTKmm 2.2/2.4 are parallel installable, and so are the autopackages. Or at least they should be, if they aren’t then I messed up the specfile (likely, as I was in a hurry).
There is certainly no theoretical reason why this can’t work.
5). It can resolve dependencies as well as other packagers.
I assume you meant “can’t”. This depends on your perspective. autopackage will pass a dependency check if the software is installed, period – unlike RPM, apt and so on which will ignore source installs or copied files. The flip side is that nobody wrote the code to integrate with native package managers yet, so most dependency checks won’t be able to resolve it if they’re missing. You do get a straightforward message with a tip for how to get it though.
Fixing this is just a matter of interest/manpower.
6). It will still need maintainers who can go on vacation.
No, the whole point is that the upstream maintainers themselves build these packagers. So when this is the case, by definition they cannot get out of sync as the same person releases them alongside the source tarballs.
7). It doesn’t solve the bandwith crisis.
It decentralises packaging which means you no longer need huge mirror servers for all the packages: popular packages get mirrored using individual facilities in the way it was always done, less popular software doesn’t take up space on the mirrors at all. This isn’t exactly rocket science.
Maybe I missed something but he offers no solution although eloquently stating the problem. A smarter install to take into account any differences in distros, “one ring to bind them all”, is one. How about other side of the coin,
Zero-Install? Or better yet a Zero-installer that takes into account different distros and links appropriately in the zero-installed pkg directory?
1). Congratulations! However, I don’t see anything spectacular about that. I don’t consider Inkscape to be a particularly challenging package to install. When Autopackage can correctly install/upgrade glibc on my system, without borking it, you have my ears. Until then, it’s not feasible. Installing software is not all there is to package management. Removing and upgrading software intelligently is where the challenge is at. Last I checked, Autopackage couldn’t do those successfully, and this is where true packagers excel.
3). All the binary distros I have used suck! All of them. All of them have been unstable. All of them have binaries compiled with either too little or too many options. All of them have generic optimizations. Many of them compile their binaries with debugging symbols included(I can do that myself if I want to test). Many of them compile their binaries with silly patches that make the binaries unstable. I end compiling packages manually from source even while using binary based distros. I experienced the most gruesome cases of dependency hell on binary based distros. They are horrible. All of them.
Contrast that with source based distros, where you can choose with options to be compiled into the package(i.e My media players actually plays mp3s!). I can optimize the binaries for my CPU arch, I can compile the binaries without debugging symbols. I’m free of silly patches that add needless functionality to the package their bye making it unstable. Bye bye dependency hell. It is more secure that depending on some third party binary you found somewhere on the Internet. It is smaller; it is faster.
Finally, majority of the software packages actually take a few minutes to compile. Inkscape compiles in 3 minutes on my machine. While it’s compiling, nothing stops me from using the computer, or even using older versions of Inkscape. When the new version is done compiling, all I need to do is restart the application and voila, I have new Inkscape on my system.
If you read my comment of the first thread, you’d realize that installing software on Windows/Mac OS X can be longer than installing it on Linux. Read my earlier comment. In brief, it takes at least 15 steps to install software on Windows. It takes at most 5 steps to do the same on Linux, with a good package manager.
4). Good news.
5). Alas, you bring up another reason I can’t stand binary based distros. RPM? RPM was the reason I gave up on binary based distros. Anyway, back on topic. This is another reason I think Autopackage isn’t usable. I’m not trying to be a jerk here, I’m just being reasonable and truthful. If I still have to go hunting for dependencies all over the Internet because they are not on my system, then Autopackage doesn’t solve my dependency hell problem! To be blunt, Autopackage can’t truly resolve dependencies. Today, packagers exist that can truly resolve dependencies and do so elegantly. This another reason I think repositories play a great role in package management. But I digress.
6). Upstream maintainers shouldn’t be burdened with packaging. Packaging should be left to distros who know how packages will run on their distros. In my opinion, upstream maintainers should provide source, let distros handle the packaging.
7). But it also means I have to hunt all over the Internet to install dependencies and packages not on my system. Some packages can have up to 15 dependencies, I’d rather pull those dependencies from a repository than hunt for them allover the Internet, like I’d have to do with Autopackage today.
If I have offended you, it is unintentional and I apologize.
Of course Autopackage itself, or whatever is used, is included in the central repository and is customized by each distro to use its repository for dependancy resolution.
This is a cool idea in that it would not only allow the devs of these small projects to spend more time coding and less time packaging but would also (potentially) still allow the individual distros to make sure that all installed packages will be placed where they like ’em.
“Contrast that with source based distros, where you can choose with options to be compiled into the package(i.e My media players actually plays mp3s!).”
So just out of curiousity, are you saying that when you emerge with Gentoo you automatically needn’t worry about dependencies or that you take care of when compiling the source?
I didn’t understand your question. However, Gentoo’s package manager, Portage, automatically resolves dependencies. Gentoo is a source based distribution.
So, yes, the user needn’t worry about dependencies. At the same time, the user has the facility to control what options get compiled into packages. For example, a user can decide not to compile flac(an audio format like mp3) support for xine(a media player/library).
I’ve used binary distributions where media players wouldn’t play mp3s because the application wasn’t compiled with the right options. Or where media players can’t playback DVDs for similar reasons. Or distros that configure binaries with a broken backend instead of the better alternative. Either way, you have recompile the packages in question from source.
To make matters worse, these same binary distros suck at installing software from source. So there commences your journey to a totally borked system. Hence my rant in that paragraph.
<< So, yes, the user needn’t worry about dependencies.
>> It’s not that easy. As statement this is pretty shortsighted. What you forget to keep in mind is that recompilations and compilations of dependancies need to take place and that certainly ain’t always a positive thing. And i mean that for both the thing itself as concept as well as case-by-case.
The author was rather vauge with his solution though.
Personaly, I think that the answer is BOTH forms of software distrobution. The central repository style holds all the core software for the distro, and any other important software that people decide to put the work into to customize and add. The rest is installed through a universal packaging system such as Autopackage.
For instance, if you want to upgrade Gnome, you should do so through the repository. Inkscape on the other had is a simple application and is not depended on by anything in your standard distro, so more correctly should be offered as a downloadable package by its devs.
Of course Autopackage itself, or whatever is used, is included in the central repository and is customized by each distro to use its repository for dependancy resolution.
Put it all together and you’ve got a pretty good solution if you ask me. Benefits of both repositories and universal packaging, and good dependancy resolution all around.
When a package is offered for download, in my opinion a version with all non-LSB dependancies staticly linked should also be available, specifically for use by those who will not have a good internet connection at the time of installation for any dependancies.
since I switched to a certain source based disto (name withheld because just mentioning it gets people mod happy). The only down side of building from source is that it takes a little longer. Otherwise, the benefits outweigh the time factor.
@Devon
…n my opinion a version with all non-LSB dependancies staticly linked should also be available…
Sounds like a great idea, until you realize there are very lage barriers in the way of such:
1) Various licenses that could be used by those various dependencies (even just one) often mean you can’t static link
2) A static library often has dynamic dependencies (see Allegro for an example)
3) You can’t statically link to libraries that dynamic load other libraries and expect your software to work on any other distribution than your own (and sometimes not even then)
4) Static versions of libraries sometimes aren’t available at all
5) Static linking all those extra dependencies bloats your executable by several megabytes causing a large number of complaints from modem users (especially if the program is updated frequently)
@Beavis
since I switched to a certain source based disto (name withheld because just mentioning it gets people mod happy). The only down side of building from source is that it takes a little longer.
Except that only works for packages that are somehow maintained by your distribution (through a central repository), and only for free software, that’s not a viable solution for commercial applications.
I think you can leave out more dependencies than just LSB libs, it would not make sense to have fully GNOME or KDE libs in your own installer (note that not even the two toolkits are in LSB, but very likely installed on user systems)
I am a Debian user and are quite happy with the way our packagers create the dependency tree, but IMHO the version info is too strict.
Last week I upgraded from KDE3.1.4 to KDE3.2.3 and apt-get removed all KDE 3.1 applications.
I think this is unnecessary, KDE libs should be binary compatible within the same major release and all applications I compiled myself continued to work with the new libs.
Of course if it is not possible to offer a static package for some reason then that option simply would not be there. After all, I only proposed that staticly linked packages be made available for the rare situations in which it might be useful, such as a very slow or non-existant internet connection for the system on which it is to be installed.
The vast majority would download the “standard” dynamic package and let Autopackage, or whatever universal packaging system, install any dependancies through their distros central repository before installing the package.
— “I think you can leave out more dependencies than just LSB libs, it would not make sense to have fully GNOME or KDE libs in your own installer (note that not even the two toolkits are in LSB, but very likely installed on user systems)”
Of course you are correct. Hey, Im no expert here, just a Linux user with an idea.
You guys get the basic idea though right?
The certain unnamed distro has commercial apps in its repository (which require serial #’s). Otherwise you install 3rd party binaries just as you would in Windows.
I really don’t see an issue here. The problems with Linux packaging really lie in the fact that most ditros are binary. Binary distribution really doesn’t make sense in an operating system whose design stems from an entirely open source base. If software is updated, it and its reverse dependecies must be updated as well. It is a fact of life with such an open and flexible platform. Thats the way the system works. Distros need to look into using Gentoo’s concept of slots, where multiple versions of the same libraries and apps can be installed simultaneously to eliminate broken dependencies after upgrades.
If you have a problem with the architectural deficits of a binary Linux distro, move to a source-based distro.
If you want to live on the bleeding edge, get off your arse and make it work instead of complaining about it. Otherwise, use the repository that your distro maintains.
If your package manager makes updates difficult and breaks your system, find a new distro.
People don’t get it. LINUX IS NOT A WINDOWS CLONE. Do it the right way, not the Windows way, and you will have few problems if any.
Just my $0.02.
> You guys get the basic idea though right?
Absolutely!
Another idea would be to have the dependencies included.
I mean some kind of archive package that contains the program package and packages of the dependencies.
The installing package manager could then use those if some dependency is not installed yet.
This would even allow to include dependecies like KDE libs without getting them installed multiple times or having them linked into each application.
They exist and they work, quit whining. We’ve had these packaging rants ad nauseum. Packaging was solved long ago. Just because it works so perfectly that it gives Windows users a heart attack does not mean that it’s the wrong way to do it.
I think uniformity of packaging is the “next big thing” for Linux distros. It seems to me that when the community starts a-screamin’ someone usually kicks things into higher gear.
At least, this seems true for some past cases: free QT, better fonts/anti-aliasing, true-transparency in X, faster X development, and more concensus between KDE/GNOME (freedesktop.org).
— “Another idea would be to have the dependencies included.
I mean some kind of archive package that contains the program package and packages of the dependencies.”
I thought of that, but the problem is that you’d have to make sure that all those dependancies you packaged insalled and worked properly with every single distro out there, and with their repositories, without breaking existing packages, and at the same time not resulting in multiple installs that waste resources. Better to let the installation of libraries and other dependancies be handled by those who know each distro best: the repository maintainers.
Thats also why any universal package manager, such as Autopackage, should be cusomtized by each distro to resolve dependancies with its repository, or in whatever way is appropriate for that distro. There are vastly differing philosphies guiding many of the popular distros, and the best way to deal with them all is to simply let them deal with themselves.
The problem is that some distributions have horrible packagers/package managers or none at all! It is usually users of these distributions that yell “dependency hell!“…”linux is not ready“…”static linking solves it all“…”do it the Microsoft way“…”do it the Macintosh way“…etc.
Users should not have to go to an application’s website to install the applications. It is a backward and broken concept, that has thrived because Microsoft Windows dictates our computer usage standards.
Each distribution should have a repository that allows their users to install software easily and correctly(e.g Gentoo’s Portage and Debian’s Apt).
When I tell my friends that installing software is a lot easier on Linux than on Windows they ask me how. I give them this example:
A user wants to install a CD burning software on Windows(Nero), these are the procedures he might have to go through.
1). Launch your browser
2). Point browser to Nero’s website
3). Search the website for where the download link it after wadding through adds.
4). You might need to register to download some apps.
5). Nero provides links to various versions of their software, look for the one that suits your needs.
6). Download Nero to your system.
7). Click on the installer.
8). Click to agree to license you didn’t read.
9). Identify the components you need on your system.
10). Click on them and click on next.
11). Identify where the components are going to be installed, if you have weird setup(many users do)
12). Register the product or click on next.
13). Advertisements click on next.
14). Click on finish.
Contrast that with a Linux user who needs to install a similar app(K3b). His/Her procedures.
1). Launch your terminal.
2). Enter distro’s installation command (“install K3b”)
3). Close terminal.
or
1). Launch gui installer
2). Search for K3B
3). Click on it to install
I don’t know any other operating system that makes installing software easier. I agree with the article. Lets focus on upstream vs downstream dev communication. Lets advertise the benefits of intelligent package managers. Lets work on making package managers more robust.
I still get surprised when people today claim they experience dependency hell. The days I experienced dependency hell were the days when I had to go hunting for packages all over the internet to install them manually(i.e years ago). Gone are those days.
I still believe Linux has to most sophisticated method of installing software. It’s just that, like the article cleary demonstrates, new users to Linux are stuck to the bad habbits the inheritted from other operating systems. They are the ones who propose static linking as a solution to installation flaws of their distro.
The best package managers out there are Debian’s Apt and Gentoo’s portage. Gentoo’s portage is particulary impressive because Gentoo is a source based distro and installation of packages on source based systems can be challenging. These two also have the largest repository of softwares I know in Linux. Why…oh…why aren’t their package system Linux standards!?
Afterall, these two package managers have already solved 90% of software installation issues and most importantly they have the largest repositories.
Warning: Grammatical errors abound, I wrote this in a haste. My apologies.
So basically he wants us to depend on a repo system maintained by the distro maker?
1.)OK so what happens if the maintainer of a certain package take s a vaction? Community based distros(debian, gentoo, etc) can’t afford to pay people to maintain packages.
2.) Er portability? What if I want to install s/w on a comp that dosent have internet connection.
3.) Congestion. How much bandwidth/storage are we talking about to keep so many packages(14,000 +)
4.) Repackaging. So lets see….Fedora,debian,gentoo,MDK,etc.
LOL what thats 5 to 6 distros and 14,000 packages each.
~84,000 packages!!! Bandwidth isnt cheap!
Conclusion…guys a retard, full stream ahead with distro-neutral packages (Autopackage).
“Some teams get this almost right. The Debian GNOME packagers are certainly a decent example of upstream/downstream cooperation. Bugs get reported upstream, and upstream developers are involved in the packaging process downstream. The result, GNOME 2.6 makes it into Debian about a month after release.”
I tell you what. The creator of Knoppix gave up on GNOME due to its poor support on Debian. Don’t thrust me? Check out the Debian mailing list.
About packages: all the dependence resolution, different package formats, and centralization, aren’t very easy to overcome and make the software installation one click away.
And when he talks about Linux distros, he usually talks about a couple of them, but he needs to understand that there are hundreds, and people aren’t going to settle on his two, no matter what.
I thought we already assumed either compatible packages or one master package for each distribution, otherwise even the smallest external dependency wouldn’t work.
Actually that are all implementation details, I just wanted to point to another solution for having all dependencies in one downloadable file without requiring bad things like static linking.
“1.)OK so what happens if the maintainer of a certain package take s a vaction? Community based distros(debian, gentoo, etc) can’t afford to pay people to maintain packages.”
Why would they need to pay someone? And unless he took a rediculously long vacation without telling anyone, I can’t imagine it being even slightly disruptive.
2.) Er portability? What if I want to install s/w on a comp that dosent have internet connection.
That would be a fairly uncommon situation. I guess you’d have to fall back on the old “hunt down all the dependancies” method then. This is package managment, not magic.
3.) Congestion. How much bandwidth/storage are we talking about to keep so many packages(14,000 +)
Whos keeping packages? Debian may do it that way, but Gentoo doesn’t. Portage is simply a repository of scripts. The actual source archives are downloaded from the same places everyone else downloads them from. APT4RPM and similar RPM repositories are similar in that they all ususally download from the same mirrors. Why would everyone have to do it the Debian way?
4.) Repackaging. So lets see….Fedora,debian,gentoo,MDK,etc.
If the repository maintaners are willing to do it, who are you to tell them its to much work? Its their choice where to spend their time.
1). Packages usually have more than one maintainers, yes, even in community based distros like Gentoo or Debian.
2). Copy the binary from one computer to the other.
3). What congestion? Packages are stored in the distribution’s repository.
4). You lost me.
Autopackage has its issues.
1). Today, it’s unusable.
2). Today, it doesn’t support many packages.
3). Today, it can’t build from source.
4). Today, I can’t install different versions of different packages/libs on it.
5). It can resolve dependencies as well as other packagers.
6). It will still need maintainers who can go on vacation.
7). It doesn’t solve the bandwith crisis.
So, since Autopackage today is severely handicapped, what are we retards to do?
“I just wanted to point to another solution for having all dependencies in one downloadable file without requiring bad things like static linking.”
Autopackage has support for this (sealed installers). It would work just like the old loki installers.
Is provide their app’s source code (where applicable) and a package with all the libs. Like the Mac OS X .app which is just a folder with all the needed executables inside. If you want a “clean” installation you can always build your distro’s package from the sources.
A lot of non-OS people seem to be doing this already.
I think this is much better than the Autopackage solution. Autopackage just seems to add an other distributed packaging system on top of the one that the installed distro already uses.
“I thought we already assumed either compatible packages or one master package for each distribution, otherwise even the smallest external dependency wouldn’t work.”
What makes you think that? Obviously there would have to be some working together to make sure it all worked. As you said, its all really just details of the implimentation.
Keep in mind that my goal was not to make everything work in perfect harmony, it was to find the best and most implimentable solution that still allowed distros to keep their differing conventions and philosophies.
Why would they need to pay someone?
You expect people to work(for free) *FULL* time maintaing packages?
And unless he took a rediculously long vacation without telling anyone, I can’t imagine it being even slightly disruptive.
“rediculously long”? What do you consider long? If a package is more than 1 week old, im pissed and I just think, hey if I was in Windows I wouldnt have to wait.
I guess you’d have to fall back on the old “hunt down all the dependancies” method then.This is package managment, not magic.
This is magic? Well I guess the guys at Autopackage must be magicians. Autopackage supports including ALL (non-LSB) dependecies into a single package!
Portage is simply a repository of scripts. The actual source archives are downloaded from the same places everyone else downloads them from.
You got me there. I was talking about Debian.
If the repository maintaners are willing to do it, who are you to tell them its to much work? Its their choice where to spend their time.
They can do whatever they want. I just think its rediculous to have 84,000 packages+ just to support 5-6 distros!
Packages usually have more than one maintainers, yes, even in community based distros like Gentoo or Debian.
Yes, thats the point I was making. Debian has social contracts and has induviduals that maintain specfic packages right? What if one of those maintainers leaves on a vaction, is sick, etc and a major release of a package is made(one that he or she in charge of)?
Copy the binary from one computer to the other.
If you can help me get a debian package to work perfectly on Vector and have no dependency problems, im listening
What congestion? Packages are stored in the distribution’s repository.
What?!? They get unlimited bandwidth for free? 😮
You lost me.
OK
-lots of redundancy. Repacking the same s/w many times over
-waste of bandwidth hosting all these different packages.
Autopackage has its issues.
Ofcourse it does, its not even API stable! Im just saying that the folks at Autopackage are on the right track.
@dr_gonzo
Is provide their app’s source code (where applicable) and a package with all the libs. Like the Mac OS X .app which is just a folder with all the needed executables inside. If you want a “clean” installation you can always build your distro’s package from the sources.
One huge problem with that. Those precompiled libraries often DO NOT work correctly across distributions due to filesystem layout differences, kernel differences, and base system library differences (gcc, etc.). So this isn’t a silver bullet solution either.
-: If a maintainer is on vacation, in community based distros, another maintainer takes over. Or a user interested in the package steps up and provides the necessary script which is then submitted to bugzilla, forums or other community resources. This is the least of a packaging system’s problems.
-: You can get many statically linked packages to work accross many distros. I can’t, however, guarantee they will work. Some distros provide facilities to generate statically linked applications. You should acknowledge that technically speaking, Debian and Vector are two different operating systems, because they both use the Linux kernel doesn’t mean packages designed to work on one will or should work on the other. Such an expectation is unreasonable as is many other expectations for Linux.
-: A distribution’s repository are mirrored servers all over the internet. In a community based distro like Gentoo, the bandwith is sponsored by commercial interests, users and well wishers. I still don’t see how Autopackage solves the bandwith problem.
-: The distribution’s maintainer is responsible for packaging software for his/her distribution. I don’t see the redundance here. All the application developer needs to do is provide the source. Distributions usually have scripts that convert the source into a manner compliant to their needs. You keep bringing up this bandwith issue but I don’t see how Autopackage saves me, application developers, or distributions less bandwith.
-: They might be on the right track, but the track is unproven. Portage and Apt, today, have proven that software installation can be easy, reliable and consistent. While Autopackage is still a work in progress that might never be fruitful. I have used Portage to build a whole linux system from scratch(every single package). When Autopackage can do that, it will begin to tickle my fancy. Until then, it is unusable, period.
I am just not a big fan of unproven technologies. I do wish Autopackage developers the best of luck. Let’s work on improving what we have right now. If, eventually, Autopackage solves all our installation woes, I’m sure we will be quick to dump our current solutions, which work well at the moment.
http://www.archlinux.org
I find it laughable that there are still people who complain about software installation on Linux – it’s easier than any other OS I’ve tried. As for packages not being updated immediately once a new version is released, that’s a big innaccuracy – Arch releases new packages the day the program is released (in most instances – bigger packages like X.org or Firefox take a day or two so they can test them).
You expect people to work(for free) *FULL* time maintaing packages?
Nope. Why would they have to work full time though? They can if they want to I suppose, but I believe the majority of them do not.
“rediculously long”? What do you consider long? If a package is more than 1 week old, im pissed and I just think, hey if I was in Windows I wouldnt have to wait.
Perhaps a few months? Hmmm, yeah, Id say thats unusuaully long. Besides, the “not telling anyone” part is the important part. If the other maintaners know hes gone, they can see to it things get taken care of in the mean time. Besides, new releases are hardly suprises. Why would he wait until an impending release to do his vacation?
This is magic? Well I guess the guys at Autopackage must be magicians. Autopackage supports including ALL (non-LSB) dependecies into a single package!
Well, if you found a way to download dependancies without an internet connection that would be magic.
Of course including them with the software package could work to, and when Autopackage is ready it will be another possible way to solve that problem. Still, this is an uncommon situation, and there is no solution that is best for EVERYONE.
You got me there. I was talking about Debian.
But… you can’t say that the apt repository won’t work, ’cause its been working fine for years!
I thought you were reffering to the idea that ALL distros have repositories, and that this would result in internet congestion (storage was never an issue, its too cheap to matter).
They can do whatever they want. I just think its rediculous to have 84,000 packages+ just to support 5-6 distros!
So you think that Linux needs less software?? Personally I would want there to be as many packages as possible available for my distro.
Interesting, but apparently confused article.
The “solution”, in summary, appears to be:
Aug 17: Developers of “foo” complete work on foo-3.75
Aug 17: “Foo” developers work with Debian, Fedora, SuSE, Gentoo to create .deb, .rpm, ~x86 packages which are fully tested with “foo”‘s co-dependant packages.
Aug 20: Basic requirements found – changes required in “bar”, “baz”, “giz” projects.
Aug 21: “bar”, “baz”, “giz” developers notified.
Aug 25: “bar” developers reply “impossible”; “baz” developers submit patches; “giz” developers reply “technically unlikely to work; fix goes against the giz roadmap”.
Sep 01: Distributors start work patching “bar”, “baz”, giz” code.
Sep 15: half-broken “bar”, “baz”, “giz” patches available
Sep 27: Debian, Fedora, SuSE, Gentoo packages available for “foo”. One month after the “foo” source was available. “bar”, “baz”, “giz” packages variously broken.
Sep 27: Users of Slackware, Yoper, JDS, [insert-fave-distro-here] complain “why isn’t “foo” available for me?” (answer 1: your distro isn’t part of the cabal). (answer 2: It’s been available for over a month, as source)
Sep 28: Developers of packages “bar” release new source (based on their own tree, not the developer-cabal tree). This breaks “foo”.
Sep 29: Developers of “baz” release new source (as Sep 28, but more so).
Sep 30: Developers of “giz” do the same (as Sep 29, but even worse).
Nov 01: Users complain “What is this crazy Linux packaging system?!”
(this post doesn’t necessarily represent the position of the posting IP address)
I run Gentoo, and Gentoo has theese problems too, I’m still waiting for a mysql-4.1 ebuild… (QA doesnt allow it in the tree)
Thought one:
1. The shared source tree is never released as a package.
2. Distro cvs works as an overlay to the shared source tree.
Thought two:
Recognize that computerized dependency resolving can be distributor/os agnostic webservice. Design an RDF/OWL language to describe deps and other meta data about source trees. Use URI as package/feature namespace.
Thought three:
The distro package managment system should just as easily install the latest cvs either anon or rw as it installs prepared packages. Installation from cvs code should go thorugh the distro package managment.
Thought four:
Gigantic P2P based CVS system…
Thought five:
install(1) could be extended to do more, db registering md5,mtime and such for uninstallation/updating.
I don’t know about other distros, but som of the features of portage could be incorporated into install(1)
I untar it into /usr/share/firefox
I make a link /usr/bin/mozilla-firefox to it, and boom I have it updated. I know that’s a bit complex for most users. But Mozilla is a REALLY EASY package to install, that’s why you don’t need an rpm for it.
And most packages that don’t come with your dist are small enough to build within 10 minutes. I think auto-package’s source builder is exactly what we need. Basically just a check to make sure you can install by “./configure && make && make install && make clean”. Oh and deps checking as well (which can be done universally using each packages configuration script, always in roots path).
Maybe users should learn to live with a bit more difficult installation? I mean, making installs harder also makes adware installs harder…
The only install I’ve done that I don’t think the average person could figure out involved fixing some errors in a guys code. But it was a cheesey panel app, so I don’t think anybody’d cry about not having it .
http://bugs.gentoo.org/show_bug.cgi?id=34600
– When Autopackage can do that, it will begin to tickle my fancy. Until then, it is unusable, period.
It’s unusable for you, I Installed new version of Gimp from their site without any troubles and I liked the idea of autopackage. Why are u trying to enforce your way of software installation to everyone. You use package manager like APT and that’s really fine solution for you, but I like to have choice here.
Package managers like apt or portage are not going anywhere, they will still be one of the strong points in using distro’s like gento and debian but it’s really cool for some people to also have a choice to use packages with installers. I really don’s see anything wrong with that. If autopackages are not your choice of software installation because it reminds you of windows way of doing things, than you should rather use apt and central debian package repository, or better yet, use tarballs and compile it yourself, everyobody happy.
Let the users choose what they want, they’re not geeks or developers.
I really wish good luck to Mike Hearn and autopackage project.
<< I still get surprised when people today claim they experience dependency hell.
>> It still exists for the people who package the stuff up for the users. But they minimize the problem. The problem people rather talk about and what Autopackage is meant for is the inconsidtency and difference between distributions. Remember that every package of every distribution needs to be maintained and that costs time. When you add it up, a lot of time, and i agree that it would be awesome if such double (more than double) work can be minimized.
For users who want software the way they want it which their distributor doesn’t provide:
* Bleeding edge.
* Newest stables.
* 3rd party patches.
Source distributions like Gentoo are a good way to solve this problem but it comes with other implications one might not prefer.
For example, the author says:
<< The way it works now: Mozilla releases Mozilla Super Edition. Users weep with joy, Debian and Fedora users check apt and yum, but no software is available. Stuck with outdated software and the overriding human nature to get new stuff to play with, users flood the bug tracking systems (if you’re lucky)
>> I totally disagree, and i wouldn’t want to run a new Mozilla version right after it was released. I would not want to force others to do that as well unless they explicitly understand and agree with the risks.
Why i disagree is: I do not *want* bleeding edge. My current version works *fine*. If i want a newer version, i compile that from source. But “want” is more “need” for people who want work done. I prefer the people who want bleeding edge to test the software so that i know for sure the new version works fine without any difficulties and if there are, i know what is gonna happen. As i said, i wouldn’t want to force other people -for whom i decide- to use something new either while they’re not aware of it, haven’t asked for it. You’d rather want to inform your userbase what’s going on, and why that happens.
Neither Debian Sid nor Gentoo allow me this. Woody does, nd Sarge is a middle ground. My point is: i don’t want Sarge (Testing) to become like the author described. That counts even more for Woody (Stable).
<< How it should work: Mozilla releases Mozilla Super Edition. Users weep with joy, Debian and Fedora users check apt and yum, and lo and behold, the software is updated to the latest version. Users spend the evening rejoicing how awesome Mozilla is, instead of yelling at their maintainers.
>> Okay well that’s interesting for some home users (geeks?), but not when time means money. A worker shouldn’t care for whatfeatures the browser has a worker should care for what he can do with the browser. The boss cares more for what is done with the browser than what the browser is able to do. He or she might even want the browser to be able to do exactly what it should so the Good things are done with it.
I think it is more common that when a new version of Mozilla is installed that the user shrugs and thinks whatever than the way described here. I think that’s more common on commercial levels, but that it ain’t uncommon in home user environments either. I mean, does your mother really care if that’s Windows 98 or Windows 98SE what she got? Only if either does enough or not enough that she wants and such decision is up to her.
<< Distributors are just as bad as we are of course. Debian’s release process is so broken that we have to rely on the concept of the “backport” as an excuse to explain a 2 year release cycle.
>> It ain’t broken, it is different than what you want. If you have a problem, you have other choices, but i think many people don’t have a problem with it.
(Maybe the distribution would have a better release cycle if people cooperated more with it instead of creating a similar alternative.)
Resume, i think that, in general people who are not happy with the packages they have in their distribution are mostly using the wrong distribution or even wrong OS.
Now back to my subject which is “Misses the point”. The author misses the point in various ways because he misses some steps in his research:
* I am seriously wondering if there was contact with Fedora or Debian developers before he mentioned his opinion that their release schedule, and packaging manners are “broken”. Maybe the current way it works in Fedora and Debian is how they want it to be?
* Is the author really unable to understand that “how it is now” has reasons? Why isn’t that analyzed?
* Why does the author think that his solution is able to fix a problem for distributions while some (at least i do) don’t see the problem he sees as a problem?
* Why is the diversity between the versions of apps in distributions a problem? Is it? What are the positive and negative implications of this?
Those are all points i’d like to have seen addressed.
<< Contrast that with a Linux user who needs to install a similar app(K3b). His/Her procedures.
>> Nice post, and a fine point.
[The following is more something for a usability thread, but still]
Except that users don’t have a clue what “k3b” is. When one wants to do something it ain’t true that they know what application to use. This is true for Windows as well. I’d like a local Freshmeat-like database driven application which guides a user to the application he/she wants. Such could be achieved with something like GNOME Storage. FM is pretty good on this. For example say you want a CD writing application. You go to multimedia, then cd-writing. You see a number of applications installed on your OS. You click on one, there you see what the current version is. You see the difference in features between that version and yours but you also see the features of that application and another one. You also see applications which are available in repository but not installed. After you gathered such information you’re able to decide which one you want to use. But what if you don’t know that cd-writing is part of the parent group multimedia? That’s where a search pops in. Heck, the features the apps provide could even be link ed to a tech-database a-la Wikipedia. IMO an interactive solution like this is a good step in helping a new (or even old) user to get something done while they’re not sure how.
utopian ideas like this are doomed to fail. If only we could get every application and distro working together then all package management would work as it should. Its kind of ironic this is found on the same day as the “linux != communiism” article. (Not saying it is… but referring to utopian ideals mentioned in the article. )
I’d love to see this magical kind of coordination happening. Poor joe coder could not release his piece of software to users unless it gets approved by their distro gods. Its a recipe for failure. There are just times you can’t control everything in the world.
Theres a reason why the windows model works fairly well, it is because its practical. Applications can count on binary compatibility for a few system dlls. The rest they package with their app. You want application foo. You got to http://www.foo.com. Download and install. MS has adopted the centralized approach to drivers to an extent, because its a reasonable subset of possibilities.
The windows model is not perfect. There used to be issues, but I haven’t had one since the win95 years.
However, I’ve mentioned this before, but if distros really want to use this utopian model, they could do it, by choosing for users. Only 1 windowing system, 1 text editor, 1 browser…that’s all distro X supports. That would be manageable to use this model. It certainly would NOT be a distro everyone (home user, power user, coroporation, server) could use, but each of those could have its own distro/sub distro.
Autopackage has its issues.
1). Today, it’s unusable.
This has not stopped over 3000 people downloading and installing the Inkscape packages using it, apparently without much issue.
2). Today, it doesn’t support many packages.
Correct, but API stability (a prerequisite for lots of developers using it) is our number 1 priority alongside stability. We’ll get there soon enough.
3). Today, it can’t build from source.
So what? Some people seem to be under the mistake impression that because Linux is open source, that makes source installs more “pure” or something. This is just funny. There should never be a need to compile from source. Often people think there is a need, when really there isn’t. Give me credible arguments for why source installs are “better” and maybe I’ll listen, but I’ve yet to see any.
Be warned. Binary portability issues you may have had are not an argument here. We already researched and fixed a great number of these.
And sorry but Windows/MacOS have install times of a few seconds for most programs – having that suddenly bumped to 30 minutes is a serious downgrade.
4). Today, I can’t install different versions of different packages/libs on it.
This seems wrong. Go try installing the GTKmm 2.2 and 2.4 packages at once: GTKmm 2.2/2.4 are parallel installable, and so are the autopackages. Or at least they should be, if they aren’t then I messed up the specfile (likely, as I was in a hurry).
There is certainly no theoretical reason why this can’t work.
5). It can resolve dependencies as well as other packagers.
I assume you meant “can’t”. This depends on your perspective. autopackage will pass a dependency check if the software is installed, period – unlike RPM, apt and so on which will ignore source installs or copied files. The flip side is that nobody wrote the code to integrate with native package managers yet, so most dependency checks won’t be able to resolve it if they’re missing. You do get a straightforward message with a tip for how to get it though.
Fixing this is just a matter of interest/manpower.
6). It will still need maintainers who can go on vacation.
No, the whole point is that the upstream maintainers themselves build these packagers. So when this is the case, by definition they cannot get out of sync as the same person releases them alongside the source tarballs.
7). It doesn’t solve the bandwith crisis.
It decentralises packaging which means you no longer need huge mirror servers for all the packages: popular packages get mirrored using individual facilities in the way it was always done, less popular software doesn’t take up space on the mirrors at all. This isn’t exactly rocket science.
Maybe I missed something but he offers no solution although eloquently stating the problem. A smarter install to take into account any differences in distros, “one ring to bind them all”, is one. How about other side of the coin,
Zero-Install? Or better yet a Zero-installer that takes into account different distros and links appropriately in the zero-installed pkg directory?
1). Congratulations! However, I don’t see anything spectacular about that. I don’t consider Inkscape to be a particularly challenging package to install. When Autopackage can correctly install/upgrade glibc on my system, without borking it, you have my ears. Until then, it’s not feasible. Installing software is not all there is to package management. Removing and upgrading software intelligently is where the challenge is at. Last I checked, Autopackage couldn’t do those successfully, and this is where true packagers excel.
3). All the binary distros I have used suck! All of them. All of them have been unstable. All of them have binaries compiled with either too little or too many options. All of them have generic optimizations. Many of them compile their binaries with debugging symbols included(I can do that myself if I want to test). Many of them compile their binaries with silly patches that make the binaries unstable. I end compiling packages manually from source even while using binary based distros. I experienced the most gruesome cases of dependency hell on binary based distros. They are horrible. All of them.
Contrast that with source based distros, where you can choose with options to be compiled into the package(i.e My media players actually plays mp3s!). I can optimize the binaries for my CPU arch, I can compile the binaries without debugging symbols. I’m free of silly patches that add needless functionality to the package their bye making it unstable. Bye bye dependency hell. It is more secure that depending on some third party binary you found somewhere on the Internet. It is smaller; it is faster.
Finally, majority of the software packages actually take a few minutes to compile. Inkscape compiles in 3 minutes on my machine. While it’s compiling, nothing stops me from using the computer, or even using older versions of Inkscape. When the new version is done compiling, all I need to do is restart the application and voila, I have new Inkscape on my system.
If you read my comment of the first thread, you’d realize that installing software on Windows/Mac OS X can be longer than installing it on Linux. Read my earlier comment. In brief, it takes at least 15 steps to install software on Windows. It takes at most 5 steps to do the same on Linux, with a good package manager.
4). Good news.
5). Alas, you bring up another reason I can’t stand binary based distros. RPM? RPM was the reason I gave up on binary based distros. Anyway, back on topic. This is another reason I think Autopackage isn’t usable. I’m not trying to be a jerk here, I’m just being reasonable and truthful. If I still have to go hunting for dependencies all over the Internet because they are not on my system, then Autopackage doesn’t solve my dependency hell problem! To be blunt, Autopackage can’t truly resolve dependencies. Today, packagers exist that can truly resolve dependencies and do so elegantly. This another reason I think repositories play a great role in package management. But I digress.
6). Upstream maintainers shouldn’t be burdened with packaging. Packaging should be left to distros who know how packages will run on their distros. In my opinion, upstream maintainers should provide source, let distros handle the packaging.
7). But it also means I have to hunt all over the Internet to install dependencies and packages not on my system. Some packages can have up to 15 dependencies, I’d rather pull those dependencies from a repository than hunt for them allover the Internet, like I’d have to do with Autopackage today.
If I have offended you, it is unintentional and I apologize.
I agree, I am not gonna try to understand the specifics, but sounds like a great idea to me.
Of course Autopackage itself, or whatever is used, is included in the central repository and is customized by each distro to use its repository for dependancy resolution.
This is a cool idea in that it would not only allow the devs of these small projects to spend more time coding and less time packaging but would also (potentially) still allow the individual distros to make sure that all installed packages will be placed where they like ’em.
Nice.
“Contrast that with source based distros, where you can choose with options to be compiled into the package(i.e My media players actually plays mp3s!).”
So just out of curiousity, are you saying that when you emerge with Gentoo you automatically needn’t worry about dependencies or that you take care of when compiling the source?
I didn’t understand your question. However, Gentoo’s package manager, Portage, automatically resolves dependencies. Gentoo is a source based distribution.
So, yes, the user needn’t worry about dependencies. At the same time, the user has the facility to control what options get compiled into packages. For example, a user can decide not to compile flac(an audio format like mp3) support for xine(a media player/library).
I’ve used binary distributions where media players wouldn’t play mp3s because the application wasn’t compiled with the right options. Or where media players can’t playback DVDs for similar reasons. Or distros that configure binaries with a broken backend instead of the better alternative. Either way, you have recompile the packages in question from source.
To make matters worse, these same binary distros suck at installing software from source. So there commences your journey to a totally borked system. Hence my rant in that paragraph.
<< So, yes, the user needn’t worry about dependencies.
>> It’s not that easy. As statement this is pretty shortsighted. What you forget to keep in mind is that recompilations and compilations of dependancies need to take place and that certainly ain’t always a positive thing. And i mean that for both the thing itself as concept as well as case-by-case.
Thanks. That answered my question. I’ve been considering Gentoo…