Linux, the free operating system, has gone from an intriguing experiment to a mainstream technology in corporate data centers, helped by the backing of major technology companies like IBM, Intel, and HP, which sponsored industry consortiums to promote its adoption. Linus Torvalds, the creator of Linux, with the system’s penguin symbol, will assist the Linux Foundation. Those same companies have decided that the time has come to consolidate their collaborative support into a new group, the Linux Foundation, which is being announced today. And the mission of the new organization is help Linux, the leading example of the open-source model of software development, to compete more effectively against Microsoft, the world’s largest software company.
I especially like the idea of folding OSDL and the FSG into this new “Linux Foundation”, rather than starting up (yet) another group.
The war for the server is over. The war for the desktop is about to begin.
The war for the server is over. The war for the desktop is about to begin.
What planet is this on? Last I checked Windows is now (by revenue and server shipments, according to IDC) the dominant server platform, bumping Unix from the top revenue spot in ’05. Linux is in 3rd place behind Windows and commercial Unix.
Linux is great and all, but just because you want something to be true doesn’t mean it is.
You can’t base comparisons of numbers of servers running Linux compared to those running Windows on revenue figures, since many companies can and do install free versions of Linux (or Unix variants like BSD).
The war for the server is over. The war for the desktop is about to begin.
What planet is this on? Last I checked Windows is now (by revenue and server shipments, according to IDC) the dominant server platform, bumping Unix from the top revenue spot in ’05. Linux is in 3rd place behind Windows and commercial Unix.
Linux is great and all, but just because you want something to be true doesn’t mean it is.
Well, if you go by official revenue, Linux is behind Windows (and Unix). But everyone acknowledges that Linux is the most commonly installed *Nix by an order of magnitude.
But that’s not the issue. Replacing every copy of Windows on a server is the wrong battle – the right battle was making Linux a serious, practical, acknowledged alternative. Whatever the politics and underhandedness of Microsoft’s deal with Novell over Linux, the fact that they made one of any kind shows that THAT battle has been won.
As much contempt as I have for Windows, it would probably be a mistake for Linux to wipe the floor with it completely – all the other server contenders are Unix-like, so having just one architecture is probably bad for the dominant one (as the dominance of Windows and IE has been proven to be bad for them – there’s no incentive to improve).
Edited 2007-01-23 04:27
I’m glad the big vendors have realized some of the mistakes they made with UNIX in the past and are working towards a unified Linux platform to really compete in the market place. The constant fracturing not to mention the proprietary lock ins have all helped to make UNIX a very confusing, expensive and uncertain area in computing.
If the Linux foundation ends up doing what it plans, the threat to Microsoft’s near monopoly will be very real. This can only be a good thing as healthy competition has been shown to be advantageous for all concerned.
With more companies working toward standards in Linux, Microsoft will have no other choice but to react. Hopefully, Microsoft will increase it’s drive and release more quality software in the same trend as the last few years.
Congratulations Linux! Not only have you arrived, you have changed some very ingrained assumptions.
I still think this is the wrong battle. It’s like Windows vs OS2 again. It doesn’t matter whose operating system is technically superior – all that matters is the applications. I have at least dozen that require Windows. Linux “the OS” is getting better, but that won’t matter until the developers follow suit – both OSS and commercial.
It doesn’t matter whose operating system is technically superior – all that matters is the applications.
That’s why both OSDL and FSG do very little (if anything at all) for the desktop development (which is taken care of already by the free desktop projects), but rather concentrate on making the plattform more appealing to third party vendors.
So in effect it is “all about applications”
One of the ways to make linux more competetive is to come up with better standards within linux. Software developers will be more likely to write programs for linux if they don’t have to write several different versions. Personally I would like to see software installation/removal modeled on OSX. As much as I would like to see this happen I just don’t think it will. Imagine Gentoo, Ubuntu and RedHat representatives sitting down and discussing software installation. I am not talking about compatability, I mean ONE SYSTEM.
//Software developers will be more likely to write programs for linux if they don’t have to write several different versions.//
This is a bit of a misconception. The exact same source code for Linux would normally compile without change on all versions of Linux distributions. No need to re-write, and probably no need to recompile. It should normally be necessary only to re-link against the different versions of libraries, and re-package into the different package formats, in order to produce versions of your application installable on different Linux distributions.
You simply don’t have to re-write applications for different Linux distributions.
If you are just a little careful, it is even possible to make a “one size fits all” binary for all Linux distributions.
Exactly. The very thing that makes Linux unique (ie. NOT Windows) is all the different versions and the fact anyone can change anything and call it a new distro. As long as that mentality stays, it will never be a Windows replacement. Don’t even get me started on package managers…
The very thing that makes Linux unique (ie. NOT Windows) is all the different versions and the fact anyone can change anything and call it a new distro. As long as that mentality stays, it will never be a Windows replacement.
I disagree, I don’t think this has actually hampered Linux adoption in the least. That doesn’t mean there shouldn’t be increased standardization, as the LSB and freedesktop.org have done (and will continue to do).
Don’t even get me started on package managers…
Package managers are great, and they can work alongside standalone installers/statically-linked installs.
If Package Managers sucked, then why would Microsoft have adopted a form of them for Windows?
Imagine Gentoo, Ubuntu and RedHat representatives sitting down and discussing software installation. I am not talking about compatability, I mean ONE SYSTEM.
That would be horrible. There is a reason their package management is different.
It’s not even a problem that needs solving because each distribution manages and maintains it’s own packages and the software developers don’t really need to worry about it.
As for applications not being available on GNULinux, I actually fine myself annoyed at the fact that most of the applications I use aren’t available on windows.
Imagine Gentoo, Ubuntu and RedHat representatives sitting down and discussing software installation. I am not talking about compatability, I mean ONE SYSTEM.
IMHO this would be theorically possible and not so bad.
The startpoint and pivot of all this should be Gentoo.
Imagine something like this:
– Gentoo (source based, USE flags customizable etc.) is the base. It’s hyperflexible, and it can be really anything you like. So use it as a starting point.
– Gentoo, Ubuntu and RH then figure out what kind of patches, adaptations etc. are characteristic of their packages.
– +ubuntu, +redhat useflags happens for Gentoo packages. These useflags override other useflags (and, if necessary, CFLAGS) for the given packages: they are more meta-flags.
– src-deb Ubuntu source packages and src-rpm RH source packages would now be *equal* to the Gentoo ebuild emerged with standard CFLAGS and the ubuntu/redhat USE flags
– Binary ubuntu and RH packages become just compiled versions of these gentoo-ized packages.
– In the meantime, Portage, apt-get and rpm are made working seamlessly. That is, each package manager now has access and supports transparently each other database/features. In the end, the community would probably settle to one.
– This way Ubuntu and RH releases are really no more than binary installs of a particular Gentoo disk image. Everyone could start from a Ubuntu disk, install it vanilla, but then customize it to the end by using Portage. On the other hand, someone that badly needs full RH compatibility on his gentoo install, can add +redhat to his use flags and emerge -eav system (and the kernel).
Not that A LOT of work would not be needed for this kind of thing, but it would be fun and maybe useful.
//I have at least dozen that require Windows.//
You have at least a dozen for which the vendor has released them only for Windows.
That does not necessarily mean they actually require Windows, it probably means only that the only commonly available binary copies are Windows executables.
If Linux makes as big a splash as it is starting to promise, it will become more routine for companies to release applications in three versions: one for Windows, one for Mac OSX, and one for Linux.
Rather like this commercial application:
http://www.softmaker.com/english/of_en.htm
http://www.softmaker.com/english/ofl_en.htm
… oops! No version for Mac OSX!
//I am not talking about compatability, I mean ONE SYSTEM.//
If a given package will install unmodified on all flavors of Linux distribution, is that not then ONE SYSTEM for all practical purposes? You can have both unification and freedom of choice at the same time that way.
Edited 2007-01-22 11:38
> That does not necessarily mean they actually require
> Windows, it probably means only that the only commonly
> available binary copies are Windows executables.
You forget that Linux and Windows are not source-compatible. Those applications still have to be “ported”, which can mean a major overhaul. Cross-platform compatibility isn’t a simple problem, and can mean a lot of work if not considered from the beginning.
> If Linux makes as big a splash as it is starting to
> promise
Exactly: It has to be a *BIG* splash to justify the porting effort.
Amen!
Desktop users don’t care how superior the OS itself is, provided that it’s not a crashy mess, they only care about accomplishing tasks.
Give them superior software to accomplish all of thier tasks and promote it thoroughly.
And I have many that require Linux. Kile, TexMacs, MonoDevelop, Dillo, GV, gFTP, K3B and so on, really useful, really productive. On my Win laptop almost all productivity apps are Open Source with the exception of VStudio and Windows of course.
I’m sick of hearing that ‘Linux’ should have standardised package management, click and install software or be kinder to newbies.
The whole point is there is no such thing as ‘Linux OS’. Linux is just a kernel. You add what you need to make an OS. You can have a CLI or a variety of desktop environments. You can run the kernel on a PDA, laptop or cluster.
Linux isn’t a desktop environment designed to compete with OSX or Vista though some distros have been built using a Linux kernel to do so.
I’m using PCBSD to write this. It looks and feels exactly like a KDE Linux distro except there ain’t no Linux kernel. The other difference is that it is faster and more stable than any of the many other linux distros I’ve used.
There is no technical reason why SUSE or Ubuntu or any other distro can’t be based on any POSIX compliant kernel.
Edited 2007-01-22 12:11
“The whole point is there is no such thing as ‘Linux OS’. Linux is just a kernel. You add what you need to make an OS. You can have a CLI or a variety of desktop environments. You can run the kernel on a PDA, laptop or cluster.”
And that attitude is exactly the problem.
Linux should be a standardized platform! The more you see Linux as “just a collection of stuff which allows you to create your own OS”, the harder it will be to create software that “just works” on all (or at least most) Linux distros, and the more it will scare off third party software vendors.
As an open source developer, I’m sick of not being able to provide binaries that ‘just work’ for most users. I don’t want to create a RedHat RPM, Mandrake RPM, SuSE RPM, Debian deb, Ubuntu deb, Slackware TGZ, etc etc just to make it easier for my users! I want to create one package which is easily installable by all my users! This, however, is an uphill battle because of people with mentality like yours.
Of course I’m not advocating that all distros must be merged. But similar distros must at least be compatible with each other.
Nobody expects that a package for desktop Ubuntu would work on a PlayStation distro. But look at Fedora, SuSE and Ubuntu – functionally they’re pretty similar: they’re desktop distros. Why shouldn’t I be able to create one package that works on all three of them?
Edited 2007-01-22 12:27
Bah, We don’t even need GNULinux. We can get rid of MACOS too.What we need is a single standard platform.
As a web developer I’m sick of having to make my web applications work in all web browsers, So I think we need a single standard web browser.
I hate how web pages all have different navigation menus, we should have a single standard template for all websites.
“Bah, We don’t even need GNULinux. We can get rid of MACOS too.What we need is a single standard platform. “
Wrong comparison.
A more correct statement would be this: “GNU/Linux and MacOS X should be compatible with each other.” And to an extend, they are – they’re both based on POSIX.
Most Unices are based on POSIX, which makes it easy to port one Unix program to another Unix. Are you against POSIX standardization?
“As a web developer I’m sick of having to make my web applications work in all web browsers, So I think we need a single standard web browser.”
Another wrong comparison.
What we need is that all web browsers are compatible with each other. In other words: they should all support the W3C standards correctly. Are you against supporting W3C standards?
“I hate how web pages all have different navigation menus, we should have a single standard template for all websites.”
Strawman argument. Different website templates don’t cause incompatibilities.
So far all you’ve done is using strawman arguments.
“So far all you’ve done is using strawman arguments.”
I believe jessta was being sarcastic.
Pardon my ignorance, but I have to disagree.
Firstly, how will you enforce the standardization? While it may be possible to bring some major linux distributions together to support a common standard (aka LSB, see the article recently), it is not possible to forbide distro developers to stray from the LSB standard and still call their product a linux distro. So yes, the situation is pretty much like what the OP tried to describe.
Secondly, I am myself an open source (co)developer (although I have at the moment not enough time to pursue most of my projects further at a non-glacial pace 🙁 ), and my practice to distribute my content is to
– depend on libraries/applications, that are common among major distributions and have “civil” licenses, so that they can be included even into strictly FSF/Free distributions
– provide binaries for the operating systems (aka. Linux and BSD distributions in my case) that I happen to run on my computers myself
– provide the source tarball, so that distribution managers / volunteers can roll their own version
While it would definitly make things more convenient (although not necessarily overall more simple) for developers if one true standard were established, this is simply not the way our developement system works. And – counterintuitivly, I have to admit – the more standardization efforts like the crowds at freedesktop.org or the portland group succeed, the more heterogenious the whole ecosystem grows. I can use KDE applications within my XFCE desktop, that use the system tray, no matter what UNIX based operating system powers the computer. I can copy&paste between apps, that use different toolkits, again because of standards like freedesktop.org. But not all libraries/frameworks have synchron release schedules across hard- and software architectures, so that a “vanilla” binary is imho not very realistic.
Linux is a family of OSes, a certain distribution is a OS. And sooner or later, the kernel should make less and less difference between different (UNIX based) OSes, because most people will run KDE/GNOME/XFCE/Fluxbox/… ontop of them.
And we will not have the ability to restrict them to certain choices. And, honestly, we shouldn’t do it, even if we had the ability.
Just my opinion
EDIT: Fixed non-ending sentence
Edited 2007-01-22 13:36
I do not answer the question on how to force standardization. I’m just saying things should be standardized and compatible. I’m fighting against the mentality that Linux distros should not be compatible with each other.
People also seem to confuse “standardization” with “the same implementation for everyone”. That is not what I’m saying. I want the interfaces to be compatible so that it doesn’t matter what implementation you use. But frankly, that isn’t the case – witness the fact that you can’t easily compile Inkscape on Fedora and run the same binary on SuSE and Ubuntu, without many tricks.
// I want the interfaces to be compatible so that it doesn’t matter what implementation you use. But frankly, that isn’t the case – witness the fact that you can’t easily compile Inkscape on Fedora and run the same binary on SuSE and Ubuntu, without many tricks.//
Sigh! You don’t need the binary to be compatible, you need the source to be compatible.
Take the same source and compile it once on a Debian-based system and make a .deb package, and then compile the same source again on a Fedora/SuSe/Mandriva system (all systems LSB compliant) and make an RPM package.
Those two binary packages should be enough to cover most Linux distributions.
“Sigh! You don’t need the binary to be compatible, you need the source to be compatible”
No, you do need the binary to be compatible. Do you really think the average user knows how to, or even wants to compile software? They don’t care what binary compatibility is, they want their software installed in the quickest way possible. You are just searching for excuses to ignore this very serious usability problem.
They don’t care what binary compatibility is, they want their software installed in the quickest way possible. You are just searching for excuses to ignore this very serious usability problem.
What usability problem? Do you even know what you’re talking about?
Users want their software installed in the quickest way possible, which is why they’ll fire up the Add/Remove Programs app, check the little box next to the software they want to add, then click on the “Install” button. How more freakin’ simple can it be?
“What usability problem? Do you even know what you’re talking about?”
Read this use case: http://osnews.com/permalink.php?news_id=17023&comment_id=204409
I read it, but I don’t fully agree with it.
For one, Klaas’ friend should have told him that the version he had installed could not read .mkv yet, and that he would have to wait until it’s ready.
His friend should also have installed a popular distro, for which packages would be available quickly.
Next, the Totem team could have provided packages for the most popular distros, or a standalone installer.
In any case, Klaas shouldn’t have to compile anything.
What if mkv support is only available in some codec package that is not supported on that distro.
Should Klass reinstall the whole OS to get a simple app. What if later he needs another package that is only supported on a different “popular distro”?.
No distro has package for every linux program. Besides limiting the landscape only to popular distros is not what linux was meant for.
If the packages are popular, then chances are that they *will* be available for popular distros. This situation is only problematic for obscure/highly-specialized programs.
E.g. for a highly-popular distro like Ubuntu you’d be hard-pressed to find actual examples to support your hypothetical cases.
//No, you do need the binary to be compatible.//
No, you do not need the binary to be compatible, because it is fairly trivial to make a number of different binary packages from the one source.
//Do you really think the average user knows how to, or even wants to compile software?//
I don’t say that users should compile software, I say that the vendors should compile the software. A couple of times for the main variants of distros.
//They don’t care what binary compatibility is, they want their software installed in the quickest way possible.//
Yes. That way, for Linux, is to find a package compatible with your distro. There are only a very few different types.
//You are just searching for excuses to ignore this very serious usability problem.//
You are ignoring that it is not a problem.
Vendors can easily make a binary-compatible cross-distribution Linux application (such as this one, for example: http://www.softmaker.com/english/ofl_en.htm), or at the very worst makes a few different binary packages from the same source.
softmaker is a special case because they use their own toolkit. No gtk+ or qt.
I do not answer the question on how to force standardization. I’m just saying things should be
standardized and compatible.
But how? Don’t get me wrong, if you can come up with a way to enforce the compability between distributions without sacrifying the flexibility of a modular, decentralized developement model, I would be more than willing to hear your suggestion and aide in implementing it.
I’m fighting against the mentality that Linux distros should not be compatible with each other.
Well, nobody says that distributions should be incompatible. But without deciding what versions of Fedora or OpenSUSE (both relaese rather often without a synchronized schedule, so what amount of compability is achievable between two targets moving with different speeds not necesarily into the same direction?) should be compatible, I don’t see a practical way to ensure cross platform binary compability. Especially, should a binary compiled on say a OpenSUSE 10.1 system be compatible with a Fedora core 6 system. What about OpenSUSE 10.0, and FC 4. What about Debian sarge ?
That’s a big part of the problem!
Most of the times binaries don’t work across distributions, because of incompatible versions of the used libs/compilers and because of non-standard patches / locations / modifications. While the latter could probably be attacked using a standard a la LSB , the problem of different (incompatible) versions is much harder to tackle.
Either you force the developers of the libs/tools to freeze their API/ABI so that a certain amount of compability is possible even across version numbers (but this is not a problem within the scope of the linux distros, this is an issue one should adress at the lib developers) or you force distromakers to provide certain versions for crucial libs/tools. Again, this is a target of the LSB standard, but it is (again) not mandatory for distros to stick to it.
(side remark, my two cents on a related topic brought up by another poster in this thread):
This is only my personal opinion, but statical linking against a sytem-wide lib is definitly NOT a good practice for developers of open source applications, especially for security reasons. Don’t do this, if it is not really really necessary!
If a security alert comes out for lib xyz, chances are good you (as a JoeUser type of enduser) will recive a
patched version over the usual update channel (package management, announcement, etc.). Chances are also good, that the typical user will have no idea, that his statically linked, third party, non-repository app will also be in need of an update / recompile. Speaking about security holes
EDIT: added (against a sytem wide lib) in the sentence about static linking
Edited 2007-01-22 14:32
Don’t get me wrong, if you can come up with a way to enforce the compability between distributions without sacrifying the flexibility of a modular, decentralized developement model, I would be more than willing to hear your suggestion and aide in implementing it.
There is no way to inforce something in the meaning of legally binding, but for example being LSB compatible does not mean sacrifying flexibility, since the LSB only describes how a system should look from the point of view of an LSB compatible application and not how it ought to look to applications provided by the distribution itself.
Of course it will require less effort to be LSB compatible over all, so distributions with long release cycles (“enterprise” editions) might adjust their system to the LSB requirements, while fast releasing distributions can implement LSB requirements more in the sense of a compatability layer.
Again, this is a target of the LSB standard, but it is (again) not mandatory for distros to stick to it.
Quite true, however LSB compatability is becoming increasingly important for distributions and usually an offical goal, even for community driven ones like Debian.
I think this goal has become more popular recently when LSB began including desktop related things, because earlier versions of LSB were concentrating on server things and thus mainly interesting for “enterprise” distributions which tend to prefer having direct ISV certification rather than certifying for standards compatability.
As an open source developer, I’m sick of not being able to provide binaries that ‘just work’ for most users. I don’t want to create a RedHat RPM, Mandrake RPM, SuSE RPM, Debian deb, Ubuntu deb, Slackware TGZ, etc etc just to make it easier for my users! I want to create one package which is easily installable by all my users!
Then use a stand-alone installer! Sheesh…
By the way, the binary will work on all these distros (as long as it’s on the same type of processor), it’s the linking to libraries that will vary from one distro to another. So why don’t you take your little app and provide a standalone, statically-linked installer version on your web site, and let the distro makers make the distro packages – which is something *they* are supposed to do anyway…
The fact that you seem to be unaware of this makes me wonder how real your claim to be an FOSS developer really is…what software have you produced, exactly?
“Then use a stand-alone installer! Sheesh…”
Hey, if it’s that easy I wouldn’t have bothered typing all this! See below for more info.
“By the way, the binary will work on all these distros (as long as it’s on the same type of processor), it’s the linking to libraries that will vary from one distro to another. So why don’t you take your little app and provide a standalone, statically-linked installer version on your web site,”
Let me give you some numbers. Here’s an example WxWidgets C++ app. All binaries are stripped of debugging symbols.
– Source code: 136 KB
– Dynamically linked Windows binary with WxWidgets DLLs included: 2.5 MB
– Dynamically linked Linux binary: 104 KB
– Statically linked Linux binary: 35 MB
Do you see the problem? It’s already bad enough that I have to include WxWidgets on Windows. On Linux (if I statically link the app) I not only have to include WxWidgets, but also an entire copy of glibc, libstdc++, pthreads, GTK and Xlib!
“and let the distro makers make the distro packages – which is something *they* are supposed to do anyway… The fact that you seem to be unaware of this makes me wonder how real your claim to be an FOSS developer really is…
I am well aware of that, but I don’t agree with it. Did you somehow miss the recent OSNews article on Linux software installation? Read http://osnews.com/story.php/16956/Decentralised-Installation-System… and see what’s wrong with current centralized installation systems.
Here’s a simple use case for you:
Klaas is not very technical – he doesn’t know Unix commands and never used DOS on Windows. But he’s competent enough to be able to install programs on Windows with no problems. Furthermore, Klaas is an anime fan and downloads anime fansubs (video files) from the Internet very regularly.
Klaas’s computer has Windows XP and is riddled with viruses. He calls his friend Joe Geek. Joe installs Ubuntu 6.10 for his friend Klaas. Joe tells Klaas that he will never get viruses again. Klaas happily uses Linux. One day he downloads a .mkv video file and wants to play he. He finds out that the Totem video player (version 1.2.3) doesn’t support it. He goes to the Totem website to see whether there is a new version which supports it – and there is! Version 4.5.6 was just released. He opens Synaptic (as Joe told him earlier) and looks for Totem. But too bad – the latest version on the repository is 1.2.3. Klaas downloads the Totem source code (even though he doesn’t know what source code is) but couldn’t figure out how to install Totem. After 10 minutes, he reads the INSTALL file, but still can’t figure it out.
As time passes, more and more fansub groups release anime as .mkv, but Klaas can’t watch them. For some reason Totem 4.5.6 still isn’t in the repository after a month. Disgusted with Linux, he asks Joe to either upgrade Totem for him, or remove Linux and install Windows.
This is very common problem. If you look at Linux forums you’ll see that things like this are among the top 10 of Linux questions. If you read Slashdot and OSNews you’ll see that software installation is among the top 10 complaints about Linux.
“what software have you produced, exactly?”
I have contributed some things to GNOME. One of my more recent projects is Autopackage, a cross-distribution software installation framework. Our goal is to make software installation in Linux “just work” and as easy as possible. In the past 4 years, we’ve encountered many, many problems regarding binary compatibility and desktop integration on Linux.
Let us use Inkscape as an example. Inkscape, a well-known high-profile program, is written in Gtkmm and C++, and uses the Boehm conservative garbage collector. Read the Autopackage mailing list archives to see all the problems that Inkscape has had with providing cross-distribution binaries. It’s an uphill battle.
As time passes, more and more fansub groups release anime as .mkv, but Klaas can’t watch them. For some reason Totem 4.5.6 still isn’t in the repository after a month.
Well, that’s the crux of the problem, isn’t it? The key is to use a distribution that is up-to-date (yet stable). The onus is on the distro makers to make sure that up-to-date packages are available for their distro.
I have contributed some things to GNOME. One of my more recent projects is Autopackage, a cross-distribution software installation framework.
All right, my apologies. I have used Autopackage and it is in fact an impressive piece of work. Sorry for doubting you.
I understand that the package manager way can be frustrating, and that statically-linked universal installers make for big downloads…but the fact of the matter is that for the vast majority of users these issues are not a priority.
If you look at Linux forums you’ll see that things like this are among the top 10 of Linux questions. If you read Slashdot and OSNews you’ll see that software installation is among the top 10 complaints about Linux.
The fact that software installation is often criticized doesn’t mean much by itself. Some complain that it is different from Windows. Well, yes it is, but that’s not a flaw in itself. As I said, I’ve never heard anyone say that this was a reason they wouldn’t switch to Linux – in that sense, availability of key apps and hardware compatibility are much more pressing issues.
I aplogize too. I sounded very frustrated but that’s because I’ve been typing things like this for the 300th time.
I understand. It’s true that you guys are fighting an uphill battle…perhaps some of your efforts should go into communication efforts with the new Linux Foundation to get some help from them? After all, as commendable as your efforts are, unless it becomes some sort of standard Autopackage becomes just another way of doing things. This is the hard part of trying to solve this particular issue – if you do it on your on, without participating in the current consolidation efforts, in some sense you’re part of the problem, not the solution! I strongly urge you to approach the new LF whenever they have become incorporated, and see what you can do to make Autopackage the “official” alternate method for standalone application installations.
>I understand that the package manager way can be >frustrating, and that statically-linked universal >installers make for big downloads…but the fact of the >matter is that for the vast majority of users these >issues are not a priority.
For vast majority of *current* linux user base they are not critical. For all others , well at least from my experience they are right after hw support.
>As I said, I’ve never heard anyone say that this was a >reason they wouldn’t switch to Linux – in that sense, >availability of key apps and hardware compatibility >are much more pressing issues.
This is quite connected not only intallation/compatibility problems limit availability of OSS apps in most scenarios, but the ISVs won’t come by untill the installation/binary compatibility problem is solved. Currenly their barriers for entry are too high compared to potential gains of linux market share, esp. on desktop.
This is quite connected not only intallation/compatibility problems limit availability of OSS apps in most scenarios, but the ISVs won’t come by untill the installation/binary compatibility problem is solved. Currenly their barriers for entry are too high compared to potential gains of linux market share, esp. on desktop.
I disagree. It’s certainly not an obstacle for Google, Openoffice, Adobe or Codeweavers, who have all provided universal, statically-linked binary installers.
I think the small market share vs. the costs to port popular applications fro Windows to Linux is a *much* bigger hurdle for ISVs than competing package managers…
Package manager format is the smallest of the issues that isv’s have to face with.
Take a look at autopackage dev list to learn that’s only a tip of the iceberg.
Others are:
– lax binary compatibility between distros (down to glibc level)
– lack of standarized platform (multitude of frameworks, incompatible/missing libraries) that virtually force you to make bloated statically linked binary that poorly integrate with the rest of the desktop
– immature/not existing system wide services/apis that are present on windows/mac like colour management, document embedding (that would work everywhere), etc.
– lack of standarized apis of augementing the system :from simple installing menu shortcut (that’s at least being adressed) to adding stuff like new control center icons, adding fake printer drivers.
– inability to add kernel 3rd party kernel drivers for specialized devices
Besides lack of central resource ang guidelines for devs is hurting smaller developers that could have more incentives to support linux to diffrentiate feom the gorillaz
why don’t you release your binary as LSB packages- they weill work on KDE/ubuntu
//why don’t you release your binary as LSB packages- they weill work on KDE/ubuntu//
Exactly.
Here is the comment of one dev for Linux on this topic:
http://www.groklaw.net/comment.php?mode=display&sid=200701220535195…
“I am a developer for the Gnu/Linux environment. I program to the glibc APIs, which are in all distros by default, and use toolkits like gtkmm, which are optional for some distros but which can be chosen by the user for any distro. Consequently, any distro can run my stuff. This is true for most application programmers in the Gnu/Linux environment. We do not have to ‘tweak’ anything.”
There you have it. “Any distro can run my stuff”.
This whole topic is a storm in a teacup.
“And that attitude is exactly the problem.
Linux should be a standardized platform! The more you see Linux as “just a collection of stuff which allows you to create your own OS”, the harder it will be to create software that “just works” on all (or at least most) Linux distros, and the more it will scare off third party software vendors.
As an open source developer, I’m sick of not being able to provide binaries that ‘just work’ for most users.”
This is simply not going to happen on linux. Just like you’re not going to get websites to follow consistent UI guidelines.
It’s like the Second Law of Thermodynamics. Once you start with disorder in a system, it is much harder to bring it to a state of greater order and much easier to bring it to a state of further disorder. The objective would be to keep the system the same, stasis, or come up with a new system.
Edited 2007-01-23 00:15
The whole point is there is no such thing as ‘Linux OS’. Linux is just a kernel.
You’ve lost this argument. For the sake of simplicity, any platform based on Linux is called Linux. If it looks like Linux and it isn’t, then it had better act like Linux, because, well, that’s just the way things are moving.
> Linux is just a kernel. You add what you need to make
> an OS.
I think you are a bit confused here. You argue that Linux should not use a unified package management system, but yet you argue that Linux should use a unified file system model, or a unified processing model.
To support this claim, you essentially state that package management occurs in user-mode, while processing management and file system management happens in kernel-mode. By relying on such implementation issues, you make a random choice about what should be standardized and what shouldn’t.
Wouldn’t it be better to concentrate on those things that are important to third-party application developers? Standardized file formats (including configuration files) and APIs, High-level (semantic) APIs, …
…in an age where 95% or higher of the computing populace runs a variety of compatible hardware from God knows how many different vendors, a variety of compatible software from God knows how many different vendors is somehow a problem?
I agree with unclefester: Sick and tired of hearing this bull.
“a variety of compatible software from God knows how many different vendors is somehow a problem?”
They aren’t compatible, which is the whole problem! Compile Inkscape on Fedora and see whether that same binary works on SuSE and Ubuntu.
They aren’t compatible, which is the whole problem! Compile Inkscape on Fedora and see whether that same binary works on SuSE and Ubuntu
OK, then, PC’s aren’t compatible, either. Try using an Nvidia driver with an ATI graphics card – on Windows – for example.
“OK, then, PC’s aren’t compatible, either. Try using an Nvidia driver with an ATI graphics card – on Windows – for example.”
Users don’t expect to be able to use ATI drivers with NVidia cards. But why are you even comparing binaries with video cards? On Windows, a developer can compile a binary on Windows XP and it’ll work pretty much everywhere. Why should Linux be any different? Give me a good reason why a user shouldn’t expect that Linux binaries work on most Linux distros, just like Windows binaries work on most Windows systems.
Users don’t expect to be able to use ATI drivers with NVidia cards. But why are you even comparing binaries with video cards? On Windows, a developer can compile a binary on Windows XP and it’ll work pretty much everywhere. Why should Linux be any different? Give me a good reason why a user shouldn’t expect that Linux binaries work on most Linux distros, just like Windows binaries work on most Windows systems.
Well given that all Windows systems are produced by one developer, I should hope so. But that’s just the point: Windows lacks the flexibility to be used by everyone from command-line loving, installation-tweaking geeks to secretaries to GUI-loving administrators – which is why everyone who installs Windows has so much extraneous baggage. The different package managers are targeted towards the different markets that Linux is suitable for; this is no more wrong than not installing g++ and lint on a system that’s only ever going to be used to run amaroK and OpenOffice.
Most binaries /do/ work on most Linux distros – with the distribution’s package manager. Even proprietary software like VMware runs on relatively obscure distros like Gentoo.
//They aren’t compatible, which is the whole problem! Compile Inkscape on Fedora and see whether that same binary works on SuSE and Ubuntu.//
They aren’t necessarily binary compatible. They are quite likely to be.
If you have the source for Inkscape (or whatever program) for a Linux target platform, then of course you do not need the binary to be compatible. Instead of compiling it on a Fedora install, compile it on your own distribution.
“If you have the source for Inkscape (or whatever program) for a Linux target platform, then of course you do not need the binary to be compatible. Instead of compiling it on a Fedora install, compile it on your own distribution.”
And that is exactly the mentality which keeps Linux from succeeding on the desktop. “Oh we don’t need binaries because everybody can recompile!”. Yeah, try explaining to my grandmother how she’s supposed to compile software. As soon as you mention the word “compile”, you’ve already lost her.
Really, thousands and thousands of people on OSNews and Slashdot have complained about this for years now, yet people like you are still ignoring the problems.
And that is exactly the mentality which keeps Linux from succeeding on the desktop. “Oh we don’t need binaries because everybody can recompile!”.
What the heck are you talking about? Just use Ubuntu, for Pete’s sake: all of these apps are available. I never compile apps for my Ubuntu laptop, unless I want to try some beta software that hasn’t been released yet (and that no grandma should ever run, being unstable).
The myth that you *have* to compile stuff in Linux is completely false. Shame on you for perpetuating that FUD.
Also, you should know that the binaries are compatible from one distro to the other, however the libraries aren’t always in the same place or have the same name. That is the main source of incompatibility, not binary incompatibility, which means that ISVs are free to make “universal installers” if they do.
Again, this has nothing to do with Linux adoption. I’ve never heard anyone say “I’d try Linux, but I’m concerned about cross-distro compatibility.” Never.
“What the heck are you talking about? Just use Ubuntu, for Pete’s sake:”
So you’re saying “Everybody must use Ubuntu. If you don’t, then you deserve to die.” Do you honestly not see the problem?
Sure it’s easy for you to say that, but as a developer it’s my moral duty to installation as easy as possible for all my users, including the non-Ubuntu users.
So you’re saying “Everybody must use Ubuntu. If you don’t, then you deserve to die.” Do you honestly not see the problem?
…or use Gentoo, or use a distribution that has up-to-date packages, like Cooker.
Is this an ideal solution? No, it isn’t. But it’s not the catastrophe that some make it out to be.
That said, as a developer of an alternate way to package apps, I can understand how frustrating it is to try to keep up. That said, I do think there are more pressing issues with Linux right now.
Sure it’s easy for you to say that, but as a developer it’s my moral duty to installation as easy as possible for all my users, including the non-Ubuntu users.
The problem is that you’re trying to do it as an afterthough, i.e. trying to proved installers for programs that weren’t meant to be installed in a stand-alone way. While this is commendable, it’s an uphill battle, as you say.
A distro-neutral solution such as Autopackage should not seek to replace package managers, because that simply cannot be done (and shouldn’t be done). It should be seen as an alternate to install commercial/stand-alone apps, and in a way I think that’s what you’re aiming for. It seems to me the *developers* should be the one considering these solutions as early as possible in the development process.
As for providing packages, it’s up to the distros to do this, but of course there’s nothing preventing developers to give a helping hand.
“So you’re saying “Everybody must use Ubuntu. If you don’t, then you deserve to die.” Do you honestly not see the problem?”
I want to add something about this. Obviously, I don’t want everyone to use the same distro. However, the truth is that there are two ways in which standardization can happen.
The first is through negotiations between all involved parties to set forth common ways of doing things. That’s the preferable way, but for some reason (egos, the NIH syndrome, honest divergences of opinion) it doesn’t always work.
The second way is for a particular way of doing things to become so prevalent that others start adopting it because they don’t want to become marginalized. It could very well be, for example, that Ubuntu becomes so popular that many developers only provide .tar.gz and Ubuntu debs (that is already happening in some case, sometimes with RedHat rpms as well).
The fact is that if Ubuntu continues to enjoy the current growth it’s experiencing among Linux users, Ubuntu debs could very well become the de facto standard (in addition to tarballs, of course). I’m not saying this is a good thing, but it is certainly a possibilit…
Anyway, sorry again for snapping at you. I’ve stopped smoking yesterday and I think I’m a little on edge! 🙂
All in all I think this is a healthy debate to have – thouh it is somewhat off-topic for this particular article.
Edited 2007-01-22 16:33
Ubuntu, doesn’t have much presence in the enterprise where the money are.
Still limiting the support to 2 branches (say Ubuntu & Suse) would be kind of solution.
For developers, sure…but remember that ultimately it’s up to distro makers to make sure the packages are available (and up-to-date) for their distros.
In any case, we’re already seeing the emergence of a kind of “de facto” standard in ISVs supporting SuSE and Ubuntu. That said, it doesn’t mean that a debate about the future of software installation isn’t relevant, or needed.
Ok, I can name a few I’ve run into:
libsidplay2 (compiles incorrectly, I submitted a patch I found on launchpad that fixes the issue)
xsidplay (doesn’t support libsidplay2, I recompiled it to do that and gave instructions on launchpad)
filelight (crashes on close. Version 1.0 fixes that issue, was released in August; Fiesty Fawn still has the previous buggy version)
Now, these are niche applications, the sidplay ones especially so… but these are known bugs with known fixes and nothing’s been done, really.
If there were a single cross distro compatible build system, maybe the problem could be settled (as xfce intaller tried to) , but in many cases compiling nontrivial software package (in generic tgz form ) is quite involving even for seasoned unixer, much more than advertized configure; makeinstall :
– u have to figure out all build dependencies, all of them with correct versions needed for your package. if not reported by configure, prepare for long searching in obsure places
– u need all runtime dependencies to run it, and too have to figure this out (source and binary package names doesn’t map perfectly)
– thank god if they are all packaged for your distro, if not you have to start over recursively (that one disqualifies user executed source install imo)
– the compiled executable is not compatible with your packaging system, but sits in /usr/local, there is no easy way to get rid of it after u delete the compiled source, upgrading is a hurdle (you have to manually recompile all dependant parts) – RedHat charges for rhn for a reason
– the autotools/libtool version installed on your system may be incopatible with the one package requres, configure bails out with obscure message
– some packages don’t like other’s dev headers sitting in usr/local and bail out during configure stage
This all used to work in times of simple standalone KISS unix cmd utils, but we now live in era of multi milion LOC frameworks.
u have to figure out all build dependencies, all of them with correct versions needed for your package. if not reported by configure, prepare for long searching in obsure places
This to me is more often than not the biggest issue with compiling software. It would be great if developers at least indicated which dev packages are necessary in the tarball’s README. I know that the dev packages don’t all have the same name, for all distro, but *any* indication would help…
the compiled executable is not compatible with your packaging system, but sits in /usr/local, there is no easy way to get rid of it after u delete the compiled source, upgrading is a hurdle (you have to manually recompile all dependant parts) – RedHat charges for rhn for a reason
That’s not really an issue with Checkinstall. I highly recommend you give it a try, as it basically solves all the issues you’ve indicated.
As ArcheeSteel said, there are many binary compatibility issues on linux, most ranging from lack of interest for the issue in the core developers commuinty (going down as deep as to gcc and glibc).
You have to be aware of many, not documented, obscure lowlevel issues to develop reasonably portable binary without resorting to static linking.
Actually it often does . You just have to know how to get the files out of the package without the proper package manager usually.
With KDE applications I’m not sure (C++ is has a nightmarish ABI) but with Gtk programs it’s usually not a big deal to build once for an arch and run everywhere.
Low level applications that work with kernel interfaces are a whole different story though.
There was an article last week about making a standard update API that all of the distros can plug into:
http://www.linux.com/article.pl?sid=07/01/10/2045258
But a lot of the problem with installing across many distros, for some projects, is a lack of discipline in adding dependencies to their software. If a developer is careful enough, he can avoid unnecessary dependencies, and write portable code that avoids cross-distro incompatibilities.
And often, it might be advisable to link statically with one or more libraries, rather than with a shared lib. It will make the executable slightly larger, but that might be a small price to pay for the relief of headaches. This can be especially true for C++ libs, where the C++ ABI can change between versions.
We need to emphasise the use of open standards rather than standardising platforms.
Thousands of different models of cars, motorcycles, trucks and buses from numerous manufacturers coexist on our roads. That is because transport is an ‘open standard’ with well defined protocols.
However many people think that we should all drive identical white Honda Civics because it would be simpler for mechanics and motor repairers to stock parts and repair them.
The ruthless Darwinian world of Open Source means that only the best survives. Go to sourceforge and see the thousands of stagnant projects that have been sidestepped or forked.
I wouldn’t be suprised if the Linux kernel goes the same way as Hurd in a few years and it would be no tragedy if something better replaced it.
I have recently entered the post-Linux world of FREEBSD without any real dramas after 2 years full time on Linux. Prior to that I was an XP, 2K, Win 98, 0S7.1 – OS8.6, DOS, Commodore 64 and PDP11 user.
The world constantly changes and we need to change with it.
Edited 2007-01-22 13:46
+5!
” The world constantly changes and we need to change with it. ”
Amiga. Er, I mean “Amen”!
“We need to emphasise the use of open standards rather than standardising platforms.”
Aren’t they the same thing?
Not really. A standardized platform is a standardized implementation. Windows or OS X are standardized platforms, for example. An open standard is a standard interface, presumably supported by a number of different implementations.
That said, at some point you do have to standardize the platform. If you encode everything that an application might depend on into a standardized interface, you just end up with a big, ugly, complex standard. In practice, there is a de-facto standardization of implementations. While X11 is an open standard, basically every major desktop distribution uses X.org’s implementation. Anybody could come along and implement GTK+’s API, but in practice everyone uses GTK+. Sitting down and saying “you shall use X.org, GTK+, Cairo, blah, and you should put them in these locations” doesn’t really reduce the flexibility of distributions in any practical way. Who really cares what the GTK+ shared library is called, after all?
Of course, there is no way to enforce standardization between distributions. What might be a more doable approach, and one that would be almost as useful for developers, would be to synchronize the schedules of the major APIs. You should be able to point to a “Linux Platform 1.x, which is guaranteed to have GTK+ 2.8+, Cairo 1.2+, etc”. Even if the package formats and library configurations aren’t standardized, having a stable target for all your major APIs would be tremendously useful.
You should be able to point to a “Linux Platform 1.x, which is guaranteed to have GTK+ 2.8+, Cairo 1.2+, etc”. Even if the package formats and library configurations aren’t standardized, having a stable target for all your major APIs would be tremendously useful.
Yes, I agree, this would be a good step forward. And again, this doesn’t need to be enforced; rather, it should be a certification (like an ISO certfication), i.e. a particular distro could be said to be compatible with “Linux Platform 1.0″…
If the standardization has something behind it it’ll bring in distributors. You can try it politically and I’d guess you’ll get mixed results. Instead I think the best way to get distributors to abide by standards is to provide a central package that works on any distribution following your standard.
You might work from an existing attempt (autopackage) or you might start over. But you have to:
1.) Make it easy to build packages so that developers will use it.
2.) Make sure it works if the distributors follow your well documented standard.
“The ruthless Darwinian world of Open Source means that only the best survives. Go to sourceforge and see the thousands of stagnant projects that have been sidestepped or forked.
I wouldn’t be suprised if the Linux kernel goes the same way as Hurd in a few years and it would be no tragedy if something better replaced it.
I have recently entered the post-Linux world of FREEBSD without any real dramas after 2 years full time on Linux. Prior to that I was an XP, 2K, Win 98, 0S7.1 – OS8.6, DOS, Commodore 64 and PDP11 user.”
I think you’re right about the Darwinian enviornment, and it would certainly be interesting to see Linux replaced by something better.
(as opposed to Linux BECOMING something better?)
But FreeBSD? Not that it’s not a great OS in its own right, but what has it got that promises to send it beyond Linux?
Will they be handing out new ferrari laptops?
“I’m sick of hearing that ‘Linux’ should have standardised package management, click and install software or be kinder to newbies.”
Then what the heck is the group for? Why compete with Windows with “just a kernel?”
Windows is a (sort of) complete experience and geared toward “newbie” users. Why bother forming a group to compete with the “NT Kernel?”
Is all I have to say to the Linux Foundation!
Microsoft takes very good care of its developers – so does Apple.
When the Linux Foundation can run a goddamned busines and make profit like Apple and Microsoft, I might pay some attention. These guys come off as panhandlers constantly asking for paypal donations.
Sorry this battle is OVER! Linux has lost and Mac and Windows have won hands down!.
When the Linux Foundation can run a goddamned busines and make profit like Apple and Microsoft, I might pay some attention.
I’m not sure you understand what a foundation is…
These guys come off as panhandlers constantly asking for paypal donations.
Who are you talking about, exactly?
Sorry this battle is OVER! Linux has lost and Mac and Windows have won hands down!
Thank you for this utterly useless comment. Please get a clue.
linux orgs were fragmented to prevent a unified challenge to MS FUD, we had to compete on merit, unfortunately microsoft does not
i especially like the idea of combing the osdl and fsg
If we look around, we can see that there is such wide variety in everything and there is a lack of duplicity in everything other than what man has manufactured via industrialization.
No two tulips exactly the same, Creative mind of developer is reflected in his OSS and finally in a distro.
Notwithstanding these facts, a little sacrifice on part of people who release distro, adhering to a common guideline is necessary for Linux (Gnu/Linux) to succeed comercially.
This is also necessary because, if a person is trying to use any distro x,y,z to get a task done, he should not waste his time like a Linux enthusiast does, experimenting various permutations and combinations to setup his system. That is, Distros in addition to keeping there individuality especially of package management should also open their doors to a common package management across the entire spectrum of linux distributions.
People might frown, why two package management? the answer is – let the individuality die a slow death. Let the debians and Ubuntus have their debs and let fedorans and Suses have their rpms. But let them support a common package management also.