Red Hat’s Havoc Pennington blogs about the issue of dependency hell and weighs in the good and bad of static linking. He also links to an interesting article by Jono Bacon.
Red Hat’s Havoc Pennington blogs about the issue of dependency hell and weighs in the good and bad of static linking. He also links to an interesting article by Jono Bacon.
Fact is that Linux and it’s progammers will never ever replace Microsoft on the desktop because of issues like this and many more. This is why linux is nothing more then a niche OS with very few users. No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.
Isn’t this issue rendered moot by sane package management, with depedency tracking and easy to write specs/ebuilds/etc.?
The managers of large monolithic blocks would have to do this internally anyway.
But you normally don’t see it because everything usually comes with everything! You have multiple copies, over and over again, of the same DLLs all over the system. But try to install programs that need some special DLL and don’t come with it, and you have to track down some obscure filename to figure out where to get it from.
As long as there are shared libraries there are going to be dependancy problems. The point of package management is to provide a pretty common way of addressing what programs need what to run. If you start compiling and installing your own things, and you start needing specific libs that aren’t included with your distribution, well… you’re kinda on your own. Hopefully if you are donig that, though, you already know how to resolve those types of issues.
Anyway … I have no problem using apt/dpkg, yum/rpm, emerge, whatever to handle these types of things. Except for the occasional error where someone made a package that had incorrect dependencies, I don’t see a problem.
What about DLL hell? It’s as valid as Dependency hell. Check out Joel Spolky’s paper on why MS has lost the API war…
http://www.joelonsoftware.com/articles/APIWar.html
Now, stop trolling and let the grownups talk.
Fact is that Linux and it’s progammers will never ever replace Microsoft on the desktop because of issues like this and many more. This is why linux is nothing more then a niche OS with very few users. No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.
Even if you are right (and you may be), the desktop isn’t everything. Embedded devices are growing faster and the server market yields more cash for less headache.
If Linux never manages to grow on the desktop but continues its present rate of growth in the other two markets, both the Linux movement and the companies who support it will be sitting pretty, and MS will be in bad shape.
Of course that’s not any more inevitable than your proposition. A decade ago Unix was supposed to be dead and the Alpha was supposed to be the future.
“Fact is that Linux and it’s progammers will never ever replace Microsoft on the desktop because of issues like this and many more. This is why linux is nothing more then a niche OS with very few users. No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.”
You can’t spell D-L-L Hell can you? Shared libraries are a good thing. Applications could come packaged with their deps, but that’d be silly. The solution has been around for ages:
Ports
portage
pac
apt
yum
Must we solve this problem again and again? Download apt for rpm and let it rest.
If you’re going to troll from my IP range, at least do a good job.
BTW, whoever this “Havoc” (if that’s his REAL name) character is, he needs to stop whining. Open source is all about fixing solutions to your own problems. In case this person didn’t notice, the source is available! Feel free to fix it yourself instead of whinging!
“Must we solve this problem again and again? Download apt for rpm and let it rest.”
A lot of problems disappear when discipline is followed by people further up the chain. Package managers, and shared libraries can only do so much in the face of a “lassez-faire” attitude.
>whoever this “Havoc” (if that’s his REAL name)
> character is, he needs to stop whining.
And you need to get a clue.
I’ve used Windows and Linux for a while now. The fact is that DLL Hell just doesn’t exist anymore. It hasn’t existed since the days of Windows 95 when programmers were still tring to get their stuff on the new Win32 architecture. Why is this? Because Microsoft has made sure to keep compatibiity. Of course, they sacrifice a lot of progress because of this. They can’t make changes that would break compatibility. I’m running Gnome on Debian. Installing software is a breeze – if they are in an apt repository and all the dependencies are there. The fact is that you don’t get dependency problems on Windows now because everything is so unchanging and stable. With Linux, new versions come out all the time and sometimes they break compatibility. Frankly, for me, it doesn’t bother me too much because I know what I’m doing (to a point). I like having the most cutting edge technology. I like that OSS programmers break compatibility to make things better. I don’t mind using apt as long as everything is there. Of course, the average schmo is going to find this a problem. He’s not going to understand it beyond it not working.
You can’t have every program bring everything with it. That leaves you with two options. Have a set of packages that programs can depend on that rarely (and by rarely I mean AT MOST once per year) change and therefore, rarely progress. Or have package management solve the dependencies that programs have.
Package management works great for me. Well, except when I try to install something it can’t solve the dependencies for (like Muine which is currently broken in Debian unstable). I prefer it to having libraries that don’t change, but the fact remains that most people aren’t cutting edge types. They don’t want to know that software development is a process. They want to see something and be told that it is finished – like a car or a blender. Package management is better technologically, but doesn’t sit as well with most people.
“No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.”
Linux=lego blocks
Some of use like ripping out individual blocks, or groups of blocks, and rebuilding our systems as we please.
and you must be clueless. Ever heard of a thing called URPMI ? Works just as good if not better the apt-get or emerge without the 50 hour compile time.
Havoc Pennington is one of the most famous developers in GNOME world. For example, he is the author of Metacity, the default GNOME window manager. Anyone who wrote “whoever this Havoc is, he needs to stop whining”, seriously, needs to get some clue.
It is not an abuse at all.
> and you must be clueless. Ever heard of a thing called URPMI ? Works just as good if not better the apt-get or emerge without the 50 hour compile time.
And you are clueless. apt-get doesn’t compile at all.
.deb .. .rpm…. freebsd packages..
we have these things for a reason.
i usually hate it when i install something and get all these depencies required post install
and either you do not know how to correctly read a sentence or you are trying to put words in my mouth.
“or emerge without the 50 hour compile time.”
This part refers to emerge and not apt-get. Please do us all a favor and buy a clue.
Please block IP: —.client.comcast.net. This is just annoying. Surely Havoc is his real name. Whining? Please, give him some due respect he deserves for mountains of code he wrote.
Ok, you only referred to emerge and not apt-get. My apology. But I think it *could* be read to refer both, couldn’t it? Ok, English is not my native language. Anyway I am sorry again for misunderstanding.
I’ve used Windows and Linux for a while now. The fact is that DLL Hell just doesn’t exist anymore.
Well, I wouldn’t be ready to say that. I recently had to hunt over the Internet for Visual C/C++ 7.1 runtime libraries. Okay, I must admit that the one that packaged the program probably didn’t made his job but it’s not something *completely* over… There are also many libraries provided by Microsoft that probably break backward compatibility as some developers are including specific version of some DLLs in the directory of their applications. Basically, it might be over for users, but not for users.
Basically, it might be over for users, but not for users.
Heh… Of course, it should read “it might be over for users but not for developers“.
Eh, some people react defensively to criticism. Frankly, if it were someone else writing the criticism, I might have reacted that way too, but this is hid job. As a RedHat employee, he has to deal with this stuff. RedHat has customers. They don’t want to be in ‘dependency hell’. Thinking about better ways of doing things with Linux is what RedHat does. Linux is far from the be all and end all in operating systems. It works really well, but there is always room for improvement and always a chance to make it easier for new people.
Even with a good package manager, you still end up with 4-5 GUI libs, 8-9 networking interfaces, 3-4 shells, 5-6 scripting languages…. and each one takes up memory and needs to be loaded individually. Now, if you could do everything with 1-2 libs / interfaces / shells / scripting languages / etc, you’d end up saving memory and app load times. Remember, dep. hell isn’t the only problem here. Does desktop Linux ever feel slightly sluggish? Now you know why.
I totally agree with Nathan O. The problem with Linux is that it is basically too free. There should be a group that actually manages these issues and tries to standardise the basic libraries like POSIX for the Unix architecture. IMHO Freedesktop should be in charge of this since they are already in charge of standardization of the DEs. The Linux kernel is great because most people use Linus’ tree. The Linux environment on the other hand is terrible. I know many Linux zealots are going to rant about how this is taking away their freedom, but in order for Linux to go into the mainstream market, we need more standardization of the base Linux environment.
BTW, I really like the way OSX handles dependencies through a mixture of shared libs and static libs
dll hell… Such a flaw in logic to bring this up in this discussion. Why? Let me explain.
First of all, as someone else already pointed out, .dll hell doesn’t exist anymore, and, but that’s just my experience, never really existed at all (note: FOR ME) and I’ve been using Windows since 3.11.
Second, companies selling Windows software include all the required .dll’s, and if they “forget” one .dll, it’s their fault, and not Microsoft’s or Windows’. In Linux, this is completely different: it’s the mentality that creates dependency problems: everything has to be small, program packages should be as small as possible, and everything progresses so damn fast no one can keep up. This is fundamentally different from the non-existant .dll hell in Windows.
Okay, I’m ready to face the conservative side of the Linux community once again…
All those claiming that apt-get or portage or whatever is the solution: it’s not.
More than once, I’ve seen portage upgrading / downgrading autoconf because one package requires version X and another version Y.
I’ll take Subversion as an example – not because it’s exceptionally broken, but because that’s the one package I’ve been fiddling with the most. While I really like that VCS, it has also given me numerous headaches. There are packages for RedHat 7.3 that depend on a glibc version that hasn’t been around before RedHat 8.0. If you update your Neon package because of some other app, suddenly your Subversion breaks. I had to set ~x86 in portage to get a halfways up-to-date Subversion because the portage maintainers didn’t keep up with Subversion releases.
And if your falvor of Linux isn’t supported with a Subversion package, you’re out of luck anyway. And so on and so forth.
This is by no means belittleling SVN. It just shows that package management is just a crude patch over a system that’s just as fundamentally broken as Windows DLL hell.
hmmm freebsd’s solutions and debian’s solutions are the best in my opinion. I never liked rpm that much.
Software providers could just bundle all the required software. Run a script that will check to see if the required software is already install, if not , install it, and then proceed to install the target applications.. thus reducing confusion when installing alot of applications that reffuse to work due to dependencies…. would not be hard to do at all–very easy. I highly doubt alot of programmers will distribute applications like this, especially open source programmers since most installs are done from the source and not packaged up
I’ve used Windows and Linux for a while now. The fact is that DLL Hell just doesn’t exist anymore.
Windows DLL issues are not the same as dependency issues in Linux. The only commonality they share is that they both are shared library issues, but they are still BOTH very much real world issues.
In Windows one can still occasionally be confronted with missing or out-of-date DLLs, but more frequently the issue is a lack of central installation. By central installation I mean that it is not uncommon for app A to install a common DLL in one directory, only to have app B install the same DLL in a different directory — app C is later installed and then complains because it can’t find said DLL. Not only is it not uncommon to have multiple copies of the same DLL floating around on the same drive, but then one faces the certainty of garbage collecting on the drive, with regular serious cleanup as an absolute necessity.
In Linux the problem is on the flipside of the coin…central management and storage of the majority of libraries results in less overall garbage building up, but can quite easily result in dependency hell because applications are not including libraries, or adequate documentation of libraries or dependencies. Linux is best used by people willing to learn a little about computers, but one of its weakest points in many cases is still the lack of good documentation for applications. Even if an app didn’t include the necessary libraries, good documentation of dependencies, maybe including links, with perhaps tips on how to acquire needed libraries using apt-get, urpmi, and yum would go a long way to smoothing things out.
Neither Windows nor Linux have perfect library management, it’s as simple as that. I continue to use and/or learn about a dozen different OSes and I have yet to find one that does everything perfectly, often though, when the problem is DLLs/dependencies it’s not the OS that is the problem but specific apps.
Just my $.02
“Even with a good package manager, you still end up with 4-5 GUI libs, 8-9 networking interfaces, 3-4 shells, 5-6 scripting languages…. and each one takes up memory and needs to be loaded individually. Now, if you could do everything with 1-2 libs / interfaces / shells / scripting languages / etc, you’d end up saving memory and app load times. Remember, dep. hell isn’t the only problem here. Does desktop Linux ever feel slightly sluggish? Now you know why.”
Well first of all, if your smart, you just put on your Hard Drive what you need. You as a user are responsible for what you want, or don’t want on your machine. No one else.
Second as I’m certain you’re aware. When you load up an app. “4-5 GUI libs, 8-9 networking interfaces, 3-4 shells, 5-6 scripting languages….” don’t all go loading up. And I’m willing to bet that the majority using their computers use only a small number of any of those.
PLUS since the article mentions “shared libraries” that keeps thw whole situation under a degree of control.
[ Hon Weng Chong (IP: —.swiftdsl.com.au)]
It’s being standerized were it applies. Or to be zen about it. The reed bends but does not break in the wind.
[Thom Holwerda (IP: —.quicknet.nl)]
“In Linux, this is completely different: it’s the mentality that creates dependency problems: everything has to be small”
Small is good for maintainability, understandability, and not everyone’s on broadband.
“…and everything progresses so damn fast no one can keep up.”
You DO realize that you can set your own pace of upgrading. In fact, so can everyone else.
“This is fundamentally different from the non-existant .dll hell in Windows. ”
Well “non-existant” isn’t quite accurate. Try greatly reduced. Anyway we have a different development process than Windows does, and I’m more than willing to let the restults speak for themselves.
I wouldn’t call it hell. If you install the specified required dependencies then there’s usually not a problem. Linux has DLL versioning in the file name. This prevents one DLL from being superceded by another and incompatible DLL. Windows had DLL hell because it had no or was not using versioning and 3rd party DLL’s superceding native DLL’s.
But it could be better. There are too many little DLL’s out there. Some consolidation into fewer larger DLL’s would be helpful. That may also uncover (a lot of) functions that are duplicated or should not be in a DLL in the first place. But the DLL’s have many origins and who should manage a consolidation? And will the developers participate knowing that it is going to cost some freedom?
Nowadays disk and memory sizes are not in the way of larger DLL’s. But slow download links still are but will disappear over time.
“IMHO Freedesktop should be in charge of this since they are already in charge of standardization of the DEs”
OSDL who already manages the kernel has a mission statement along the same lines http://www.osdl.org/lab_activities/desktop_linux/
Now, if OSDL just had a clone of Andrew Morton to manage the DLL’s as Andrew Morton manages the kernel then the issue would be in good hands.
Before someone hits the report abuse button, I can tell you that NOBODY uses plain rpm anymore. On fedora, I use yum, Mandrake, urpmi, Slackware, swaret, Debian and deratives, apt-get or synaptic for a GUI. The only time I had to endure dependancy hell is when installing packages for obscure distros. Using Linux since 2001, it was ready for me then, and its only gotten stupifingly easier. Using SuSE 9.1 right now, with GNOME installed via garnome. Couldn’t be easier!
Eugeina should check her facts before posting flamebait, and I recommend EVERYBODY to view moderated down comments to see the dark side of OSNEWS.
I confirm, I’ve not experienced that in a long while. Last time was when compiling a package from source on mandrake (and not an srpm).
But no such problem with binary packages, neither on mandrake nor debian libranet.
Havoc was merely commenting on the seemingly good idea of statically linking applications…..
I whish you would go read the book linus wrote: Just for fun where he explains why he write linux and says that some people in teh comunity take it too seriously to promote their own ideals. He just writes it because it serves his needs well. Frankly we dont need people like yourself in the comunity spewing crap and flaming people for their opinions. Please change for the better.
Its hard to change for the better when people are lying on purpose. If the comcast annonymous guy went to http://fedorafaq.org and got yum set up properly and simply typed yum install popfile he would be happily on his way, but no he decided to lie on osnews.
Most people just don’t get it. It’s not about the packaging format, nor about the manner in which you pull them in. It’s all about the policy.
http://www.linuxmafia.com/faq/Debian/policy.html
[Thom Holwerda (IP: —.quicknet.nl)]
“In Linux, this is completely different: it’s the mentality that creates dependency problems: everything has to be small”
Small is good for maintainability, understandability, and not everyone’s on broadband.
That is exactly the problem! It is good for your maintainability! A lot of Linux users seem to forget that 95% of the world’s population does not or does not want to maintain his computer at all! These people want to download an app, click it and finish the damn install! They don’t want to maintain their computers, they want there computers to be maintained.
“…and everything progresses so damn fast no one can keep up.”
You DO realize that you can set your own pace of upgrading. In fact, so can everyone else.
Again, I realize that. But the general end users doesn’t. Accept, for once and for all, that end users don’t want to be confronted with version issues. And, if there’s a critical flaw in a program, a flaw that restricts someone from using the app, he’ll have to ugrade no matter what. In the end, upgrading for some apps is never in your own hands.
“This is fundamentally different from the non-existant .dll hell in Windows. ”
Well “non-existant” isn’t quite accurate. Try greatly reduced. Anyway we have a different development process than Windows does, and I’m more than willing to let the restults speak for themselves.
First of all, don’t “we”, I’m part of that we too you know, I’m a Linux user myself .
But anyways, I’m confident enough to say that when you use Windows like an end user does, .dll hell does not exist. Period.
Do anyone know any of more article(s) that has about the real solution of dependency hell?
For anyone who wants easy to install software, with no command line tools or depandancies, try one of these two apps. In Knoppix there is also a program called live installer. Synaptic can run on most distros, but you have to pay for Click and run (Your paying for convienence).
What a lot of people need to remember is that Linux is the kernel. Dependancies is not handled by Linux, it is by the apps. As I said before, most new distros acknowlege depandancy problesm and have gone a long way to solve it.
You could write RPM for BSD if you wanted to.
Anyway this thread should be deleted, its going on about a problem that was solved three years ago. I also remembered DLL hell, having to boot in to DOS mode to restore my back-up copy of OLEaut32.dll, but thats solved too.
Lets post some constructive stuff than obsolete stuff. SuSE 9.1 has gone a long way to silence the critics.
http://zero-install.sourceforge.net/faq.html
… the best solution for depency hell is called Slackware & handmade compiling It’s bullet proof!
When I switched over recently to a Mac, I couldn’t believe how simple it was to install an application. All that was required was to copy the application and paste it somewhere on the HD and voila, the thing just works. I wonder why Linux is not able to have something like that. It has a higher chance on working on Linux than Windows because Windows has that stupid registry. Maybe this should be the model that the big distros eg Red Hat, Novell and Mandrake should be looking into th simplyfy the process of installing programs in Linux.
I have always found it hard to teach someone how to install something in Linux. It is either they keep stuffing up on the dependencies if they use binaries eg RPM or compiling for source seems to much like a challenge for them.
As I said before, Synaptic or Click-and-Run. I know from experiance that it is easier than OSX. Even better, download knoppix and try the Live Install system.
“Just because you are in denial, or haven’t experienced the problem doesn’t mean it exists.”
Then I may as well say that just because people are in denial, or haven’t experienced the DLL hell doesn’t mean it doesn’t exist anymore.
“That is exactly the problem! It is good for your maintainability!”
That this is exactly your problem! Just because you don’t care, you’re acting like the rest of the world doesn’t care either! Look around, take a look at the huge number of 56k users!
“A lot of Linux users seem to forget that 95% of the world’s population does not or does not want to maintain his computer at all! These people want to download an app, click it and finish the damn install!”
Another problem of you OSNewsers/Slashdotters is that you try to force a “solution” down everyone elses throat thinking it’s the Only Right Way.
What people need is an easy to use GUI for tools like apt-get, or things like autopackage. There’s no reason why small & modular can’t mean easy to install.
No dependency hell on Windows ?. Well, most of the times this is right, but then only because the company that made the installer is using programs such as “Wise package studio” which is not exactly free software. If you have ever tried working with MSI files or Microsoft installer code, then you will know just how hard it is to write an error-free installer for windows.
Basicly you have to compile up against the oldest version of windows you want to support, then figure out what dll’s that you can count on being on the system (such as vb runtime), then you need to figure out what dll’s you can redistribute and wich one you can not. This is hard work, and Microsoft does not make it easy.
So yes because of very expensive programs, and hard work the end user does not feel this problem. It is not some genius techincal microsoft thing that just removed the dependecy hell, it is hard work.
I remember last year…
I was installing a program on Windows XP. The program needed a reboot. So I rebooted my pc and tried to load WindowsXP again.
But, it wouldn’t start.
Hal.dll was corrupt.
I searched the internet for the solution.
From a lot of articles I read that it was a serious bug in Windows XP that was solved in SP1.
A couple of months ago, my sister had problems with IE.
IE would consume 100% cpu power and her pc was unuseable.
So, I went to the software panel, clicked to remove IE.
I got a warning, like I expected, that IE would be installed again once the pc is rebooted. No problem, this is what I wanted.
Uninstalling IE… rebooting… nothing.
Couldn’t log back in to windows. The login dll’s were all corrupted.
A couple of weeks ago, I wanted to update my Win2000 pc to SP4 and the latest security patches. I had a program running to mount iso (and other image) files.
When updating Windows informed me that SP4 could not be installed because a dll (atapi.dll I think it was) was in use.
There have been several times that I installed a program and I got an error message when starting the program.
The reason?
MFC42.dll
It seems that there are lots of different versions.
THIS is dll hell.
It does NOT exist in that magnitude on linux.
I use Gentoo.
When I want my system updated, I open a command line and type:
emerge -uDv world
and sometimes, when config files need to be updated also:
etc-update
That’s it.
I never experienced dependency hell on Gentoo.
And I can imagine that apt-get and similar are a great idea.
note: there was the subversion problem of wanting to install a certain autoconf version. But that problem can be very easily solved.
Suppose that I write a small calculator.
I add support for GNome into it, for KDE, for Windows and for Mac.
Do I need to include full GNome libs, full KDE libs, full Windows libs and full Mac libs with my program?
My users would love to download a complete cd or two.
Shared libraries are a great concept.
Especially when you go back in time, when internet connections weren’t fast. Even today, most people use a 56k modem.
It also strikes me that when on Windows a program breaks dependencies, it’s the fault of the creator of that program. If it’s on linux, it’s the fault of everyone who works on linux. A little bit redicoulus if you ask me.
And why is this? Because not all software is available in you favorite package format. This is esp. true for closed source packages. emerge foo works great until that one day you find a foo which isn’t available in portage and then you realize that portage and all these other sollutions don’t solve the problem, they hide it. If linux is to grow into a signifigant desktop sollution this problem has to be solved properly, not covered up by package manager.
Standalone (closed) packages use /opt, just like windows-software use $programfiles.
As usual, Havoc is 100% right on. For the basic, average user, installing software in Linux is much, much harder than in Windows, unless the user has a distro with its own respository of software, like Lindows, Lycoris, Xandros etc.
i was on the fedora forum the other day and I read some posts by a complete and total Linux newbie. Poor guy was trying to install the sword bible software and various bible translations. Lots of people were trying to walk him through it but he just couldn’t get it all work. He was working with tarballs, rpms etc. It was a classic example of how difficult things can be.
The fact is that it is much easier to install software in Windows. This same guy could go to a website, download an *.exe program, double click it and that’s it. It just is not as easy under Linux.
Dependencies are a legitimate problem, and all these package managers do is help you to not see it. Just because URPMI, Emerge, YUM and apt more or less track/manage dependency problems doesn’t mean they don’t exist. It just means that every distro has to have a package maintainer for every package to make sure that, with each new version, the packages and their ever changing deps are kept up to date.
Apt, yum, urpmi, emerge and the rest all fail if the package isn’t in the repository because they’re the lamest possible solution to the problem. All you’re doing is creating a new layer between the developer and the user, the package maintainer. The result is one or more persons for each piece of software for each distro to make sure that the package is built, stored, dependency managed and working. Has anyone thought about the amount of people-hours required to make sure that KDE is available and working on every distro? It’s staggering.
I LOVE static builds because they always work. Open Office and a lot of third party games (quake, ut, etc) always install correctly and usually work. Just like in windows. Complain all you want about wasted hard drive space, but at 50 cents a gig I’ll gorge myself on static packages like I’m Caligula feasting on .so ‘s.
I’m sorry to say this, but you’re lying!
1. There is no “atapi.dll”, there’s an atapi.sys but that’s a device driver so it would never give you that error message. Drivers are allways being used, you don’t need to stop using them to install new ones, how would it be possible to install new drivers for your graphics card if that was the case??. Besides if a program was “mounting” ISO’s i think the dll that might bork would be the aspi dll’s.
2. HAL.DLL is a kernel module (Hardware Abstraction Layer) and no program needs it to install, and even if it got corrupted windoze keeps a copy of it on your “c:windowsdriver cache” and would restore it.
3. IE, as much of a piece of crap that it is, has nothing to do with the “login dll’s”.
4. There is only one version of MFC42.dll, if it was called mfc.dll then yes i might buy that.
And yes gentoo has better package management.
Correction. Standalone packages should use /opt and be completely self contained. This isn’t always the case though. Also just because a package is closed doesn’t mean it doesn’t have dependancies. There are also a lot of not closed packages out there which for whatever reason aren’t available for your package manager of choice and for these you have to manage dependancies by hand. Bottom line is still that package managers are not the solution.
Package manager always not install new version of programs. It required a new minor upgrade version. It is fine if install a new depended package and solved problem. But the real result is that — upgrade the whole system!!!
Even if you are right (and you may be), the desktop isn’t everything. Embedded devices are growing faster and the server market yields more cash for less headache.
So what?
Who cares if Linux makes it in embedded devices and servers?
Server side people and embedded device manifactures!
I only care about what I have on MY DESKTOP, not what is running my microwave over or sends me back Slashdot pages.
That aside, the argument is wrong:
1) Linux is still a minor player in embedded devices.
2) If Linux looses the dektop, it will loose the server too, because MS will be able to push a proprietary server-side tachnology as prerequisite to interact with their desktops, and bypass the html compliant web.
Desktop is where it’s at.
Your desktop is running Windows. If you’re so convinced that Linux will fail then why do you even care? Because you like insulting people, perhaps?
Shared Libraries can only be a positive thing, if managed correctly. Am I wrong? What needs to happen is that people need to move away from antiquated pkg management systems like RPM (which was great in its ay, but has now been surpassed). RPM is improved by many other pieces of software, but as I see it, things like URPMI are only propping up a dying technology. More new distros are using apt than RPM at htis point, so most users of these distros (or user of Arch, like me) never have to deal with dependencies unless we want to. Exactly like WinXP. Only in WinXP, you can have a multitude of copies of the same DLLs used for different programs. Huh?
Quote:
“I’m sorry to say this, but you’re lying!”
How unbelievable it might sound to you, I’m not lying.
Quote:
“1. There is no “atapi.dll”, there’s an atapi.sys but that’s a device driver so it would never give you that error message. Drivers are allways being used, you don’t need to stop using them to install new ones, how would it be possible to install new drivers for your graphics card if that was the case??. Besides if a program was “mounting” ISO’s i think the dll that might bork would be the aspi dll’s.”
You’re correct, it was not atapi.dll, but atapi.sys that was in use.
I thought it was a dll, my bad.
quote:
“2. HAL.DLL is a kernel module (Hardware Abstraction Layer) and no program needs it to install, and even if it got corrupted windoze keeps a copy of it on your “c:windowsdriver cache” and would restore it.”
It is a known problem on Windows XP (versions prior to SP1). It has nothing to do with the program being installed. It’s a “Do you feel lucky today”-problem.
And yes, how unbelievable it might sound… it happend to me. What did I do to “try” to solve the problem?
Use my WindowsXP install cd… went to the recovery console, backedup the hal.dll in my installed windows system folder, expanded the one on the cd, copied it to the system folder, restarted pc… problem still there.
quote:
“3. IE, as much of a piece of crap that it is, has nothing to do with the “login dll’s”. ”
Exactly…
Then why oh why did my sister’s computer display tons of warnings about corrupted dlls when restarting the pc?
No, I didn’t touch anything else. I just removed IE and restarted the pc.
Note: since IE was taking much cpu power, it might have been spyware that also infected those dll’s.
quote:
“4. There is only one version of MFC42.dll, if it was called mfc.dll then yes i might buy that.”
Please, buy it when I say that there are incompatible versions of mfc42.dll
You do also realise that .net is a HUGE dependency?
You do also realise that windowsXP will cause a lot of broken dependencies?
You do also realise that windowsXP will cause a lot of broken dependencies?
SHOULD BE:
You do also realise that windowsXP service pack 2 will cause a lot of broken dependencies?
“As I said before, Synaptic or Click-and-Run. I know from experiance that it is easier than OSX.”
hahahahahaha!!!! That is a good one, dude. I use both Linux and OSX and none of the Linux install tools is as easy as drag and drop, which is how you install and uninstall apps on OSX. Gimme a break. I love Linux as much as the next guy, but to say it is easier to install apps on Linux rather than OS X makes you look silly.
2. HAL.DLL is a kernel module (Hardware Abstraction Layer) and no program needs it to install, and even if it got corrupted windoze keeps a copy of it on your “c:windowsdriver cache” and would restore it.
actually i have had this happen as well on customer machines. i have even done a bit for bit comparison between the system restore hal.dll, the borked hal.dll, and the hal.dll on another system which was booting fine. all 3 were identical. so the problem wasnt really the hal.dll but i am guessing what was trying to load it.
of course then there is the infamous problem of windows xp loading up quite a few drivers and system files then complaining that such and such file could not be found (i think it was ntfs.sys, but it has been about 9 months since i was doing benchwork pc repair. i ran into the exact same problem on around 15-18 different customer machines with differing hardware since sp1 came out for xp. never encountered it prior to sp1, not sure if i was lucky or if sp1 introduced the problem).
anyway, you could boot the machine in safe mode with logging and watch as it would reach that one file and then bork because it said it couldnt find it. take the hard drive and stick it into another machine, browse to the file. do a binary comparason between it and a known good copy only to find they are again bit for bit identical. cause? not certain. cure? overlay/repair install or FARI (pronounced like faerie, Format And ReInstall)
i have seen some of the strangest unexplainable issues on windows through the years. these were in the minority though, most problems can be directly attributed to the user doing something stupid.
all that being said, every OS has its share of issues that crop up. even though i migrated to using nothing but linux at home, i wont lie and say it is perfect, but it suits most of my needs. (havent found an os that suits all of them)
The various GNU/Linux distributions aren’t all equally affected by these problems. I for one use Gentoo and hardly ever come across a dependency conflict when emerging software. And if there are any conflicts the ebuild is usually fixed in no time because of the very active community.
As for the RPM based distributions: some can be a pain in the ass when it comes to unresolved dependencies but the majority
of have pretty good package management these days. If the distribution you’re using is giving you too much trouble try another one.
Most of the posts here show no evidence of having read Pennington’s article, no understanding of his role in the Liniux community and the credibility that might lend to his opinions, and no evidence that the posters are anything other than adolescent fanboys who think they’ve got it all figured out.
Dependency problems exist in any OS because developers assume that certain libraries and other code collections will be available on the machines of people running their software. If it isn’t, someone or something (apt-get, yum, emerge, whatever…) has to go looking for it and install it.
The problem arises when someone needs two packages: Package A and Package B. Package A depends on Library foo.1.001. So a dependency resolver finds it and installs it. Now, Package A works. Then, someone wants to install Package B. It depends foo.1.002, so a dependency resolver goes out and installs it. Because foo.1.001 and foo.1.002 are incompatible, the dependency resolver removes foo.1.001. All of a sudden, Packaga A is broken.
Note that this will happen with any automatic dependency resolver or with a human being handling the dependencies. Two packages depend on mutually exclusive software.
The problem is aggrevated by (1) code installed to resolve a dependency can affect any of the hundreds of other previously libraries and packages because no one knows what assumptions each developer made; (2) different distributions add their own problem by altering source and/or its location in the filesystem, gumming up the works even more.
The smaller the pool from which new software is installed, and the more someone enforces development standards and discipline, the fewer dependency problems show up. That’s what OS X has fewer problems.
Finally, fanboys hyperventilating about their favorite distribution and their favorite packaging scheme need to understand that banality of their assertions: Any single distribution or any single packaging scheme will reduce the dependency problem if that’s the only one you ever use. E.g., we’d see fewer problems if everyone used Red Hat and pulled software only from official RedHat servers. Is that what you want?
well ok, you explained right this time, i guess i owe you an apology for saying you were lying: I’m really sorry about that.
BTW Those are really weird problems.
There are two unrealistic notions played out in this thread.
The first one is that Joe and Jane Sixpack will (inevitably) end up in dependancy hell in GNU/Linux. Highly unlikely. Joe and Jane, IF they use GNU/Linux will go for a polished distribution and find everything they need on the install disks. If they need to upgrade a package, they will use the online updater from their distributor.
If a (newbie) user gets into dependancy hell, it is because he is trying to install an app from an unknown source which didn’t cater to his expertise. There is nothing fundamentally wrong with that, it’s how people learn, but if you go off the beaten paths in the woods accept the rude reality that you are on your own and need to survive by yourself.
The second one is that there is a “solution”. Dependancy hell is a symptom. The cause here is the “online distribution model” of FOSS software. There are two trade-offs to consider. To accommodate fast and convenient downloads, one has to link dynamically and use libraries, risking (severe) dependancies. To accommodate foolproof hasslefree running of apps one has to link statically, but the package sizes will rise accordingly, thus hampering fast and easy downloading.
The only “solution” to the trade-off symptoms would be to remove the cause of the problems. Stop distributing software online and both problems of dependancies and statically linked codebloat are gone. Without downloading you have to package everything on disk and if that thing doesn’t work, it’s the fault of the aggregator/distributor. This ofcourse in no solution at all either. It’s throwing the child out with the bathwater.
Overall the main problem here seens to be this weird notion that GNU/Linux mandatorily has to be better than perfect. Every “design” desicion has it’s negatives. Yet these desicions need to be made to get something out the door at all. The difference between bad software and good software, is to what extent developers have mitigated the negatives of the design desicions, while retaining the positives. On GNU/Linux the negatives of dynamically linked software seems to have been mitigated quite adequately with package management.
Why not just expand the LSB to include more generic libs? I agree that some package makers are really stupid. For example BitTornado, who the hell has wxpython installed on their system by default?!?
That and also there should be an unspoken rule that a distro will not exceed 2 CDS! Yeah I know you get a $hit load of packages that you wont find on a default install of windows but if I want OOo ill download it from their website.
OH yeah, what about autopackage [www.autopackage.org]. Theres alot of buzz around that project.
Your desktop is running Windows.
Wrong, my laptop is running Windows. My desktop runs Linux.
I’ve been using Linux since RedHat 5.2 (IIRC), and SunOS and Solaris since 1996. So there you have it. As a matter of fact, in a couple of months my new laptop will run OS X.
If you’re so convinced that Linux will fail then why do you even care?
I’m trying to warn people! Psychics that are convinced that, say, a plane would crash, still care about it!
Because you like insulting people, perhaps?
Yes, it gives me a fuzzy feeling. However, in my comment I haven’t insulted anyone. Check again.
hahahahahaha!!!! That is a good one, dude. I use both Linux and OSX and none of the Linux install tools is as easy as drag and drop, which is how you install and uninstall apps on OSX. Gimme a break. I love Linux as much as the next guy, but to say it is easier to install apps on Linux rather than OS X makes you look silly.
Not quite true. Not all applications for OSX can be installed using drag and drop, many of them use Installer.app and put files outside /Applications folder. And these applications cannot be uninstalled using drag and drop to trash can, unless you want to leave garbage. In fact, uninstalling these applications means going through /Library/Receipts, figuring out which packages belong to application and manually removing app files. Everything must be done from command-line.
Obviously, rpm/deb beats it. So much for OSX superiority.
Have your cake, or eat it.
Either the software will depend on you having certain libraries and have a small disk footprint, or statically link everything and be bloated. One or the other.
If you use a distribution with a decent package manager like gentoo, you don’t have to worry about libraries conflicting with each-other. Portage introduced a concept called slots which allows multiple versions of libraries and programs to be installed simultaneously. This eliminates the vast majority of problems involving dependencies. There are even ebuilds for commercial apps, like VMWare. For example, VMWare was a nightmare for me on debian and mandrake. On Gentoo, it only required a single command to install and it worked perfectly. I hate to be a fanboy, but the author is just using the wrong distro.
There are certain disadvantages to using a platform as infinitely extensible and customizable as GNU/Linux, dependencies are one of them, and only in some cases.
Frankly, I have always seen package management as a strength of Linux.
“Not quite true. Not all applications for OSX can be installed using drag and drop, many of them use Installer.app and put files outside /Applications folder. And these applications cannot be uninstalled using drag and drop to trash can, unless you want to leave garbage. In fact, uninstalling these applications means going through /Library/Receipts, figuring out which packages belong to application and manually removing app files. Everything must be done from command-line.
Obviously, rpm/deb beats it. So much for OSX superiority.
Hardly.
Sure, some OS X apps have an “installer” that you have to click through dialog boxes rather than drag and drop but that’s not really the point. The point is ease of use. You can still install anything without dependencies either by drag and drop or going through a couple of dialog boxes.
See, you are thinking like an experienced linux user when you are talking about going through /Library/Receipts. You say you can’t drag those apps to the trash to uninstall unless you want to leave garbage, which is true. Your statement implies that you can drag those apps to the trash to uninstall if you don’t care about leaving garbage, which is also true. That’s how I uninstall apps from OS X all the time and it always works. Never a single problem. Not once.
I’m talking about the average consumer who does not understand all that. My grandmother could not navigate thru apt, tarballs, rpms or whatever. She really could not. She doesn’t even understand right-clicking! She also couldn’t give a crap if there is leftover garbage in /Library/Receipts from uninstalled programs. For that type of person, they need to download a program and double click it or drag and drop. That is all they can do. You cannot expect that type of person to do _anything_ else. It “just works” in OS X. No dependencies. Maybe it isn’t as clean to you but to my grandma it works.
I have been using Linux of many flavors (currently Debian Sarge and Slack) for several years and been using OS X for a little over a year and for the average, non computer literate person, OS X beats the pants off Linux in terms of installing and uninstalling programs. There is no way, and I really mean NO way, you can argue otherwise.
Of course this problem is a no-no for apt-get, emerge and urpmi users, but there should be a standard GUI tool for this. I think Open Carpet is the answer. All distros should include it with their default repositories activated.
Fact is that Linux and it’s progammers will never ever replace Microsoft on the desktop because of issues like this and many more. This is why linux is nothing more then a niche OS with very few users. No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.
I have a few doubts about the credibility in what you say. My first clue is the cliché “never ever.” It just sounds so “story telling time” from the beginning with that. My second clue is that I have noticed much more than a “niche” of Linux users. We’re talking about Colleges spending large sums of money to put together curriculum suitable for teaching Linux related topics, and not just for research. More and more of the non-research and liberal arts colleges are making this investment. It’s money well spent in their opinions, seeing that so many large enterprise transitions are being made. My last, but not least, clue would be your presumptuous use of the contrived term “linux-zealot” which covers some rather remarkable people in our world today. What exactly is it about them again? Linux is perhaps lacking in a few areas, but then again, so are we.
Dear Mr. Anonymous (IP: —.client.comcast.net)
Posted on 2004-06-17 04:33:11
You said, “Fact is that Linux and it’s progammers will never ever replace Microsoft on the desktop because of issues like this and many more. This is why linux is nothing more then a niche OS with very few users. No matter how linux-zealots try to spin it, linux as a whole is one giant meshed together ball of a OS mess.”
Allow me, a humble user, to explain, in my own small way, the business and economics of GNU/Linux. (1) GNU/Linux, like many alternative operation systems in the market place, man you have to love the freedom of the market place, started with the simple notion of solving a problem. Necessity, being the “mother” it is represents a paradigm shift in IT. This paradigm shift fills an IT need in business and actually represents a new business model created by alternative operating systems, like GNU/Linux and open source.
The GNU/Linux you speak of indeed had its humble beginnings as an academic project and has since, in ten short years, to become a major competitor in academia, business, and individual desktop use. The reason GNU/Linux became so big was because of the Internet boom and hundreds of freedom-loving, dedicated coders. Service providers and business saw a way to reduce operational costs associated with licensing, the old paradigm, and a shift toward service. GNU/Linux has struck a fatal blow to the old business model used by the “other-platform” marketing giant based in Redmond. It may take a generation or so for the old business model to finally push up daisies. But it will occur none-the-less. The giant realizes that it has been wounded and sees its own business model demize on the horizon.
Sorry to sound Darwinistic, wait, maybe I’m not, but the operating system that adapts the fastest to the needs of business, academia, and individual users – wins. The “other platform” is more of “one giant meshed together ball of a OS mess” than GNU/Linux ever is. Aparrently, someone has never taken the time to look at the GNU/Linux design philosophy.
By-the-way, I am not in dependency hell. I have learned how to keep my system “nice-and-shiny” with excellent updated code using YUM, APT-GET, and Synaptic. Since I went “other platform” free in October of 2003, I haven’t had one single hardware failure or software error. Talk about a paradigm shift. I can now focus on my business and not “where I will have to go today” with the “other platform” operating system by a daily upgrade to repair.
@bsdrocks, and everybody else who is actually interested in solutions that are being worked on, plesae take a look at the autopackage project at autopackage.org. In my oppinion this is probably the best solution I’ve seen so far.
If anybody is interested in this topic please check out the Poll/thread on LinuxQuestions.org where a long heated discussion on this topic has been going on for the past few days. The link is http://www.linuxquestions.org/questions/showthread.php?threadid=543…
To say that there is ‘no dll hell’ or even ‘reduced dll hell’ is absolutly wrong! As many have already stated dll hell even exists on the latest of m$ desktops. Heck i’ve also used windows from the days of 3.1 to the test bed of Longhorn and guess what. Its still there. Its not so easy just to click ‘program.exe’ and be done with it! Alot of the times it could be a missing .dll file or something else could be bundled with it that currupts the .dll files in your /system32/ directroy or even destroy the registry itself.
I have had that happen repeatedly to me no matter what version of windows I was in!
Second is that there are package managers that already look after the dependancies for you. On SuSE there is YAST. One time i didn’t have gxine and it didn’t come with the distro. The package manager told me what I needed and then I downloaded the recommended files. Thats it! It installed perfectly, thats on top of all the programs that come automaticaly with linux which you would have to install separatly in windows, which of course will put you into ‘.dll hell.’ In windows you also have to look what type of file it is. Whether its xp, 2k or some sort of 9x. The one thing m$ has never been good at is compatibility! If one program works for win2k it might not work for winXP! Again a thing that has happend to me where I had to fork out extra money so I can keep that compatibility some what in check. While applications like OO for exsample can install on linux no matter what version you use!
Also saying that the package managers will only pull from one install mirror is absolutly obsured! In apt-get there are plenty of repositories for you to use ith options in adding or changing where the source may come from. So far all I see, yet again is miss information braught apon windows fan boys!
If people complain about it in windows then its always a “third party app” or “its just a wierd problem.”
If you going to make an argument against linux at least have it partialy fixed in windows first since it is supposed to be the number 1 desktop right?
At least try to make a vaild claim. Not this FUD stating you are trying to ‘warn average joe user.’ Instead of spreading this FUD why not warn average joe user of the dependancy hell amoung other countless of things that windows brings so that he or she can then purchase even more software to keep their windows box running longer than six months!
By-the-way, I am not in dependency hell. I have learned how to keep my system “nice-and-shiny” with excellent updated code using YUM, APT-GET, and Synaptic.
This will always be true if you stick to running applications that reside in distro specific repositories. But sometimes people need an application that isn’t in the repositories, and they need to do stuff by hand, and that’s where you can run into dependency problems.
Example: I just finished taking a motorcycle training course to get my license. They had a training manual that was in protected PDF format. I couldn’t print out the PDF. One of our assignments was to answer the 100 questions in the back of the manual, detailing the page number you found the answer on, etc. Well, it is a pain to have to jump around all over the place between the question and answer when it’s so protected even search doesn’t work, so I wanted to print it out. I found an app that would disable the protection so I could print out the PDF. I downloaded the .tar file, untarred it and tried to configure, make, make install it but found I needed certain libraries on my machine. So I went out and found each of those, tried to install each one and found more needed libraries, etc. Eventually ran into the problem where I needed a library older than the one I had installed. When I finally got everything to where it would install the program didn’t even work correctly. Wasted almost 4 hours doing it.
Plus, you can run into problems using great tools like Apt-Get. I used to run MEPIS. After a fresh install once I did an Apt-Get Update; Apt-Get Dist-Upgrade. I looked at what it was going to do and it was going to uninstall a lot of files that made KDE work correctly. If I didn’t look through the list of what Apt-Get was prepared to do I could have really screwed up my install. Found out later to only do an Apt-Get Upgrade, Dist-Upgrade overwrites files that are specific to the distro.
people say thats easy.. no thats even easier for tools on linux.. i do agree installing an rpm is not an hard thing, even for a newbaby in linux world.
Here what i experienced with redhat.. i downloaded the software.. cool.. read the ‘read me’ file. There were files listed(here starts the depency hell), and there vere certain kernel needed, certain libs. Well, just imagine how much fun to download all these files:)
Whats even better is, if you are using dial-up to connect internet, the size of the file really matter. If its so big, you would wait for a cd, or ask kindly a friend with a faster connection. But in linux you cant estimate for how you will be downloading for at first.
Don’t you people thing its insane to download a new kernel, a new gcc for a simple 1.5mg software:) And if the file you are downloading is an old one, how can you compile the kernel, i mean if you are new to linux:) now pls tell me how easier mandrake,redhat,xandros…
Yes, i havent yet tested urpm or the other tools considered, but i’m really doubtfull of them as well. Because before i try linux, i read million articles saying how easy it is to use, how fast it is, how secure and stable it is.
My experience on linux considering speed was not so promising, how can it be more secure(other than being as secure if you have decent firewall on your win system), and for being stable, cant recall how many software provided was buggy.
But sure someday i’ll give one more try to linux, i guess mandrake that time just coz i’m already sick of viruses,trojans and license issues on windows, just that..
“how can it be more secure(other than being as secure if you have decent firewall on your win system),”
First off a good fire wall would have to be downloaded off of the net. Second that fire wall can easily screw up with your .dll settings or even the registry. Third if it does work it uses a lot of reasources and lastly most viruses and trojans can kill software firewalls built for windows very easily. So much for windows being ‘secure.’
Mr. J. Sane,
In an earlier era of computing, not really that long ago in the cosmic scheme, time being relative to our interpretation of its passing, computers were built from the ground up with custom solutions coming close to meeting our computing needs to a proverbial “tee”.
Vendors cam along later and built basic programs to fit our needs. These “basic” programs were expensive. The solution, need I say solutions, were not perfect then and they are not now. But the costs have droped dramatically lower. Hence, as a cost to a business, academic institution, or even individuals, the GNU/Linux solution may not be “perfect”, if there is such a thing, that it is worth the trade-off using the “other platform” and the licensing costs associated with it.
The ability for GNU/Linux to more closely “adapt” and align to the needs of business, academia, and the individual is the overall cost benifit of using GNU/Linux and open source.
“I found an app that would disable the protection so I could print out the PDF. I downloaded the .tar file, untarred it and tried to configure, make, make install it but found I needed certain libraries on my machine. So I went out and found each of those, tried to install each one and found more needed libraries, etc. Eventually ran into the problem where I needed a library older than the one I had installed. When I finally got everything to where it would install the program didn’t even work correctly. Wasted almost 4 hours doing it. ”
A couple things: One that protection is rather easy to get around (what’s up with that, Adobe?). Second look high and low for your prefered package format. Third even if you have a tarball, not all tars are the same. Fourth if one’s familiar with Makefiles, one can do all of what’s needed in one particular directory, pulling in from the main system, only what’s needed. And last, at least with Linux you have the option of the above. If there were no Windows program to do what you wanted, what would you have done? Freedom means exactly that, and like all freedoms it comes with a cost. I hope at least you asked the community for help.
Then you are really shooting yourself in the foot here. URPMI, APT-GET, YUM and other tools easily solve this problem for you. They do dependency management for you when you download a file from a repository of software. They are no fuss solutions to any user who does not like doing things the hard way.
I had a lot more problems with DLL’s in winXP alone then I ever had with dependencies in linux. Actually I never had a single problem with apt.
In windows I ofcourse deleted all terrible bloat named internet explorer, outlook express, msn , etc. Then I would install miranda im to chat. But it wouldn’t connect to msn. After a while I found out it needed an IE dll. It isn’t even documented what dll’s the program uses. I found out most windows programs don’t list at all what dll’s they need to work. The debian packages are far better organized then the chaos in windows on that part.
If you touch it, the window breaks.
I agree 100%
I did the same thing and it got me with the same results!
Heck updating one of my m$ boxes with a dat file for the av or an m$ patch would make things worse.
[I’m the guy from 194.blah, just typing at different computer now]
Sure, some OS X apps have an “installer” that you have to click through dialog boxes rather than drag and drop but that’s not really the point. The point is ease of use. You can still install anything without dependencies either by drag and drop or going through a couple of dialog boxes.
My point wasn’t instalation, it is really very easy under OSX for moderately experienced user. My point was uninstalation, that’s what is hard under OSX.
See, you are thinking like an experienced linux user when you are talking about going through /Library/Receipts. You say you can’t drag those apps to the trash to uninstall unless you want to leave garbage, which is true. Your statement implies that you can drag those apps to the trash to uninstall if you don’t care about leaving garbage, which is also true. That’s how I uninstall apps from OS X all the time and it always works. Never a single problem. Not once.
Yes, I’m experienced Linux and other unices user. However, it does not mean that I want to do thing the hard way. I would welcome some clickety-click tool to remove apps without leaving garbage. You see, I don’t uninstall applications according to my current mood, but when I need it – for example, I’m running low on harddrive space and I see some big application that I barely use. When I was writing this, I had a specific app in mind – GarageBand. I needed some extra space, GB is almost 2gb and I ran it once or twice, just to see what is it about. But when I remove GarageBand.app, I get only 55mb. Where is the rest? In /Library/Application Support/GarageBand, 1.81gb. This is prime example, what’s wrong with leaving garbage behind. I pointed to /Library/Receipt precisely because it is the hard way, that average user is not going thru it. If it were at least as “easy” as rpm -e … (not that it would help the average user, but it would help at least the power users)
I’m talking about the average consumer who does not understand all that. My grandmother could not navigate thru apt, tarballs, rpms or whatever. She really could not. She doesn’t even understand right-clicking! She also couldn’t give a crap if there is leftover garbage in /Library/Receipts from uninstalled programs. For that type of person, they need to download a program and double click it or drag and drop. That is all they can do. You cannot expect that type of person to do _anything_ else. It “just works” in OS X. No dependencies. Maybe it isn’t as clean to you but to my grandma it works.
Exactly. Average consumer does not understand all that. I may be fortunate to understant that all, but I should not have. That was my point.
Also, I doubt that your grandmother would be able to install app on mac. I don’t have grandmother anymore, so I will use my mom as example – and I can tell you, that the concept of downloading .dmg file and mounting it is beyond her capabilities. When she needs some app installed, she will ask for recomendation which app to get for given task and to help installing it. And yes, she doesn’t get right-clicking too and has hard time with double-click vs. single-click.
I have been using Linux of many flavors (currently Debian Sarge and Slack) for several years and been using OS X for a little over a year and for the average, non computer literate person, OS X beats the pants off Linux in terms of installing and uninstalling programs. There is no way, and I really mean NO way, you can argue otherwise.
I’ve been using Linux since Slackware 3, but again, that doesn’t mean I like doing things the hard way. My PC has currently Fedora Core 2 installed and I like yum very much. I’m also OSX user since 10.0, in fact, I’m typing this on Mac with Panther (that one with unused GarageBand). However, I will argue that Linux _can_ be easier than OSX and point you to Lindows’ Click-and-run store. What’s what my mom could be able to use.
I hear what you’re saying but I have to disagree.
First of all, ease of installation and ease of uninstallation and whether you run into dependency problems when you install apps is unrelated to how well, or how cleanly, for lack of a better term, the OS uninstalls those apps. Your garageband app is a good example. I haven’t tried it on my Mac but I’m sure your correct in your numbers. However, you were able to uninstall it via drag and drop which you can’t do in Linux. Advantage: OS X.
Second, your claim that “the concept of downloading .dmg file and mounting it is beyond her capabilities” you are making it sound harder than it is. Virtually all OS X apps you download by clicking on for example a link on the software’s homepage. The dmg is downloaded to the Desktop, and then automatically mounted. The mounted dmg file then opens up a new Finder window which usually says something like “To install this application, drag the icon to your Applications folder.” To the average user, it is totally transparent. Rather than click ‘n run, it is drag and drop ‘n run. It really couldn’t be easier. Advantage: Tie.
Third, comparing OS X is Click N run is not fair. Click ‘n run is a dedicated repository of software maintained by one company. It is highly unlikely that you are going to run into any problems with that environment. Apple could do the same if it wanted to, but it doesn’t. So, the real issue is installing third party apps. I’d be willing to bet anything that if you took a vanilla lindows box, edited the sources.list file and pointed it to sarge and started installing software, you’d run into all sorts of problems. Xandros has the same issue, which is why they recommend not doing it. And, in terms of ease of use in installing third party apps without dependency problems, OS X is in a class by itself. Advantage: OS X.
I hear what you’re saying and I understand you. Linux can be easy and Lindows is the best example of that. It’s the easiest Linux out there. But it is not as easy as OS X. With Lindows, you are at the mercy of the click n run repository to keep things easy, otherwise you’re back at regular apt-get and dependency issues. With OS X, if you come across a third party OS X app, it generally is drag and drop installation. And, for some apps, you have to click through some dialog boxes. However, you *never* run into dependency problems, and you can, albeit perhaps inefficiently, uninstall programs with drag and drop.
I love them both, and certainly Linux does a lot of things better than OS X, in terms of ease of use, OS X wins hands down.
Even with a good package manager, you still end up with 4-5 GUI libs, 8-9 networking interfaces, 3-4 shells, 5-6 scripting languages…. and each one takes up memory and needs to be loaded individually.
Did you read Joel Spolsky’s article on Windows APIs? This is exactly the case for Windows. The standard you’d like Linux to live up to is out of reach even for Windows.
Now, if you could do everything with 1-2 libs / interfaces / shells / scripting languages / etc, you’d end up saving memory and app load times.
Actually, you wouldn’t save up that much memory, since the libraries are loaded only when necessary. Again, this is the same in Windows.
Remember, dep. hell isn’t the only problem here. Does desktop Linux ever feel slightly sluggish? Now you know why.
Actually, the Linux desktop doesn’t feel sluggish for me on my Athlon900. Then again, I do use kernel 2.6 which has improved desktop performance.
The one thing X lacks (though not for long) is desktop double-buffering. This can give some people the impression that the Linux desktop is a bit less responsive than Windows or Mac OS X. But that has nothing to do with multiple libraries…
There’s one thing that we can all agree on and that is choice. Choosing what os you want whether it’s Mandrake, Suse, Windows or Mac, their all great 0s in my book but having a choice is a beautiful thing. I guess I just love computers as a whole.
It hasn’t existed since the days of Windows 95 when programmers were still tring to get their stuff on the new Win32 architecture.<p>
You’re joking right? WinME, the last version that came out before I made the switch to Linux, had a ton of problems with dependencies. I was constantly searching for DLL’s or uninstalling programs and reinstalling them to try and alleviate the problem.
in ol’ days, you had bad day messing around with msvcrt.dll version 4 and 5 and so forth…or such.
ok, i admit that that kind of dll hell is very rare in this day….however…we STILL HAVE MULTIPLE SAME VERSION DLL EXISTING IN ALL OVER THE SYSTEM….foobar.dll in c:progra~1foobar and c:winntsystem32 and c:winnt and c:progra~1arfoo and c:foobarr and c:fooobarrr2 and so on and so on….and that sucks if you have lots of apps installed which makes extensive use of shared libs…
>>linux as a whole is one giant meshed together ball of a OS mess.
nice troll. & nice opinion.
you are obviously not a programmer, and you obviously have not seen windows source code.
crawl back under your rock.
>>Havoc Pennington is the author of Metacity
thanks.
i always wanted to know who was responsible for nearly destroying gnome single handedly.
forcing us to use opaque windows with gnome equals tutti-frutti all over the screen.
good thing we can still shoe horn sawfish in.
i thought a litle before posting, but here is some long message, sorry for that :”)
>> And why is this? Because not all software is available in you favorite package format. This is esp. true for closed source packages. emerge foo works great until that one day you find a foo which isn’t available in portage and then you realize that portage and all these other sollutions don’t solve the problem, they hide it. If linux is to grow into a signifigant desktop sollution this problem has to be solved properly, not covered up by package manager. <<
]- then compile it yourself, it is stated on the gentoo site, that Gentoo is for power users.
If not then u can pay for RH or MDK.. afaik if u pay u can “push” mdk-team to add pacakges u want in the distro and solve all your depencity problems.
>> That is exactly the problem! It is good for your maintainability! A lot of Linux users seem to forget that 95% of the world’s population does not or does not want to maintain his computer at all! These people want to download an app, click it and finish the damn install! They don’t want to maintain their computers, they want there computers to be maintained. <<
]- linux is effort of its users, those 95% of the ppl which dont want to be bothered with depencity hell will have to pay to comercial distros to solve their problems, if they dont want to think about such problems. ppl often forget that the idea behind OSS is not to conquer the world for the sake of conquering (or making alot of bucks) but to do things right.(and if along the path it comes to world domination or get some bucks for living then it will be good, but..not nececery)
As u see BIG names already stepped to solve the problems for the end-users. Give them some time :”).
i use gentoo and i’m happy with the way it handles the depencities. If i have problem I google if the answer is there i fix-it, if not i ask on the forums/mailing-list if there is solution i apply it. If not I try to solve the problem myself. If i find the answer I share it with the community.
If I cant i will pay someone to fix it for me, not happened till now :”)
SO….
if u dont want to bother with solving computer problems PAY SOMEONE to solve it for u, else learn more about programs/OS u use.
Do u repear every problems that happen with your car, do u build your house by hand, etc.. NO ! you pay someone to do things u cant do.
If u dont want to use Linux, then fine no one force u to use it. Use windows, we linux ppl are fine with your decisions..
I will be more happy if u use what suits u best.
In fact in most of the cases ppl are FORCED to use Windows.
For linux in most of the cases it happens ’cause it is technicaly superior or do the work better i.e. those who use linux, feel it better.
Of course it does not perform best in all areas, but the ppl are different so it is really stupid to think there will be windows and only windows. The same stand for linux.
If after 10 years linux throws windows in the trash and “eats up” the 95% of the desktops/servers then something else will show up that will displace linux. This is called evolution. :”)
ok it became too long… :”)
for me linux rocks, perl rocks and gentoo rocks too.. (even that portage is written in python )
and windows sucks and vb sucks too..
hope this will be catched by suck-o-metter :”)