“Every year or so I like to see how Microsoft is doing in its attempt to make a desktop operating system as usable as Linux. Microsoft Windows XP, Home Edition, with Service Pack 2, is a tremendous improvement over previous Windows versions when it comes to stability and appearance, but it still has many glitches that…” Read more at NewsForge.
Poor software included, poor documentation. Windows must work on these things to be considered a serious player in the OS world.
Just goes to show that its not what you’ve got, just how your present it!
That was fairly clever and didn’t come across as a flame.
Well, i’m a GNU/Linux “supporter” but your article is not fair.
Probably XP didn’t recognize your NIC because it was released after the release of XP, hence the driver is not included.
You should have tested with a Linux distro of the same age of XP.
It doesn’t matter whose fault it is. The fact is a user tried to install winXP and it failed on this point.
Linux has simply progressed to where there is better general driver support and hardware autodetection. As for 3d video cards, windows requires you to download additional drivers to get 3d to work as well. As a matter of fact Suse was able to get 3d hardware acceleration to work with my ATI Radeon with only requiring me to check one check box!! I’m impressed.
I agree that this was clever (even though it was written by roblimo, ugh). It’s funny because it turns an argument on its head, and the article has a lot of truth: today’s Windows could learn a lot about usability and value from Linux.
Here’s something I don’t see Windows matching any time soon:
$ apt-get update # update entire system! usually without rebooting!
$ apt-get install firefox # or whatever program
So yes, one of the reasons I use Linux it’s easier.
“to make a desktop operating system as usable as Linux.”
That makes me laugh. As usable as Linux? Sure Windows may not be very intuitive or user-friendly at times–But calling it less usable than Linux is just funny. I’ve yet to find a distro where hacks and workarounds aren’t required to get basic drivers and software running. Ubuntu, for example, has a habit of freaking out whenever nVidia drivers are installed. I had to learn Vim (itself not user-friendly at all) just to get my 3D card working. Every distro I’ve tried so far has had problems like these.
>Probably XP didn’t recognize your NIC because it was released
> after the release of XP, hence the driver is not included.
>You should have tested with a Linux distro of the same age of
>XP.
I can see that too, but on the other hand, consumers have a choice for choosing a modern Linux system. It’s important to approach this from the consumer’s point of view–because it’s the consumer that counts. No consumer is going to pick an old Linux.
In other words, frequent releases are yet another strength of Linux.
How useable is a desktop system that is constantly being hit with virus’and spyware
A Most Excellent Troll, this is. It will bring in a large catch. You are to be congratulated, Sir.
“English-language password, but a 20-character string of apparently random letters and numbers. It took me several tries”
I believe it’s 25 characters, can anyone back me up ?
When installing software of Windows is as easy as it is on Linux, then we can begin to talk. I’m not even gonna talk about spyware, adware, crapware, viruses, trojans etc. And how about a consistent user interface? The Windows development team and Windows developers, in general, should spend a good year studying GNOME to see how it is done. Yeap, these days I cringe when I have to use Windows.
“You should have tested with a Linux distro of the same age of XP. ”
Or MS should have shorter release cycles. Its been 2001 since MS released a new OS.
And to point out some other faults, MS stopped putting in feature enhancements into w2k in approx 2002, 2 years after release. It was relegated to the scrap heap, despite being used primarly in the corporate universe. Feature enhancements only went into XP. XP users have a full 5 years of bug fixes and feature enhancments while w2k had about 1 year of feature enhancements.
Their business practices leave a lot to be desired. FYI: I am a home user running w2k (retail).
Its a fair enough article. Everyone knows XP is not perfect. I still use it for the ease of use the familiarity and the fact that I can get a crapload of software otherwise unusable in Linux unless I resort to using virtual machines.
sarcasm
He has a nic older than spring 2004 in a machine that’s a few years old? I doubt it. Remember, Windows XP SP2 is only a year old .
Pretty funny. He does this about once a year. I guess he’s still bitter that the open source desktop isn’t progressing as fast as he expected.
i dont think anything get easier than click-n-run or synaptic.
how many windows installers will keep you upto date with the latest upgrades and security fixes. ok there windows updateer, but that only covers MS stuff.
Nothing else to say, besides it’s a funny article (for both sides!)… =]
“I had to learn Vim (itself not user-friendly at all) just to get my 3D card working.”
Ahhh, Vim, kind of like Vi. Which would be a distant feature rich cousine of notepad (I think). Learning vi or vim is a good thing, since it is quite portable. I actually run WinVi and Vim on my w2k box.
Now if that is the basic trouble with Linux, just compare a simple text file the the registry. Sorry, mate, a simple text file is way easier than the registry. The problems with the registry:
1) Not fully documented
2) Registry bloat can slow system performance
Its easier to re-do or create a text file. And this one text file typically only affects the 1 program in question.
Just my humble opinion.
I have been trying to get XP pro on a big name dual xeon box. The worst part is having to call and wait for a new 8 billion digit number to enter just to get it to work.
While linux is no joy it works and I can reload it when I want to. Soon I will replace MS products with open sourced. Might want to consider that the honest people don’t need to be treated like crooks when they reload their OS.
Haha, I reread the article, and it is truly funny, or should I say, ironic?
Oh well, personal preference.
i think for a seasoned linux user linux does tend to be easier than windows… but i think us linuxers tend to lose touch with the non poweruser world. Think of it like this…. I run Gentoo, and by the time i get it tweaked and all set up.. it pretty much runs hands free.. installs anything i want with a simple emerge command. however….. what did it take to get it to that point?…. A a lot of BS, that’s what. and the average windows user would never want to go through all that.
and for those of you who run debian i know what you are thinking….. there are all sorts of debian distros out there that are easy to use… xandrox, ubuntu, mepis….
well, apt-get breaks easily in the hands of someone who does now know what they are doing….. aka average windows user
so … for the re re off the street with no computer knowlege… windows probably is easier to use (till it getts filled up with adware and spyware)
That was very funny, but also very true.
The last time I installed Windows, I had to go searching for graphics card drivers, audio card driver, webcam driver, cable modem driver, DVD Player software, CD burner software, tv card software, took about an hour AFTER the OS install to get those extras in.
ANY distro has all that and more after 20 minute install, all set up and ready to go.
Also, I have a plain vanilla TV-Card, under Windows I can watch TV on it…. Under Linux, it is a TiVo substitute, watching TV, recording TV scheduling etc etc
It is not just that Linux is starting to support more hardware than Windows out of the box, it is the extras that it does with it
oh, and another thing, Lumburgh, just because people have jobs and get wages, does not mean that said wages, that people have worked damned hard for, should just be handed straight over to Microsoft, does it ?
I just got a new laptop which linux has some problems with, so I’ve been working with Windows XP Home SP2 for a couple weeks.
First off, let me say that XP Home is definitely a nice OS. The wireless is flawless, switching automatically to whatever network is around, and handling keys and encryption without third party utilities. Suspend and resume and hibernate work great, I find it’s very rare that I have to reboot, mostly I just hibernate (takes a while with a gig of RAM!) and then it comes up again very quickly.
But the one thing that will bring me back to Linux is the small features that make my computing environment fun. Dragging and resizing windows from anywhere within them, true network transparency, quality, free applications without nagware, and processes that don’t break for no discernable reason.
Yes, to get Linux installed and working right sometimes takes quite a bit of fiddling, but once it works, it will continue to work for as long as you want it.
“Important Notice: OSNews is not just about operating systems, it is a computing site in general.”
Take that down, its very unprofessional. The smarter ones will realize that the title says “OSNews.com – Exploring the Future of Computing.” If people complain, they complain. Don’t let it bother you.
“Or MS should have shorter release cycles. Its been 2001 since MS released a new OS. ”
Why so the usual trolls can say micorosoft if trying to bilk customers by releasing so frequently?
Does anyone take *anything* from News Forge seriously anymore? Honestly? Ugh, there’s 3 minutes of my life I’ll never get back again.
Zealotry will kill any chance of Linux getting anywhere in the *minds* of your “average home user.” Linux zealots have a completely unrealistic grasp of what people will actually pay for and use.
About installing software: sure, if what you want is packaged for your distro, fine. But if you’re looking for something lesser-known, on Windows at least such programs come with an installer of some sort.
On linux, you grab the `.tar.gz’ and `compile’ the program yourself. With `make’. To do that you open an `xterm’. And then you get a `prompt’ to `type’ `commands’ into. Yes, that’s typing as in what you usually do in Word. No clickety-clickety.
And sure, that’s fine for me, and you won’t see me switching to windows anytime soon. But not for the average joe….
oh no!
Yet Another “Is Windows Ready For The Desktop” Article?
😀
“Linux zealots have a completely unrealistic grasp of what people will actually pay for and use.”
Really? You seem to think people buy Windows PCs because they’re impressive and not because it’s the only thing on the shelf at Fry’s, Best Buy, Walmart, etc. They buy it cause it’s there and because they don’t know the difference, and half of them are just as lost with Windows as they would be with any other OS. I spend half my time showing people how copy and paste, figure out where they saved their files to, etc. Fact is, no OS is user friendly enough for the common man.
He’s joking, right?
I guess he’s still bitter that the open source desktop isn’t progressing as fast as he expected.
Nah, he’s just poking fun at those who spread FUD by saying that Linux isn’t as usable as Windows. In fact, Linux IS as usable as Windows, however that doesn’t necessarily translate into market share. Case in point: Mac OS. No one would claim that it’s not ready for the desktop, and yet it has a market share similar to Linux’s…
Both OS X and Windows are leaps and bounds ahead of Linux. Given the opportunity (OS X on x86), I would dive cold-turkey into OS X just to play around with it some more. It’s an elegant system. Windows is elegant, but it’s a no-bullshit platform. You just install it, install your drivers, install your applications, and you use it.
Linux is a platform for those who’s time is worth $0/hour.
All this talk of apt-get and rpm ignores the real issue. Let me give you an example:
Johnny installs FooLinux 3.0. He uses GAIM for chatting to his friends. One year on, the MSN protocol changes, and he needs a new version of GAIM to continue chatting.
But his distro only provided security fixes, and is no longer supported. There is no new GAIM package for his distro. What can he do? ‘build’ the ‘tarball’ from ‘source’, potentially with different file locations, and parameters that’d complicate things? Oh, and he’d need a ‘compiler’ and ‘make’ installed. WTF?
Oh, or he could install FooLinux 4.0, which has the new GAIM. So just to keep chatting to his MSN buddies he has to re-install his entire OS, bringing new bugs/changes/complications along the way, altering the software he currently uses and likes, and possibly increasing hardware requirements.
Oh, or he could try the GAIM ‘package’ from FooLinux 4.0 on his 3.0 installation. But then he’d have to work out the ‘dependencies’, which could include a new ‘GTK’, and in turn ‘atk’ and ‘pango’, and well, it goes on and on.
Basically, just to keep chatting, Johnny has to go through all manner of complications. It’s insane. And this is just an example – it affects hundreds of apps.
Contrast this with Windows: you install the OS, eg Windows 2000, and it’ll run programs for years and years – just double-click the .exe. This really is true; 99% of apps doing the rounds will install on a 5-year-old OS. Now try this with a Linux distro from 2000 – try installing, say, the latest Gnumeric. You’d have to completely overhaul the system, changing everything you’re familiar with, introducing new bugs and higher system demands.
THIS IS THE PROBLEM, PEOPLE. The whole concept of ‘package managers’ and ‘repositories’ doesn’t solve the fact that users are far too highly constrained in what software they can install, and when.
And sure, that’s fine for me, and you won’t see me switching to windows anytime soon. But not for the average joe….
…which is irrelevant, since the average joe will not install packages that are not available for his distro. Heck, I’m a advanced user and I’ve barely ever had to compile something for my Desktop Linux PC. Mandrake (sorry, Mandriva) has upwards of 7,000 packages in its repositories. All the good programs are in there.
I built a box for Linux and haven’t run into many problems with it. Heck, I can even pop in a Linux LiveCD and it will just work (thus far, three out of three distros work).Windows was a different story though. There were four sets of drivers which had to be downloaded and installed after the OS was installed. Then I had to install applications to actually make the machine useful. As the author pointed out, many of the applications which I take for granted under Linux cost money. Even the toy applications (never mind the ones which take man-years to develop).
If you are given a bare machine, a good Linux distro will usually be much easier to handle than Windows. If you are given a box with the OS and applications preloaded (with a restore CD to boot), then Windows will probably be easier since it does have some nice applications. (Yes, I miss WordPerfect and CorelDRAW, but they simply aren’t worth the headache that is Windows.)
Windows is elegant, but it’s a no-bullshit platform. You just install it, install your drivers, install your applications, and you use it.
It takes a lot longer to have a functional Windows install than a functional Linux install. I know, I recently had to install WinXP on a friend’s new PC. Good thing I had Knoppix, too, so I was able to partition the disk beforehand. So not only is Windows very long to install, but the installation was made easier thanks to a Linux LiveCD.
Linux is a platform for those who’s time is worth $0/hour.
Ah, yes, more FUD. Meanwhile, costs due to malware on Windows reached more than 150 billion dollars last year. So if you have money to burn, go with Microsoft. If you want to save money, Linux is for you.
WTF are you doing using a Linux LiveCD to partition your friend’s drive for? I hope you realize that the installers for NT4.0, 2K, and XP have a built-in partitioning utility. Do you? If not, then what you have to say about XP taking longer/being more difficult to install is null and void, because you’re obviously not as competent as you should be.
PS: Show me a quote for $150b and I’ll believe you.
All the good programs almost never mean, all the programs. Years ago, trying to get bochs 2.1 on suse had no package, compilation gave errors and an rpm from another distro had a lot of other dependencies – no apt-get of today or any other packet manager working in a similiar way would solve that issue either, because the real problem remains.
and comments too, especially regarding easy Linux installation – who cares if it’s easy to install if it’s barely usable as desktop os? With limited choice of poorly written end-user software? While Windows usually comes preinstalled, easy to use and with a broad choice of applications?
And, I believe, it’s not a problem to check what kind of hardware in your system, is it in hcl and get the drivers before starting the installation. And, to make life even better, integrate those drivers to installation disk?
And author is definitely a talented provocateur.
Nun’s backup….
http://www.vnunet.com/news/1159778
WTF are you doing using a Linux LiveCD to partition your friend’s drive for? I hope you realize that the installers for NT4.0, 2K, and XP have a built-in partitioning utility.
A graphical utility that allows non-destructive resizing of partitions? Really? How do I access it before installing WinXP?
If not, then what you have to say about XP taking longer/being more difficult to install is null and void, because you’re obviously not as competent as you should be.
It is still valid, because it takes so much longer to install XP + Office + DVD Burner + all the other programs. Heck, even installing only the WinXP OS takes longer than installing an entire Linux system, OS + apps.
PS: Show me a quote for $150b and I’ll believe you.
Look for it yourself, buddy. Just Google “global cost of malware 2004”. The actual figure is 166 billions (US$), though other estimates place it at beween 169 and 204 billions. That’s between $281 to $340 worth of damage per machine, according to the article…
Linux is a platform for those who’s time is worth $0/hour.
Really now? I make decent money and the upper management loves it when you build a new server with free software out of some free junk already laying around the IT department.
Erm … that was more like a rebuttal of what nun said.
According to newly published research from IDC, the need to identify and eradicate these parasitic programs will drive anti-spyware software revenues from $12m in 2003 to $305m in 2008.
1. Not only was it not $150 BILLION, it wasn’t even CLOSE to $1b. It won’t be close to $1b in 2008.
2. The numbers quotes are revenues for anti-spyware software, not losses.
What kind of crack is nun smoking? Thanks for the counter-evidence though.
>THIS IS THE PROBLEM, PEOPLE. The whole concept of ‘package
>managers’ and ‘repositories’ doesn’t solve the fact that
>users are far too highly constrained in what software they
>can install, and when.
I do agree with your entire post.
I would also add that we can’t expect GNU/Linux distro to grow forever.
I mean, debian for example, is getting insane big ! It is not clever to pretend that every single software will be written in the world for the next 10 years, will enter (just say) debian.
This mean that “having the software included in the distro” is a condition that we cannot expect to be true for long time more.
(also because not every software is free)
Why would the average joe want to install Bochs? Win4Lin is a much better solution (and comes with its own installer).
Not all software is available for Windows XP either. Final Cut Pro is a good example. Nor is Konqueror, the best file manager/web & ftp browser combo out there.
Obviously, if one specifically needs a software that is not available for Linux, then one should probably stick to Windows. That doesn’t mean one has to waste your time (which allegedly is worth something, according to Zealot) and spread FUD about Linux on Internet forums…
Just because bochs is an example that you can apply to a lot of other software if you get into that situation.
What kind of crack is nun smoking?
I guess you posted this before reading the link that I provided…you know, the one that provides the 166 billion dollar figure (but claims that it could be as high as 2004 billions?)
Would you like some salt with that crow, mr. Zealot?
A bad example. Joe Average will not need to compile software.
BTW, SuSE may not be the best distro to pick as an exampel. Debian and Mandriva both have lots more packages, including ones for Bochs.
Not everything here is missission critical and thus not worth spending crap loads of money on but when you’re asked to do something and do it in the cheapest possible way how can you beat not costing the company anymore than what they’re paying you? Sure the harddrive might fail tomorrow, so what, we’ve got more and we’ve got backups where it counts. It takes no time to rebuild what you lost in these cases and you still come out cheap and that’s what the people signing the pay checks like. We drop the money where we need it and save it whenever we can.
Fair enough. I’m just not used to the idea of building “servers” for a business out of “free junk” lying around. LOL
About specific examples that closely? Joe Average might google and search for his software and find something and might look in the repository and might not be there. Even large package count make only a probability to have everything really. Under Windows, I can just download it from the page when the author allows to download. For Linux he even really can’t do this service, or if, probably for the distro he uses, or he might try to be lsb-compliant, which would be still the best solution, except for which package manager he would use.
Now even if Debian has 100 times the packages of SuSe, it doesn’t change the basic problem.
“Every year or so I like to see how Microsoft is doing in its attempt to make a desktop operating system as usable as Linux.”
Wowowow, am I on acid? Did that dude say that? Now I am a Linux supporter but I’m not that zealous.
Under Windows, I can just download it from the page when the author allows to download.
If it’s available for Windows, that is. What if it isn’t? Even if Windows had 100 times the amount of software that SuSE has, it doesn’t change the basic problem.
If the software isn’t easily available for your system, don’t install it. End of story. It’s as much a problem on Windows as it is on Linux.
204 billions because of malware? strange that i didn’t get my share of it…
for those who scream because win includes only a minimum of applications:
last time i checked ms was sued becaus they included a mediaplayer. same happened some years ago because they had the great idea to include a webbrowser.
Yes and no, if it supports Windows, it supports Windows.
If it supports Linux, it doesn’t automatically mean it supports the specific distribution it runs.
Not every software supports every OS, that is true, but the linux community who want to be one big movement are internally shattered into pieces in that point – in a lot of pieces.
this is one side article, but since long time ago i don’t like windows. I prefer Linux, it’s a matter of taste …
“Linux is a platform for those who’s time is worth $0/hour.”
Well $10 per hour as first job is not very bad in my country …
Windows XP since 2001, well you can install gentoo from now and in 2010 is updated.
For me the best concept of OS is Gentoo/Linux. Let’s see what will be Gentoo/BSD.
cheers.
Sérgio Machado
Good point. When MS bundles extra applications, they’re leveraging their monopoly. When Linux distros do the same, they’re applauded.
Oh the double standards.
>> Microsoft is doing in its attempt to make a desktop operating system as usable as Linux <<.
Does this man that I have to fiddle around with a lot of ugly config files and console commands to tune just a single behaviour of my system?
Well, no thanks. Muharharharhar, …
Final cut is a nice app, but for windows there is plenty of professional quality NLE tools available, so you have the choice. As contrasted with linux (cindirella? Dont make me laugh)
Maybe I used the wrong words to describe it.
We’ve got plenty of high dollar equipment, we also have a SUN E450 (old, but expensive) that I kind of consider junk that we’re not using for anything at the moment. Though Linux doesn’t really apply to it because we’ve already got Solaris licenses for it anyway.
Not every software supports every OS, that is true, but the linux community who want to be one big movement are internally shattered into pieces in that point – in a lot of pieces.
For one, the “Linux community” does not want anything. It’s not a monolithic movement, but rather a nebulous group that encompasses all kinds of users and developers. Making such generalization about the “movement” is misleading.
Second, there are lots of distros out there, but only a few really big ones. The vast majority of software is available for those big distros, and “joe average” is likely to find everything he needs, even if it isn’t the exact piece of software he’s looking for.
I don’t think this is a serious problem – personally, it’s never prevented me from having a fully functional Linux system.
“Johnny installs FooLinux 3.0. He uses GAIM for chatting to his friends. One year on, the MSN protocol changes, and he needs a new version of GAIM to continue chatting.
But his distro only provided security fixes, and is no longer supported. There is no new GAIM package for his distro.”
Heh, I had to laugh at this example…we just released an MDV bugfix update for kdenetwork yesterday to allow kopete to work with the recent MSN protocol change…
…ZealotHater was quick to say that I was smoking crack when he though I had been proven wrong, but that he now completely ignores me after I’ve given him the link that proves me right…
200 billion dollars, baby!
“Under Windows, I can just download it from the page when the author allows to download”
You missed a bit out of that sentence:
“…and then go googling for six year old Virtual Basic runtime DLLs”.
😉
“Heh, I had to laugh at this example…we just released an MDV bugfix update for kdenetwork yesterday to allow kopete to work with the recent MSN protocol change…”
That’s not the point. Your distro is still supported. Once that ends, what does the user do? Upgrade his entire distro, changing the software he knows, complicating things, and increasing requirements?
Or ‘compile’ from ‘source tarball’? Or mess around trying to get kdenetwork from the next MDV to work on his current one?
It’s great that you’re keeping people up to date, but it’s only a temporary solution for 12 months or so. Users should be able to install software on installations several years old, without worrying about dependencies, distros, repositories, filesystem layouts etc.
Windows 2000 users don’t have to worry for 99% of apps. Red Hat 6.2 users don’t stand a chance in hell of installing the latest Kopete without massive overhauls or days of fiddling.
That’s the point. Once your support ends, and a user needs new software, s/he can’t just double-click and install. It’s a massive job – it’s a problem with Linux software installation, and the current ‘solutions’ are short-term papering over the cracks.
“Good point. When MS bundles extra applications, they’re leveraging their monopoly. When Linux distros do the same, they’re applauded.
Oh the double standards.”
No. When Microsoft bundle a Web browser, you need a _specially designed tool_ to uninstall it. When Microsoft include a media player, they only include one, which advertises Microsoft content at you which you can download over only the Microsoft DRM platform for playback on your Microsoft-approved media player.
Linux distributions, notoriously, bundle about fifty three web-browsers and seventy six media players, each of which can be removed with a fairly simple command and none of which ties you into an entire philosophy of acquiring content…
Final cut is a nice app, but for windows there is plenty of professional quality NLE tools available, so you have the choice. As contrasted with linux (cindirella? Dont make me laugh)
You want professional? How about Piranha from interactive FX? Yes, it runs on Linux. For the lower end, MainActor runs very well on Linux. It doesn’t have all of Premiere’s bells and whistles, but take it from an ex-Film student: bells and whistles don’t make a good edit!
Cinelerra isn’t bad, but it’s unstable and the UI sucks. However once you learn how to use it’s quite capable, and can use a local distributed network for background rendering (quite a powerful feature).
But the point (there was a point, you know – just re-read the exchange between me and Legend) is that Final Cut Pro isn’t available for Windows. For the sake of the discussion, I really don’t care about alternatives: the fact is that not all software is available for Windows. Please try to stick to the point. Thanks.
“… and cursing the application developer for not including them …”
He/She is the guilty one in that case.
I wasn’t saying your entire point was invalid, just that I happened to find the example funny. To an extent I agree with your point…Microsoft have a stable platform on which you can build a program and reasonably expect it to work on any Microsoft OS released since 1998, and it’s called Win32. Linux has a (graphical) stable platform on which you can build an app and reasonably expect it to work on any Linux distribution released since 1998, and it’s called xlib.
Now go run xedit.
(time passes…)
see why people don’t build apps on the only available really stable long-term platform available on Linux?
I agree that this is a bit of an inconvenience for Linux. There really isn’t a fix at the moment, though. It’s a situational thing that cannot be changed in the short term.
Most don’t even want it to be addressed …
Oh, and in case it wasn’t clear from my last post, the problem doesn’t have anything to do with installation methods, IMHO. The reason you have to rebuild the entire OS to install the latest kopete on Red Hat 6.2 has nothing to do with RPM or tarballs or whatever. It has to do with the fact that large amounts of things whose existence is necessary for kopete to function did not exist when Red Hat 6.2 was released.
I totally agree with what you´ve said here!
Linux needs a standard installation routine for all distros. IMHO this is the key for a broad success for Linux on desktop systems. Autopackage is the way to go!
But then try to just add rpm’s and their dependencies, then you’ll get into a version chaos, and when you would think you should upgrade glibc – well, everytime I experienced “cya system”.
“This is autopackage, the multi-distribution binary packaging framework for Linux systems.”
Never seen that one before, that seems interesting. Thank you for the info!
“Oh, and in case it wasn’t clear from my last post, the problem doesn’t have anything to do with installation methods, IMHO. The reason you have to rebuild the entire OS to install the latest kopete on Red Hat 6.2 has nothing to do with RPM or tarballs or whatever. It has to do with the fact that large amounts of things whose existence is necessary for kopete to function did not exist when Red Hat 6.2 was released.”
Absolutely, that’s true. I just meant, whenever the problems with Linux software installation come up, some people say “I just apt-get it” etc. like it’s a big solution. Apt (and apt4rpm) is great; it’s not a solution to the underlying problems though.
Like you say, it’s an inconvenience (growing as the range of software and distros increases) and is hard to change. One thing I’ve been really interested in is Autopackage (http://www.autopackage.org).
This solves two problems:
1) Users only need one source for their software, regardless of distro. No need to keep track of repositories or find sites or make sure foolib.so.4 is installed
2) More importantly, MASSIVE reduction in testing. With so many distros all testing their own packages of the SAME software, it’s a huge duplication of effort. ONE well-tested Autopackage GIMP is quicker, easier and saner than 500 less-tested GIMP packages.
I think people seriously overestimate the degree to which “familiarity” can be confused with “usability”. Having not used Windows as my desktop since NT4 w/ Office 97, I have to say XP machines with Office 2003 make me uncomfortable. So much so, that I prefer to use a Mac, even though I’ve never owned a Mac, because its more familiar (UNIX-y).
What annoys me when people say certain things about Linux is that they don’t take this comfortability factor into account. They harp on software installation, but don’t take into account that Synaptic isn’t really any harder, or more time consuming, its just different. A lot of complaints people level at Linux on the desktop are of this form.
Of course, there are real problems with Linux. But even with these, people refuse to have perspective, or take into account that users can work around them. They don’t take into account that their OS has problems too, and users have to work around them. They don’t take it into account, because they’re so used to those problems themselves, they don’t even notice anymore!
For example, sure hardware support on Linux isn’t as good as on Windows. Has this ever bothered any real Linux user? Unlikely. It takes me 5 minutes to check how well Linux supports a particular piece of hardware, and I can do it at the same time as I’m Googling for any “don’t buy this, or it’ll kick you in the shins and steal your lunch-money” warnings about the product. On the other hand, nobody mentions how Windows can be a bitch about hardware. I’ve never had to dump a piece of hardware because it was not Linux-compatible, but I’ve had to replace two network cards because after days of it working on every machine but the one it was supposed to be used on, I just went and bought a new one. Of course, when your hardware isn’t supported on Linux, welcome to text file madness. Hotplug and udev deserve a special place in hell, for being both badly documented and overly “featureful”. To this day, I don’t quite understand the interaction between hotplug, udev, and pcmcia-cs.
My point is, people adapt to things. They get used to things. Reviewing any platform, Linux or Windows, without taking into account the user’s existing familiarity or how the user can adapt to the new situation, is just silly.
no, autopackage still wouldn’t really solve the problem. The author of autopackage explicitly says it should only be used really for managing _applications_, not for managing the underlying libraries and really fundamental parts of a distribution, which is what causes big headaches when trying to install new software on old distributions. The solution to this is simply to have less dynamically changing and more stable platforms, and this will happen inevitably over time as the platforms get _better_ and need less radical improvement. I’ve already explained why we can’t use the common platforms that exist already; all the old toolkits that you can rely on to exist everywhere are underpowered and (equally important) look like ass. Current platforms are a far better cut at it. I can _reasonably_ imagine myself still using some GTK+ 2 apps in five years and them not being too horribly out of place – like 1998-era Windows apps look on XP.
With limited choice of poorly written end-user software?
“Poorly-written” is a word I’d take exception to. It’ll be a decade or more before Your average Window’s PC comes with the caliber of software I use on my Linux machine. Right now, I’m a very happy engineer with a combo of Maxima, Octave, and TeX, GCC, and Emacs/TexMacs. These five programs do 90% of what I use a computer to do, and they do them *very* well. Maxima and Octave are extremely powerful programs, with a lot of support. Not as good as Mathematica and Matlab (which run on Linux anyway), but they’d not be taken lightly if they were $600 products competing in the same market. The magic comes in when they’re used with TexMacs, though. I don’t know of any combo that is as powerful on the Windows platform, save for these applications themselves.
And TeX is what Microsoft Word dreams, deep in its heart, that it could be. Don’t get me wrong, Word is “easy”. But then again, so is a hooker. Easy or not, Word’s output looks horrible, while TeX’s output is just beautiful. As for GCC, yeah, MSVC might be faster and prettier, but can it cross-compile to a microcontroller?
Of course, I’m probably not your average user. But that’s kind of what I’m getting at. “Average users”, don’t need anything better than OpenOffice, Gaim, and Firefox. They need good and simple applications with just enough power. The GNOME folks are doing a fine job of addressing that need. However, its the high-end users that need high-end software, and there is no shortage of that on Linux.
It’s unfortunate that most people aren’t intelligent enough to avoid getting spyware
Yes, my engineer friends and I are all dumbasses. Just because we didn’t know that plugging a Windows machine into a network without a firewall will have it MSBlastered in 2 hours flat.
There is no “two minute” lesson that fixes the spyware problem on Windows. It involves a lot of batting down the hatches (changing IE security levels), changing system defaults (why oh why is the user administrator by default???), updating software, installing service packages, and regular monitoring and maintainence with the proper programs (virus scanner, spyware scanner, firewall).
You missed a bit out of that sentence:
“…and then go googling for six year old Virtual Basic runtime DLLs”.
Windows XP comes with VB5 and VB6 DLLs. Of course, you still have to google older VB DLLs if it wasn’t already included in the package of your favourite application, but at least you know your six years old app will _work_. Get a six-years-old binary application for Linux and try it on your box.
Microsoft can be blamed for a lot of things, but backward compatibility isn’t one of them.
I have set many many people on SuSE for 9.0 thru 9.3 Most have no problems with set up most manage nearly all maintenance through YaST without problem. True a couple of times I have had to go help one of these people with a problem – but then again XP users also need to find outside help with their systems. In reality average Windows users are much more likely to need outside aid than a similarly set up average person using SuSE.
This humorous article pointed out what I have know for a year now – Once a person goes through the pain of the conversion to a quality Linux distro they will find Linux easier to use and maintain when all things are considered.
a little side note here…… try to find a windows box that is 95% x86-64 optimized, in this respect….. linux blows windows out of the water…. at least right now
although that has nothing to do with useability lol
RedHat 6.2 is a poor example, for the simple reason that Linux just wasn’t as mature than as it is now (or Windows was at the time), and its kind of a crummy design to begin with. However, modern Linux has taken a different approach than Windows. It does not allow the idea of running new software on an old OS. It’s a bad idea anyway, both technically (new OSs need to keep all the old libraries), and from a security point of view (security updates to an OS last only so long). In a modern Linux, particularly Debian-based OSs like Ubuntu (though other OSs are heading in that direction too), the OS is always up to date. You don’t “upgrade” an OS, you “update” it, several times a month. That way, the question of “how to install new software on an old OS” becomes a moot point. Your OS should never be old.
I have a feeling that Microsoft is likely to move towards this direction as well. If you look at XP, you’ll see that fairly major security and feature upgrades have come out in services packs over Windows update. Back in the Win9x days, they’d have been rolled into a new release. Between their desire to sell you an “OS maintainence subscription”, and their frustration at the slow diffusion of the .NET runtime, I’d say such a move is not unlikely at all in the future.
“Good point. When MS bundles extra applications, they’re leveraging their monopoly. When Linux distros do the same, they’re applauded.
Oh the double standards.”
Well, the Linux distros don’t have monopolies.
I liked that article. I’m on some Linux since 1996. And I also try out Windows from time to time myself. But for me it can’t keep up with my Linux Workstation. But that’s apparently not just me anymore 🙂
“In a modern Linux, particularly Debian-based OSs like Ubuntu (though other OSs are heading in that direction too), the OS is always up to date. You don’t “upgrade” an OS, you “update” it, several times a month. That way, the question of “how to install new software on an old OS” becomes a moot point. Your OS should never be old.”
Huh? That only makes sense with Debian Testing/Unstable, which are NOT recommended for real-world production use (moving target, not officially supported by the Security Team).
Where’s the stability in that? Constantly changing foundations is NOT a good thing. It means shorter testing, more difficulties in developing and targeting software, and more hassle.
If a user is perfectly happy with his/her setup, why should he/she have to keep upgrading/updating just to use one new app? See the example above. If Johnny is happy with his system, why should he have to be on this constantly-rolling upgrade just to chat to his friends?
And what if you’re on dialup?
It’s all absurd. Users in Windowsland are familiar with installing an OS, keeping it for several years and adding new apps as they please. This is NOT possible on Linux without a LOT more work (mostly out-of-reach of newcomers), and it’s a GENUINE problem.
Look around message boards on the Net for Linux newcomers. Speak to some. They’re not happy that their 18-month old Linux box has to be fiddled with and upgraded just for one app. Constantly shifting platforms won’t help – they’ll add more problems.
By the Debian reference, I mean that if you’re using an official release (same with Ubuntu etc.) you’ll only get security and major bugfix updates. Eventually you’ll need to upgrade the whole system to be able to use new apps.
And therein lies the problem. Users want a stable platform they can work with – not things shifting under them all the time. They want to install an OS, and two years down the line just double-click a new app.
Again, try installing the latest Gnumeric on a two-year old distro. It’s a mammoth effort.
Well, I must agree – there is excellent software for certain tasks, but let’s separate serious professional software from “general use software” – as well as hi-end workstations from pc’s, o.k.?
And I don’t think you ever seen TeX used as an office suite, right? And if we talk about “office”, try to ask somebody who really use MS Office to switch to OpenOffice
In any distro with a modern package manager, the concept of an “Old OS” is outdated, the OS is a constantly evolving piece of software, point releases are just for convenience when installing.
$ tar zxvf foo.tgz
$ cd foo
$ ./configure –prefix=/usr
$ make
$ make install
*installed*
—————-
[double-click] foo-installer.exe
*installed*
Which I think might have something to do with the data execution thing…
just to be an ass:
[double-click] foo-installer.exe
[click] “I Agree”
[click] “Next”
[click] “Next”
[click] “Next”
[click] “Finish”
*installed*
——————
$ emerge foo
*installed*
“Right now, I’m a very happy engineer with a combo of Maxima, Octave, and TeX, GCC, and Emacs/TexMacs. These five programs do 90% of what I use a computer to do, and they do them *very* well. Maxima and Octave are extremely powerful programs, with a lot of support. Not as good as Mathematica and Matlab (which run on Linux anyway), but they’d not be taken lightly if they were $600 products competing in the same market. The magic comes in when they’re used with TexMacs, though. I don’t know of any combo that is as powerful on the Windows platform, save for these applications themselves. ”
i believe the windows equivalent is:
http://sourceforge.net/projects/octave
http://maxima.sourceforge.net/download.shtml
http://www.texmacs.org/tmweb/download/windows.en.html
although i admit i don’t know a free equivalent of GCC (aside from the obvious cygwin non-solution)
as for installing things on windows, some one a couple pages back mentioned the 6 year old VB .dll, well i’ve never run into a dll problem that couldn’t be fixed by googling the dll downloading and putting a copy of the dll in the same folder as the application, that is i’ve never had to compile and install the dll or track down dependencies for it can’t say the same about libraries on linux.
of course that assumes that when you succesfully install the library the next time you try to run the “configure, make, make install” trilogy it will actually find those libraries.
the difference is when you download a program in windows if it doesn’t come with it’s own installer it can 99.9% of the time just be decompressed to any old folder and run, lack of compiling and all that because windows being proprietary has consistent file locations (that is consistent from system to system not as in logically consistant there are serious issues there). inconsistant file locations from distro to distro (compare say arch to ubuntu sometime) means you sorta half to run the holy trilogy of compiling if your distro doesn’t have a package. it’s a problem inherent in an open system (don’t get me wrong i wouldn’t have it changed for the world i LIKE open) but it IS a problem that windows doesn’t really have.
@Watcher:
Huh? That only makes sense with Debian Testing/Unstable, which are NOT recommended for real-world production use (moving target, not officially supported by the Security Team).
Even if you’re tracking Debian stable, you never have to reinstall your OS. When the next Debian stable comes out, you just do “apt-get upgrade”. Ubuntu is the same way, and a better example for a desktop user. Every six months, there is a major update (ie: a new release), and inbetween there are minor updates (ie: security patches, fixes). This way, you’ve got a stable base, and at the same time, your OS is never out of date so new apps always run.
Where’s the stability in that? Constantly changing foundations is NOT a good thing. It means shorter testing, more difficulties in developing and targeting software, and more hassle.
It’s hardly unstable. Most updates during that 6 month timeframe are bug-fixes and patches. They’re not “changing the foundation”, merely touching it up. Major updates are tested to make the transition very smooth, which is eased by the fact that usually the users system is in a known state (ie: most recent versions of core software, rather than some 3 year old version patched with a random assortment of fixes). Anyway, Apple releases new OSs every year, and it doesn’t hasn’t seemed to make OS X unstable!
If a user is perfectly happy with his/her setup, why should he/she have to keep upgrading/updating just to use one new app?
Security! If you’re a good little Windows user, you’re *already* doing something like this. You *do* run Windows Update regularly to get patches, do you not? And you do install service packs regularly, do you not?
And what if you’re on dialup?
Sucks to be you.
It’s all absurd. Users in Windowsland are familiar with installing an OS, keeping it for several years and adding new apps as they please.
If they update it properly, it’s not the same OS. You ever wonder why Service Packs are 200MB? Because they change pretty much every system file. And if they don’t keep it up to date, well, now we know whose 0wn3d machine is infecting all the others on the network now, don’t we.
18 month old software never has to be updated to install most non-development applications. I can’t remember the last time I saw gaim not supporting 2 releases back for it’s libraries (I think the newest version still runs on RH 8.0).
Most software doesn’t have to be upgraded, and there are distributions catered to this thinking (like RH, Mandrake, Suse, Slackware, Debian Stable). However, the software is going to be older, and in the case of Debian stable (which is never recommended for desktop or x11 workstation use except by Debian leaders) that may be 4 year old software. For example, I use RHEL on my laptop. It uses Gtk 2.4 (which hasn’t been the stable gtk release for probably 6-8 months). But there are very few apps that don’t run on gtk 2.4 and most of them haven’t had truly stable releases yet.
Many Linux geeks like to live on the cutting edge though. And in interest of serving the community, and not the slack-jawed yokels who don’t care anyway, a large number of distributions take a highly update driven policy. When there are a lot more non-geeky people using linux you will see more non-geeky distributions creep up. But there are already several. And yes, they update “more often than Windows.” This is because Microsoft has fallen behind on updates. For once they are giving customers 5 years between desktop products (this is a first for them). Microsoft used to ask people to update at least every couple of years (Remember 98-98SE-ME-XP? How about NT4-2K-XP?).
Also I’d like to say that Debian unstable is fine in most production environments. It’s running similar software to RHEL, but with a bit less testing. This is why people are yelling at Debian about keeping unstable .. well … unstable.
Sigh, this has been addressed since what… before Linux? It’s always been autoconf and automake. And now there is even an alternative to that: Autopackage which is actually so easy you just double click!
But since the linux community rocks there are a host of other great package utilities:
pkg_add
pkgtool (is this what slack calls there’s?)
pacman
rpm (note apt is just a medium, not a packager)
dpkg
etc!
And because the linux community totally rocks, you don’t have to use any of them! Look at that, free tools! Free not to use them too!
” Linux has a (graphical) stable platform on which you can build an app and reasonably expect it to work on any Linux distribution released since 1998, and it’s called xlib.”
Try 1995 or maybe before. Anybody remember when X11R6 came out? Even then moving from R5 and before to R6 isn’t exactly rocket science. But yes, nobody wants to write apps in X11…. That’s why they use Gtk, which has been around for a good 6 years (that it’s been real stable). And in that time it’s had one major revision; and they have some great porting tutorials too… Then there’s QT, I know very little about coding for QT…