Despite the constant predictions of “This year will be the year of the Linux desktop”, such predictions have yet to become reality. While the reasons for this are numerous, they all tend to boil down to Linux being built as a server and workstation OS rather than a home system. This article will focus on how a distribution might be designed to not only make Linux a competitive desktop solution, but to propel it into a leader in the Desktop market.
The Linux Desktop Distribution Of The Future
183 Comments
“No surprise Linux has only 1% market share.”
Where do you people get that figure from? It is at least around 3%, not even counting people who dual boot or people who are planning to use linux in the near future.
And as A nun, he moos said, linux users are still so few simply because people don’t even know that another OS exists.
When I introduce new people to linux I get positive reactions in at least 5 times out of 10: that makes for a potential 50% market share.
i do belive some of those comments are defined as either sarcasm or irony. problem is that i cant tell the two apart these days…
> Why nobody wants Linux?
It seems some companies are having trouble convincing their users to update / upgrade. Tsc, tsc, no more money…
Linux users, OTOH, update regularly, just because they can. It’s free, you know. Free as in speech, and free as in beer, though many choose to pay and have support. That’s why Red Hat is doing good.
Linux is an option for all, because it’s independent.
Cost-free products, OTOH, are not always for everyone. Internet Explorer, for example, is free — but only for those having a Windows license. Thus, poor developers cannot use IE, because they cannot afford Windows — nor even a new computer with a pre-installed Windows (I know many pirate Windows, but this is another problem).
> Linux has 1% market share.
Oh, come on, you can do better. Say 1.83% or 2.5%. No one believes in 1%, people are not that gullible!
And, think about it, Linux has no marketing machine supporting it.
How come so many people talk about it? How come so many countries are promoting it?
How come companies with rich marketing departments cannot reverse this trend? Is it bad marketing or this trend is really irreversible?
I, FWIW, think modern corporative marketing is fairly competent. But, then again, the question lingers: if proprietary products are that good, why marketing cannot stop Linux progressive growth?
Is there a way out? ;-P
I’ve been using Linux on the desktop since 1997. I can’t do proper administrative work on Windoze at all.
How come companies with rich marketing departments cannot reverse this trend? Is it bad marketing or this trend is really irreversible?
Microsoft has launched a biased advertising campaign called “Get the facts”. This campaign it aimed at stoping/reversing the trend that more and more companies are using LAMP as a viable server solution. By no means the “Get the fact” campaign is aimed at preventing people from using Linux on the desktop. No one uses Linux on the desktop, it’s a server OS. Two different markets.
The trend you’re talking about is happening on the server end, not on the desktop.
By no means the “Get the fact” campaign is aimed at preventing people from using Linux on the desktop. No one uses Linux on the desktop, it’s a server OS. Two different markets.
Problem is your mindset. You refuse to see there are people who use Linux as desktop. One of them is me.
Your suggestion for a base Linux install is a good one. There is one Linux Distro that caters for that with a CLI base install. Arch Linux.
Now if others followed a similar approach then I’d be happy. The only thing I didn’t like with Arch Linux was the hand configuring but it is a elegant system and it teaches you a lot about the underlying OS.
I am torn between Arch and Ubuntu but neither are the golden egg for me. I prefer to use Gnome on Linux though to anything else including OS-X and WinXP. Do I use Linux exclusively? No but I use it for 85% of my computing tasks and as more software become available for it I’ll use it more.
Configuring is a pain for full hardware support but OS maintenance is a piece of piss which can not be said for Windows. Performance on same hardware for (best illistration I can think of that pushes the sysem fully) gaming is favouring Linux.
All horses for courses, now all of you shutup and go out and get some fresh air.
I’d be just as happy if linux stayed as-is on desktops.. I dont use windows, i know better. Linux is easy to use and powerful to me.. why do i care if anyone else thinks so?
I’d be just as happy if linux stayed as-is on desktops.. I dont use windows, i know better. Linux is easy to use and powerful to me.. why do i care if anyone else thinks so?
Because you’re a selfish person, like most Linux developpers and users. If it works for them, they don’t care if it doesn’t work for regular users. Nice.
It’s funny you know but alot of Windows users like there desktop to look like MacOS-X, have you seen what it takes to get it to look that way?
Well in GNOME it’s just a case of dragging the icon and gtk2 theme tar in the theming options and you have a MacOS-X look. Thats one of the many things thats much easier to do on the Linux desktop, in windows you have to install about 5 programs and mess about for god knows how long.
People forget jusy how easy it is to do tasks in Linux but at the same time protecting the user with the inconvenence of a root password.
If it works for them, they don’t care if it doesn’t work for regular users. Nice.
Except that it does work for regular users. And so your entire argument crumbles down to the ground…
Except that it does work for regular users.
Proove it. Linux fails miserably on the desktop for regular users. Every week or so, we have negative feedback from customers who are “forced” to use Linux. There are some companies, call centers, libraries and other institutions that are implementing Linux on their computers, users aren’t able to use it, and they complain all the time.
Please verify your information before stating something you have no clue about.
Proove it.
No, you made the original assertion, you have to “proove” it first!
Getting negative feedback is no proof – do you know how much negative feedback I get from people who are forced to use Windows? Using your own logic, one would say that Windows doesn’t work for regular users – in fact, nothing would work for regular users, including toasters and VCRs.
Meanwhile, my ex-girlfriend, a complete non-geek and novice computer user, used a Linux desktop for a whole year and didn’t find it any harder than Windows. In fact, she liked it much better than her old Win98 computer, which kept crashing.
Same story with my current roommate, an intermediate computer user, who has used Mac and Windows PCs a lot. While he sometimes ask how to do specific things that are done diffrently on Linux, he recongnizes that it is a mature and powerful operating system.
There are some companies, call centers, libraries and other institutions that are implementing Linux on their computers, users aren’t able to use it, and they complain all the time.
Please prove what you are saying by giving us the names of those companies where people aren’t able to use Linux, and what the specific problem is.
Personally, I don’t believe a word of it, because all of the examples you gave are focused on a very specific set of applications (in a call center, for example, workers will use a set of 2 or 3 applications at the most). So in fact I am quite certain that you’re simply spreading FUD here. Linux/BSD/Solaris is ideal for the kind of work you mention – well, ideal except if you work for Microsoft, that is…
Please verify your information before stating something you have no clue about.
I have
“Proove it. Linux fails miserably on the desktop for regular users.”
This is rather subjective, and is the usual argument we hear about Linux. The fact is, you could say the same thing about Windows when you factor in all the headaches with security and other issues. Users complain all the time about Windows too. So what’s your point?
dang it… A nun beat us all… well anyway
i was going to point out about the complaints all the time argument as well…
well i am posting anyway
If these things were implemented in Linux, I think it wouldn’t be a vast improvement, but a nice bit of polish on an already “pretty good” desktop OS.
While package managers are neat, sometimes they don’t have the package I’m looking for. I use Ubuntu, and while I’m savvy enough to realize I need to look for Debian Unstable .deb files when the package I want is unavailable, I couldn’t expect an average user to do this. If we could have a distro-independent packaging system adopted for Linux-based operating systems, that would be awesome.
And I do find it annoying that sometimes my newly-installed program doesn’t have a menu icon, and I have to make one myself. But I can’t count on the executable or the icon being in a specific directory … the icon could possibly be in:
/usr/share/appname/pixmaps
/usr/lib/appname/share
/usr/games/share
/usr/games/share/pixmaps
Yadda yadda yadda.
I know a lot of people are upset that someone would have the audacity to suggest Linux may not be perfect; but although it already is a pretty good system, I think the things mentioned in the article would be “icing on the cake” and only help the adoption of desktop Linux.
Extra cooperation and work will help far more than righteously screaming in the average users’ faces, “USE LINUX BECAUSE IT IS PERFECT!”
while i dont think anyone have yet claimed that linux is the perfect os, the claim is that its usable by joe sixpack and family.
to that i will add that to test this fully one should duplicate joe on a atomic level. delete any knowledge joe have about using computers. put one copy in front of a preinstalled windows and one in front of a preinstalled linux. both running on fully working hardware btw.
after about a month of containment they each rate the experience.
im guessing that under these conditions the score will overall be about equal but diffrent areas will score diffrently from system to system.
basicly there are not enough preinstalled linux boxes sold for any statisticaly valid rating to be done.
there is allso the thing that 99% of the computer users out there will have their first experience with a computer running windows. and this will color their view of things unless they allso experience linux soon after.
humans are creatures of habbit. we do not like to change out habbits. so when a person that have had some months or years using windows then have to use linux they will complain endlessly that they cant find anything where they expect it to be. its just like having someone come into your house and move stuff around so that what you belive you left in the livingroom is now in you bedroom and so on.
I totally agree with the article. Linux is not ready for desktop use. I pity those being ‘sold’ the Linux message for desktop use. It’s a big lie.
I migrated from Monopoly$oft to Linux a couple of years ago. I spent 12 months running Linux exclusively on my desktop boxes, eventually settling on Slackware after trying many many distros.
Dependencies, package management & the lack of uniformity in directory structures/file placement almost drove me insane when trying to install/uninstall applications.
I found i was spending as much time coping with these massive design flaw’s as i was on my previous Monopoly$oft boxes.
Linux zealots told me ‘oh, no problem, just compile the app yourself’ – yeah right!
just try it & see how much time you waste trying to figure out why it WON’T compile – inconsistent directory strutcures/file placements. Hell even symlinks are a free for all from one distro to the next & from one packager to the next.
Eventually I gave up & bought a Mac… ahhhh, i felt right at home… all the familliar unix commands in a unix terminal… one (mostly) consistent gui… and installing applications is an absolute dream.
Installing an app on MacOS X:
1. open archive containg app.
2. drag app to ‘applications’ folder.
Uninstalling an app on MacOS X:
1. drag app from ‘applications’ folder to the trash.
2. [optionally) drag preference file of the app from the ‘preferences’ folder to the trash.
This holds true for 99.9% of Mac apps.
A few mac developrs insist on creating installers that splatter files all over your os (a la windows) but they are in the minority & there are easy strategies to undo the mess they make.
Why on earth would anyone in their right mind think that the fragile, super complex mess presently embodied in Linux is ok for desktop use is totally beyond my comprehension….
Well I just stumbled across Gobo Linux in my readings.
At last !!
A distro that acknowlegdes Linux’s short comings & sweeps away all the arcane nonsense, reimplementing Linux in a more ‘Mac like’ way!!
Now that Mac’s are switching to x86 CPU’s Gobo Linux is one I will be keeping in mind for installation on my ‘MacTel’ next year hehehe
http://www.realtechnews.com/posts/1511
Still feel good about giving someone you care about a Windows infection?
-Gnobuddy
after actually having switched to linux. i like it much better. *because* i understand how it works. it is alot different from windows. Some things are simpler, some things aren’t.
some of the problems i have with linux are minor things.
like installing software that is not in the ubuntu repositories. i actually figured out how to compile some of these programs. i was quite proud of myself for that one
installing drivers is HARD. i still can not figure how to update the ati drivers at all, couldn’t get it working after i got it installed either. and then it was real hard to find the uninstall for the ati driver, and nothing was even said about the uninstaller on ati’s website.
Configuring x is also hard. There is probably a million different settings for various devices. And they are not all standardised. IMO, its a mess. it totally requires someone to read something to configure it. i was able to get it configured all right. but its not a good system at all. it makes usb mice impossible to just plug in and use. It would be better if input devices were not configured there. and users never had to touch the xorg.conf file.
I use Ubuntu Hoary, and honestly I love it. I still duel boot to winXP sometimes, but except the times I use my webcam, ubuntu is the place i feel more home. I am neither a computer engineer nor so talented. Yeah, it takes time to customize your desktop and configure things to work the way you want to be in linux. However at the end of that time, you take your award. I do agree Gimp is still far being close to something as professional as photoshop, and i agree most applications on linux are lacking the functionality comperasiant to their windows equals. But nobody can deny the progress and positive change on linux in the last year. and i guess it has been faster than any other OS. Synaptic is really so great for installing software, and i cant complain anymore about dependency hell, or dazzled with terminal screen installing screens. However, one problem still continues, you can’t know what exactly you are installing. Unlike in windows world, you cant simply go to site of the software. read, check screenshots than install. and software names are still crippy.. even on that aspect, there been nice progress, at least you can go to skypee.com and download from their site after reading and learning more about the software.
There is only one thing I would complain. Even Ubuntu has incredibly well detected my hardware on my brand new compaq laptop. Its still hard to make changes on driver case. I cant easly install ati drivers for my ati card… i told u i am not expert, and synaptic is not that helpfull on that front, terminal screen and coding is necessary still when it comes to hardware. philips webcam i cant install.
ati.. philips..mp3 all those are property, and linux community still avoid property. well i think property software, and distros supporting is the big next issue. somehow there must be a midway for that property thing, a way to negotiate otherwise it will be still problematic linux becoming a desktop OS for daily use.
With KDE as my desktop of choice I could easily say that Linux is indeed ready for the Desktop. Compared to the features WindowsXP offers I can say that KDE is far ahead in many areas of WindowsXP which makes it a pleasant Desktop enxperience.
“However, one problem still continues, you can’t know what exactly you are installing. Unlike in windows world, you cant simply go to site of the software. read, check screenshots than install.”
sure you can…. why cant you?
Linux fans often claim that Linux hasn’t become a huge success on the desktop because people are used to Windows and don’t even think about changing. That’s true to a certain extent, but the usability problems and complexity of Linux are just as much to blame IMO.
I know plenty of people who’ve switched from Mac, Acorn and Amiga to Windows (or vice versa). Generally it’s an easy transition, initial problems because of lack of familiarity are quickly overcome. I also know a number of people who’ve tried Linux, I’ve tried several “desktop” distributions myself. But I don’t know anyone who’s switched to Linux full time, IME the frustrating problems drive most people back to an easier to use OS.
I use Windows and Mac OS every day, I’ve also run OpenStep and BeOS on my PC perfectly happily, but every experience with Linux has been extremely frustrating and unpleasant. From trying to get my hardware working to installing apps and dealing with Linux inconsistency I constantly wish I was using a different OS. I hope eventually Linux will reach the same level of usability as Windows/Mac OS, having a free OS is very appealing but Linux has a long way to go before it’s a desktop OS I’m willing to use.
i can see eugenias horns growing….
use whatever you like pleeeeze….
I mean users who have never used linux, really used linux, got adjusted to linux, learned linux, etc… just dont GET linux and probably never will, so you will never convince a windows user that hasnt REALLY used anything else that anything is better because there argument will always be it isnt windows like enough….
Now as far as us linux users, we have beat our heads in and figured it out…. well, we will never go back to windows and since realizing how great linux is nobody will ever convince us that the “windows” way is better….
i like linux for a lot of reasons but one of the coolest being that I can make it whatever i want from a server to a light desktop to a heavy desktop and anywhere in between..
oh and my mother uses linux and it has been a LOT less trouble, she has a icon on her desktop that says “internet browser” and one for “email” and “address book” “calculator” and of course openoffice is installed and she is good to go…
i personally found windows VERY hard at first but now windows is easy, same with linux…….
in windows go to START to stop? makes sense from a usability standpoint to me…. NOT!
“frustrating problems”
could you be a bit more specific?
1. Install the Linux distro of your choice.
2. Start the included email program and configure your account.
3. Count the number of functions (new letter, reply, reply-to-all) in the program. They should be like 10 or 15 of them.
4. Go to preferences/settings (there should be two or three of them, sigh).
5. Count the number of preferences/settings for your email program. There are probably 100 or 150 of them. Maybe more than 200.
When the preference/settings are more than ten times the number of functions, you know you are using a lousy system built by idiots.
“I have been a linux user for over 5 years ( currently gentoo ), windows since 3.11 and a few experience with osx, and i have to say that i am 100% agree with this guy. linux is not yet a DESKTOP OS.”
Of course, I understand that. If you use Gentoo you can only say that linux is not ready for the desktop. That is why at the moment there are never ending thread at the Gentoo forums where people complain about Gentoo being difficult, tedious to install, takes for ever to compile…
Take a look at something like SUSE, Mandrake, Xandros, Kanotix, just to mention a few, and perhaps you’ll change your mind.
“When the preference/settings are more than ten times the number of functions, you know you are using a lousy system built by idiots.”
Not hardly – you know you are using a system that allows for more flexibility – the USER of the software can select options appropriate for them.
> When you said Linux, which distribution do you use?
>>We use Ubuntu. It IS supposed to be easy to use. It’s not. Install Ubuntu, the network isn’t configured automatically. You’ll have a hard time installing software you DO need to have your work done such as MS Office (OpenOffice isn’t compatible with some MS Word documents, they are all messed up), a PDF converter, and other software. Configuring WINE is really complicated and takes much time. There are always errors.
It is common to waste a whole day configuring something that would take 20 mins on Windows or OS X. I hate Linux. I’d rather pay 10x the price of a Windows license to stay away from Linux. And Ubuntu’s Gnome is ugly. Arghhh!!!!<<
__________________
Yes!!! You believed the hype and you got what you deserved. The Emperor is naked! Take a look at something else, like SUSE, Mandrake, Kanotix, Libranet, Xandros, just to mention a few, and you’ll change your mind in no time.
I’ve read some post that talk about the low market share of Linux OS, but isn’t that because windows gained momentum at a time when there was no real competitor?
Drivers, well I’ve had some experiences setting up drivers for Windows without having the Driver Disks And I don’t find any major difference (Except maybe because of using the Terminal) And for ease of use well, On windows 98 I remmember screwing up the whole system for deleting stuff I shouldn’t have(And didn’t even knew how to set up a dial up connection plus I had a customer last week that did the same I did years ago but on winXP)
As for other hardware, I don’t mind doing some research looking for X hardware supported by my favourite Linux Distro(Before buying).
“I’m pretty sure that I’ve never even defragged the hard drive.”
Yes, I know that you Windows users do that. Your system badly needs maintenance but you are in denial. Would you read a book whose pages have been scattered all over the floor?
Not hardly – you know you are using a system that allows for more flexibility – the USER of the software can select options appropriate for them.
Why develop applications at all? Just give users compilers and the USER can build applications appropriate for them.
same here.
I tryed Suse, MDK, Linspire …. all are pretty useless and a waste of MONEY.
I work with the developmentally disbled with mild/mr. one client has fedora 2 configured to boot straight into kde. every once in a while he hides the desktop and needs me to remind him of the password. I have set up some icons in the task bar for him. he plays only the kde games. If I were to show him windows, i do not think he would know the differance. kicker and start button on the lower left, icons to favorite apps on the task bar. Come on folks KDE and Windows are interchangeable to momsy and popsy. AND many moms and pops never look at a file system beyond “My Documents”, “My Pictures” and “My Music”.
MY MY mY me me me. i’m somebody. i digress.
Another client has mandrake 10.1 with kde direct boot. I’m still teaching her the buttons that will send that e-mail, buttons located pretty much where they are in outlook express. she has very little worry about many viruses. the many virtual desktops can be used to keep the different apps wide open for her, she need only traverse the virtual desktops to get to different chores.
NOW I have another client with win2000 and he is my most trouble. weekly it is a chore to rid him of hijackers addware spyware, OH THe MESS!! and those mysterious charges he finds on his phone bill. I’m tempted to ripp that 2000 right out.
You want to keep your mom and pop happy (and out of trouble) and get on with the rest of your life, show them ubuntu, mandrake or other windows clone. Yes, Linux is very near ready.
the rest of us will use BSD.
“I tryed Suse, MDK, Linspire …. all are pretty useless and a waste of MONEY.”
And why did you try if you didn’t want to make the effort of learning something which works just slightly different? I have a 8 years old here who is very happy with SUSE: plenty of games. He has WinXP as well, but he hardly ever uses it.
And why did you pay? SUSE has been making free isos available since 9.1. Mandrake has always been free. Linspire has very often free offers. And ever heard of Bittorrent? It is *not illegal* to download linux distros. So stop trolling.
And why did you pay?
You can’t buy everything but you can’t get everything for free either.
Besides it’s nice to have in a box.
“You can’t buy everything but you can’t get everything for free either.
Besides it’s nice to have in a box.”
Oh yes, I agree with you.
My point was that “mythought” didn’t *have* to pay and complain thereafter.
If you sort through the 119 previous posts, the rational conclusion is (a)Desktop Linux isn’t ideal for everyone (what the heck ever *is*?) and (b)Desktop Linux is entirely adequate for many.
Perhaps a few facts about the state of desktop Linux today are in order:
1)”The German National Railway made a second major move towards open source Linux software when it successfully moved 55,000 of it’s Lotus Notes users onto the Linux operating system.”
(full story at http://htmlfixit.com/?p=389 )
2)”the Extremadura government announced last month that it had successfully deployed 80,000 LinEx computers in schools, or one system per two students..”
(An old story from 2002, the Linux rollout is far larger today. Full story at http://lwn.net/Articles/41738/ )
3)”At a November conference, IBM exec Sam Docknevich revealed that IBM would deploy 50,000 desktops in a year’s time, up from 14,000 technical users.”
(2004 story, read the full article at http://www.desktoplinux.com/news/NS2663717532.html )
Those three stories alone refer to 185000 Linux desktops tha t either already exist, or soon will.
Research firm IDC thinks Linux hit 2.8% of the (US) desktop market in 2004, up from 1.5% in 2000 (story at http://insight.zdnet.co.uk/software/linuxunix/0,39020472,39118695,0… ). So it took some nine years (1991- 2000) for Linux to get to the first 1.5%, and only four years to pick up the next 1.3%.
The only way anyone can claim desktop Linux is not already a success today is to either be ignorant of the facts, or to willfully disbelieve the ample evidence. After all, we have people who still refuse to believe in evolution despite a century of solid scientific data backing the concept.
-Gnobuddy
That speaks for the huge success that KDE makes on the Linux Desktop.
There is one base repository that effectively dictates how compatible other repositories will be – the one you use to update the OS
True. Having more than one repository for the bases system would be a very bad idea.
Additional repositories should only contain additional software.
No one install of a distribution is going to be exactly the same as another.
That doesn’t matter as along as each installation is using the same repository and dependecy tree.
Any not yet installed package can then automatically be retrieved from there, making it a equal enought for the additional software in question.
F*** yes I am! The current implementation of Linux distributions is what’s out there, since that’s what is actually being discussed!
Well, I had the impression you believed that the software distribution model in question is broken by design, which I doubt.
IMHO that is a false generalisation.
Of course if you just mean that the current implementation of said model are broken, I have no problem to agree with that.
But I still think that the model itself is superior to any other currently available one.
If the pre-requisites are OK then it is OK
That’s what I said, isn’t it?
but the point is that from Linux distribution to Linux distribution, versions to version, especially with multiple repositories, no ISV can ever guarantee those pre-requisites at all
I already agreed multiple times that the current implementations fail at that, but I still insist that this doesn’t make the model broken by design.
No, it’s just you have absolutely no clue what you’re talking about and you’re going round in circles
No, it’s just you making an unholdable generalisation based on experience with current broken implementations.
an ISV will simply not support umpteen package managers, end of story
Multiple package managers are only true for Linux distributions.
A single source operating system like OS X or Windows or Solaris would very likely only provide one package manager, not a single one more.
Your arguments fall apart totally right there then.
No, I never claimed that the current implementations of the repository based software distribution model are working.
I just countered the logic that because current implementations are broken, the system itself cannot work under any circumstances.
I strongly believe it can given a good implementation.
No it isn’t independent of the distribution model because software that ISVs create is tied to each individual distribution and even version.
A Windows program is tied to the existence of the Windows API, a Cocoa program is tied to the existance of the Cocoa API, ergo a program is always tied to the existence of the platform it is built on, absolutely independend from how it gets deployed.
The repository based software installation system cannot possibly work on a wider basis for ISVs to support Linux on a large scale.
Thank you for again over generalising from the current broken implementation.
Just because a broken car cannot move doesn’t mean cars can’t move by design.
Poeple that base a judgement of some techology on the experience with broken impementation of said technology are usually proven wrong by history.
and there’s no way of configuring the software through an installer
While this is unrelated to the discussion about the distribution mechanism, it is also untrue, because any even remotely sane implementation of a package manager can prompt for user input during configuration of the package.
Even dpkg can do this.
“The only way anyone can claim desktop Linux is not already a success today is to either be ignorant of the facts, or to willfully disbelieve the ample evidence. After all, we have people who still refuse to believe in evolution despite a century of solid scientific data backing the concept.”
Exactly. And yours was, could only be a small selection of facts. The truth is that we read of linux succes stories everyday.
That’s what I said, isn’t it?
Read the bit afterwards – namely that no ISV can rely on it. Don’t just pick parts out to suit yourself.
I already agreed multiple times that the current implementations fail at that, but I still insist that this doesn’t make the model broken by design.
I’m afraid it does, but I think you’re either not understanding it or one of these people who persist in thinking that the repository method is the way forward. There’s been similar discussion on the Ubuntu bugs site about Autopackage, and they just cannot see what is in front of them.
But I still think that the model itself is superior to any other currently available one.
You haven’t described the model you imagine, or why what current Linux distributions do is wrong or how it is going to solve any problems for an ISV. You’re just fleshed out your comments with bollocks.
You still need to get something installed first, and an ISV still needs to rely 100% on what base system they are installing on top of. No repository based system can guarantee that because you’ll have umpteen companies all pointing to their own repositories all getting you to download different versions of components that break your system.
No, it’s just you making an unholdable generalisation based on experience with current broken implementations.
Don’t cop-out with the broken implementations, unholdable generalisations crap. If repositories don’t work today then they don’t work, period. You’re talking out of your backside.
I speak with many people like you all the time – all that happens is that you talk absolute bollocks for as long as possible thinking that someone will think you know what you’re talking about. You don’t.
A single source operating system like OS X or Windows or Solaris would very likely only provide one package manager, not a single one more.
Since this is a Linux discussion that’s totally irrelevant. Besides, you still need to target multiple systems, but with Linux they should all be the same and that’s the point here.
Even then, every package has the call-home syndrome unless you have a central repository. ISVs cannot be tied to a central repository. As an option you’ve then got every ISV pointing to their own repository with their software trying to update whenever they like. The only way to control that madness is to have a centralised repository, which is what the Ubuntu guys elaborated on.
It’s either one or the other, and they’re both not good enough for widespread desktop and ISV use. I’m afraid you don’t have some mythical solution to the problem.
This cannot work.
I just countered the logic that because current implementations are broken, the system itself cannot work under any circumstances.
No you didn’t because you haven’t described any of that, nor responded directly to any of the points made. You’re simply not sticking to the issues at hand.
A Windows program is tied to the existence of the Windows API, a Cocoa program is tied to the existance of the Cocoa API, ergo a program is always tied to the existence of the platform it is built on, absolutely independend from how it gets deployed.
Correction – it is tied to a particular version, or snapshot. That’s where the problems start. With a repositories system you’ve then got ISVs wanting to update things that either knowingly, or unknowingly, break other things.
Thank you for again over generalising from the current broken implementation.
Blah, blah. I’ll just say that everything is broken (and there’s a mythical repository system out there that doesn’t exist that will solve everything) and maybe he’ll think I know what I’m talking about.
If there is no movement out there on solving it then it simply is broken, and you sitting here telling everyone what they have is broken isn’t going to change matters. You’re not doing it.
While this is unrelated to the discussion about the distribution mechanism, it is also untrue, because any even remotely sane implementation of a package manager can prompt for user input during configuration of the package.
It’s not unrelated, because we’re not (just) talking about the distribution mechanism – we’re talking about software installation. That has wider issues.
Still, it is absolutely nowhere near the level of configuration you get from an install wizard, it has to be integrated with the system and it still doesn’t get around the fact that ISVs will not target impteen implementations of it.
I’m afraid you just don’t grok any of the issues involved whatsoever.
I didn’t make it up about the 2 NICs. I use an Asus A7N8X-E motherboard. Try it yourself.
Read the bit afterwards – namely that no ISV can rely on it.
On a system where this would be true, the ISV could rely on it.
After all they also rely on Microsoft not removing certain APIs.
So based on a stable system you can rely on that system, no matter how it got installed.
You haven’t described the model you imagine, or why what current Linux distributions do is wrong or how it is going to solve any problems for an ISV
You’d have a base repository for base system, X server, common libs. Additonally it could contain software built on that.
An ISV would have repository for their software, which relys on parts from the base system like libc, etc.
An “installer” would add an ISVs repository to the distribution system configuration.
Any dependency on base packages of the ISVs package not yet installed, would be fetched from the base repository prior to fetching the ISV package.
Any upgrade by the ISV to their package will be seen the next time the system is updating its package lists, thus enabling the system to trigger an upgrade.
So far this is already working.
The problem is: there is no base repository, as you continually point out and as I continually agree with.
So in order to let this work, there has to be one and only one base repository.
Not very likely in the near future, but not impossible.
So far this is only a problem on Linux, not on operating systems with a single distributor entity.
Don’t cop-out with the broken implementations, unholdable generalisations crap
You are the one bringing “unholdable generalistation crap”, because you are extrapolating from the current situation to all possible future situations and not even Nostradamus could to that.
Since this is a Linux discussion that’s totally irrelevant.
Sure, based on just Linux it is, based on the distribution technolog itself it isn’t.
“This cannot work” includes as possible target platforms, including those with proven long time compatability and single source base systems.
“This cannot work on Linux” on the other hand would make other platforms irrelevant, but I don’t see the word “Linux” in the context of this statement in your posting, thus I have to assume you are talking about all current and possible future platforms, including any newly developed assistance technology.
Making predictions on the future of IT is usually already difficult for short time periods, making one for any point in distante future is quite bold.
You’re not doing it.
True. I think I never claimed I could do it. But you boldy claim that no one ever could do it.
Which either means you know ever human being currently living and being born in any time until the human race is extinct or computing has changed to something different, or you are just making a guess.
It’s not unrelated, because we’re not (just) talking about the distribution mechanism
Oh? I do exactly that, Just talking about the distribution mechanism.
That’s the thing you claimed can never work.
we’re talking about software installation
I see. So we are wasting our time because we are talking about different topics.
Well, it’s Sunday
Still, it is absolutely nowhere near the level of configuration you get from an install wizard
Can you give an example on what kind of configuration item cannot be asked by a system launched configuration wizard rather than a user launched configuration wizard?
No, printers will not accept your CD-ROM if your files are not in the .cdr extention. It’s the industry standard, unfortunatly.
You’ll quickly refute that by saying gimp is a great alternative, but actually gimp is nowhere near the level that photoshop is at.
It depends. This is where I can tell many of you have never been in business like this at all. When a customer goes to you for business they’re not interested in whether you use Photoshop or not. All they want is an image, or set of images, that you can produce for them for the business, website, literature etc. Whether you use Photoshop or the GIMP, they’re simply not interested.
It’s funny that all these people come out saying the GIMP isn’t good enough, but no customer actually cares what you produce their images with as long as they get done. You can quite clearly get quite a bit done with the GIMP, but as to whether you need everything that’s in Photoshop when you reach the limits of the GIMP is a considered decision that you make.
For the vast majority of people and businesses, unless they’re making a lot of money out of their Photoshop work to pay for their Photoshop licenses, it is simply too expensive regardless of whether people think that the alternatives are good enough or not. You may decide the GIMP is good enough, or go for Paint Shop Pro.
I’m all for OSS, but please don’t say people don’t need their Win or Apple software until there are real alternatives.
I never said that, so you haven’t read what I’ve written. If you need a couple of Windows machines around for Windows-only software, then fine, but sticking your hands over your ears and saying you can’t use Linux for anything is quite clearly not true. That’s people have been trying to say here.
Linux is not for everyone, take for example a school in Melbourne, Australia who were using Debian, but got fed up and switched to OS X.
They clearly wanted to use OS X to start off with. If they thought Linux wasn’t good enough, and they already had x86 machines, it would have been far cheaper just to move to Windows, and Office, since they ended up using that anyway. Buying large amounts of Mac hardware and software is an expensive business.
From the article:
“It also depends on cost. If you can run your business without paying Microsoft, significant cost savings can be achieved.”
Considering that he’s spent many, many thousands needlessly on Macs and Mac hardware and he’s still paying for Microsoft Office it’s anybody’s guess where the savings come from. That article was just someone making decisions out of ideology and pet likes and dislikes, not someone concerned about cost or common sense.
how about svg? thats vector graphics saved as xml
No, printers will not accept your CD-ROM if your files are not in the .cdr extention. It’s the industry standard, unfortunatly.
I’ve never, ever encountered that, and if you’re thone paying the money you specify the form and ask if they can print what you’ve got. I’ve walways had my PDFs printed, and I’ve always had PDFs distributed to me by graphics companies. Corel Draw formats are not an industry standard in any way shape or form.
If you’re paying the money, you specify the format and what you want printed. That’s the way the world works.
I didn’t make it up about the 2 NICs. I use an Asus A7N8X-E motherboard. Try it yourself.
That motherboard has a built-in Gigabit ethernet interface. You’ll either need the nVidia drivers installed for it or a modern kernel. It probably won’t work out of the box on Windows either (or very well) because you still need the mother board drivers installed.
Everyone who has been paid to use *nix, take this simple test. Did you like it?
I have been paid, on several occasions and in each case, without exception, I hated it — just like “Joe User” above.
Since I was getting paid to use it, I should not have hated it. Struggled with it, sure. Had a wry smile on my face as I think “and they are paying me to waste my time on this!”, sure. But not hated it.
My list of ultimate OS’s is: Win98SE, XP, NetWare, …and the rest don’t rate, including MacOS & *nix. Why?
98SE is the fastest OS, DOS/Windows marriage the best (with DOS access to clipboard that XP took away, grrr).
XP is the best MS OS to date, heavy duty to say the least. Plug in printers or USB sticks and all is well in seconds. I run 15 to 200 applications all the time, 24×7, and I have had a total of one blue screen, on one of ten XP machines, and this was caused (ironically) by MS’s MSN conflicting (sound card wise) with my Yahoo IM.
NetWare was lightyears ahead of NT for years, but I’ve not used it recently so this may no longer be applicable.
MacOS is the most overrated of all OSes — a true pain in the butt. ONE mouse button?!?!?! That should be the start and end of any argument about Macs. What type of insane idiots are happy with that? I need to hit the keyboard to bring up a context-sensitive menu?!?! Puhlease. For much more, search the net for the ACM.org “Anti-Mac” article. The Mac interface is 20+ years old now, and prior to OSX (that I’ve not used but I am sure would not care for given its *nix underpinnings) largely unchanged. The notion that an application’s menus takes over the top of the screen is ludicrous, yet Mac zealots actually think this is better than Windows/*nix. To wrap your mind around this weirdness you then need to run each application full screen — defeating the whole windowing concept of modern GUIs. The only other comparable GUI mistake is the *nix thing where the focus moves as the mouse does.
*nix is the oldest of all of these OSes, and it shows.
Sure, Windows has old underpinnings (DOS) but MS has upgraded/merged DOS in nice ways — e.g. you can launch GUI apps from DOS, can see both short and LFN in DOS, yet can also do most things in the GUI — the perfect marriage. Kind of like the MS decision many many years ago to make everything mouseable also keyboardable.
*nix believes that man pages, vi and the truly awful .tar format are all still viable! man should have been replaced (sure, with backward compatibility for a _while_) with something HTML based, or PDF, or some other solution that did not involve the MORE prompts (press a key too quickly and you have to start the whole thing again — that super stinks yet nothing was changed for eons!). vi was awful 20 years ago but *nix types are probably still using it today — an incredible statement but very indicative of the *nix mentality. The .tar format is shockingly bad to use (at the command line anyway, where I do all my archive work). The system should automatically play the Mission Impossible theme music as soon as you type “tar …”. Archiving files in ZIPs (scr*w the .tar ball!) is a huge part of systems work, and yet tar is all but impossible to use __without any syntax errors__ by even the most power user.
I am beginning to think the people posting the most favorable remarks about *nix here are in fact MS employees — trying to lure more suckers to try *nix, so that they too will hate it.
On the plus side of *nix, it is good to give MS competition — this benefits everyone. It is good to see 3rd world countries switching to *nix in a big way — but consider MS’s answer to that — crippled XP Starter Edition that allows just 3 windows — MS would not offer this if they didn’t think it was a fair competitor to *nix!
On the minus side of XP, my HP laptop hard drive is failing — 12 months of 100% utilization will do it — and my weekend trip to Fry’s would have translated to $199 to re-buy XP home (my orig. CDs are in another country right now) + $50 to install a HD (if you can believe it) + $60 to install XP OS, all in addition to the $120 drive itself.
I was outraged, refunded the $50 install cost, went looking for an $89 copy of XP Home on the shelves — not to be found, more outrage, turns out must ask for OEM XP “behind the counter”, then must refund HD purchase and re-purchase so it is on same invoice as XP to keep MS off Fry’s back. And MS wonders why XP has seen less adoption. My HD was all that failed — I should have walked out of Fry’s with a new one for $120 + about $25 to ghost a new XP image in 10 minutes. When that is possible, XP sales will pick up. For now, XP’s greatest threat is MS itself.
Props to the person above (around the 80 to 100th comment) who said *nix lovers should be open to IDEAS and not be OS bigots.
about that clipboard in dos/cmd and xp, try rightclicking
it would be interesting to see a system as simple as dos to admin but with a more modern gui that can run on top when the user wants to. maybe trow in the ability to have multiple commandlines like you can do in linux (alt+f1-f5 or more) and proper multitasking + memory protection so that one can kill a driver and restart it without having to restart the whole freaking system (or maybe give the system the ability to drop everything and reread the boot files without doing the ctrl-alt-del combo).
i kinda enjoyed dos in that it was simple to admin. need a soundcard to work? add the command for starting the driver in autoexec.bat and so on. it was easy to get the hang of and very easy to recover when messed up (just have a floppy handy with cdrom drivers and presto. i these bootable cd age you dont even need the floppy).
still, i think that gobolinux covers most of these bases for me. if i want to fire up kde i just type startx…
>Yes, I know that you Windows users do that.
>Your system badly needs maintenance but you
>are in denial. Would you read a book whose
>pages have been scattered all over the floor?
If I had an idiot savant available to hand me the next page when I finish a page then yes, I would read such a book. My computer is my idiot savant.
If my computer does everything I ask it to it doesn’t need any additional maintenance.
That’s not denial, that’s reality.
When you have these people that think repositories are the total answer then you have a problem.
Repositories are good for a base system, but don’t solve the problem for ISVs and other software. In fact, I grab a random deb or rpm off some site and try to install it, it won’t automatically go out and get the needed dependencies.
So basically it always comes down to the dependancy problem.
Personally I like the idea of GoboLinux. I know it’ll never fly as some kind of standard, but I think many people like to have all files centralized so they know what goes with what.
My DOS clipboard need was from a DOS application, not the cmd-line — XP changed how the clipboard worked, breaking older/DOS programs that could access it before. Totally annoying, and none of the XP “compatibility” settings addressed this.
Re: Multiple cmd lines — just run cmd.exe multiple times or did you have more in mind? I run multiple cmd’s when needed (not often).
Proper multitasking, hmmm. Only weakness here that I know of is that some (most, I suppose) DOS apps don’t yield properly, driving up utilization. But XP gives good performance even with a busy DOS ap., and in DOS with a 100% utilization Windows ap. so I am not sure what is missing here.
The “kill a driver” thing is really a server-level wish. I don’t ever want to kill a driver on my personal machines but for sure on servers I don’t want to reboot them ever, if possible. Once in a blue moon (say once a week on average) I have to (or need to) reboot one of my XP machines. This is not a painful experience, again on personal machine. On server it would be of course. [FWIW, I agree about being able to reread boot files.]
DOS was dead easy to admin, and Win 3.x almost as easy (due to a few text INI files, most kept in once place). I accepted Win98SE as an improvement once the Internet came along — the need for always on net communications killed the pure command line star.
It may be that some newer *nix like gobo covers these things. If so, glad to hear it. Personally, I like the idea of updating the file structure using SYM links. I had to use cygwin for some remote file transfer stuff and (just checked) it created almost 1,000 DIRECTORIES yet the *user* (i.e. data) directories are part of this tree monstrosity. Why oh why does it default to installing all the source files/branches? Put them ALL in a full-path ZIP file and label that appropriately. For the one in 500 of us that wants access to the source, we can extract/edit/recompile/cleanup from there.
>If you sort through the 119 previous posts, the
>rational conclusion is (a)Desktop Linux isn’t
>ideal for everyone (what the heck ever *is*?)
>and (b)Desktop Linux is entirely adequate for many.
Amen.
To decrease the number of folks in catagory “a” and to increase the number of folks in catagory “b” the article has a number of suggestions.
Some of them are good, some are bad, and some are simply uncompelling.
this is like watching a tennis match….. boring ball going back and forth back and forth back and forth….. yawn
fun argument tho….
“I grab a random deb or rpm off some site and try to install it, it won’t automatically go out and get the needed dependencies. ”
thats the reason you create a package list and list the dependencies and then instruct people how to add your site to the apt sources and then it WILL do just what you ask…
of course if I think I have what is required or close to it then I just install it and tell it not to check for dependencies….
not difficult, just different….
I have been paid, on several occasions and in each case, without exception, I hated it — just like “Joe User” above.
Where u using BSD and *not* linux??? If it *was* linux, was it one of the known hassle Distros like Gentoo, Slackware, Pure Debian, etc???
I mean, i *know* vi IS outdated, that’s why many modern linux distros come with nano(for the cmd users) where everything works just by using some CTRL + Key combo.
Most modern distributions include enough GUI tools do deal with TAR archives, display help files properly and so on…
Or is it that u ssh to your computers at work and expect it to look like win NT server(Including XP and so on)
So far this is only a problem on Linux, not on operating systems with a single distributor entity.
You can’t have a single distributor entity because ISVs need the freedom to distribute their software as and when. That’s one of the demands of a widely used desktop environment.
Any dependency on base packages of the ISVs package not yet installed, would be fetched from the base repository prior to fetching the ISV package.
Any upgrade by the ISV to their package will be seen the next time the system is updating its package lists, thus enabling the system to trigger an upgrade.
This goes totally against having a solid and stable base system. You’ve got a moving target there. No installation of ISV software should trigger an update of the base system – ever. It falls apart right there and then.
What you’ve described above is working in many Linux distributions (has done for some time), but it just doesn’t work on the scale required for widespread desktop use for the reasons I’ve continually outlined. There’s been many discussions like this within Ubuntu, for example, about Autopackage, central repositories, whether to have them, how to solve it etc. It’s nothing new.
You are the one bringing “unholdable generalistation crap”, because you are extrapolating from the current situation to all possible future situations and not even Nostradamus could to that.
In terms of all future situations of this stuff, they obviously don’t exist yet, and I’ve outlined why they never will because everything you’ve talked about has been discussed to death on package management – especially concerning repositories. There’s no light at the end of the tunnel.
Nostradamus?! You’re the one who’s writing about mystical future situations that you don’t know about yet.
“This cannot work on Linux” on the other hand would make other platforms irrelevant, but I don’t see the word “Linux” in the context of this statement in your posting
Hint: Have a look at the original post and have a look at the title of this article ;-).
Making predictions on the future of IT is usually already difficult for short time periods, making one for any point in distante future is quite bold.
I’ll just play Nostradamus for a bit and gaze into my crystal ball…… :-). If all else fails, just say no one can predict the future.
But you boldy claim that no one ever could do it.
Yes, and I’ve said why.
Which either means you know ever human being currently living and being born in any time until the human race is extinct or computing has changed to something different, or you are just making a guess.
Err, no. I just look at what’s happening with pckage management and how many people are proposing to solve it and find a way forwards. There’s nothing happening currently. That’s the situation.
Extreme off-topic mystical tosh though. Yawn.
That’s the thing you claimed can never work.
Certainly for the purposes of desktop Linux (and other OSs as well) no, I don’t think it can work at all for the reasons I’ve outlined.
Oh? I do exactly that, Just talking about the distribution mechanism.
The distribution mechanism is just part of software installation, and it has implications for it.
I see. So we are wasting our time because we are talking about different topics.
Well if you are you’ve misunderstood the whole thing.
Can you give an example on what kind of configuration item cannot be asked by a system launched configuration wizard rather than a user launched configuration wizard?
Simple graphical configuration. Have a look at the MySQL installer on Windows and any installation mechanism used on Linux. You’ve still got the same problem of many distributions configuring things differently though, ISV packages trigger updates of the system etc. and so configuration becomes a moving target in the same way as the base system.
keep saying that linux will never be the defacto desktop…maybe you are right….
of course, i personally havent read the “linux must take over the world” manifesto but maybe it exists, i personally am just happy with where it is at now – on my desktop and could care less if other users get it… I will still be using my OS in two years when M$ requires a new machine with DRM protection and so forth….
(flame bomb)
🙂
Repositories are good for a base system, but don’t solve the problem for ISVs and other software.
No, they certainly don’t.
In fact, I grab a random deb or rpm off some site and try to install it, it won’t automatically go out and get the needed dependencies.
It should never actually trigger an update of the needed dependencies (only within the context of itself), particularly those already installed, because then your system is a moving target.
yea software installs should be painless just like on windows, it should auto open ports, put in default pages, and auto start services and not tell you about any of it or ask about configuration because that would mean you would have to learn something and might figure something out… keep the easy auto installs I love them, nothing better, none of that crazy linux stuff for you where you are asked questions about whether you want a service started locally or not and so forth… who needs that headaches… windows foreever
signed
the spammer, hacker, spyware guy
….waiting for my 60 sec to pass…..
vi isnt outdated…just cryptic as hell
nano and nano-tiny have been my choice for a while but then I am not leet
but when a windows guy rants about how crazy vi is, well you can suggest nano but they will just ignore you and keep raving about vi…. they do not WANT to see that everything isnt as hard as they think it is, they just want a excuse to keep using windows and not have to relearn….
You can’t have a single distributor entity
Ok, I am curious: tell me one other single entity that legally and independendly from Apple distributed Mac OS X.
I had the impression that only Apple is allowed to do that.
But of course if this is so bad for ISVs, there is very likely at least a second entity doing that which I don’t know of.
No installation of ISV software should trigger an update of the base system – ever
I fully agree with you, but I was referring to upgrades of the ISV package.
As opposed to having the software update itself.
If all else fails, just say no one can predict the future
Exactly!
There’s nothing happening currently
Stagnation doesn’t necessarily prevent development later on.
Hundrets of years the quest to be able to fly didn’t improve, until some day a couple of geeks managed to fly a couple of yards.
I guess there have been tons of people through the centuries who predicted that this won’t ever be possible.
The distribution mechanism is just part of software installation, and it has implications for it.
It is nevertheless an additional technology which can be discussed on its on merrits.
Just because HTTP is mostly used to deliever HTML documents, doesn’t mean HTTP is damaged by also quite broken situation of HTML rendering.
Simple graphical configuration
Hmm, there could have been changes in what installers do nowadays, but back when I was installing software on Windows it just used wizards that asked options, page by page.
Sometimes mulitple choice like, sometimes requesting user input.
No difference to what dpkg does when installing packages without default options or on systems where the admin specifically requested to get all questions.
joelito:
>>>Where u using BSD and *not* linux??? If it *was* linux, was it one of the known hassle Distros like Gentoo, Slackware, Pure Debian, etc???<<<
One time is was bsd, the other time HP/UX.
>>>I mean, i *know* vi IS outdated, that’s why many modern linux distros come with nano(for the cmd users) where everything works just by using some CTRL + Key combo.<<<
Sure, and at that time I believe pico was an option. And you wouldn’t believe how often we still used vi. Why? Pathing issues to pico. And different cmd shells requiring different configuration to be done but it never was. Bottom line is we toiled on with vi and unconfigured system issues, thinking that it would be “just this one time”. Like I said, I was being paid for this, so it should have been OK. It wasn’t because unix syntax is crap, i.e. NOT updated after all these years. DOS cmd line syntax is the most intuitive, by far.
>>>Most modern distributions include enough GUI tools do deal with TAR archives<<<
Note, I emphasized that I work with archives at the cmd line. Much easier/faster that way. Much. In DOS I set up batch files for common archive activities or paths that I want to change to and work in. Amazing how efficient it is. Unix is like walking a Great Dane by comparison.
>>>, display help files properly and so on…<<<
No doubt this is updated today. I would expect that to be the case. But, lol, I bet that many 3rd party apps still install man pages. Legacy issues can kill an otherwise modern OS. It takes one or more companies like Linspire making a truly easier/better experience to cause the rest to finally get it. There is progress, don’t get me wrong. But there are still some seriously outdated mindset issues.
>>>Or is it that u ssh to your computers at work and expect it to look like win NT server (Including XP and so on)<<<<
I have never much cared for Windows on the server. But typically when you ssh or hop on to someone else’s sytem, you get stuck with the lowest common denominator like vi — unless people specially build wrappers that implement something better. DOS (again) did that with their default editor, eventually updating it to be qbasic-in-drag.
anon / tmodns.net:
>>>…but when a windows guy rants about how crazy vi is, well you can suggest nano but they will just ignore you and keep raving about vi….<<<
Not really ranting — does one actually need to rant about vi? I was commenting on my experience at that time. I don’t believe I ever used the word “linux” for example, and joelito figured that out.
>>>they do not WANT to see that everything isnt as hard as they think it is, they just want a excuse to keep using windows and not have to relearn….<<<
Look, I couldn’t be happier with XP. I am not looking at linux in a serious way because I am not seeing any killer linux apps, nor any serious improvement in the experience — and in fact I am seeing the opposite if anything (e.g. laptop issues). Yes I do want to avoid relearning _IF_ there are no benefits for my efforts. Is that really unreasonable?
I am not a Windows ranter, but a DOS+Windows preferrer. And this after computing 22+ years, many spent typing in PC Mag. utilities byte-by-byte, lol. I like Windows because it does 3rd-party devices better, fails better in config. areas (i.e. there are fewer config. issues and better documentation of them on the ‘net, in help forums, etc. due to bigger marketshare, admittedly) and I don’t have to mess with mount, lol. As the comments in this thread reveal, linux is good on servers, ok for novice home users, and is avoided by super users and average techies due to excessive gnarliness. Some company should test their linux with a dozen average techies who don’t now use linux — and every time the techies curse more than 3 expletives in a sentence they should sit down and make changes.
Linux also needs to simplify. This is something that DOS (and PCMag utilities) got very right. The average PCMag ap., for years, was just 3 files — an .EXE, .HLP and .DOC. Linspire wins here, providing one package in each software category in the default installation. Yes you have to pay to get more if you use their system — but (1) you don’t have to, (2) most users won’t mind terribly.
Maybe the mainstream linux companies — suse, ubuntu, inspire, red hat, mandrake, etc. — need to get together and create a single “lite” version of linux that each of them installs by default. Then, in the gui you optionally select the superset you want to install, if any. They would still differentiate themselves via the superset they offer, but the entire linux installed base would have a standardized core. The installation of that lite core could get screen resolution right, for a change. It could get sound, dvd, swf, pdf and laptop stuff nailed right off the bat. And for many home users the simple addition of an Open Office or equiv. would be all they would need to add — reducing support issues to nothing, minimizing RAM demands and making a $100 to $200 PC desirable, not just doable.
It depends. This is where I can tell many of you have never been in business like this at all. When a customer goes to you for business they’re not interested in whether you use Photoshop or not. All they want is an image, or set of images, that you can produce for them for the business, website, literature etc. Whether you use Photoshop or the GIMP, they’re simply not interested.
I think the person who has no clue what he’s talking about is you. Do you work in computer graphics? I doubt. Of course the customer doesn’t care what tools you’re using to have the job done, but at least he wants the job to be done, and the job has to be perfect because there is fierce competition in graphic design and web design. You can’t compete if you use the Gimp because the result is very poor and ugly. Believe me, most graphic designers you’ll find on the Internet in forums have tried the Gimp because it breaks on’s heart to have to play almost $3.000 into Photoshop, but there’s no way around if you want a perfect job and if you want to make a difference on the market. The tool does the difference in this case. If you have friends who work in the area, ask them. You don’t seem well informed. You could use the Gimp of course, but the result would be poor, and in the long run, you succeed to pay the Photoshop license in less than a 2 weeks because you can do a great job, and you can charge it to your customer. This is being smart. Smart investment, good results, satisfied customer, good money.
It’s funny that all these people come out saying the GIMP isn’t good enough, but no customer actually cares what you produce their images with as long as they get done. You can quite clearly get quite a bit done with the GIMP, but as to whether you need everything that’s in Photoshop when you reach the limits of the GIMP is a considered decision that you make.
Again, you have no clue what you’re talking about. You probably have never used Photoshop nor have you greated a masterpiece in the Gimp either! If you don’t know what you’re talking about, I think it’s better not to talk.
For the vast majority of people and businesses, unless they’re making a lot of money out of their Photoshop work to pay for their Photoshop licenses, it is simply too expensive regardless of whether people think that the alternatives are good enough or not. You may decide the GIMP is good enough, or go for Paint Shop Pro.
<sigh /> Photoshop is a good and necessary investment in the graphics industry. All communication agencies have Photoshop. Wondering why? Because the graphic design industry generated big bucks, and people fight to grab customers, only the best survive, and you need to prove yourself. Photoshop is the best tool for graphic design, and you have a fast ROI because people pay big bucks for quality. A good designer can’t do a good job with poor software such as the ones you suggest.
You have no clue what you’re talking about apparently, sorry.
You can’t compete if you use the Gimp because the result is very poor and ugly. Believe me, most graphic designers you’ll find on the Internet in forums have tried the Gimp because it breaks on’s heart to have to play almost $3.000 into Photoshop, but there’s no way around if you want a perfect job and if you want to make a difference on the market.
I have used both programs intensively, and both produce the same quality output for Web design (i.e. things that are not supposed to be printed). For print, Photoshop still holds a slight lead over GIMP, but as CMYK support becomes more mature on GIMP this should change.
GIMP doesn’t produce “poor and ugly” designs any more than Photoshop does: that all depends on the designer’s talent and knowledge of the tools.
Oh, and Photoshop doesn’t cost 3,000$.
Photoshop is the best tool for graphic design, and you have a fast ROI because people pay big bucks for quality. A good designer can’t do a good job with poor software such as the ones you suggest.
GIMP isn’t “poor software”…apart from CMYK support (which is improving, but only matters for print work) GIMP is just as capable as Photoshop. I know, I use Photoshop extensively in my day job, and use GIMP at home.
Not only that, but GIMP can actually save .PSD files (Photoshop’s format) so you can very easily work in GIMP, then open up your file in Photoshop (and convert it to CMYK or Pantone) – which means you only need one copy of PS in the shop, which can be a real cost-saver in a graphics design shop.
Why do you think GIMP (as Cinepaint) has become so popular in Hollywood animation studios?
You have no clue what you’re talking about apparently, sorry.
Actually, it seems you have no clue about graphic design software. The other poster made some valuable points which you simply ignored. Sorry.
I really I’m amazed at the comments about Linux, so many people here live in there windows world with no actual knowledge of Linux or years old out of date info myth.
People come to a forum I mod still using Redhat 8/9, no wonder these comments exsist, Linux has moved on a world away from RH8/9. People who say Linux is not for the desktop simple dont know what they are talking about, is Windows really a desent desktop OS?, because it must have the most conplaints for a product in history. A desktop OS with Virus’s, worms, spyware, security nightmares, no standard HTML browser, Linux has non of this and is free.
It’s funny that Longhorn which is still over a year away is getting features that Linux desktop has had for years, PDF, text and lots more previewing of many formats.
Stop the BS and get on with real facts.
i said dos-like not dos directly…
1-2 files that control the basic startup after the kernel have done its bit. want to stop something from starting everytime? comment out the line that starts it.
as for multiple shells, most linux distros come pre-configed with about 5-6 virtual terminals. if your not running X then alt+f1-6 should bring up a new login prompt. this allows you to have say a browser like links or w3m up in one terminal while editing something in a diffrent one. all without starting X. why win2k have to boot up into gui mode to give me a safe mode with commandline i dont know. ok so there is the recovery console but that require aditional files installed and adds a extra boot option in boot.ini or something like that.
and if i want to start X then all i have to do is type startx in one of said terminals and up comes X with my desktop enviromet of choice. i can even start gui apps from any of the terminals and tell them to go find the desktop (alltho i wish they where able to do so on their own).
“randomapp :0 &” should do the trick. randomapp is the name of the app, :0 tells it to go to X display 0 on the local machine and & puts it in the background so i can still use the shell.
from X i can still reach the virtual terminals with ctrl+alt+f1-6. ctrl+alt+f7 or f8 should give me X again.
dos was at one time allmost this flexible. but you had to kill win3.x to get back into the dos shell, unless you fired up command.com in a window inside windows. and there was no way to restart a TSR program without restarting the whole machine.
as for installing a random package of the net. i have only tryed this in mandrake 10.1 but when you doubleclick a rpm there it will be sendt to rpmdrake, the mandrake tool that handles package install and uninstall (basicly a frontend for the uprmi commandline tool).
there it will be read and any dependencies checked against rpmdrakes list of ftp server and cd’s to see if they can be fullfilled.
if not then you will be presented with a dialog telling you what rpms are missing. feed these to google and it will most likely point you to rpmfind or similar pages. download and repeat.
waste of time…..
moving on….
you are right…. no reason for anyone to use linux…..
strange that people do, isnt it…..
Hobglobin:
Thanks for the clarif. on how multiple shells are loaded before the gui is optionally loaded.
BTW, renaming win.com allowed “just DOS” at startup. Also PC Mag’s “Install” and “Remove” allowed adding and removing TSRs (in DOS, or within a DOS box but not if TSR loaded before Windows admittedly).
Okay, here is my response to the three statements:
1. Installing applications is complicated.
I am of the systems administrators on a college campuses out in the wilds of New Mexico. We have been, until recently, a ‘Redmond’ only campus.
Starting my sixth year, I can honestly say there are some ‘Redmond’ based software that is a pain-in-the-ass to install, manage, and maintain. This is especially true for older software that was created for 3.1 and early 95 flavors. Yes, you can ghost, image, and all that jazz. But that initial config can still bite.
For me, installing software in Linux is as easy as opening a terminal window as a super user and typing the distribution update, upgrade, or install command. All the dependencies are taken care of. Done.
2. Directory structures can be confusing to navigate.
This is as easy as Read the “Fine” Manual. Good lord. en I do a scan of my wife’s little XP box it reads 30 directories visable and over 7,000 files.
Need I remind a many a ‘Redmond’ system admin the curve they had when they started out driving with ‘Redmond’? I find the ‘Redmond’ registry less complex than the actual distibution of files on a ‘Redmond’ box.
3. Interface is confusing and inconsistent. A steep learning curve required to understand system functions.
It’s all very much ‘straight forward’ once you learn the grammar and syntax. I have found ‘Redmond’ limited and restrictive.
Again. crack open a manual in an area that doesn’t make sense. The ‘Redmond’ resource kit is a wonderful read too on learning the latest and greatest for the latest flavor of the OS.
Finally, while your arguments may sound – well – legit, A+B does not lead to C.
“MacOS is the most overrated of all OSes — a true pain in the butt. ONE mouse button?!?!?!”
Please not that crap again, plug in a two button mouse and it just works.
There were a few other stupid remarks about OSX you made, I won’t get into them, obvious you do have a personal interest in being the devil’s advocate.
Talking about multi-tasking, security ,stability and maintenance (MS did have some good ideas but their implementations are the worst) compared to OSX Windows is a sick joke.
I am working at an IT company which is mainly focused at Windows and I am deeply ashamed of what we push ‘our’ customers down their throat and the volume of manpower/money it needs just to make/keep it running.
But in a sense I do understand you, if you don’t know an OS like OSX or the difference between OSX and that what was before OSX (which indeed was outdated) you may think that Windows is a ‘decent’ OS, it is not, Microsoft knows this and that is why Longhorn takes so long.
You won’t know, until you choose to know, we are living in a world of self-delusion.
> Matt24 (IP: —.solcon.nl) said:
> You won’t know, until you choose to know, we are living in a world of self-delusion.
This is the general rule, I agree with you.
But look at the recurring theme of these posts: old linux versions, repeating statements of 4 (four) years ago, talking about what no longer applies (Macs can use multi-button mouses, now)… that remembers me of someone who wrote about strange talks in forums.
These criticisms seem copied from lists!
And the way some lost meaning (because developers already fixed them) — but are nonetheless posted at infinitum, as if the writer doesn’t care about what is going on in the OS scene.
Maybe I’m getting paranoid, uh?
It’s free of charge and widely available on free CD-ROMs. Why no one wants it then?
“It’s free of charge and widely available on free CD-ROMs. Why no one wants it then? ”
Silly useless comment.
Have a look here:
http://www.interknet.net/bt/stats.php
And here I am quoting a SUSE developer about what happened after SUSE 9.3 was released for free:
>>The situation at http://ftp.gwdg.de is as worse as never before and as worse as I never had thought to happen: this time, a new important release does not push the total output sum to a new height, but it is even lower than during the “quiet days” before…
This is due to the size of the 9.3 tree: it is 29 GB, including the new iso/split/ directory I like to add, and this is far beyond the available buffer cache (ftp.gwdg.de has “only” 12 GB RAM).
So almost every download is really moving the 15 available disk heads, and with 1000 to 2500 users at every moment, disk heads are more moving than reading.<<
So nobody wants linux? Get your facts right before you write nonsense.
That is just *one* of the many dozens of SUSE mirrors.
It’s free of charge and widely available on free CD-ROMs. Why no one wants it then?
Plenty of people are using it. Over 25 million, according to some estimates.
Of course, many more people use Windows on the desktop, but that’s because it comes pre-installed with most new computers. And the reason it comes pre-installed with most new computers is because of MS’s monopolistic practices with OEMs – NOT because of the OS’ qualities.
Have a look here: