I’ve only been using GNU/Linux since 2001, so I won’t say that I’m by any means an expert yet, as most of those that are reading this, probably have been using Linux much longer than I have. However, I still have high hopes for the Linux scene. The purpose of this article is to voice my personal opinion on what I feel is keeping GNU/Linux from taking over the mainstream operating system market. My intentions aren’t to “badtalk” the open source kernel+apps, but rather give constructive criticsm on what I personally feel it could be done better.Introduction
Why is it, though, that Linux hasn’t yet taken over the mainstream market? There are quite a few completely free downloadable distro’s, so why aren’t more people giving it a shot? In my view, several things need to be improved before Linux can take over the market. Ease of use, usability and 3rd party software support are among those that I feel should be improved upon.
The Competition, out of date?
To the best of my recollection, Windows XP was released in 2001. Predictably, that’s what the majority of PC users use these days and what I’ve used myself during the brief period when I was taking Linux classes at a local community college out of shear curiosity. Windows XP is now obsolete, it’s old now. The successor? None, yet. At least not until when late 2005 or 2006 rolls around and with it, Windows Longhorn.
But what about those that don’t want to wait that long for up to date software and the newest technologies? The answer to quite a few is Linux. However, when most “casual users” think of Linux, they either think about geeks with bottle glasses and plaid shirts, or something that is completely unknown, and to some, scary. There are those though, that welcome Linux with open arms, some of them being your everyday web surfer or chatter, and others that get their kicks out of recompiling kernels or playing with Midnight Commander.
Ease Of Use
Usability is one of those features in Linux that actually is improving almost every day. When I first learned Linux, my distro of choice didn’t even have a “Computer” icon in Gnome. (That didn’t roll out until Gnome 2.6, unless you made one yourself). However, I had to learn how to navigate to the /mnt folder and throw a few mount commands at the terminal in order to view files on a CD or hard disk. It’s definitely looking up, but there are some things that still should be improved.
If I had to pick one thing to improve above all others, I’d have to say software installation. No, I am not talking about installing the actual OS (we have Anaconda and many others for that) but rather installing and upgrading software applications.
For example, in order to install KDE 3.4 RC1 on my system, I found some APT repositories and downloaded away, and it worked. However, a new user to Linux (fresh from Windows) is accustomed to double-clicking a setup icon, clicking “next” four or five times followed by “Finish”. That won’t work here. The last time I tried to explain Apt to an avid Windows user, I got a blank stare.
But why shouldn’t Linux work the same way? Upgrading KDE made my system download somewhere around 30 packages. Meanwhile, XFCE, another up and coming desktop solution, recently released an installer and shocked the Linux world. I was so happy, that I logged on to KDE’s bug report wizard, and under wishlist, and said that basically it would be nice if KDE does this too. And how wonderful that would be, right? Nope, I was told in so many words that a KDE installer will never happen, and to wait for my vendor to upgrade. I was also hinted toward using Konstruct.
To me, “Wait for your vendor” is a Linux remark that should get thrown in the trash. The way I see it, without installers, I strongly believe that Linux will never amount to anything in the mainstream market. In addition, I don’t want to wait 3-6 months to have the newest software whenever my “vendor” releases a new revision. When Microsoft releases its newest Directx, you don’t have to wait until the next version of Windows to utilize it, you are free to download and install it right away. I do understand that KDE has the konstruct builder, and Gnome has Garnome, but both require the command line, neither have ever worked for me, and both still need dependencies. (An ideal “installer” is without a commandline). When I say “Installer” I mean something that you click on, choose components, and have a progress bar, because that’s what the typical Windows user absolutely needs to feel at home. At the end of the day, requiring a new user to find the command line is right when Linux has failed for that user.
Usability
That leads right into the next subject I feel important, which is usability. A good case study is at work, they have just installed a brand new “Internet Cafe” in the lunch room which is based on Knoppix. They basically installed the cd onto the hard drives of the PC’s, but in a way that the hard drives are read only and they start over every time they are rebooted. This is a great way of doing it, actually, but they forgot to include Macromedia’s flash player in the image.
Since I am the only “Linux guy” of my department, it’s my duty to install the flashplayer for everyone, since I am the only one there that knows how. I fire up the command line, download the Gunzip package, throw a few commands in the terminal, and flash is ready to go. (A brand new Linux user couldn’t do that!) I even caught someone downloading the Windows flash installer (somehow) clicking on it repeatedly, and then they ask “Why isn’t this working?” Like I said before, I feel that the very moment Linux requires you to fire up Konsole, is right when Linux has failed. In this case, it’s not Linux’s fault, Macromedia made the installer, but you get my point.
Linspire came up with the idea of “Click n Run” which is a step in the right direction. However, Synaptic is the best you can get with most other distro’s. My point is that this is exactly what we need – more Click n Run’s and Synaptics. I feel this is what developers should focus on this year, but instead, perhaps installers that are alot like Click n Run or Synaptic, with easier features. Another possibly is making downloadable installers with apt-like capabilities built right in, to automatically resolve dependencies. That could work very well.
Third Party Support
I think it’s really good that more and more non-Linux developers are jumping onto the Linux bandwagon, but we could always use more. As it stands now, it’s already known that the majority of new pc game releases are usually made exclusively for Windows. Perhaps some developers are too scared to program for Linux? At any rate, there are more technologies available now then there were in the past. For example, I find Cedega is relatively good for Windows games, but Windows games will always run better in their native environment, no matter how good the program made to convert them. As great as Cedega and Wine are in general, they don’t fix the immediate problem, that being the industry leaders in this market, or any market, don’t program for Linux. If that were changed, I am sure that would give Linux the boost it needs to be pushed forward.
Closing Arguements
I don’t necessarily apologize for making so many comparisons to Windows. Microsofts’ flagship operating system stands as the most used today, and that is more than likely because it’s easy to use. As the top dog in the industry, making comparisons to it is inevitable. Windows or not, I’m sure we all want our PC’s user friendly no matter what the hardware or OS.
If any of you readers have an idea on how Linux will work better, don’t keep it to yourself! Post it at your local bugzilla or developer’s website, because no one will know your ideas unless you make them known. The strength of Linux will always be the people that use it, and in numbers, so use your power to convince those in power to make things better. If you don’t, it may never happen and I cannot stress enough how important it is.
If any developers are reading this, let’s come up with some installers, user friendly applications, and nullify the dependence most distributions have on the terminal.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
You don’t get it. Even if Linux was 100 times better than Windows people will still not switch to it. Not until it comes bundled with their computers, schools begin to adopt it, software and hardware vendors begin to realize its opporturnities and corporate America begins to exploit its flexibility.
Linux support for more hardware devices than any other proprietary operating system in the world, it runs on anything from wrist watches to supercomputers, it has more full features desktop environments than any other proprietary operating system in the world, it is exponentially customizable.
Pray tell me, how is that an operating system that does all those isn’t ready for the mainstream? The point is the world is still understanding Linux. They are overwhelmed by what it has to offer. Many don’t even know where to begin weilding its power. No other platform has presented them with so much options, freedom, customizability like Linux does. They are in shock.
It’s not going to happen overnight, but Linux does have a future on the desktop. But they can’t do with without the parties I mentioned above. Finally, people spent years investing and learning Windows, today, there are not many incentives to spend that same amount of time learning Linux, although doing so can quadriple their productivity and reduce costs. The mainstream isn’t ready for Linux. Linux is ready for the desktop.
The way I see it, without installers, I strongly believe that Linux will never amount to anything in the mainstream market.
I see this as a step in the wrong direction. installers are messy, non-standard, and potentially dangerous. A good package management system standardises package installation and calculates dependencies, which leads to fewer incompatibilities and broken installations. A good package manager can also verify packages with a gpg signature. I can never understand why this complaint still arises. It took me all of two weeks to realize the superiority of package management.
Yeah, we need something like AutoPackage for the installer. Dont’ expect that to be adopted anytime soon as so many people wall up in their little distro fiefdoms.
The games support won’t happen for a long time except from a few that do it because it’s cool, such as Id and Epic, because (A) you’ll never recover your development costs right now targetting linux and more importantly (B) even when there is a substantial marketshare, customer service will be a nightmare for years and years to come. Linux and Mac have about equal market shares, but so many more games are targetted at Mac. Too many distros, too many configurations. Nobody wants to deal with that mess. Don’t expect many games to be targetted at Linux for at least another decade. And of course there is the chicken and the egg dilemna for games too (nobody moves over to Linux because there is no games, but game producers won’t do a port because there is no market)
And not related to your comments, but Linux needs good RAD development tools. They need to take a look at Visual Studio and other Windows development environments for inspriration. Vi and Emacs doesn’t cut it in 2005. Well….vi keybindings cut it, just not vim itself.
The problem is that with package repositories you’re stuck in your own little distro universe, and it makes a pain for developers too.
You don’t get it. Linux *is* 100 times better than Windows.
…Sorry, couldn’t resist
I believe that one of the biggest problems with Linux which prevent ISV/IHV adoption is too many options that need to be supported, and API/ABI stability.
Having to support multiple competing and ever-changing technologies can be really difficult. Consider the number of options that are available today for Linux:
– various distros (incompatible paths, startup scripts, security models, software installation models)
– GTK vs QT vs Motif vs … (Windows has one Win32 API which is there since 1985)
– Gnome, KDE, XFCE desktop environments (Windows has one desktop which in most cases is backwards compatible with what you had back in 1995)
– different versions of glibc
– various threading models (Win32 has one since the early days of WinNT and it’s stable)
– various security patches (there’s one patch source in Windows)
– various hardware-accelerated display options (I cannot even name all X11 extensions that are required to make any serious use of modern graphics cards)
– various compilers (as long as you distribute just the binaries, this doesn’t matter, open source development model makes you consider that. Most ISV wouldn’t want to open source their applications)
Also, consider the API/ABI stability: Windows is known for its backwards compatibility. I have yet to see any six-year-old Linux GUI application that would work unmodified on any today’s standard Linux desktop configuration. Most of the time you have to deal with recompilation and/or dependency hell and/or patching your application to adhere to ever-changing APIs and ABIs.
Jarek
You’re exactly right on all counts, but there is so much groupthink out there that the current model is the best, don’t expect things to change.
Someone has to step up to the plate and standardize some stuff – someone with deep pockets. “Buehler…..Buehler…Buehler…IBM….IBM….IBM”?
Maybe the only real hope for Linux on the desktop is Mono, since in 7-8 years almost all new windows apps will be .NET based, that coupled with Wine.
Or…..someone steps up, takes the kernel and does something radically different with the userspace that distinguishes itself from Yet Another Distro.
> I have yet to see any six-year-old Linux GUI
> application that would work unmodified on any
> today’s standard Linux desktop configuration
Acrobat reader 5. Not sure exactly how old it is, but it is _old_. Thankfully just Adobe released v7.
Half the article is a result of you not using Ubuntu. The other half is a result of you not using GNOME. Here’s how you install the Macromedia Flash Player for FireFox. Download http://fpdownload.macromedia.com/get/shockwave/flash/english/linux/…
When it finishes click Open in the download manager (same as on windows), the GNOME archive viewer will open (same as WinZip), click extract and press ok (same as WinZip), open a file browser window to where you extracted it (just like you would on windows), double click on flashplayer-installer. Now that process was annoying and unnecessary I’ll give you, but there’s a lot of things like that on Windows. If Macromedia want to make their download more accessible they should simply package it into an APT.
The more I think about it, the more I believe that the centralized model of software deployment (distro repositories) will have to change. It’s too restrictive.
Development options and ease of development is great.
I think there are two main things stopping linux becoming mainstream in order of most importance:
1. MS Windows is installed on most PCs, and Linux is installed on almost no PCs pre-installed. This needs to change for it to become more mainstream. Commonly people don’t go installing their own OSs.
Once this is solved…
2. Installation of software is a nightmare in comparison to Windows. Many dependency problems arise.
These done, then Linux is a much more superior beast than that of MS Windows which is largely hacked together evolved low quality software (Above kernel level).
I know the command line is a powerful tool, but in order for linux to make inroads in the desktop space, distros has to use less of the command line. am talking bout mandrake, I still need to configure some of the config files in order to accept my video drivers and my refresh rates.
Installing software is broken on Windows. Windows does not have the concept of a package manager, which is a sophisticated installer. Actually, Windows has a lot to learn from Linux in this regard.
Windows does not dominate the market, because its easy to use. Its because there is (was) no choice, all cause of the OEM.
Its only now ( at least in my country ) that computer shops are starting to offer computers with Linux preinstalled ( or let you buy a computer wihout an os and not label you as a piracy supporter for doing so ), for Windows it has been done for years.
If you really think that regular Joes/Janes use Windows because it was so easy to install, think again, most of these type of users use it because it came preinstalled with the computer they bought.
If you really think that regular Joes/Janes use Windows because it was so easy to install, think again, most of these type of users use it because it came preinstalled with the computer they bought.
Be that as it may, but I’m a developer, power desktop user and sysadmin, and if I bought machines with GNU/Linux preinstalled I’d format it and install an alternative straight away. Most likely Windows.
For my own machines too.
Quote:
“When Microsoft releases its newest Directx, you don’t have to wait until the next version of Windows to utilize it, you are free to download and install it right away”
That is because Microsoft is the software provider and the release manager.
GNOME or KDE only provide the software, they do not act as release managers for a linux distribution.
Quote:
“When I say “Installer” I mean something that you click on, choose components, and have a progress bar, because that’s what the typical Windows user absolutely needs to feel at home. At the end of the day, requiring a new user to find the command line is right when Linux has failed for that user.”
Do you have any prove of that? Any scientific studies?
It’s just not true what you say.
Most programs are so small, that it’s not necessary to display a progressbar at all. It provides no meanigfull feedback to the user in most cases. Only with large programs it’s usefull.
And remembering one single command is very easy, even for someone who has never ever used a computer before.
There is a reason why, for example, Autocad users type in commands when they draw, and do not use the toolbars a lot… it’s speed and it’s a lot easier to remember a command instead of loosing your focus by searching a toolbar button.
I have said this before
Where open source give you a freedom of choice and that is great I do believe that too many options could be a bad thing. If have seen this numerous times where sometimes when a developers find a tool that does not do what they want it to do, they would go off and right their own version, instead of enhancing the original tool in the first place. Sometimes this has to do with the fact that the learning curve of making chances to the original source is higher than it would be to develop it from scratch.
But this action leads to instead of having one product that works with less bugs and issues and more features, you end up with the 2 projects with less features and more bugs. Prime example if this would be the package management tools.
This is one complaint I have heard numerous times, “… but in Microsoft Windows you only have to do this and this and it works, why do have to do this that and this if you want this or if you want that then you need to do that this that that and this in Linux”. In the end it just confuses the hell out of the end users and he ends up not using the system at all, no matter how beautiful it is technically. The reason why Microsoft just works (these days)
* They have unified development team
* One goal
* They have one product (be it a Word Editor, Media Player or IDE) but all of these products are feature rich and way less buggy than any open source product I have every used.
Could you imagine uniting all the open source developers around the globe to work on one operating system (kernel), one desktop environment, one office suite, one mail client, one browser, one database server, one VPN client, one package management, one config tool, one IDE, one etc. You would truly have a force that is able to overthrow the market as we know it today.
Where some people might see the number of different Linux distributions as the success of Linux, I see it has its major downfall.
I am not saying lets remove competition or choice but I think the open source market needs to have a global focus instead of branching at every change it gets.
Just a thought!
Quote:
” but Linux needs good RAD development tools. They need to take a look at Visual Studio and other Windows development environments for inspriration. Vi and Emacs doesn’t cut it in 2005.”
A programmer who can’t program with a normal texteditor, will not be able to program with a state of the art IDE either.
Try to take a look at KDevelop.
It does not have all the bells and whistles of Visual Studio, but for any normal programmer it’s more than enough.
i always wondered why there couldn’t be a simple gui interface to do ./configure && make && make install on a tar.gz. almost all linux software is packaged like this, and the routine works on any distro and platform.
maybe there could be some kind of halfway step so that the whole compilation did not have to be done, but it would still work for x86 ppc sparc etc.
the autopackage stuff is interesting. good interview about it at http://www.lugradio.org .
although i think could be dangerous for end users to try and install a new x server or something low level. unless they know how to fix it from the command line.
it could work if applications cames as autopackages and the system came through the distro. but if they step on each others toes it could get messy.
Linux needs to learn somethings from OS X, especially with installing software. And are there any security problems with that? I don’t think so!
Actually, the OS X way is equally broken. Statically linked binaries are just backward. They are so 70s.
when i put ubuntu on someones computer, enable universe in synaptic, and show them how to browser the catoragories, read the descriptions and click install, then i am confident that they can install thousands of random packages and there system will remain clean, virus free and stable.
would you tell a windows user to, just search for something on google, and download and install it?
and with synaptic, if a new version of random-package comes out, they will be prompted to upgrade with that great new update icon thingy in the system tray.
Sometimes this has to do with the fact that the learning curve of making chances to the original source is higher than it would be to develop it from scratch.
On the contrary,altering the existing source if you know how to code is a lot easier than making all new from scrap,that is when most part of the source is properly documented.Starting from scrap means facing the possibility of the same pitfalls the other developer(s) already have been fallen into,which means both a higher learning curve and time investment.
Aaaah, thats why SVs release they software in different versions for Windows 98,ME,NT,2003,XP etc because its all backwards compatible, right.
A programmer who can’t program with a normal texteditor, will not be able to program with a state of the art IDE either.
Yeah, and you could program using a hex editor too. The 70s are calling and they want their development tools back.
What most people is missing is that Windows is compared to Linux only one Distribution (don’t tell me XP and 9X are different, I mean for “desktops users”), in Linux you’ve a lot of Distributions with their own configuration, so you can’t think of Linux targeting the desktop market, but instead of X Distribution targeting desktop market (like Ubuntu or Linspire).
Linux may need to learn some things, but not those things.
Installers are the weakest point of Windows. They work okay for installing a program, but try using Add/Remove Programs (bit of a misnomer since you basically can’t add programs using it) to uninstall stuff. You wind up in a mess of individual proprietary installers.
The worst candidate here would be Nortons – clicking “uninstall” refuses to work. Going through “change” requires two or three reboots to remove all the programs from the list – and your system is still *covered* in cruft from it.
Package management is one of the strongest points of Linux – and the various GUIs have nothing to do with the console. Even Portage has a frontend available…
“When Microsoft releases its newest Directx, you don’t have to wait until the next version of Windows to utilize it, you are free to download and install it right away”
Well not really. You get bugged repeatedly about “validating” your copy of Windows. Having downloaded it, you find out it’s an incredibly awkward web setup that requires more waiting and lots of clicking.
Whereas if I want to install a package on here, I can tell Linux to install it along with many others, and it will do it without bothering me. Windows Update won’t do such a thing because every second package “requires” a reboot.
Linux may have a long way to go, but at least it’s aiming at the right place. Windows will never get there.
I just wish there could be one package standard and not ten, would make it a lot easier with a standard package that was supported on all distro’s
http://bitsofnews.com
So maybe there needs to be an Uber standard distro? Free I might add(not linspire). One where all companies can develop for. Game developers, application developers? This way devs don’t have to worry about starting services with crazy paths. Placing libs in strange places. Slackware/Debian/Gentoo/Mandrake/Redhat/SuSE/etc.. “Linux is about choice”. Well your “choice” is your downfall. Maybe i’ll create my own linux distro next week. And make the paths even stranger. And create a new package system. A new installer format. YAY! It’s all about choice. Your “choice” is why linux is a PITA to port apps to. It sucks saying “yes it will work on redhat and suse but not Debian or Gentoo sorry.. not without modification”. Thats why Linux will never be a real contender in the desktop market. Plain and simple. Now go cry about MS being “an evil monopoly”. You praise these fly-by-night distros and put linux on such a high pedastool. When that is whats killing you. Theres all this talk about “what if linux fragments”. Umm it has. It has shattered into a million different distros. And that is why commercial entities don’t make applications on linux(i guess im gonna hear “well IBM loves us, Novell backs us”..blah blah blah w/e a handful of companies with deep pockets. Linux is R&D to those guys right now. Nothing more or less. Get your head out of your asses.). Your “choice” will be the death of you.
Quote:
“Yeah, and you could program using a hex editor too. The 70s are calling and they want their development tools back.”
Hmmm, for my needs, gui design for example takes only a very small part in the development cycle of a program.
Most time is spent in actually coding the program and algorithms.
Let me tell you a story, a real one, one in my department of expertise:
I’m a mechanical engineer. I design objects.
There are several ways in doing this.
But the trend these days is to use software packages that display nice graphics of where an object has stress points or how it should be moulded etc… The possibilities are endless.
These programs let an unexperienced engineer, or even person who does not understand anything of mechanical engineering to design things.
Is that a good idea?
ABSOLUTELY NOT !!!
You still need to understand what you’re doing.
You need to know the dirty details in order to interpret the results of such programs / tools.
The same for RAD tools.
They are only a tool, they are not the holy grail of software development.
RAD can be usefull in some cases, like in designing prototypes, or designing the dialogs etc…
They do not solve the problems of coming up with good algorithms, of designing a usefull database structure, or even how to maintain your program during its lifecycle.
RAD is only a tool used in a fraction of the development process.
Like I said before… even the most dumb person can create a small program with a RAD environment. But that doesn’t guarantee that the program has any quality. To create quality software you need to know what you’re doing.
”
“Actually, Windows has a lot to learn from Linux in this regard.”
Yeah, what with the rapidly expanding user base of GNU/Linux, I’m sure Windows has loads to learn from GNU/Linux.
Dream on.
”
Oh, you know, cause the market leader always does everything right and there isn’t a single advantage to any alternative platform, right?
”
Linux is R&D to those guys right now. Nothing more or less. Get your head out of your asses
”
Linux R&D to Novell? I suppose Netware is the platform of the future and Novell will always be able to support itself on Netware,r ight? I’d say Linux is more like Novell hoping to keep its head above water after another 10 years.
Well, IBM has begun to put AIX on the backburner (considering Linux has a higher marketshare right now), has ported all of their software to Linux, offers full support for Linux on their servers, has formed strong partnerships with both Novell and Red Hat, and is making tons of money supporting Linux…..but yes, you’re absolutely right, Linux is simply R&D and IBM is making absolutely no money off of it.
Wow Novell has always loved Linux and been a major player in it’s development.. Big deal. 1 almost washed up company. IBM will put money into shit if they can sell it. That says nothing about Linux. IBM knows Linux is better than Unix. Likes thats hard to be anyway.
I know somemay think im nuts but its gets confusing with all
these different distros.
I know its about open source but it would go along way we are talking about making linux better an more used.
I have heard slackware is good but the average user or newbie would be confused trying to install it
Mandrake was the easiest disro i used.
Ubuntu was a pain in the but.
All im saying is the people in linux must work together
thats the whole key
I agree completely with the article’ author, but with new distros like ubuntu we are moving in the right direction.
For general adoption the solution is simplicity.
I would really know how installing new programs is handled on Mac OS X, after all this OS is well known for its simplicity and easy usage.
After all OS X is unix at its core, derived from bsd; so how it works ?
This approach could be use in the linux world too or not ?
bye
Roberto
Ease Of Use[/i]
For example, in order to install KDE 3.4 RC1 on my system, I found some APT repositories and downloaded away, and it worked. However, a new user to Linux (fresh from Windows) is accustomed to double-clicking a setup icon,
KDE 3.4 *RC1*
A package which is a release candidate shouldn’t be directly accessible to newcomers in order to prevent they might take to much obstacles they could handle at that time.Mostly the OS get’s in anyway the blame for unjust reasons.This doesn’t do anybody a favor.For example: Gentoo testing packages are all masked.An advanced user or a smart one who allways inmediedly heads for the documentation would know that you can edit “/etc/portage/packages.keywords” and write something like “=x11-themes/commonbox-styles-extra-0.2-r1″ in it to install the masked package commonbox-styles-extra-0.2-r1.Anyone with a fair gasp of reason understands that the described exersize isn’t recommended.The same goes for installing packages or compile from source provided out of the official repositories.Debian has devided the packages in three categories;”stable”,”testing”,and unstable.
In my humble opinion ther’s not really a good reason to focus more on installers.What’s necesary is a standard package manager like the kernel is mandatory in one way or another.A package manager like smart from Mandriva for example,yet with a extensive environment variable “database” for a lot of distros (can argue about how many).As an addition perhaps some could research in what way installing from a source not in the various repositories could be better integrated in the described package manager.
A big unison on elemental but vital key issues that not per se have to conflict with culture and the ways of how the distros look at certain things should be strived to.
To me, “Wait for your vendor” is a Linux remark that should get thrown in the trash.
That’s your good right but doesn’t and can’t be applied on everyone.With power comes great resposibillity.Linux is very powerfull,you can in fact do anything with it and configure
really to the last bit,*when you have the skills to pull it off*.Till than the good distros protect the OS from you.
Usability
That’s exponential to the user knowing what he/she wants and his/her skills,on any OS.
As with taste you can’t hardly argue about usabillity,it’s up to the user to decide in the end.
Third Party Support
Clear,ther’re enough great distros with great communities and even the capabillity to assist or handle custom (tailor) made solutions with support,and you would be suprised about the costs.
Than you might see more games for the Linux platform 🙂
final thoughts
If any of you readers have an idea on how Linux will work better, don’t keep it to yourself! Post it at your local bugzilla or developer’s website, because no one will know your ideas unless you make them known. The strength of Linux will always be the people that use it, and in numbers, so use your power to convince those in power to make things better. If you don’t, it may never happen and I cannot stress enough how important it is.
If any developers are reading this, let’s come up with some installers, user friendly applications, and nullify the dependence most distributions have on the terminal.
Feedback is tremendously important.One of the great strenghts of the OSS community.
Linux is a reflection of reality. The world is a complex and confusing place to live in. Nobody can change the way the world works. Nobody can exert undue influence on how Linux works. I think people are overwhelmed this phenomenom.
Whining about the choice Linux affords you is like whining about the diversity of fish in the ocean. Do you propose we kill all but one species of the fish in the ocean, so that we can have only one to study?
Get real.
Now i don’t think we should kill all the fish in the sea. But then again if those fish needed to wear pants and there was only 1 or two sizes i imagine evolution would sort that out.
As for whining. Well i don’t use Linux on the desktop. I have two gentoo servers. I sue to run linxu on my desktop until i got fed up with it. I wanna see linux grow and mature and take hold of more market share and be seen as a force. I’d love that. but I’m no zealot. I’m not blind to the real issues.
Linux does have installers: the package management systems. They could be made look a bit better (a bit more like normal installers), but they do the job. The problem is the variety of packaging formats: currently there are Redhat compatible RPMs (Redhat, Fedora, Mandriva), SuSE RPMs and DEBs. This isn’t too bad, but a bit more consolidation would be nice.
Linux is also very easy to use. Some of the advanced features like ioslavs need some advanced knowledge, but they’re advanced features.
What’s hard is configuring and administering a Linux system: this is really up to the individual distro to provide quality tools. In terms of ease of use and completeness, I’d rank the best as Mandriva, SuSE, Redhat and others. As it is, KDE provides tools for scheduling jobs (kcron), user and group management (kuser), package management (the slightly unwieldy kpackage) and start-up management (can’t remember the name, krunlevel maybe).
The issues will sort themselves out. They always have. You seem to think the Linux community is blind to the issues that affect its adoption. They are not. What is frustrating is when people assume the role of armchair critic and naively believe they have the solutions to the problems when in reality they don’t.
Unless a universal installer/package format is created and other distros adopt/support this. No it will not get sorted out. This has been an issue for a long time. And it has yet to get sorted out. Another huge issue being the Disto of the week may not use RPM’s or .DEBS. Distros grow in popularity and then fall. It’s like an ocean swell. You may start supporting a distro when it’s almost at it’s peak(if you’re lucky) and then it falls in popularity and eveyrone moves on to the next weekly craze. You obviously Fail to see this.
How nice KDE and revolutionair in itself might be it’s still a small brace in the toolchain.What use is a good brace when ther’re some who are weak and influence the whole toolchain in a negative way.
So, this rocked. I think this article made very legitimate points. And all of this weird static about it never working or happening (an installer) is totally wrong. Why? because people that you consider sheep want simplicity, I know I do,and they’d rather be playing their newly game then trying to install it. Point n’ click, cut n’ paste, plug n’ play, click n’ drag, these things make up the very fabric of our digital experience. And if Open Source wants to be pre-installed on PCs across the globe it better respond to market demands. It’s not that Microsoft thought it’d be a good way to control people to make installations easier. I’m sure they learned from their customers that that is what they want. A computer is meant to conform to the human, not the other way around. As of now linux and other OS projects still put too many demands on the user to have foreknowledge of commands and processes. I’m sure their are people who use Windows and have never touched the command line. I use an Apple and nearly never need to. Are these people inferior users? No way, they know that a computer should conform to them, it should have foreknowledge of their habits and workflows. The computer is a working, entertainment, and teaching tool. It should work for me, it should entertain me, and it should teach me out of the box. The less I have to refer to an “instruction manual”. I want the computer to be doing the processing, not me, I have other things to do like use the software I’m trying to install.
Cheers
There you go. You believe having a universal installer automatically solves the Linux issues and tomorrow everybody will be running Linux. That’s naive.
How can it be more difficult? Most application running under OS X install using drag and drop into your applications folder…
No I don’t think it will solve all linux’s problems. it still has quite a few more obstacles to overcome. Some very obvious and some unseen. I don’t think that would make people people switch over night. It would be a great start though. And a much needed fix.
With OS X you drag a single icon into your applications folder or (typically if there are multiple files) it does it for you via an installer that more or less always looks and functions the same and never alienates the user. Which means I do it once and I will always know how to do it henceforth.
Simple, beautiful.
Half the article is a result of you not using Ubuntu. The other half is a result of you not using GNOME.
🙂 You actually confirm the reality behind the article with this line
That’s where I disagree again. Even if Linux had a universal installer and it was 100 times better than Windows, people will still not switch. The problem is not that superficial. The problem is more cultural, political and psychological, than it is mundane.
It took me 3 months to whine my cousin of Photoshop to GIMP. Prior to that, it was either Photoshop or nothing. He would never ever had considered using the GIMP if I hadn’t forced him to learn it, with the added bonus that it cost nothing.
Today, he finds The GIMPs interface more intuitive than Photoshop, and swears by the scriptability and plugins available for it. Today he prefers The GIMP to Photoshop.
Why was he so resistant to even try out The GIMP? There were many reasons, but none of it had to do with the difficulty of installing, package management, or the thousand linux distros. All of it had to do with culture and other social issues.
Right.
A single package that can be installed on all systems is one very important thing. I pray Autopackage can provide a good and useful solution for that.
A module for Autopackage in e.g. Yast2, Yum, Red Carpet, … could already be a big step forward.
Although I have to note most distro’s include SO MANY applications on their CD’s that it’s not likely most desktop users will need to install a package not on there. But when they do is when the trouble starts, since they have to use an installation method other then the one they always used.
The other is, like someone said already: pre-installed systems with Linux, with full HW functionality: working sound, network card, audio, 3D card with full accelleration, … where you avoid all of the frustration people have not getting their hardware to work, simply by giving them hardware that works to begin with. Brilliant! I know and agree they should do some research, but most don’t realize that and even when they do, few will bother making the effort.
I’ve used several different Linux distros daily for the past 15 years. Ubuntu,Fedora,Debian,Suse,Mandrake just to name few. I fully agree with the author about usability.
Opening any type of console to do some installing/setting up or related work is where Linux fails to live up to it’s popular counterpart.
If you have to know even one command to install or update a program, your average user will ask for help from a professional.
The purpose of any given OS in desktop use (or any use) is to fullfill the needs of the user. The user doesn’t “speak computer” and the computer doesn’t “speak user”, so the OS is there as a middle man to help them (user and computer) agree on things.
When your average user is expected to type something in the console, it’s like telling the user to learn to “speak computer”. But i think that is a task for the OS.
Sure i could go in to details about GUI, mouse etc. But i think that’s not necessary to make a general point like this.
Quote:
“If you have to know even one command to install or update a program, your average user will ask for help from a professional.”
And if they double click an icon, and are presented with questions, they suddenly are computer scientists?
Of course not, even with those gui installers, people still need help.
In fact, with any method of installation, when the user is presented with options, they need help, no mather how they are represented.
The problem of linux is not the installation of software.
On the contrary.
The problem is that most people are used to Windows, and that most people are conservative and do not like change.
Afterall, once you’re used to it, it becomes easier than anything else because you know how to do it.
>I have heard slackware is good but the average user or
>newbie would be confused trying to install it
>Mandrake was..
>Ubuntu was..
>All im saying is the people in linux must work together
>thats the whole key
The key to what? Desktop domination?
Desktop domination isnt much a goal in the Linux crowd, you know. Most of them just try their best to make systems they like, not to please some fastidious Windows convert. Besides of a few commercially oriented Linux companies, almost nobody actually dreams of making the distros pleasing for newbies in the first place, because the distros are “desktop ready” for themselves, and have been for some time now.
And most of the people who repeatedly moan about “unusable” desktops, hobbyist developers, “too much choice”, missing “installers” and all that stuff, like you do, for example, most certainly dont do any linux development at all, and just scream because theyd like to have Linux as a gratis windows alternative, but are not willing to change a single one of their windows desktop using habbits.
The best proof there is, that people like you have no interest in a community development process or actually software Freedom, is this ridiculous requirement to stop development of most of the lesser known distributions.
I mean, whats wrong with taking the first one, like Mandrake, which you seem to like, and simply ignoring all the others? Why do we have to stomp all the other distributions into the ground, just to give best chances and mind and marketshare to a singe one, to compete against Microsoft? Do you have been so brainwashed with such “one system, one start button, one vendor, one führer” monopoly monoculture slogans, that all you can think of is such a highlanderistic fight to the death, until “there can only be one”?
Aaaah, thats why SVs release they software in different versions for Windows 98,ME,NT,2003,XP etc because its all backwards compatible, right.
No they don’t, actually. Most vendors will create unified software packages for 98/NT/XP etc. and it will just work. MSI is actually available for 95 and above, but realistically only works for 98 and above. Binaries that you had on 98 will mostly just work on XP. I’ve got games from about 96/97/98 that still work absolutely fine.
Where this falls down is the drivers side of things, for fairly obvious reasons.
“I’ve got games from about 96/97/98 that still work absolutely fine.”
And I’ve got a couple of games and some language training software that DOES NOT work in XP, no matter what I try. 98, ME or 2000, all worked fine (although I wouldn’t even recommend ME to my worst enemy…)
Agreed. Installers aren’t the way to go. There’s a reason why even Microsoft is pushing ‘installation packages’ vs. stand-alone installers these days.
Enhancements such as an easy to browse Synaptic-style interface (with pictures and more detailed text). Alternatively, use a plugin for Firefox/Mozilla/… that allows you to safely install apps. A whitelist — enforced by the system — would give some protection against the scammers.
Make the installed programs (if not installed as root) go in an isolated environment (XEN, maybe) and much of the potential problems go away.
Standard API
– Linux does not have a standard api that everyone uses like DirectX or OpenGL/Carbon. This eliminates the need for dependancies which are no matter how you look at it nasty.
Standard Software Installation
– A universal installation package is needed.
– Users need to be able to choose where they install to in the case of programs.
– An Uninstalation package is also needed (one much better than windows, which actually removes the programs and associations).
Standard Hardware Installation
– This is needed if linux is ever to take off.
Centralized and Easy Configuration.
– Instaling and configuring a hardware device that is not supported by the kernel should not require editing in some cases up to four text files.
People that say the reason linux isn’t used is more a cultural thing are wrong. The reason why it is used is a cultural thing. I have plenty of people that come into the computer store I work at and ask about linux. Microsoft is about the most hated software company in the industry.
You are blind if you want it to be widely used and supported but only aim it toward computer guru’s. BTW: The store I work for does sell a couple of linspire based linux systems.
There’s hardly any need to support legacy junk on Linux because it’s development model is open, dynamic and evolutionary. This is in stack contrast with proprietary models like Apple’s or Microsoft’s.
One development model is analogous to the moving ocean, the other is synonymous to a slab of ice. It’s a question of dynamic development versus static development.
What do you think about Kalypso?
Most corporations are moving or have moved to laptops and this is where linux needs a lot of attention if it wants to be the “desktop”. Until linux can install and run on a varity of laptop hardware it will never make to the desktop seriously.
I do have one annoyance with upgrading my Linux system (Fedora Core 3); if I’m upgrading, say, 100+ packages and 1/2 have unmet dependencies, neither APT/Synaptic nor YUM gracefully drop the 1/2 that can’t be upgraded and continue on with the 1/2 that can be upgraded.
Instead, both UIs throw up errors and stop all updates.
To work around that, I’ve checked for updates, taken the list of updates, and wrote a simple dumb script to individually install each package. That way, if something can’t be upgraded it won’t be but the remaining will be upgraded fine.
To give you an idea how simple the script is, it only contains 1 line per package like the following;
yum -y upgrade gedit
The line above reads “run yum, answer Yes to all prompts, upgrade the package gedit”. If you haven’t used yum, it will handle extra dependencies for you.
The idea of a portable universal packaging standard is wonderful (see autopackage.org). But who is going to do the work? What incentive does a developer have to drop what he is doing, forego his family and personal life and spend endless hours on something which will bring him no personal benefit?
I think that people are missing the point of open source. The goal is to contribute. It is not like commercial software at all. There is no help desk. There is no room full of eager engineers waiting for the phone to ring so that they can have the joy of serving your every need.
If what you want doesn’t exist yet, or is not of the quality you desire, do it yourself. Don’t use the lame excuse that you are too busy. People have already donated countless hours (millions of man-years?) to open source. Don’t say that you can’t contribute. Get a book, learn to code. Write a manual, author a helpset, set up a web site, help distribute. After all, developers are merely users, too, and are doing those things right now.
The spirit of open source is a selfish form of altruism. Make a program to help yourself or to indulge in a hobby. Then let others enjoy it also.
“I think that people are missing the point of open source. The goal is to contribute. It is not like commercial software at all. There is no help desk. There is no room full of eager engineers waiting for the phone to ring so that they can have the joy of serving your every need. ”
if you pay for support, sure. You probably meant proprietary software since there is nothing preventing open source software from being commercially sold
No game support, no switching— That is reality. Saying “Go buy a console” may as well be saying, go use the Windows XP or 2000 that came with your computer.
The Wine project has alreadly allowed some businesses with legacy windows applications to switch over to Linux.
Many older Win9x games now ONLY run with Wine or Cedega. I expect this to continue as windows abandons old software with future versions.
Schools are huge– Getting a linux distro taylored for an entire school system (similar to K12LTSP) would be an enormous savings to the school–except for the fact that most school system adminstrators probably have never even heard of Linux, they’re choices are Mac (OSX) or PC (windows,) and usually because of cost, it’s a PC setup.
“The mainstream isn’t ready for Linux. Linux is ready for the desktop.”
This is the most common aptitude I see between many Linux users; I think it’s wrong and it’s the reason of scarce diffusion of Linux in desktops.
Yes, Linux the kernel, the O.S, the community, the model of development… have a great power.
But think to a win9x user, to a winXP user, to a Symbian user, to a MacOS or OSX user: they have a new thing to do with their equipment (use a new program or new hardware), ok, they do it in one click or two.
This is the “personal computing” way: what do you want to do today? You have a programmable hardware, and I (vendor x, or “the community”) give you the means to do the new funny things.
Simply this, it’s all, if you have time and knowledge to occupy of dependencies, file version, paths, become mad about searching right drivers or checking version and path of libraries in your system etc… maybe you are about server computing, or professional software installation, or you are setting up boxes, that’s NOT personal computing.
Ok, ok, there are rpm and many other interesting installer’s project, but tell me, why even most Linux oriented companies make installer for Win/OSX/Symbian that are WAYS simpler and better than the ones on linux?
Because you have to check the versions of libraries installed, the paths, kernel personalizations and so on…
IMHO to be successful on desktop Linux should undergo trough something similar Darwin/OSX: one thing is the community and the experimental Linux when anyone can touch and move everithing, another thing is “PenguinOS” or something similar, when anyone knows how to AUTOcheck libraries version and path, kernel patches and so on, to have really one click and forget installer/uninstaller.
The fragmntation of Linux-related scene is worrying, because it’s a good thing for research and for guarantee that no good and worty work or fork can be “imprisoned” by majors but on the other hand it’s a serious nag to create unique system automations, quite painful for users and more painful for commercial developers (the ones that should supply easy to install ad use products).
As with everything else there is always room for improvement. However, lately Linux is becoming extremely popular. Everyone I know, including myself, has switched to Linux and everyone loves it. Windows has become a virus and spyware ridden disaster and people are tired of infested computers and loss of data. This is certainly a main factor that helped Linux become that popular and once someone gets a taste of Linux he’ll stick with it.
Third party software companies could produce software for the top two distros then the other distros could create a conversion utility to install the software. There are already conversion utilities for DEBs, RPMs and TGZs. If a distro wants to veer off the FHS path then they should be responsible for creating the conversion utilities.
I am surprised that the small distros haven’t done this already. Since the bigger distros have done most of the work packaging the software, why not leech of their repositories.
On the GUI front, I work for a company that still uses some DOS programs. The office people use these DOS programs along with programs like Word and Outlook. I am sure when these office workers go home to use their own computers, they could use GUI and CLI programs just as well.
People can cope with differences in GUIs too. For those people who work with Win2K at work then go home and use WinXP, do you encounter differences in the GUIs? There are differences between Win98, WinXP and Win2K but not too many people complain about it. Moving between Windows, MacOSX, Gnome and KDE just takes a little bit of time.
Linux will arrive on the desktop sometime. Those who use linux now do not need to wait.
you doesn’t say anything new
ABI, API, and threading models are a bit of a canard to complain about. API compatibility is just as stable under Linux as Windows or Mac OS/X. ABI compatibility has gone under a couple of upheavals for C++ over tha past few years, but blame the ISO for taking so long to settle on a C++ standard. It’s pretty stable now and looks like it will remain so for some time. The threading API has remained pretty consistent too. Granted the semantics of threads have changed somewhat, but they still obey the POSIX rules and if the programmer adheres to the rules, all will be well. Again, the threading implementation has also stabilized.
That leaves packages… Well, there are graphical package managers out there and they ship with most distributions. There are two package formats (RPM and DEB) that can take care of all the dependencies for you provided that you
use have repositories available for the dependencies. My only addition would be a method for a vendor to temporarily add a repository, if need be, to your list.
What I would like to see: standardization on RPM as a package format, and apt4rpm as a repository handling tool. I’d then like to see a standard set of RPM macros an utilities to go with the packages. For example, Mandrake’s macro set includes an ‘mdkmenu’ macro for adding items to the menus in KDE and GNOME. Why not an ‘add_desktop_menu_item’ macro that has RPM run a utility to do whatever magic the distribution requires to add menu items to the menus? It’s not complex. I’ve been thinking of doing this myself, actually. Coupled with some naming conventions for applications and libraries (which seems to be converging anyways), I think you could easily make RPMs entirely distribution agnostic.
Quote:
“When Microsoft releases its newest Directx, you don’t have to wait until the next version of Windows to utilize it, you are free to download and install it right away”
Perhaps the author has a special agreement with Microsoft and he gets the new DirectX version directly from the development team and does not have to wait for Microsoft to release it to the public.
Or maybe he just uses a distribution that does not provide updates to packages when the respective project releases a new version.
Live CD distributions usually don’t maintain their own package repositories so they release new version when sufficient numbers of new packages are available.
As the first possibility is extremely unlikely, I can only assume he is basing his comment on this issue on observations of the wrong kind of distribution.
The author of this article seems to have make the assumption that making Linux more like Windows would be an improvement, and that Linux distros should concentrate on converting Windows users to Linux.
I don’t belive these assumptions are correct. Don’t get me wrong, I find no fault with Linux distros that try to imitate the Windows way of doing things. Such a distro might be a good fit for many people. However, many people leave the Windows world to find something better, not more of the same. If we liked the way Windows did things, we would have just continued using Windows.
Hello all,
I’m currently thinking of releasing a Linux port of one of my commercial games for Windows, a miniature golf called Garden Golf (http://www.funpause.com/gardengolf).
I’ve been doing kernel development (http://www.osnews.com/story.php?news_id=8162&page=3) but the userland is another story 🙂
For that game, I want to use a bunch of external libraries that don’t allow static linking due to the LGPL license, and that’s where the nightmare begins. It’s entirely up to the authors of these great libraries like SDL and Freetype to pick a license for their code, so no discussion there, but it does create issues for packaging.
I’m looking at Autopackage, but if anyone has ideas, I’m a taker — especially since I’m not the only casual games developer trying to support Linux and looking at a development and support nightmare.
Emmanuel
Linux has a lot against on the desktop right now.
1) What linux are you talking about? I know there is only one Linux, but the mainstream does not see it that way. Right there, that’s a huge problem.
2) The biggest problem of all, and one that the open source community will never solve – after all, how could they, they created it – which is this. “Linux is nothing if you don’t stay in the loop”, it’s rolling news for geeks. Did you get the new installer from so and so??, it’s cool. Did you hear about the new distro? it’s got and new desktop theme, I installed it yesterday, but I am switching back. Did you install beagle yet?? I got it running yesterday after a week of messing with it – it’s the best searching for Linux! – this week anyway.
In Linux you have to “catch up”, and then stay up, or you can not use Linux. How can you solve this? the only way I can see it is for one vendor to just take over and become the standard, and that vendor would have to stamp some sort of version of the product so support venders would have a clue – intill then it’s a geek soap opera for geeks.
Yesterday, I was adding a printer to my Linux computer. I had just added the network printer to a few Windows computer and now I was adding it to my Linux system. It was relatively easy to install but the graphical interface used to add it was hideously ugly and unprofessional. I looked around and realized all the system tools and setup were the same. They were inconsistent. They would get in the way of things. They just were not that helpful.
I came to a conclusion last night. Linux will never overtake Windows if it continues it’s current approach to the GUI. To Linux, the GUI is an afterthought. It is a piecemeal solution, where controls are added one by one. It is a whole bunch of different solutions to controlling certain aspects of your system all thrown in a bag and mixed together. The Windows GUI is so much more elegant and professional.
IMHO the greatest barrier to Mass Market Linux acceptance is gaining access to the distribution itself. Given that there are a lot of people out there that are still on dialup, “Free” and “Easy to Use” and “Hardware Support” have no meaning after they have spent a week downloading a distro, then installing it and finding out that their winmodem won’t allow them to access the Internet. This assumes they know how to burn an .iso and change the boot sequence in their BIOS. What is needed is an easier way to get Linux (put the boxes back in stores, Novell…) and to have readily available resources for questions, resources that are not online (does your local community college have a Linux for newbies class?). One of the great contradictions has always been that the Linux advocates that want to see wider adoption of the OS ignore the fact that while a free OS should be popular with lower income people, lower income people do not have Cable or DSL. The download/access, winmodem, and education issues will reamin barriers until they are seriously addressed.
if Linux does break into the desktop market in a big way it isn’t “Linux” that is going to be used, it will be “RedHat”, “JDS”, or “NLD”. That is what the average user will know they are using. It is only those distros with corporate backing that software companies will create apps for and everyone else will be forced to make the apps work themselves on the distro of their choice. You see it already with enterprise apps only being supported on RedHat or SuSE. I use and prefer Debian, but if I want my app supported I have to use RedHat or SuSE or I am on my own. Most end users don’t want to be on their own so they will be forced into certain distros and then we will see posts about how the RedHat, Sun, Novell are ruining Linux for everyone and their way sucks and it’s slow, etc…
Just use what you want, younger users have grown up with more exposure to computers and have a higher level of confidence when using them. If Linux is to win out it will be over time through a cultural change of users who want more control over their computers. As for businesses, they will still use whatever will protect their investments.
And I’ve got a couple of games and some language training software that DOES NOT work in XP, no matter what I try. 98, ME or 2000, all worked fine (although I wouldn’t even recommend ME to my worst enemy…)
Well poor you. If it worked in 2000 and doesn’t work in XP I find that very difficult to believe to be honest, but you can’t have absolutely everything.
How many times do we all get “uprade to the latest versions of….”, “just apt-get” or “recompile xxx” as answers when dealing with Linux when someone’s binary software doesn’t work? This just doesn’t wash I’m afraid.
All that whining about installers.
Installers are an implementation detail. What’s needed is conformity to standards. Kinda like de Debian derivatives do with the FHS (so yes, standards do exist). That’s why it’s not really a big problem to install FHS compliant RPMs on a Debian system using e.g. alien or even rpm itself.
There’s not even a single package format for the “standard” Windows OS. Excercise: Install a Windows box from scratch, and all the software you want to run on it. Keep a tally on how many installer types and versions you encounter. Native MSI, Wise, Installshield are just a few I can come up with from the top of my head. The only common ground they seem to have is that they all either come as MSI or EXE. The reason why all these somewhat work for the end user is because a Windows system is *reasonably* standardized. Note, that as soon as you dig a bit deeper, you’ll notice that the Linux situation regarding software installation standards and methods are a whole lot better than everything you encounter in Windows.
But all this is drivel. What we need to do is stop comparing Linux distributions to what we’re used to in Windows, and find our own damn market.
It’s about centralized repositories. If centralized repositories are going to be needed to install software then Linux has 0 chance of ever making a significant dent in the consumer home market.
Linux is too much of a mess to make much inroads in the home consumer market. Or if there is not going to be standards then someone has to come along and do something in userspace that distinguishes itself radically from Yet Another Distro.
Why is everyone obsessed with dumbing things down?
What happened to the days when you went out and attempted to READ documentation? Heard of a book?
No…you all want to use a very complicated piece of technology like its a toaster.
Sure, why not. We can make it as simple. The end result though is a generation of dumbed down computer users that dont know not to run that trojaned executable, that dont know that rm -fr / is infact a bad idea.
Of course, we always could make it so they *cant* run that trojaned executable, and we could always rm rm but is this the right way to go? Should developers be trying to protect users from their own lack of education?
At the end of the day GNU/Linux is not for everyone. One size rarely fits all.
As for improvements to Linux? I’d like dm-crypt to advance along a bit (not that it isnt good now, but I think everyone would agree theres room for improvement .
My personal wish for improvement in the “Linux” community is that they stop naming the OS after the kernel, heh but I know thats not going to generally happen – ahh well, we can wish.
Hmmm, for my needs, gui design for example takes only a very small part in the development cycle of a program.
Most time is spent in actually coding the program and algorithms.
Well, RAD is probably a bad choice of words. What I’m really talking about is an alternative to dumb text editors and source code in files.
http://mindprod.com/projects/scid.html
It’s about centralized repositories. If centralized repositories are going to be needed to install software then Linux has 0 chance of ever making a significant dent in the consumer home market.
Well, centralised repositories are good for some things and not so good for others. With a repository users don’t have to do much to install software, and that works great for fairly static things like drivers. Installing drivers on Windows isn’t a great experience for anyone, it’s time consuming, awful and I hate it.
However, repositories don’t give you the choice of software to install and they sure as hell don’t make anything straightforward that ISVs can adequately support.
Linux distros fall apart at that junction, as the only way to upgrade your packages and apps (let’s face it, realistically) is to rely on your distributor to do it for you. You’re then heavily reliant on your distributor to support you not just in terms of your OS, but for all of the applications you use. For many businesses that is a situation they feel very uncomfortable with and can even make them feel more locked in than with proprietary software!
Strangely, I get the impression that most RPM using distributions (but not limited to them) really like their awful packaging systems because of this. It means that realistically, only they can support a repository of updates for their particular distribution and any competing update repository there is, free or otherwise, is always playing catch-up. It also means that they can more effectively dictate a cut-off point for then end of their product’s life. If the repository is no longer available for you, realistically, you’re cut off from updating or installing any new software unless you upgrade. What is worse is that some distributions, like Xandros, actively make it difficult for you to install any open source software that competes with the commercial stuff (like for CD burning) they offer.
A lot of Linux distributors seem hopelessly addicted to that kind of lock-in and it is putting many businesses and organisations off. I’m sure Microsoft would love to do all that, but they know that it just isn’t practical.
Be as pedantic as you like, but that’s the situation we have with many Linux distributions today.
Or if there is not going to be standards then someone has to come along and do something in userspace that distinguishes itself radically from Yet Another Distro.
Quite possibly, yes. However, I don’t see anyone willing to do that.
My personal wish for improvement in the “Linux” community is that they stop naming the OS after the kernel, heh but I know thats not going to generally happen – ahh well, we can wish.
Ahh, yes GNU/Linux . There is an awful lot more in your average system that your average person would think of as a whole OS, and it consists of software that mostly isn’t from the GNU .
linux is indeed ready for the desktop!
sure, there is much to improve – but so have the ‘others’ too.
all this articles about linux covering linux and the desktop are of an advanced-user-point-of-view. if you say ‘linux to the desktop’ you have to deal with people WITHOUT having the slightest clue.
and these people are lost in either way:
give them osx, xp or linux. it really doesent matter what they’re using.
im donig a lot of support for people (NO geeks) and i can tell you that it really doesent matter if they’re using xp or linux. if someone is afraid of the os (which most people are), then no matter what os they run, they’re overwhelmed by the options the os has to offer.
the ‘ease of use’ in microsoft-products is the biggest pitfall they created: every user is root. they can do whatever they want and people with or without a clue WILL do really stupid things. and nearly all do really stupid things. the biggest advantage microsoft has, is that people think its standard. how other is it to understand that most people buying a computer bundled with a so called ‘recovery cd’. they play around for months, hose their system, putting in the cd, and here you go. every bit of the lasst months is gone. the best part: people DONT care. no other vendor can do things like this. people blindly and stupidly accept it. not because windows is better to handle, but because they think its NORMAL. they dont know of the alternatives and therefore dont know abut the advantages of other os’es.
when i worked a few months with ubuntu i decided, if people come to me i install them ubuntu and thats it. some of these people start to cry ‘oh no, linux is just for rocket scientists’ or ‘oh no, i dont have a clue about computers’ and i say: ‘yeah, thats why i get you linux. if you dont have a clue, in linux you can’t hose your system that fast and i dont have to answer calls every 48 hours for fixing your damn xp becasue its full of viruses and trojans.’
and: it works.
in linux, (i take ubuntu as example) you install a system off of ONE cd that contains most stuff you need to run a geek-life. with xp you have… ? an ancient text-editor and ms-paint. great. then some bundled software you can install later which overextend most users anyway.
as i see this, one of the biggest parts of the problem is, that nowadays dedicated computer-shops are gone. well, i only can speak for germany. but people dont realise that a computer is indeed hi-tech and i feel and think it would be rather normal to go to a dedicated shop and by such things off of them. i mean, do you buy cars at the groceriestore? mostly not, but people buy computers in groceriestores.
theres is noone who asks what they want to do and tells them what they need. they only see ‘10000GHz cpu’, great for surfing the web… all this big sellers have prices most little shops cant compete with. so noones going to a dedicated shop anymore because thy think they can save a few bucks.
package that want to install without the distribution installer can do it without a itch nowadays .
Bu they have to do it in /opt else they will mess the system and we will end up with the windows dll hell …
Else you are right . We need a icon like the rss one to add a repository to apt and start a light window to upgrade/install what is needed (ubuntu already made this light window and xandros did the icon/one click thing).
There is just the need for someone to code the glue … sadly there is not only installer to work on.
If one feel in hack mood let s code it
Taking the piece together would take around 10 hours if one know of apt/ubuntu/html.
Cheers
Alban
Unfortunately if you want to sell to the “masses” this is what you have to do. The vast majority of people do not want to know how to use a computer effectively, they just want to use it, like a toaster. In order for Linux (regardless of distro) to gain widescale acceptance by Joe Sixpack, it will have to be severely “dumbed down” so that virtually anyone can use it.
I am sure there are a number of people who will disagree with this, but I look at other things besides computers and the same thing is taking place. I was a photographer for almost 25 years, hand a person a Nikon F and they would have no idea what to do with it! What no autofocus, no auto film load, etc, etc! Computers are no different, people do not want to read manuals and configure various options, they just want it to work with no thought on their part! Apple had a TV add that was somewhat effective in selling their products by showing someone with a PC installing a CD ROM drive. The “installation” took over 12 hours to complete while the Mac install took just seconds.
I see Linux fighting for acceptance on three fronts, the server room, the corporate desktop, and the home desktop. The server room is not much of an issue because in most cases you have experienced personnel and the transition (based on a *nix environment) is mimimal. The corporate and personal desktop is another matter entirely. People “trained” on Windows will expect Linux to work exactly the same as Windows. Unless companies adopting Linux are willing to spend some serious money on training their people to use new software, the transition is going to fail. And most people use at home what they use at work. And you will always have people who will reject anything new and different. The next few years will be interesting either way.
“Actually, the OS X way is equally broken. Statically linked binaries are just backward. They are so 70s.”
For the Mac, they make a lot of sense, because the libraries are an integral part of the standard operating system. An OSX program can link to particular libraries, and just know that they will be there.
For a diversely distributed sytem like Linux, one that’s so friendly to customization, dynamic linking is a bit riskier, because libraries can move around. It’s safer for applications to come with their own basic libraries — even if it creates redundancy, there’s plenty of storage nowdays to fit them. That wasn’t true in the ’70’s.
I have to agree. Dedicated shops, that aren’t thieves would be nice. But that’s why people are afraid of those, at least in the US. They started out and have continued to be thieves.. I know because I worked for one. It survived on return customers and people who just didn’t know better; but mostly I think it was return customers.
Now, compared to BB and CompUSA (our big box stores here); it had the best service. The service was cheaper, and it was faster. But, it was still a ripoff… And the prices on parts were just outrages, so outrages that their highest sales were through the service department! People came back, because it was moderately fair. But you saw a slow stream, that adds up over time, move to big box stores because the computer cost less off the shelf, and you could buy a “better” warranty. Now I can tell you one thing about BB’s warranties: Their service department is filled with one of these two:
1.) Morons who don’t know that you have to plug the hard disk back into the machine for it to boot.
2.) Assholes who think that leaving disk drives unplugged is a good way to get people who thought they were getting ripped off at the service counter to come back..
Anyway, at least in America; small shops that see you through (i.e., support their products with local service and support and then sell you a new one) died when people started stealing from their customers dependence. Customers left it, and now you see almost solely big box stores.
Personally, I would like to see every computer user carefully ready documentation about things. But there are two problems with this:
1.) There isn’t any that comes with the product: Digital documentation is wasteful, if you don’t print it almost no one will read it on a boxed product. Anybody read their game’s pdf manual? Now would you read it before you had a problem or question?
2.) People can’t read. People are quite literate in theory. They can read, they can do it well (typically). But in practice they are often quite illiterate because they never sit down and read. I’m bad about this, getting me to read something on a piece of paper is hard: I’ll almost invariably skim it and pitch it. But most people won’t even skim a manual, they’ll assume they can just wing it!
This all has little to do with linux. It has to do with learning new things. There are two ways to learn something new (that has been discovered by another):
1.) Read the work of the person who discovered it, and learn it in time X.
2.) Learn it by deriving it yourself with no outside help, and spend time X^2 (X**2, pow(X,2.0)). It takes a lifetime to discover something, and a single lecture to understand what we need to learn from the discovery. Anyone who has been through a math class knows this, Calculus didn’t happen over night but you learn it in 1-2 years (at least the basics).
So, the conclusion: Tell people this when they have any computer problem: RTFM
And if they say there is no manual, tell them to get a new product that has a manual.
I agree with the author’s request for installers, but I also see the definite need for package managers. I would like to see software that will recognize an apt package when clicked and run an “apt GUI” complete with check boxes and “Next” buttons and progress bars, etc… This would satisfiy the package management, negate the need to repackage software, and not insult the intelegence of the power Linux user by forcing them to use a GUI app, but it would also make the Windows users feel warm and fuzzy. This app would not be tied to a specific distro, like Click and Run, or window manager and could be installed, like Synaptic, on any distro you can get apt packages for.
With a repository users don’t have to do much to install software, and that works great for fairly static things like drivers. […] However, repositories don’t give you the choice of software to install and they sure as hell don’t make anything straightforward that ISVs can adequately support.
Yes, you saying that e.g. Debian with around 16000 packages doesn’t give enough freedom of choice. Yes, you must be right.
To Ubuntu and all-other-distros-based-on-Debian all I have to say is that without Debian’s great package pile you’re nothing but Yet-Another-Base-System-Installer. Thank you.
You really put the nail on the head concerning lock-in and being a nightmare for ISVs when all software is in a distros repository. And I agree with you that repositories are a good thing for drivers and base-system packages, but fall short for other software.
It’s not a big deal for me to tar -xvzf *.tar.gz and ./configure, make install, but joe-average consumer won’t be doing that. not to mention that it tends to make system management less than great because the package database doesn’t know about these foreign packages unless you build a package with the source code you built.
I guess the key is for something like autopackage to play well with rpm and deb systems. Ideally, there would be one package system that was smart enough to work across distros. Think how Mandrake and Redhat don’t play well together, and now Ubuntu and Debian.
Of course, as I stated before, there are the little distro fiefdoms that want to protect their turf
I’m sorry but as a Linux novice but computer expert I want to give the view of the common user.
Yes OEM and not having a choice is a big reason for Windows dominence at the desktop. But thats avoiding the real problem with many open source projects.
To keep it short its the “project for developers by developers” mentality. I’m on numerous forums and its surprising how often you see this. I fully understand that many of these projects have limited resources and most of the developers work for free. But be that as it may if you do not make an effort to listen to your users then you will eventually be out of touch.
I have two linux boxes on my home PC but they are relegated to a server and a “test/play around with box”. Why? B/C despite the advances Linux STILL doesn’t do everything I need it to do in an EASY way. I’ll give you a real world example. When my grandparents old PC kept getting infected instead of upgrading the win98 I installed linux. I wanted something really basic and simple to just do what the needed (browsing, e-mail, IM and phone calls). I started with Cobind but even though that was a nice looking distro it was too limiting and it kept crashing/hanging. I then found BeatrIX and talk about a great distro!
Everything went really well for months until they wanted to do something else (doesn’t really matter what for this story but it was PVR functions). Search around for Linux alternatives and I quickly ran into a whole bunch. After narrowing them down to Freevo and MythTV I went with MythTV b/c it seemed the most developed. After two+ weeks of messing around (even installing various distros that had guides including Fedora and KnoppMyth) I gave up. The cheap PVR card has a no go, even the Hauppauge PVR-150 didn’t work!
So I go out an buy an OEM version of winXP home, installed various security programs (Zonealarm, Avast, various anti-spam and anti-spyware, etc…). While I had a glitch or two with the PVR side of life I was able to get both Media Portal and GB-PVR up and running fully functional in a day (even with the el-Cheapo PVR card). Now they have a dual tuner capable PC. Yes its still MS and it still might not be as stable or as secure but its been running for over 3 months now, minimal issues.
Bottom line the typical user will put up with some issues and reboots and such, if they can get things that WORK and do so fairly easlily.
Listen this is not new.
There have been packages for years with installers that are essentially distro-independant. Mozilla had them. Firefox has them and Open Office has them.
Loki games had them and Linux Game Publishing I believe still uses it.
Commercial based closed apps like RealPlayer and acrobat have them.
So what?
I have yet to understand why autopackage as an example is being met with such hostility or why the worm has turned and installers are being seen as a bad thing.
For years, I have been forced to reconcile the fact that many small projects and even larger ones like gnumeric will NOT be bothered to include binary packages because of the hassle involved in doing so for so many different flavors of linux. I have waited on my distro to provide them or outside deb or yum sources to appear.
I think its nice that projects like Inkscape, Abiword and gaim have begun offering autopackages.
There are two things that linux has become too dependant on individual distros to provide.
1) A packaging system
2) System configuration Tools
These things are too important for the future of the OS especially on the desktop to leave in the hands of dueling distros.
I like the fact that Ubuntu despite the hostility toward autopackage turns to individual projects like gnome-system-tools for the System Configuration needs.
I beginning just the last few months to relish the day I hope one day will come where I can install a one-CD distro like Ubuntu and then just download the other packages as I need them in autopackage format.
I hope one day that projects like Network-Manager, gnome-cups-manager and gnome-system tools become robust enough that I fail to miss at all the distro-specific tools on my Fedora Core 3 box. KDE fans insert your own projects where necessary.
Maybe that is just a pipe dream but I hope one day that time will come.
No more installers dammit. I hate those things. They are the reason I get to have fun each day trying to remove spyware/ addware crap installed on windows systems by lowbrow computer users. Really, using a repository is not rocket science.
The only user who should have rights to install anything is the root user. If a person doesn’t know how to use the distro’s repository, or how to build using make tools (or similar, like scones), they should not be the root user. End of story.
———————-
$> emerge windows
> error: kernel panic
Linux is a reflection of reality. The world is a complex and confusing place to live in. Nobody can change the way the world works. Nobody can exert undue influence on how Linux works. I think people are overwhelmed this phenomenom.
Whining about the choice Linux affords you is like whining about the diversity of fish in the ocean. Do you propose we kill all but one species of the fish in the ocean, so that we can have only one to study?
Yes.
Because, what you failed to grasp with your little retarded analogy is that an Operating System is not …the world, so it wouldn’t have to be like it.
We DON’T want our software to be “complex and confusing” as to reflect the …complex and confusing nature of the world.
In fact, with politics and with our actions and inventions, we try to make the world a LESS “complex and confusing” place to live.
Technology is key to this. It is invented to SIMPLIFY not to add to the “complexity and confusion”.
Now, maybe the way Linux [1] is built, is like the way the world is run, so it too ends up confusing and complex.
But that is just the way it is built. We can choose to use other operating systems, that use nicer building methodologies. Say, OS X. Or even another free os.
[1] Some idiots will correct me: “Linux is just the name of the kernel”. Well, guess what: USE defines meaning. The “Linux” tag may have been *invented* to name a kernel. But people use it to name every OS built with that kernel. It’s how we use words, not where they come from that defines their meaning. So “Linux is just a kernel” retards, you are in the minority. Get over it.
(In fact, you are even mistaken about the facts. The kernel’s name is “VMLINUX” not Linux).
No more installers dammit. I hate those things. They are the reason I get to have fun each day trying to remove spyware/ addware crap installed on windows systems by lowbrow computer users. Really, using a repository is not rocket science.
The only user who should have rights to install anything is the root user. If a person doesn’t know how to use the distro’s repository, or how to build using make tools (or similar, like scones), they should not be the root user. End of story.
Most computers are owned by private individuals.
What you say has only sense (and little at that) for the computers
or a corporation / organization with an adminstrator et al.
Idiot.
“Well poor you. If it worked in 2000 and doesn’t work in XP I find that very difficult to believe to be honest, but you can’t have absolutely everything.”
Well, it’s true. And it does matter, cause I paid for that. And for all I knew software for earlier Windows version should keep working on XP. It just annoys the hell out of me when someone makes a valid claim and other people downplay that by saying “oh, that’s not a problem for anybody, really…”.
Well, this is a problem for me, and it’s annoying.
Nice article, has some interesting points. Not sure why you would want Linux to take over the market, after all, most linux users whine and scream about Microsoft trying to take over the market. Anyway, here’s what I think Linux needs to gain a better share:
1. All distros to follow a standard like the LSB. Different distros put files in different places, makes applications between the distros incompatible. More than likely a pain in the neck for commercial vendors.
2. Better naming of applications, this is not necessarily Linux’s fault. But having things that begin with a G or K in every application is not professional. Same goes for OOo with writer, presentation and all that. The names are too simple, thus making it sound like a simple app. Complicate it just a wee bit.
3. This one may be just me, but having a speaker icon that actually controls the sound when using only Alsa drivers. Since OSS is obsolete, I don’t use it.
4. Two pane file browsing – One pane is the tree, the other are the files. I believe Konquerer has this, not sure if nautilus does or not, but having a single window and then trying to navigate can be frustrating.
5. Commercial applications, not just from Linux distributors, but from third party application makers as well. Although this seems to be the chicken and egg situation we won’t go into.
What will all you linux zealots say when big billion dollar companies control linux like they are starting to do now? What will you say about linux then. I think most of you are a bunch of software hippies smoking some bad geek dope. If a system does what you need use it. Be it linux,windows,osx etc. It’s a damn os not a new heart or lung.
Linux is great there is no doubt but it is just not as easy to say get around to what you want to do and configure right away. Windows is very intuitive for a lot of people. M$ has gotten that right. Like installers and uninstallers…theres just a few clicks and it does everything for you. Sure cruft is left behind but I dunno if it is the problem with Windows or the installing application itself because i have had many installers that clean out the darn registry and so you will find nothing of the program associated with the OS anymore.
XP has games, Linux does not.
XP looks better to a lot of people and Linux does not. I am sorry but GNOME is for nerds. At work I use Gnome and some kind of windowmanager, sawfish or something…absolutely appalling. KDE is decent I like it.
Lets see setting up XP from scratch is painless even for a noob. Not Linux…edit this config file…download that library, apt get this apt get that…too much work. If you want a mainstream user whos knowledge about computers is that it is a cool box, runs on electricty and is smart and does what I tell it to, M$ pwns Linux in that regard.
Command line command line command line…GET RID OF IT. Everyone hates it except people who are comfortable with it and even they spent hours and hours of trying to figure stuff out…and nerds who love it cause it is quite powerful when you get used to it.
You have to realize most people are not smart and they dont want to spend more thant 5 min on their computer other than to check their mail, do some internet shopping and so on. Not worry about performance or antyhing. In that case, I think XP has done a great job to cater everything from the hardcore user to the simpleton who just thinks that booting a computer means kicking it.
Yes XP is quite configurable. There are many tools available for people to use to tweak it and they are all graphical and point and click. No using the keyboard to type in arcane commands and so on. Pipe this pipe that add this switch that flag NO ONE WANTS THAT. Even you nuts who love this command line stuff, if there was a graphical way a very nice and intuitive way to do it, and I think OS X has achieved a lot in that regard, would you not use that way? And if you say no then you are lying.
Windows just is so much more natural IMO to use than Linux is. Man I am a CS graduate (recent) and we had to use Linux for our programming assignment. Well guess what EVERYONE except those skinny dudes who look like they just slept in a trash can, who dont shower, have no social skills, play tux racer and tetris, and sit in front of a computer all day prgoramming loved Linux. And boy in the CS department you find a lot of those kinds of people. Everyone I know used Windows, did their programming assignments, SSH into the school computers test it there and boom! Done…less hassle…why? Because the Linux boxes were a pain in the ass to figure out and they were slow. Well most of the people I knew their XP installs were slow because they had viruses, malware and so on and they didnt know what nLite was.
Look Linux is good for mission critical apps, rock solid stability, robust security but when it comes to using all its features, and tweaking it and compiling it from the kernel code to extract every bit of performance, all it comes down to is, oh look my XP right out of the box is almost as good. You cannot say really that it is easier to program in Linux than in XP. Oh you bring up command line and commands like grep and piping and so on…well take a look at the find and findstr command for XP. Just do a search of google and while XP command line commands, while not as numerous as Linux, they still get the job done.
Even if you start bundling it, Linux wont win. It has to do with as someone pointed out psychological reasons. It has to do with the long history MS has had with common folk since the days of Win 95. It has to do with games because yes dont lie, you linux geeks (converted from Windows) miss gaming and dont tell me Tux based games are better than HL2, Far Cry, FIA GTR and so on.
Look I could ramble for a really long time, but yes I know Linux is an excellent OS, I know OS X is an excellent OS, even better when compared to Linux I would say. I would buy Tiger and a phat Dual 2.5 GHz system at the drop off a hat if I was not saving for a dSLR system and also if I didnt keep hearing of the impending release of the multicore 970 PPC processor. *drools* But the fact of the matter is Linux is nowhere near mainstream desktop ready. Quit kidding yourselves people. Sure M$ is unethical the devil’s spawn and so on but you have to admit, they keep getting better and better and people will keep using it unless someone can go back into time and get rid of Gates. My 2 cents. Flame away Linux fanboys!!
Congratulations. Your considerations and points of view are simply brilliant. Fact is,Linux “must” learn at any cost, the feature number one to conquer user, needs to be the facility operation and friendly like. Users hate command line, neither need to understand. System will do it.I have beein trying a friendly issue on Linux, the brazilian edition named KURUMIN, easy to operate. I work in computing since 1986, these is the easiest I have known. Even so, it depends on the damned command line. Definitively it makes things more difficult. Even on more distros as you use to say, which application will run on my system, there are many flavours. This is complicated once every one claims to have the best distro. Things are not centralized.
In order to get respect and be dismistifying, LINUX and his troup, MUST LEARN how to be friendly with MICROSOFT.
This may sound an heresy, but this is the point.
Congratulations.