“For the last 12 months, I have used Ubuntu 8.04, 8.10, and 9.04 as my primary OSes. I remain a very happy Linux convert, but I worry that Ubuntu is being unevenly developed. Certain areas have seen great improvements over the last 12 months, while other areas have languished or been largely ignored. The purpose of this article is not to whine or rant, but to bring some perspective to the evolution (or lack thereof) that Ubuntu has experienced between versions 8.04 and 9.04.”
I shall start with a disclaimer: Tanner Helland (the author of the article) is a personal friend, but I felt the article was well worth forwarding to the OSnews community regardless of its writer. His blog is small and this fine article would otherwise be overlooked via the main news channels. I think this article does a better comparison than I could personally do because I do not use Ubuntu full time.
I’ve tried every release since version 4 and it doesn’t last long before some bit of bad design irks me too much, considering I’m already on a very trouble-free system with OS X, and don’t want to ‘downgrade’ my experience. That said, Ubuntu has really progressed every year. Version 9.04 was the first to support all the hardware in my MacBookPro out of the box, a superb achievement. User friendliness has improved, the speed has improved plenty.
Tanner has taken a much more reasoned approach than I can as he has done what he set out to do and that is switch to Linux full time and live with the problems until he can overcome them or the community fixes those bugs. Tanner writes this article not out of some ‘look-at-me’ jab at the state of Ubuntu but out of a passion for the operating system that he wants to see go places and that desperately needs attention in some areas if it is to make grounds with regular users. This is a man that speak of the experience of anybody who has had to endure a long and arduous operating system switch, but also that of converting friends and family to linux and seeing how it stacks up ‘in real life’.
Tanner covers the subjects of:
- Hardware support
- Appearance
- New features / innovation
- Default software selection
- Performance
- Usability
- Community and support
Ranking them from school-style from A to F based on how these aspects have improved since Ubuntu 8.04.
The appearance factor, it could be argued, is unfair as all that bling is not what a good number of users want, and Ubuntu is better for not having it.
I personally don’t think that fits within the scope of the article which has the underlying theme of how Ubuntu stacks up against the other OSes when it comes to regular non-geek users. Ubuntu is frankly bland and lacks oomph. Users don’t understand what goes on under the hood, so they will associate looks with quality. Nothing stands out in Ubuntu as something pleasant to use every day. It’s too much like a dreary corporate desktop.
Yes, but, you ask—doesn’t Compiz add all that eye-candy and bling? Well, not really. A static screenshot can’t show Compiz off, but in Vista and 7 you have the glass reflections on the windows and the same on the OS X dock.
Ubuntu doesn’t need to sell itself to geeks, it’s already sold. It needs to sell itself to regular users, and yes, bling is annoying to some and it’s tacky and it doesn’t solve real problems, but I have to agree with Tanner when he says:
from 2005 to 2009, almost no major progress has been made on the default theme and initial desktop presentation. I find this less than acceptable because one of the biggest hurdles to Ubuntu adoption, in my experience, is how unimpressive it looks at first glance.
Anyway, you the community—linux users who all know better than I do—sound off in the comments with your own grades and how you feel Ubuntu has improved for you over the years, and where it’s still needing work.
As usual, I find it somewhat annoying that he[1] talks about Ubuntu when most (if not all) of his points are related to Linux generally.
Credit to whom it belongs.
[1] The original author and not the editor.
Edited 2009-09-13 16:48 UTC
To any regular user, it’s _a computer_. The OS is the computer. Yes, it’s Linux under the hood, but the end product, Ubtuntu is not doing anything illegal by taking free code, packaging it and calling it Ubuntu. The developers of everything that Canonical uses to make Ubuntu all agreed to that fact when they accepted the GPL—live with it.
Who said anything about legality?
I think the idea to credit those that do the work is one fundamental aspect of free/open source development. Also newcomes and general public should grasp this, not to speak about editors of OSnews.
I grasp the idea very clearly; I write software myself under free licences.
My angle is that arguing attribution is totally, and utterly missing the point that that has absolutely less than zero relevance to the end user and how they perceive the Ubuntu desktop. They see a 2D screenshot, not a multi-layered, diverse eco-system.
I agree, GNOME is just too plain boring and depressing.
Ubuntu should go with KDE 4.
Ubuntu with KDE is called Kubuntu….
And it’s not the best KDE distro…
He is talking about how Ubuntu should switch to KDE as its main desktop. Gnome should be dropped entirely, it is too stuck in the 90’s.
Exactly.
Just simple example.
I have a HP laptop.I had no problems with Ubuntu 8.04, everything worked out of the box.
With 9.04, my ATI x1350 mobility is no longer suported, also I’m unable to put my laptop into suspend mode or hibernation.Oh, and on shutdown i just get a blank red screen instead of Ubuntu splash screen.
I’ve searched a lot,tried everything, nothing helps, so I don’t use Ubuntu anymore because my laptop is useless with problems like those.
Well, everything works perfect with Windows 7, so I’ll stick with it.
I made a mistake, It’s not Ubuntu’s fault with my ATI card, but xorg 1.6.Well anyway, Ubuntu 9.04 supports only xorg 1.6.0.
Credit is *not* fundamental to open source, unless you have an original definition for either “fundamental” or “open source” (or unless you are just being rhetorical to show your (understandable) gratitude to the many great open source contributors, in which case anything goes; I love poetry myself).
Edited 2009-09-14 18:02 UTC
Hardware support – Linux*
Appearance – nothing to do with Linux but with Gnome and Ubuntu (there the ones setting the default look.)
New features – Mainly Ubuntu*
Default Software Selection – Ubuntu
Performance – Everybody.
Usability – Ubuntu
*) They must still make sure it integrates nicely
Weak points in [XKU]buntu:
* Random hardware support No, that’s not random hardware. But random support. Regressions with each new version are brutal. Even regular updates break millions of home computers and very likely some servers as well. This is not an issue of not having support for some mangled proprietary and undocumented device. Working devices with vendor provides OSS drivers stop working altogether. Linux and Xorg themselves get most of the negative credit for that. But network and window managers get special credit for breaking things they shouldn’t be able to touch in the first place.
* Theming You can argue that looks are not as important as functionality, and you would be right if you did. But being forced to look at some of the themes in XKubuntu or at Gnome at all is akin to torture. I use Xubuntu which lets me have decent looks within the default packages. But I have to retouch theme and font configurations with each upgrade. The blame this time is entirely for Ubuntu, because the components are already there, they just need to make them default.
Its main strong points:
* Speed Linux and the usual slowness suspect applications get the credit for fixing their speed bugs.
* Integration Compared to other Linux distributions Ubuntu is the one that provides the best average integration and tools. If others claim that Ubuntu is just borrowing from them, then they should be able to provide a better experience – from Kernel to Desktop wallpaper through package manager they must beat Ubuntu and then come back to me.
“Even regular updates break millions of home computers and very likely some servers as well.”
May I suggest looking at Debian proper for your server needs. The Connonical fork may be fine for entry level desktops but I wouldn’t be trusting server uptime to it give the alternatives.
Additionally, I find it somewhat sad that he[1] sees the theme of a desktop as important area of operating system development.
But then again, I represent a minority here where wallpapers are starting to be a new metric to evaluate operating systems. Over and out and sarcasm off.
[1] The editor and not the original author.
It’s not only about the wallpaper, it’s about how the entire thing looks and feels, it’s about the entire DE.
For some people GNOME might be better and for some it might be KDE etc.
Well i don’t think even Gnome 3 will change anything. The gnome-shell is a rather convoluted concept that makes very little sense to me and does very little to improve the actual look of Gnome.
Ever since I moved to OSX. Ubuntu and Linux in general is just not that interesting more, considering Ubuntu used to be my primary desktop, thats big. Every year we get promises of great features that will improve the desktop experience and every year only about a quarter of those features make it, with a note that it will make it in ubuntu +! . I just d;t understand why in the Linux community with such diverse programmers and users we haven’t come across a real desktop oriented distro. One that drops bloated low moving projects like X and develop their own window server like Apple did. The papers and technology are out there, it can be done. How long was xegl supposed to come out and change the X windows landscape, instead we went with aiglx.
Instead we have distros regurgitating each others work with maybe a different theme or two and some differences in configuration. Its really frustrating to see Linux practically standing still due to their adherence to an aging architecture, that was not meant to be used on a desktop config in the first place, and see all of these work arounds that they have to implement to get it to behave and look like other OSes. Both Windows and OSX have state of the art windowing and display manager aimed primarily at the desktop, where these things actually matter. Why can’t Linux do the same? Why can’t we get rid if X or do like OSX and run it in a layer if need be for compatibility, and focus on a real solution for a desktop windowing, display server.
I might argue with you that being “aging architecture” is a bit odd to say but technology inside in a lot of places is almost to much improved companies that give other products but costing you money.
I will compare OS X with GNOME (2, 3). OS X 10.0 till 10.6 uses a mix between two APIs: Carbon & Cocoa. First is plain C++, second is Objective C. On GNOME you have a C core which is arguably OOP oriented, non garbage collected language written in C, and a lot of other languages binded on top. JavaScript, C# (from Mono), are some of them.
Window Manager of OS X is great as it does not run for a lot of time OpenOffice as it was odd to port the codebase. As of today the problem is mostly fixed, but Compiz expose a similar capabilities even uses a layer with different technology that you may see it odd.
Quartz 2D API is equivalent with Cairo (first is PDF based, second is PostScript based). CoreAnimation is equivalent with Clutter (will be a part of GNOME 3). CoreVideo and QuickTime is equivalent with GStreamer.
Want an antialiased desktop, is there, from GNOME 2.16 if I remember well. Want a midleware support for events like power off, is there (DBUS). Want ZeroConf/Avahi/Bonjour is there. Want IPod, mostly will work.
The big revolution over evolution is that right now Linux is fairly mature and it’s target is still to compete to try to replace expensive unixes and low end Windowses and the looks are not the most important thing. But also, is hugely hard to replace a codebase that compiles in 8-9 hours (on Gentoo regarding GNOME) and to try to hack a newer way to think things.
Pixel perfect graphics are for sure a part of OS X and OS X in particular is one of the desktops that drives GNOME forward as look and feel, but GNOME in it’s own eventually with 3.0 will bring to masses a cleaner infrastructure, and hopefully a GNOME 3.2 or 3.4 will be in an usable state and will bring all beauties that we “desperately” need.
When will Gnome 3 be ready? Will it look better than KDE4? I highly doubt it.
By the time it is in a usable state ChromeOS will be out and all the netbook manufacturers will have zero reason to use ubuntu or any gnome based distro.
It’s not like the guys that tweak the color theme for distros are the same guys that could hack together a window server that would support existing software.
And, themes are important. The most revolutionary thing about WinXP was the theme and default background. Apart from that, it was just a slightly faster Win2k.
http://icps.u-strasbg.fr/~marchesin/nvdri/fosdem1.pdf
Because it would be a stupid and expensive move at this time. That’s not even near to where the problems currently are.
The actual problems now are:
– Lazy OEM’s not verifying their devices with Linux distros, or contributing required modifications/drivers.
– Duplicated effort b/w KDE & Gnome (though, we can live with that). General failure of “freedesktop” project…
– Bad video drivers
– Bugs
Overall, all of this will be solved in due time. Time is the enemy of Microsoft and friend of FOSS (and to lesser extent, Apple).
I’d agree. There is absolutely nothing in that article that would influence either my choice in selecting a desktop operating system, nor my recommendation for a less advanced user.
I would have left the article on the obscure blog, rather than posting it to OSnews.
I migrated my mothers computer to ubuntu in order to avoid the maintenance which inevitably came with windows. At the time a Mac was out of the question.
Its been just over a year now with one major upgrade. She has mostly had no problems with it.
Her complaints, by which meant I had to provide her ‘training’ on, were;
How to burn music CDs
How to download music (went for Amazon MP3 in the end although its downloader does not work on 64bit)
Saving Open Office docs as .doc
setting up a printer
email (quite difficult to find a simple and clear email client that gives system notifications without LOTS of tinkering)
She had no problems with;
the looks (she really couldn’t care less)
Browsing (she has used firefox/safari for years)
IM (skype + msn)
Keeping the system up to date
It was a new system and quite a learning curve for her but she managed it. My main complaints are in 2 areas
1. The default mail client is simply to complex for a casual user. It has no status bar notification and has to be kept open open at all times to get email. Daft really
2. The audio players are not good enough. They cant hold a candle to WMP nor iTunes. I think the best way to describe them is ‘designed by the programmers’
Many years ago, I replaced my parents’ PC with a Gnome driven Linux OS in order to make life easier for them. It worked, and without much ‘training’. My observations are quite similar to yours, but let me add a few things:
These sound like the typical “can you tell me how to” questions in “Windows” land that need a ‘trainer’ too answer them. 🙂
Exactly. No further comment neccessary. 🙂 Allthough I simply can’t stand the default look and feel, most users seem to be fine with it, because I don’t see altered settings very often. It does not matter if I look at Ubuntu, Mac OS X or even “Windows” – it’s a matter of split seconds to determine which OS someone is using just by having a look at the default desktop. It’s often complained about the “distribution’s artwork”, and I know it’s arguable. Average people just decide by first visual impression – yes, it is that easy.
My parents insisted on keeping using Opera, but that was no real problem.
Works great automatically. If additional support is needed, I can always SSH into the system. But since I installed the box, I didn’t find any needs to do so.
It simply does not matter from which to which system you switch. Some things and concepts are always a bit different, and they require re-learning and experiencing. Computers aren’t that easy in terms that they are exactly similar. 🙂
I could solve this by using fetchmail and using Gnome’s mail notification icon. Because of compatibility of storing concepts for mails it doesn’t even matter which particular program you use to read, compose, send and manage your messages.
Simple solution: I installed gmplayer with all the codecs. In opposite to “WMP” and iTunes it is capable to play all imaginable video and audio formats.
Don’t get me wrong: iTunes is not bad, and I prefer using it on my Mac instead of other programs. “WMP” is very restricted in which formats it supports (especially “out of the box support”); free and standardized formats are especially affected.
Anyway, the Ubuntu Linux distribution would not be my first choice, but it is among my most favourite choices, simply because I prefer Gnome over KDE, but that’s a very individual point of view. Ubuntu itself has come a long way and it still impresses me by the evolution it inherits.
what i should have clarified, my mother’s usage of the pc is as follows
Music
Email
Webbrowsing
word docs
If I’m honest she is also quite tech savvy. Turns out raising children who are geeks means sometimes things sink in!!
I will investigate the use of fetchmail and using Gnome’s mail notification icon.
I did the same, with my grandfather. Didn’t work out, for various reasons. His geneology program didn’t run in Linux. Banking sites didn’t work. THere was other stuff i don’t remember. I had just replaced one set of hassles with another, putting him in front of a new OS where he didn’t know how to do stuff, and where in some cases he couldn’t do stuff.
For every grandparent who does only email and solitaire, there’s a grandparent who does mostly that, but occasionally wants to do something else, and in his case Linux impeded him in that.
what a shame. I assume you tried wine and such options to no avail.
Some people are more ‘open’ to this kind of change than others. Just the way the world is.
Do you mind me asking when you tried this experiment? (with linux more than any other OS this can make all the difference!)
I switched my cat to linux:
http://www.jfplayhouse.com/2009/08/how-i-switched-my-cat-to-linux.h…
Edited 2009-09-13 21:48 UTC
That’s the typical case, in my experience. Regular user does not care much wht it looks like as long as it is not ugly and they can change the wallpaper. The only paper who whine on and on about the looks are “power users” (ie people who don’t really know quite as much as they’d like you to think they do).
Ugh, yes. Audio players suck but then again so does WMP and iTunes.
The article author might have more joy if he switched to Fedora instead of Ubuntu. It seems that he knows a fair amount about Linux (having stated that he had to manually configure hardware in earlier Ubuntu releases) and I think Fedora is the best desktop distro out there for knowledgeable Linux users. Many of his moans about Ubuntu are “sorted” with Fedora…
Ubuntu, Fedora, Mandriva, in the end, those are all flavors of Linux, some better, some worst. But it does not solve some of the major problems around the “concept” of Linux.
– X is old news, it would need a complete rewrite.
– Bring some “Bling” to the desktop, without crashing.
– A better Gnome or KDE, or a better integration or whatever would be required to make those simply work.
– Better support for all codecs, with hardware acceleration please.
– A better media player, oh and it would be nice to have support for common hardware, Zune, iPod, …
I know many problems are not “related” to Linux, like the missing support for iPod or Zune (or whatever) without using hacks that barely works, proprietary software (codecs), … But it’s all those missing things that make the use of Desktop Linux a little bit less enticing.
And lets not forget, the missing software MADE for Linux (WITHOUT Wine). Don’t tell me that The Gimp is the same as Photoshop, please!!
I do completely understand and mostly do even agree. Let me explain:
In principle, X is a proven strategy that offers a lot. Sadly, often a lot of overhead and dependencies that are not needed in certain settings. For example, a standalone PC hardly needs X over SSH for remote desktop usage, but on the other hand, Linux is a multi-purpose OS, and X is part of this concept. So it has to fulfill many functions, even if they are not needed in settings as mentioned before.
There are things that are much older than X, and still have their right to exist, because they keep things running. Just think about TCP/IP. In some regards, (A)X.25 is “much better”. Change? Rewrite? What about IPv4 vs. IPv6? Abandon the old fashioned v4 completely and use v6 only? You can’t do that. Legacy is very important in most settings; furthermore, concepts that just work don’t get replaced very fast.
The problem is: Where to start? X? A particular part or module of X? A part of a window manager or desktop environment? A separate program? What about other OSes that use X?
I know that the visual first impression (“first sight effect”) is very important for human decisions, and when presented a “boring” DE, people want something more “entertaining”, no matter if the “boring” one has better functionalities, faster usage speed and lower resource requirements than the “entertaining” one.
You need to understand that there are several stages of operations within a system. It has to do with privileges, security concerns and OS concepts. It’s hard to get some “chaining” between the stages, some interaction. DBUS and HAL are subsystems that do so.
But when we are talking about hardware, we should take the manufacturers into concern first. It’s easy to blame developers who reverse-engineer a product that you can use with a free operating system (instead of buying something proprietary) when something does not work. The situation would be much different if hardware manufacturers would conform to existing standards and implement them (instead of producing proprietary crap) or at least release specifications and documentations that would allow developers to make the product usable outside of the “proprietary box”.
This has, as far as I know, much to do with legal restrictions. Of course, it would be great just to install mplayer with all codecs from one binary package. But sadly, the sheer amount of different codecs and the restrictions about many of them make it different. If users would stick to open formats that completely could replace the proprietary ones, the problem you mentioned wouldn’t be there.
As far as my individual experiences go, mplayer (gmplayer, kmplayer, and mencoder) is the most fantastic media player. With proper configuration and some work (which sadly often implies compiling), it does play everything that’s imaginable.
I could not disagree with this statement. But take into mind that the situation is improving. Sadly, it’s often a “work against the hardware”; new devices are sold, people want to use them, but the new devices are not compatible. And as soon as they work flawlessly, there are newer devices, and the story goes on.
I’m surprised you didn’t go into the gaming sector here. But you’re right of course. There’s very specialized software that does not run on standard (Linux) platforms or simply is not available there. Again, think about the fact that such software is used on VMS, on AS/400 successors and S/390 successors main frames, some of this software is really old (more than 20 years), and nobody cares that you can’t run the stock exchange’s system, the hospital’s record keeping system or the subway’s control system on an ordinary x86 home PC.
No. Gimpshop is the same as “Photoshop”. 🙂
Honestly, compare the amount of people you know who use “Photoshop” to the amount of people you know who really bought “Photoshop”. 🙂
In my opinion, especially “Photoshop” is the most used example to express where “Linux is lacking”. But there are some really great alternatives, and I’m not talking about The Gimp in the first place. Inkscape, Krita and OpenOffice have a lot to offer, so has Gimpshop, “but” (finally) The Gimp, too. But I need to emphasize that I’m not a professional graphics designer who is mentally tied to “Photoshop” so my experiences don’t matter very much.
Well, maybe someone should think about making Linux (and the various distro) “Legal” in some form, god forbid even if we had to pay a “little” something for it. The reverse engineering used to make things work right now is flaky.
Maybe, just maybe, if Linux was made “Legal”, we would enjoy much better support from hardware makers and big software devs.
Right now, I can’t understand how a Linux user could seriously say that he can do EVERYTHING, with EASE, VS another user running Windows.
Think of Apple and OS X. If Apple could do it, have a OS with a nice GUI, Bling, softwares, legal with all codecs working… I guess Linux could do it too. Windows is not the only OS that can do it all.
Edited 2009-09-13 22:07 UTC
I wouldn’t have any problem to buy an operating system that does what I need. In the past, IRIX or Solaris were such an OS. There are Linusi that you can get for free and pay for additional support. And there are even Linusi that are sold in shops, on the shelf, in a box, with a DVD inside.
See, if I know what I need, I spend money for a specific piece of hardware, e. g. an office-class laser printer with duplex and PostScript compatibility, as well as a built-in NIC. Through the price I can be sure that it is compatible to my IT infrastructure. If I would buy a cheap and crappy inkpee printer, I know that I would end up in trouble because the device isn’t compatible.
Yes, sadly. It’s important to understand that this fact applies to hardware that does not comform to existing standards, or, in other words, which is incompatible. You don’t expect a Microchannel GPU to work in your PCI “driven” PC mainboard, do you? 🙂
Hardware manufactureres and software developers will have to understand that Linux is a solid OS that has many advantages, and if you learn (oww… learn…) how to use them, you can make money from it. (And as we all know, money counts, finally.)
The more end-user systems are migrating to Linux at work places, the more this trend will materialize in homes, as it was the fact with “Windows”; it’s because users “want the same pictures at home” as they know them from work.
To be honest, I can’t imagine that claim according to “Windows”, but as you will agree, such a consideration highly depends on individual experiences. I haven’t used any “Windows”, and I’ve been doing EVERYTHING with EASE for many years now. The programs, interfaces, concepts and documentation I depend on simply aren’t available on “Windows” platforms, so I’m using something that fits my needs. And I’m quite average, especially when it comes to home desktop use. For example, I’m using FreeBSD on my home desktop for many years now, without any (!) problems. What am I doing wrong? 🙂
For many years, Linux is capable of serving very well as an “average user’s preferred system”, but there are the hardware manufacturers coming up with crappy devices each day that advertisement forces people to buy (“You need it!”), and software manufacturers keep thinking within the restricted paradigms they know (“We’ve always done it that way.”). Bringing Linux to the masses, like Ubuntu does, is a move into the right direction.
I can get everything done with ease in Linux (except gaming, of course) but not in Windows. Quite frankly, every time I use Windows (Vista, XP) my blood pressure rise to dangerous levels by the constant battle you have to wage with the system to get things working.
Which is why the linux design strategy of making a platform that is hostile to binary drivers needs to be rethought. It’s almost 2010 and linux has 1% of the market. Time to rethink the benefits of demanding hardware companies open their drivers when they have little incentive to do so.
The basic concept is not the problem. As in every OS environment, there are some rules to follow:
a) release specifications so others can develop drivers
b) release drivers’ source code
c) release drivers in binary form for every Linux distro (according to packaging system)
You see the problem that comes up: Different Linusi use different packaging mechanisms to add software in binary form to the system. When we now look at other free UNIX operating systems such as the BSDs, the problem grows.
Hardware manufacturers follow the rules of the market. They produce what customers demand. If they want crap, they get crap, it is that simple. If customers would want, for example, an inkpee printer with scanner that attaches via USB and works with Linux, and if they even would want to spend some more money on the Linux printer than on the “Windows” version (that is only supported for one specific “Windows” version and will be thrown away as soon as a new “Windows” is released because there is no more working driver), manufacturers’ attitude would change. The problem is given by this implication: First the customers would need to want something different. But because advertisement and education (haha) trained them to do otherwise, it surely won’t happen very fast.
In opposite to usage share, the oh joy oh market share is very important to hardware manufactureres as you correctly pointed out. It’s hard for well trained “decision carriers” to realize that the popularity of a specific operating system doesn’t neededly have to manifest in Dow Jones figures…
Those are expectations, not rules. Windows and OSX have fewer expectations of hardware companies even though Linux has a fraction of the market share. That makes Linux far less appealing as a platform for hardware companies.
I think it is funny that there are so many people like you that defend the status quo of trying to pressure hardware companies into open sourcing their drivers. Anyone with a smidgen of business sense knows that to expand your market position you have to work with potential partners. You don’t make demands of them when they have little financial inventive to support you.
Even when a hardware company is pressured into open sourcing their drivers that doesn’t mean they have to make them high-quality.
Class A example: ATI
http://ubuntuforums.org/showthread.php?t=978967
The Linux world needs to end the puritanical view of open source if wants to bring in users from Windows or OSX. Open source can be useful but the benefits do not always outweigh the costs. Most Linux users would rather have a decent closed source ATI driver than a buggy open source one. 99.99% of people that run a driver would never look at the source. Of those that do only a few could actually do something with it, and in that rare case would still need to be part of a well organized group of similar people with a surplus of free time to actually make any significant changes.
Okay, maybe. Or those are requirements that need to be fulfilled to make an incompatible device work on a standard-aware OS.
If a manufacturer says: “We made this product, and it works with our driver on the current ‘Windows'”, then this can be seen as “We don’t want you to run anything else than the ‘Windows’ we support by our driver. Our product isn’t compatible with anything else.” – which, of course, doesn’t matter when nothing else exists, which is a common thought.
If a manufacturer wants you to use his product with Linux, he has to do something (release specifications, release driver source, release driver binary or something similar, or better: follow standards so no driver is needed). If he doesn’t, he does not want you to use his product. He declares that his specific product is incompatible.
That’s correct according to “attractiveness through marketshare”. But keep in mind that even “Windows”, in all its different versions, constantly needs new drivers. But that’s not the case if hardware is designed in a way that it will work just for a small period of time, which is especially true for the home consumer market. For example, there’s a multifunctional inkpee printer with scanner with a driver for “Vista”. It will work 2 years maximum. There will be no driver for another “Windows” version, nor will there be a compatible device for “Windows 7”. This seems to be intended by the market: Often sell a cheap and crappy product instead of selling a good product once. This of course complies to the concepts of the market.
True.
Maybe I need to express in another way: I don’t demand manufacturers to open their sources. It’s mostly the community (of Linux) that requires this, instead of filling the OS with binary blobs without knowing what’s inside.
Personally, I’ve always been comfortable with the closed (but functional) ecosystem of the AS/400 and its successors. The OS isn’t free, it’s not open, the hardware isn’t, but finally, it makes a good productivity platform. Or take the Sun and SGI ecosystems as an example of the past, or even Apple as an example of today.
Furthermore, I’m using a Mac as a secondary computer. I have no problem with not being able to read the GPU’s source code.
But… I would really like hardware manufactureres to follow standards that do exist and that are already implemented in an accurate way, such as PS or PCL for printers, or PTP or DA for cameras. I would even pay more for such a device. But professional IT stuff still is a niche market. 🙂
That’s a good point, of course. But it’s understandable. If a manufacturer considers Linux a joke, why spend money / work on making them a driver where no money would come back from?
Security is often mentioned when it comes to Linux. For a more extreme position, think about OpenBSD. Do you trust a manufacturer that gives you a DVD with stuff you need to install for, let’s say, a webcam, without knowing what the software really does? Open source has the advantage that many eyes have a look at it, or at least could do so. Keep in mind that not everyone who wants to make money is a honest guy. 🙂
I’m not sure I would agree with “most”… but in my own case, I would certainly agree. I am “lucky” that I can use an older ATI GPU with a very good X driver right now (at my home desktop). Among my friends, because of the attitude of ATI you mentioned, this brand isn’t common anymore. Most prefer Intel and nVidia GPUs. And most of them think before they buy – that’s an important point, too, especially when you run Linux or any other Non-“Windows”.
But they could, and that’s the point… just to find out that the GPU driver captures keyboard input and sends it to… you get the idea. 🙂
The problem is that there is a good deal of certainty that allowing closed, binary drivers in Linux would be the first step in a dance of death resulting in “Linux” systems that are non-Free and cannot work without (possible incompatible) vendor-specific driver sets that must be licensed separately.
This is not paranoia, just good business.
The quandary that results is not good, but at least it’s a stable kind of not good. Most things work via mainline, open drivers. Some things suck. Gradually support expands, but it always lags behind.
If this continues indefinitely it is far better than having better support today, but no Linux tomorrow.
If there were a way for companies to supply closed drivers that would not inevitably lead to the demise of Linux, I would jump at the chance. So far there doesn’t seem to be one.
Good business for who? You’re concerned with the death of Linux when it has been in a coma on the desktop since 1998.
Even if the scenario you described were to happen the kernel devs could fix it by locking out offensive drivers our requiring that all drivers follow specific standards. Furthermore the scenario you described could only happen once Linux gained significant marketshare since those vendors would need a market before they could manipulate it.
You should be more concerned with the possible death of desktop Linux through stagnation and Windows/OSX inertia.
The status quo isn’t working. Linux is in a much better position to take desktop marketshare than it was 10 years ago yet it continues to hover around 1%. The rise of OSX has shown that people are willing to leave Windows for an alternative.
The hostile design philosophy against proprietary hardware and software needs to end or the Linux evangelists should stop trying to convert people and accept that Linux will be regulated to 1% of the market for another decade.
I never mentioned “Desktop Linux” – I am talking about Linux the kernel and operating system. A stable binary driver API would kill Linux, as surely as anything could. Maybe Linux could take 10-20% desktop market share in the near term, but in the long term it will end up with 0% of anything.
The status quo /sort of/ works, which is better than your alternative.
As far as I can tell you are raising a red herring. There’s no hostility toward ‘proprietary hardware’ (in fact, little non-proprietary hardware exists!) Everyone grumbles when vendors don’t release specs, but they write drivers for the hardware anyway. No hostility beyond complaining about poor hardware/lack of documentation.
90% of “desktop linux” evangalism doesn’t mention anything about software freedom. Converting people by philosophy hasn’t been a primary focus of anyone that I know of in a few years. We tend to evangelize to developers more than end users anyway. Regardless, evangelism has little impact on desktop marketshare.
Most Linux users are not hostile toward proprietary software (with some exceptions) and use it if it’s available and actually works. I use nvidia’s driver in my kernel, I play proprietary games–and I am a bit of a fanatic when you come right down to it! This doesn’t mean I am ready to embrace a scheme whereby Linux is effectively un-GPL’d, which is precisely where what you’re proposing will go.
It’s amazing, you’re right but you don’t know why.
The Linux driver problem is due to the fact that there is no stable kernel API for drivers. Period. Different packaging systems are largely irrelevant!
Think about this: If you want to release a driver for a piece of hardware and have it work on Windows, how many different kernel builds do you need to support? You can say, for sure, that today you need to support XP, XPSP2, XPSP3, Vista, Server 2003, Server 2008 and soon 7. There are probably more ‘kernel builds’ than that, but that’s a rough count: 7 builds. So, a maximum of 7 different versions of your driver. Right? Right.
But it gets simpler. Under Windows, depending on the sort of device, you will probably be targeting *no more than two* different APIs. After doing that your customers can use your driver an all Windows versions from 2002 through 2010.
Contrast Linux. Each distribution has, at least, one different kernel build. Likely each one has several possible kernels. Each version of the Linux kernel that is released–about every 3 months–has the potential to change an interface your driver uses, or possibly an entire subsystem, in possibly radically incompatible ways. And each kernel is potentially unique–a customer may have custom-compiled a kernel for one reason or another. They certainly cannot be expected to stick with the stock distro kernel just to make your add-on card happy!
I wont do the math. I shudder to think! It is sufficient to say that if you want to support all Linux since 2002 you’d have to target a lot more than 7 kernels. Even if you restricted yourself to e.g. just ubuntu or just fedora you’d have to target several times that.
The ‘correct’ solution to this nightmare, for vendors, is this: Release your driver’s code under GPLv2 and get it merged into the mainline kernel. For those not willing to do that, the next best is the nvidia approach: Provide a few prebuilt modules for common kernels and make everyone else build it themselves. For this to work your source code has to be publicly visible (but theoretically no one is allowed to look at it). A lot of companies don’t like either solution, so their options are (1) a nightmare or (2) no support or (3) RHEL support only. They often choose no support.
The rest of what you say about popularity is true but not important. Linux is sufficiently popular already. If it were *easy* to maintain an out of kernel binary driver for the Linux kernel everyone would support Linux that way.
I’m happy you start enlightening me. 🙂
Thank you for mentioning and explaining this. I’m aware of this fact, but it didn’t come to my mind first, simply because I don’t use Linux regularly, and especially not with incompatible hardware. But from what I know, I can safely agree with your point.
One could answer that there are other UNIXes that offer a stable kernel API for drivers, but on the other hand, when it comes to oh joy oh market share, they look more and more irrelevant.
We can only hope that some standardization that allows to do so will be possible in the future.
On the desktop.
On the server, Linux has 30-40% of the market and has frankly excellent hardware support for server hardware. The current strategy works for Linux here. For example most NIC drivers are now developed and supplied by the hardware manufacturers because they want to sell lots of NICs to server vendors and they know one of the “Must have” bullet points will be Linux support.
Well of course we are talking about the desktop where compatibility issues are far more problematic.
Linux has been a success on the server, no one doubts that. But it is obviously doing something wrong on the desktop when more people are willing to buy a $1200 macbook than run Linux.
X is fine. A complete rewrite would be an utter waste of time and give us yet another buggy piece of software worse than the original. Even Microsoft has followed the policy of slow but steady enhancements rather than total rewrites and redesigns.
Not this *again*! Every damn time usability comes up, somebody says “replace X!” as if it would solve some problem.
You complain about “overhead” due to “not needed” features but can you tell me exactly how much your experience is degraded by having the ability to e.g. tunnel over ssh? I’ll give you a hint: not one iota.
X does have some weak points, but not a single one can be solved more easily by throwing it out and writing something else. You’ll throw out the baby, and the bathwater and come back in 2-3 years to the same place you are now… minus a few features.
It’s far easier just to improve X. I know you wont believe it, but nothing is *systemically* wrong with X.
X has nothing to do with usability primarily. It’s a base where programs can be built on. Those programs (single programs, window managers, desktop environments etc.) can offer good or bad usability, but so can text mode programs and even CLI driven ones.
Calm down, please. It’s not me complaining. I’m completely fine with X and its development / way of improvement. I wanted to illustrate why there are people who demand X to be replaced because they don’t see any need for a particular function or concept. X is a multi-purpose platform that integrates its parts into an overall concept. It’s hard (or even impossible) to even eliminate a certain part of it just because you “don’t use it” in the foreground.
That would be possible.
Well, I don’t see systematically wrong things in X. You’re obviously talking to the wrong person. 🙂
X has already been rewritten several times over, and it’s still very suitable.
Done. KDE 4 is beautiful to look at and stable. Compiz is stable.
I don’t think even you know what you mean.
“Better” support for all codecs? It’s already 100%. Hardware acceleration has been here for a couple of years already on Nvidia cards; it’s called VDPAU. Note that OS X didn’t have hardware acceleration for video codecs until Snow Leopard, and even then it only support H.264 acceleration.
Better in what way? The iPod is supported. The Zune isn’t, but who uses the Zune? Samsung are currently #2 in market share for MP3 players and all of theirs are supported on Linux.
It’s not, but it’s much more legal than the copies of Photoshop that most of the complainers have…
How many people actually need photoshop?
Is that a real impediment for the adoption of Linux for most users?
There are some missing programs/functionality in most linux distros, but I don’t think that alone explains the lack of adoption amongst most home users.
Noooooo!!! To Serve Man!!! It’s a cook book!!!
I know that Opensync is not Ubuntu.
However, it is a deal breaker for me when syncing my cellphone (in this case, Symbian) produces errors and duplications every time, particularly with recurring all-day events (such as birthdays).
Among other things, for me a computer is a tool to keep me organized. If I can’t do that, I can’t use the platform.
Kudos for the great progress, but a real pity that progress with PIM syncing moves forward at such a glacial pace.
I have tried via Opensync, SyncEvolution etc, and still find the problem is the same.
If Canonical is getting into the habit of delegating its developers to other projects, this might be a worthy one to consider.
I find this guy to be very ill informed about the current state of gnome. If he bothered to look at phoronix.com, the ubuntu packages repository etc. He’d know that gnome is currently undergoing a massive overhaul with the gnome-shell interface. I’ve got it up and running at the moment and it integrates all the 3d graphics stuff directly into the desktop and window manager. It basically joins the two programs into a single app and it changes the way you use gnome. it took from windows 95 until windows XP for the windows default theme to change from boring grey bars. Linux having some of that corporate familiarity isn’t a bad thing especially when you consider all the themes available on the internet. Best of both worlds exists in your ubuntu install today.
Now that said Linux does have issues. Running games isn’t really one of them. I’m currently running Mirror’s Edge, UT3, Bioshock, Assassin’s Creed, Overlord, Tom Clancy’s H.A.W.X, and Prototype all in wine without showstopper issues (this is at 2560×1600 on a 30″ dell LCD on an nvidia 260gtx btw). Major platform issues are falling away each day.
My two most obscure pieces of internal hardware are tv tuner cards and both have driver projects underway to get them working. One of them doesn’t work right now only because the linuxtv guys didn’t know what tuner the card was using, turns out its one they already have a driver for they just need to init it. Intel wireless works and works well, network manager is handling network detection/encryption perfectly on my 4965ABGN card. Pulseaudio ahs come a long way and is rapidly stabilising.
Moving away from internal to external components I have two pieces of really obscure hardware. One is a Thrustmaster HOTAS Cougar joystick, and the other is a Microsoft Sidewinder Force Feedback 2 joystick. Wine and SDL are both getting support for force feedback and I have tested that code with my hardware and it works. The HOTAS Cougar is supported by linux but the compiler and cougar programming language is not supported. That said, wine can run the compiler and programming application and can compile but not upload new configurations to the cougar…. At least not until I install the USB patch for wine which claims to support any windows USB driver. If that’s true then I will have eliminated all hardware reliance on windows.
Now if anyone is interested I am writing an application in GTK which will detect and calibrate joysticks similarly to the now defunct jscalibrator application, however I am planning to add force feedback effects support to the program. Help with it would be appreciated. I already have it detecting/using device nodes for an unlimited number of joysticks/gamepads.
In closing I’d like to say that Linux has problems. Lots of problems, but default themes aren’t one of them. Instead of focusing on the irrelevant, try focusing on items that affect how people use their machines. I took gnome-mplayer from being a rubbish app noone wanted to use, added multi file add/remove to the playlist, added digital tv support, langauge/subtitles and a couple aspect ratio settings and now it’s a widely used application and a lightweight media player classic clone. What Linux needs is not more themes but more good ideas for ease of use implemented in applications. Things like networkmanager, pulseaudio, a joystick configuration app based on HAL, these are things people want. Frontends that unlock the power of the underlying apps whilst still allowing users to strip the system down to a bare minimum configuration are what makes linux great. If themes were why we used linux we’d all have clearlooks on windows .
Edited 2009-09-14 09:02 UTC
You can bash Ubuntu if you wish, and there are some things that need attention, but for the most part, it changed the Linux world. I converted to Linux around 3 years ago, and have no thoguhts of going back. Each version of Windows makes me think, I made an excellent decision. No more driver h*ll and things just work. I only use Ubuntu 9.04 ocassionally now, I use Kubuntu or Linux Mint more, but they’re derived from the original, so I must be fairly happy. When I think of all the trouble I had keeping up with other OS problems and bugs, viruses and BSOD’s, I think, it just ain’t all bad in the Linux world. Let’s face it, Windows is pretty underdeveloped in some areas of UI also. My kids right now are fighting to get Vista to work like they would enjoy, and not having a good time of it.
If you had that many problems … with XP or Vista … in the last five years … you should throw your computer away, because you’re too dumb to use it.
Are you seriously that undersatisfied with your life that you need to troll around in a forum and post comments like this?
XP is nice in some respects but if you have never had trouble with it then you are not using it seriously and maintaining it as an admin is a total bitch. I absolutely hate the networking settings in windows. Best of all is they get worse every release.
Comments like what? Exposing your MS hate for what it is? Exposing your obvious lack of IT knowledge insofar as you can’t even admin a network of XP (or Vista) boxes so that they don’t get hosed up?
Who’s trolling who? You’re an IT nitwit, and I wouldn’t trust you to setup a toaster.
Edited 2009-09-15 03:32 UTC
Trolling around, insulting OSNews readers shows a serious lack of tact, intelligence and knowledge.
Go douche around somewhere else, please.
Thanks for the great of example of how we should post on these boards. No hypocrisy there, dickhead!
I felt it was warranted.
Exactly. As I felt mine was. So there. Welcome to the big bad Internets.
What a maroon.
The mythical BSOD.
Always that.
BSOD, BSOD, BSOD!
The acronym that is always worth mentioning, like the Linux kernel would not panic. I’ve never seen a BSOD, but I have seen plenty of Linux panics. I’ve never seen a BSOD, but there are 180295 kernel oops for 2.6.29 in the inadequate tracking system alone.
Fanboys. Every passing one makes me tolerate them less.
And the chorus replies,
BSOD, BSOD, BSOD!
“I’ve never seen a BSOD, but there are 180295 kernel oops for 2.6.29 in the inadequate tracking system alone.”
No matter how you spin it to defend Windows, BSOD is undeniably unique to Windows. Kernel panics don’t involve a *blue* screen. It’s a fact.
I am not trying to defend Windows.
I am trying to put some sense to Linux fanboys.
Isn’t that kind of like putting common sense into a liberal? Good luck with that.
You are lucky to have never seen a BSOD. I’ve seen a few in my day although they are getting more rare.
I admit that I have experienced more linux kernel panics than BSODs but I wouldn’t consider the BSOD to be ‘mythical’.
Oh that is such bs. Things just work with Linux? No driver hell? How do most cell phones fare with Linux? Wireless always works just fine? No issues with sound?
Would you like me to provide you with a list of threads regarding all the working hardware that was broken with the 9.04 update?
Read the comments on this blog and tell me again that everything “just works” in Linux.
http://www.workswithu.com/2009/08/27/how-to-fix-wireless-on-ubuntu/
If you want to state you are happy with Ubuntu then fine, but enough with the proselytizing already. Yea, you “converted” as you said, we got that. Everyone here has probably tried Linux, it is OSNEWS after all. Find somewhere else to preach.
On 9.04, I have to use a script every time I the kernel is upgraded to download and install a newer version of Alsa, which still is not in the repositories, and neither is the new one.
On almost every system I have had, I’ve had problems with audio on Linux.
ALSA needs work, or more help. Something must be done!
Pulse should have never been started.
Gstreamer, I can never get Gnome’s default audio player to do anything. What a waste of time Gstreamer was.
Linux should have gone with JACK audio server, and perhaps if it weren’t problems with the experiencing, perhaps OSS underneath.
On the last years, we have seen the infamous Microsoft learning a lot from Linux. However it seems that GNU-Linux seems to have more dificuties to do the same.
I talking about reformulating strategies. Excelent code is usualy a stimulus to keep going. But like sabatic years are a way to get the feet on the groud and re-evaluate ones perspective, the need to do so is a dificult one when so many people is involved.
Ubuntu really maked a diference, but it riscs to be lost soon if re-evaluation is not taken in consideration, And Ubuntu is in the best position to lead the way.
Why is that needed when everything seems to go just fine? Well… that’s when people become satisfied enough to let itself to dive in problems. When none is expected. A pleasant is a warning signal, paradoxical as it may seem.
What may be the problem? We all know it really: Increased Complexity that will, sooner or later, limit options of development. Not so? We do see that coming in the excessive dependencies build up. Clean structure are, little by little, being lost in favour of of expansion of code.
And the time a distro build up takes… or the related chance that a change may crach something is also a warning signal. Much time was lost already, by the effort of building up features. Where reformulation and simplification, like a sabatic year, would be a needed step.
Let’s remind that a rational common structure instead of patching features allows saving time and avoid problems. I supose that is the reason behind the future GTK and the new KDE, but also XFCE, Enligtment and LXDE.
But let’s remind that unless united by a common messaging system, all these are dispersing trends where they could be sinergetic efforts.
This is not advocating a stop. Just a reminder for consideration. As stated, the next GTK seems to be have recognised a problem. And there is one building up everywere, We just need to be conscious of it and not fall in a winning Microsoft like perspective. We remember where that lead the infamous Microsoft. Heaviness!
What can be done?:
The basic aproach is, as always the KISS rule. One task is the modularization of systems and the standardization of it’s connections. What does this mean?
Simply that libraries should be interchanged accordingly with the target computers, or it’s users.
In other words, factoring the elements in play, the basic philosophy of small tools interconected must be revived in libraries usage.
And if someone used this our that environment, this should not mean a dependency of huge library tools. They should be designed, and built, to be shared and of easy replacement. The goal being have lesser code and keep the traditional efficiency of GNU-Linux.
Adaptability not achieved by having lots of (growing) specialised code to choose but by clean designs allowing code to be small. (Like the Enligthment environment? Not sure. It seems so).
When things get too big, it is time to look back and understand how that was allowed to happen. It is not to stop the present, go back to the past, but to prepare the future.
Unifying is not mixing everything to be able to do eveything… is getting the commons well oiled so the particular blocks may have a good foundation to work. Be it a limited block for a small machine or a more expended one for a big machine. The rationale is a common and efficient foundation for what will be used above.
That involves sharing essencials, not particulars. We already see that problem in the dependies tree.
It’s spring cleanning time.