“The Ubuntu development community announced today the availability of Ubuntu 10.04 alpha 2, a new prerelease of the next major version of the Ubuntu Linux distribution. This alpha is the first Ubuntu release to completely omit HAL, a Linux hardware abstraction layer that is being deprecated in favor of DeviceKit.”
“Dave – I’m losing my mind, Dave – Please stop, Dave”
Very nice catch, sir!
nice one
Edited 2010-01-16 08:06 UTC
Nice! I still love that you could (can?) use Hal as a trigger word for Opera voice navigation.
Say “Hal go back” and Opera would go back one page in history.
Dave Bowman: Eject the usb disk, HAL.
HAL: I’m sorry, Dave. I’m afraid I can’t do that.
in favor of udisk
An useless and api breaker name change
It’s just the ‘disk’ component of DeviceKit that’s been renamed to udisk, not the rest.
Hmm. While I’m of the mind that hal needed to go as it just wasn’t efficient, I can’t help but feel that subsystems are deprecated and replaced far too often when it comes to the Linux desktop. It seems we no sooner get one subsystem that works reasonably well when the wheel is re-invented yet again. Is it any wonder that most commercial software developers don’t target it?
The only reaosn why it happened was because libudev was originally GPL which caused a major licensing issue and HAL was introduced due to limitations that have now since have been overcome. Now that it has been sorted out I doubt we’ll see changes anytime soon.
Quite honestly I remember when HAL was first introduced and it was awful experience I wouldn’t wish upon anyone. Buggy hardware handling, issues with ripping CD’ and so on. I am happy that HAL has finally been removed and there is a native solution that works out of the box for once. It is also good for other GNOME platforms as well – the duplication of HAL on OpenSolaris was pointless given that there was already technology in place that could handle everything that HAL did. Hopefully there will be an all round purge of HAL and it is relegated to the dustbin of really bad ideas.
That’s called learning from your mistakes, and trying to do things ‘better’.
HAL had problems, but they weren’t *really* noticed until people started to use it for lots of things it wasn’t originally designed to handle, aka. ‘feature creep’ (and using XML for the config files didn’t help any).
The people behind the *kits and u* packages are the same ones that were behind HAL, they aren’t inventing a new wheel (where ‘wheel’ here refers to the general idea/function of the software), just refining the old one (different API because the old one was, in hindsight, broken, and making it more modular, breaking the old monolithic package into smaller more flexible pieces). HAL was simply trying to do too much.
Most software apps wouldn’t need to interact directly with HAL. They’d use DE hooks, or an xplatform lib, rather than talk to HAL directly. I don’t think thats really significant.
They don’t target Linux because it doesn’t have much of any market share. Over the years, Windows has had various warts and ugliness that coders targeting it had to deal with, but that didn’t stop them, they went to all that trouble anyway because of Windows’s market share.
It always boils down to just the size of the market…
Alternatively, it can be also called NIH, lack of planning, and cowboy coding.
What was it? XXX: Doing the same thing over and over again and expecting different results.
It can be called a lot of things. Opinions are plentiful, because everybody has one…
No, it can’t. Something can be NIH, but if it is, then its not the same thing as learning from your mistakes. You can’t learn from mistakes you don’t understand.
NIH is all about not learning the existing solution and implementing your own because you understand your own code more than anyone else’s.
Learning from mistakes is all about understanding the benefits and limitations of existing software.
It’s nice in theory, but realistically the hooks are often broken from version to version, one subsystem to another. Not that I can really expect much, if the kernel team won’t bother to maintain stable APIs why should anyone else? You can replace subsystems to your heart’s content as long as the APIs don’t change. No one, however, seems to care about maintaining a stable API for desktop Linux. They just don’t see it as important for some reason.
I’m sure your aware some folks have a different take on that:
http://www.kroah.com/log/linux/stable_api_nonsense.html
Besides, what does this have to do with HAL? It is/was a userspace app…
How long ago did Greg write that? I guess he has been redeemed with Linux being a raging success on the desktop.
Telling hardware companies what they need has really worked out. Video card drivers in Linux are always of top quality and you can always expect them to keep working between updates. The kernel team’s philosophy of “we’ll fning break it if we feel like it” has worked wonders.
That’s bunk. The only reason for stable in-kernel APIs would be to allow outside coders easier maintenance for their patchsets. The Linux developers don’t care about that. They want development in-tree. If you don’t want to get your stuff into the tree then they don’t care about you. Too bad. All kernel -> userpsace interfaces have been very stable. As for desktop APIs, they are also very stable. How long has GNOME 2 been around? 8 years. How long did KDE 3 stick around before KDE 4? 6 years.
No it doesn’t, it’s just a factor in the decision to port.
If porting to Linux from OSX was as easy as setting a compiler flag then it would have nearly the same library.
However the situation is the exact opposite where the cost in porting to Linux well beyond what it should be for its size.
Linux is not a stable platform for commercial developers. It isn’t even a single platform. It’s a bunch of operating systems that share the same kernel and have software distributions designed around open source.
As I have pointed out before it’s far easier to build your own Linux distro that contains your proprietary program than it is to support a single distro. The people that build the distros don’t at all care about attracting commercial developers. They also don’t care about being compatible with other distros.
6 months after the iphone was released it had better support from game developers than Linux even though it had a fraction of the market size. Market share is only part of the equation and doesn’t matter much when the people behind an OS could care less about the market.
While I may agree with some of the above, here I think you’re mixing things up. The iPhone opened a new kind of market for very casual games, all bundled with hype. Of course it appealed many developers. It’s not a part of the old gaming market, not yet at least, and on the beginning didn’t play by the same rules (again, I feel hype played its role).
And i’d be curious to see some numbers about both (phones/linux) market sizes… We may be surprised.
The old gaming market is there as well. EA Games has ported Sims 3 and Madden to the iphone but they don’t port anything to Linux. The secret of monkey island is another good example. It was ported to the PC, XBLA and iphone but not Linux.
Based on Net Applications data the iphone/ipod touch has about half the market share of Linux.
http://marketshare.hitslink.com/operating-system-market-share.aspx?…
Game developers stay away from Linux and it isn’t because of market share.
A few other factors that might have played role:
1) Even the Windows gaming market is a pathetic shadow of what was 10 years ago – it’s mainly just hand-me-down console ports (with the exception of Valve, Blizzard, and maybe-kinda-sorta id). Linux games have typically been ports of Windows games – with the big commercial titles, at least.
2) Last I checked, Wine does a decent job of running most Windows games – which eliminates the need to port, at least from the perspective of commercial game developers. IIRC, that had a lot to do with Loki Games’ demise (why buy the Linux port if you’ve already played the Windows version through Wine?).
3) Games for cellphones were already a big money maker years before the iPhone ever came into existence. And thanks to the way that carriers nickel-and-dime customers for things like ringtones, there’s a large number of people who would balk at paying for software on a desktop computer – but who don’t think twice about buying a cellphone app.
A lot of 3D pc games these days are console ports but there is a booming casual market for games like The Sims, Warcraft and Peggle. Peggle has been ported to just about everything except Linux.
If you look at proprietary games / market share ratio and compare to OSX it is clear that there is something very wrong with Linux. When a single developer supports Linux it becomes a headline.
Just look at the direct2drive Mac section:
http://www.direct2drive.com/buy-mac-download
Not as good as Windows of course but there are a lot of new games to play.
The lack of commercial developer support has more to do with Linux having too many incompatibility issues between distros and software management systems that are designed around open source applications.
This is completely wrong. While Linux has a bigger market share than the iPhone, the average iPhone user is much more willing to pay money for some junk game than the average Linux user. Thus, the iPhone market is bigger.
Games on Linux basically never have to worry about incompatibilities if they do things correctly. All they have to do is statically link and use SDL and OpenGL. A game really shouldn’t depend on much more than that. You are correct however that it is hard to develop a commercial desktop app using Gtk+ or Qt. But games shouldn’t need to do that.
The fact that there are small indie games such as World of Goo that manage to release Linux ports with relative ease tells me that large companies like EA with 1000x the resources should have no problem with it. It is just a matter of market share. I would love to see more commercial software supporting Linux, but I really don’t blame them at all when the market is so small.
Strike 1: Incompatibilities amongst Linux distros.
Strike 2: Distro specific distribution systems.
Strike 3: Open source since most (game) developers cannot distribute their libraries (since it is not theirs to begin with).
Strike 4: Users not willing to pay for software.
Strike 5: Tunnel-vision and sour-graping from Linux developers (apparently your games does not have sound).
Strike 6: Only BSD licensed libs can be static linked without license side-effects. (see strike 3).
Strike 7: Linux economy based on support and that is a different business model than most game developers has (except WOW like OPG).
Strike 8: WINE since you will have a much better time as a developer by just testing under wine than developing native apps for Linux.
Strike 9: Not a lot of market-share.
Games development for Linux is an exercise in futility.
I already pointed out that the iphone has more than junk mini games. Why hasn’t the Sims been ported to Linux? Or Braid? Or World of Warcraft?
The incompatibilities exist between distros and versions of those distros. Developers can’t even expect SDL to remain stable in Ubuntu for a single year.
I tried playing my favourite game, Urban Terror, on my new Karmic Koala install. At first it actually worked good but then suddenly the performance went way down and I had problems with the audio. I couldn’t even quit the game, had to kill it.
Checking the console output it seems there was a problem with the SDL sound libraries. I looked into synaptic and found that libsdl1.2debian-alsa was installed while libsdl1.2debian-pulseaudio was NOT installed. Since ubuntu uses PulseAudio I installed this library instead and now it seems to be working fine.
http://ubuntuforums.org/showthread.php?t=1309656&highlight=urban+te…
I’m not sure what you are saying here. Games can be more problematic than business apps due to having to work with sound and video.
You’re reaching the wrong conclusion. The likely is answer is that the World of Goo devs wanted to support Linux for non-financial reasons and decided to deal with all the problems involved. I could just as easily point the the Braid developer’s blog who lost interest in supporting Linux after being disappointed with SDL and running into other issues:
http://braid-game.com/news/?p=364
No the problem is that distro creators are not trying to work with each other or commercial developers. Linux is not a single, stable platform with a software management system that appeals to commercial developers. It’s a mess of competing systems that could care less about providing a stable platform for commercial developers.
Linux is a movement dedicated to open source and making the system appealing to companies like EA Games is not even on the list of priorities. Open source developers don’t have to deal with all the distro inconsistencies since they can just dump the source and let everyone downstream figure it out.
Linux would have a much better game library if it had something like the app store that worked across distros and could be counted on to work after updates.
Nice music selection in the screenshots.
Ed W. Cogburn mostly nailed it when he said that it is mostly about the market share.
Companies will endure a lot of API annoyances and subcomponent churn if the market is worth it.
Several remarks about Linux component changes seem to imply that Windows is not having it’s own comparable changes. That is silly.
I’ve lost count of the number of database interfaces MS published. Sound system, video, messaging – they all changed over the last decade – sometimes several times.
Sure – old APIs tend to stay around – but there’s plenty of *2 or *Ex versions. And that’s just win32. Of course now it’s often .Net (and don’t think you can get rid of 1.1 just because 3.x is out). In between it was first DDE and then COM.
Technology advances. Demands change. People learn what they should have done differently. The platform adapts. It happens to both Windows and Linux. It would be crazy to keep HAL if a better replacement can be done.
If there is a viable market – companies will deal with the hurdles of targeting the platform.
And a typical program doesn’t have to worry about OS internals about what hardware is un/mounted and when – that’s what the OS is for. The program opens a file – it doesn’t usually care about the details of that file being available there. Stuff like mouse movement, button clicks and other such hardware are dealt with by DE platform libs. Most of which don’t change all the time. And the change that happens, happens on all platforms.
So how do you build your market with an unstable platform?
Cue talk about Windows breaking interfaces between decades while trying to ignore the fact that Linux does it annually.
Going by history you can build a sound app in Windows and expect it to work for the life of the OS.
With Ubuntu you can’t even expect it to work for a year.
They say that the definition of insanity is doing the same thing while expecting different results and that seems to describe the state of the Linux desktop. People 10 years ago were saying that broken APIs were no big deal and that Linux would gain share anyways. It didn’t gain share, it’s still at 1%.