“In order to broaden Linux hardware support and simplify the process of acquiring, installing, and updating device drivers, Novell has created a new driver system that will enable vendors to supply drivers to users directly. Linux drivers are traditionally maintained in the kernel itself, and third-party drivers that aren’t available in the kernel often have to be installed manually, a process that generally involves compilation. In many cases, users have to wait for the next kernel release cycle before they can get software support for the latest hardware. Novell’s new Partner Linux Driver Process could potentially resolve some of those problems by providing a simple and consistent process for deploying drivers independently.”
“I hold the opinion that binary drivers are better than no drivers at all, and I welcome any endeavor that improves hardware support on Linux”
Need i quote more?
I’m inclined to agree, but the kernel devs obviously aren’t. Otherwise they could have implemented a stable ABI a long time ago.
Hope Novell’s ready for the flak they’re going to catch over this.
True they’ll probably take lots of flak, but people like me who like Linux and want more hardware support without caring that some of the drivers aren’t open source will be applauding this.
I’m very much endeared to the ideals of open source, but by blocking out proprietary drivers the end users are hurt. One could argue that priprietary drivers themselves hurt the users, but it’s best to let the user choose for themselves which of the two they’ll choose. Companies aren’t all ready to embrace open source, and in some industries making everything open source is bad for for their current business.
Not that I follow that reasoning but to make discussion fuller, kernel devs see it as follows:
Only OSS drivers guarantee necessary quality “certification”. Allowing close source will poison linux waters with loads of piss-poor 3rd party drivers that are mostly half hearted WDM ports while removing incentive for supporting continous OSS develpement by current important linux server partners like Adaptec.
So they fear that you may end up present quality oss drivers replaced by lots of medicore at best (and mostly useless in reality) closed drivers. So your choice is in fact removed. Additionally Linux’s image as stable system can be seriousely hurt. (MS’s been figting this problem for years and only recently’ve seen some positive outcome).
Linux is still boldest on the server (depite press coverage) and there is a lot to be lost there (which MS awaits eagerly).
As long as you use SuSE, right?
Same with Novell’s Linux client. Only works on SuSE and is binary only.
You might as well just go back to Windows. This is at best a bad idea, and at worst vendor lock-in.
There isn’t that much hardware under Linux which isn’t supported. It’s relatively easy to steer away from those that aren’t supported. Eventually when the marketshare for Linux gets large enough, companies will see the light and will open source their drivers.
From what I understand, the system has been opened and is available via OpenSuSE.
I also submit that a binary driver is better than no driver at all. One word: NVidia. Open source is a development strategy, period. Some manufacturers may choose use it, some may not. It should not dictate driver availability.
And I’m not that optimistic about the state of drivers for Linux. More and more hardware is being reverse engineered and some (not many) manufacturers “see the light”, but a lot of products still lack perfect Linux drivers. If there’s a framework that will make more drivers appear, I say go for it.
I submit that you’re completely wrong, one word: ATI.
Nvidia doesn’t just ship a binary driver; they ship a _very good_ binary driver. They obviously put effort into making their linux drivers good.
No driver is far better than a bad driver. Second example: Ndiswrapper. Not a driver in itself, but a semi-broken enabler that gives you a tempting view of how things could work until your kernel crashes.
No driver provides strong impetus to get something done. A bad one provides a crutch to get by on.
I would, however, say that a binary driver is better than no driver. But only if it’s a good binary driver that’s actually compatible with the system. Something that’s much easier to do in the Windows world, although by no means a blind process (WinME).
“No driver is far better than a bad driver. Second example: Ndiswrapper. Not a driver in itself, but a semi-broken enabler that gives you a tempting view of how things could work until your kernel crashes.”
Ndiswrapper works just fine for me. Not a single hitch.
Binary drivers _can_ be quality drivers. Compared to ATI, nVidia is doing one heck of a job. nVidia atleast supports their cards and their drivers work exceptionally well.
Didn’t work for me. I used it for a wireless card and it would lock up the system So I had to go back to Windows and just running debian in VMware instead (this was while i was developing a daemon on linux and needed to use it).
No driver provides strong impetus to get something done.
Where “something” more often than not is to reverse engineer the hardware and produce:
A bad one provid[ing] a crutch to get by on.
I really don’t want the driver situtation on windows to get in to linux.
When hardware companies love to create really crappy drivers that interact really badly with other parts of the system.
By making it easier for hardware companies to produce binary drivers will reduce the chance of them releasing the source to their drivers.
Sure these black box blob binary drivers will work with the linux, but what about helping out the other free OS projects by getting the code and specs for supporting hardware out there.
– Jesse
Thank you
Yes, drivers are such a pain in Linux. Everything is painful and time-consuming with Linux anyway.
“Everything” is? Surely you must realize that blanket statements like that are seldom ever correct.
I enjoy my Linux install. I don’t have to do a lot of work maintaining it, or keeping up to date. Installing my apps is a breeze (but of course I don’t use MS Office, Photoshop and other apps that aren’t Linux-friendly).
On to the subject of the article, I am all in favor of a way to make installation of driver modules (3rd party, closed-source, for the most part I assume) easier. But on my Fedora box, the nVidia driver was a simple yum install nvidia-glx that I did years ago, and haven’t had to worry about since. Whem I update my system, any new nVidia driver is automatically retrieved for me. Seems that there *are* some easy thing in Linux, after all…
Better no driver at all then binary-only. Sure, it seems like a sweet deal today to get better support for newer hardware. But what about in 10 years when todays new hardware and tomorrows new hardware is all that there is? When you effectively *must* use closed drivers to run any system?
There is no harm in open sourcing your drivers. The only reason not to is if you’ve got some kind of legacy patent obligation. Eventually most companies will open source their drivers or release their specs so open drivers can be written. This will not happen if they are given an easy out, a way to continue to follow the old, bad development methods.
I wont even mention the stability problems involved in closed-source third party drivers. Even Microsoft will tell you why they’re bad, though they doin’t have a better solution.
I do agree with you in some points but… how is it different than it is today ? Companies are not moving torwards opensourcing their drives… and those that ARE are because they choose to considering benefits they could get from that, so I bet those will continuing doing so.
I believe it is an all win situation… since the “binary only things” will keep getting reverse engeneered the same way they are today
>>”Companies are not moving torwards opensourcing their drives… and those that ARE are because they choose to considering benefits they could get from that”
So you are saying that companies are indeed open-sourcing their drivers. Meaning the first part of your statement is false and even you don’t agree with it.
>>”I believe it is an all win situation… since the “binary only things” will keep getting reverse engeneered the same way they are today”
I will apologize ahead of time if you are one of these talented people, but other than that, you are taking this process for granted my friend. This IS a curse worse than disease.
so we go from a model where using the free and open kernel provided drivers, users shouldn’t have to ever install their own, to one where a-la-windows you have to hunt through the net to grab all the latest binary only releases hoping it doesn’t break your system…
(no, don’t mention ati/nvidia. you don’t _need_ to use the closed ones to get working video. you only require them for newer model cards if you want 3D accelaration which in Linux basically means you can play tuxracer and see fancy screensavers… plus, there is the option of using cards that the stock x.org does provide free DRI acceleration for anyhow)
what is really the difference between installing these and binary kernel modules?
This article does well to explain the problem.
http://www.onlamp.com/lpt/a/6557
It’s about decoupling the driver release schedule from the distro release schedule, so that users can get drivers now, not months from now.
no, don’t mention ati/nvidia. you don’t _need_ to use the closed ones to get working video. you only require them for newer model cards if you want 3D accelaration
You also, at least for Nvidia, need their driver if you want working dual monitors.
I don’t think that’s true. Unless nv has stopped working with dual head nvidia cards? You can use xinerama. I know, it’s not as good. But “not as good” is not equivalent to “can’t.”
Interestingly enough, by “newer model cards” he means ones in the last 4-6 generations of chips from said manufacturers…
I don’t think that’s true. Unless nv has stopped working with dual head nvidia cards?
Got it in one.
I’ve just done both Ubuntu and FC5 installs and neither had an NV driver that would handle the dual headed card in a situation where one head is analog and the other digital.
I should have clarified by saying 3D on _any_ nvidia (boo for them on that), or on post 9250 radeons.
Again though, what’s the point of having some nvidia 7800 on a linux box? It’s not all that useful in windows either, but usually the only thing people can think of is to be able to play some closed source proprietary windows games via wine or something. If that’s what you really want, RUN windows, dual boot or whatever. Remember, all that fancy stuff that redhat is working on for instance with the aiglx stuff is only being tested to work on cards with free drivers, not the closed stuff (good for them).
In terms of free software, like I said, it’s basically tuxracer et al and the gl screensavers… Is that worth compromising the integrity of your system by buying a card which isn’t freely and natively supported? (save you some cash too…)
When you make simple devices that have interfaces that amount to a handful of registers that are memory mapped and only do very simple things, then it’s a no-brainer to release specs.
But graphics hasn’t been that way for a long time. Graphics cards are full blown co-processors with their own RTOSes and ISAs. To get every ounce of performance out of them they use custom protocols to communicate with the host processor, and those protocols expose a great deal of knowledge about how they operate, even to the extent of giving away hardware trade secrets.
The number of people who would buy Nvidia or ATI cards if open source drivers were available but won’t buy them now is too small for Nvidia or ATI to be willing to take the risk of losing market share to the other over.
Way I see it this really doesn’t fix anything but rather exacerbates the problem and will only encourage manufacturers to keep drivers closed.
I can’t believe Novell is getting bitched at for trying to get the whole linux driver mess to a somewhat more reasonable point. First of all it is nowhere said that this new technology is going to be closed source ( and I am almost certain that this might turn out to be impossible with all the licencing issues ). But most of all keeping every driver possible in the kernel is a bad idea for ANY OS striving to become mainstream. There is no way that you can provide the support for the vast number of hardware out there especially since linux runs on ALL architectures in use today. Not even Microsoft with its thousands of developers can take on such a gargantuan task and this has nothing to do with principles and licencing politics. Only the guys at Novell unlike most Linux supporters actually care about this. Lack of drivers = loss of users = loss of profits and you can ask Apple to give you an example of this one.
So get off your high horses and give Novell some support for finally bringing some common sence to the equation. No company is going to opensource it’s drivers especially if they are for some new hardware. Opensourcing the drivers means telling the world how your hardware works ( more or less ) and this is plain stupid from an economic stand point. And as far as going the route Nvidia and ATI have taken goes … well how is a smaller company supposed to pay for a 2nd team of driver programers and why should they? I think that what most of you are forgetting is that Linux still has under 5% userbase and only about 5% of those 5% are in the desktop market. The only time that you can be guided by principles alone is when you are in the position of Microsoft ( i.e. a monopolist ).
First of all it is nowhere said that this new technology is going to be closed source
I’m pretty sure Novell’s driver downloading system will be Free Software. The problems is that it will be downloading proprietary drivers.
There is no way that you can provide the support for the vast number of hardware out there especially since linux runs on ALL architectures in use today.
Really? There already is a kernel that does that today: Linux. The only devices that don’t have drivers are those that the makers refuse to provide specs for.
Not even Microsoft with its thousands of developers can take on such a gargantuan task and this has nothing to do with principles and licencing politics.
The reason Microsoft doesn’t write drivers for third-party hardware is because the hardware makers write drivers for Windows themselves, and Microsoft doesn’t care about how those drivers are licensed (that’s the user’s problem).
No company is going to opensource it’s drivers especially if they are for some new hardware.
They don’t have to release anything as Open Source. They only need to provide sufficient specs to the developers so that drivers can be written. The specs can be under NDA (as long as the NDA permits drivers to be released).
The last 20 years, people have kept repeating that FOSS won’t work. It’s a nice idea, but it can never work in the real world. You can’t create a Free OS, that’s impossible. Maybe a compiler, an editor and some tools, but not a complete OS! Well, maybe an OS, but there won’t be any Free Software applications. OK, perhaps a dying company like Netscape will release their already no-cost application, but that’s as far as it goes. Besides, there’s no FOSS high quality widget toolkit. The FOSS developers are better off using Qt. We should be thankful that there’s a GNU/Linux version of it and besides, it doesn’t cost anything for non-commersial use — isn’t that good enough? You can’t expect Trolltech to Open Source Qt, it’s their bread-and-butter! It would be like opening MySQL, Staroffice or Solaris. Impossible.
The moral of the story: If you stick by you principles and give the companies incentive provide specs or open-source drivers, there’s a chance it will happen. If you just bend over, they won’t release open drivers just to be nice to you.
You hit the nail right on the damn head. +(Insert all the points I have here)
“You might as well just go back to Windows. This is at best a bad idea, and at worst vendor lock-in.
There isn’t that much hardware under Linux which isn’t supported. It’s relatively easy to steer away from those that aren’t supported. Eventually when the marketshare for Linux gets large enough, companies will see the light and will open source their drivers.”
First of all, why would it be a bad idea to let people be able to use their hardware under Linux? And it’s just plain bullsh*t that there’s not much hardware that isn’t supported…Just go out there and find out for yourself! I have a printer and a scanner, both of which aren’t supported. I have met quite a few people with atleast something that doesn’t work. Sure, it would be easy to steer away from such if you know all that aren’t supported, but you just can’t always know for certain, and especially when you’re a new Linux-convert, you might have a bunch of hardware that’s Windoze-only. And no, they can’t open-source verything. If they use something with a license forbidding such, there’s nothing they can do about it, no matter how big the marketshare is!
”
(no, don’t mention ati/nvidia. you don’t _need_ to use the closed ones to get working video. you only require them for newer model cards if you want 3D accelaration which in Linux basically means you can play tuxracer and see fancy screensavers… plus, there is the option of using cards that the stock x.org does provide free DRI acceleration for anyhow)”
Not everyone has the option of choosing which cards to use, and especially if you go out and buy a new PC, you most likely get a PCI-E one. As such, you pretty much can’t use any older and supported cards. Besides, 3D acceleration is being used for 3D modeling, CAD etc. Ever thought of that? And yes, it can be used for gaming too, and you’re not limited to tuxracer. I use to play Morrowind (with Wine), Doom3, Quake4 etc….
“Just go out there and find out for yourself! I have a printer and a scanner, both of which aren’t supported.”
and neither of which have much to do with the kernel… (CUPS + SANE)
“And no, they can’t open-source verything.”
like someone else mentioned, it’s not the driver they need to opensource (wouldn’t necessarily be all useful anyhow since it’s probably a windows-only driver anyway), but realease the hardware specs so someone else in the community can do it. It’s not that difficult, plenty of companies already do this, some however are run by folk that don’t understand the technical side of things, and only can understand a loss of control (which isn’t really true anyway).
“Not everyone has the option of choosing which cards to use, and especially if you go out and buy a new PC, you most likely get a PCI-E one.”
valid point and sad. I know this because the college I work at we had to release the fglrx drivers for our distro (as an option) because the hardware that was purchased was using the x300/x600 cards. and guess what, stability is awful. unless you really need it, I’m now recommending folk use the ati xorg driver instead. but about your last point, windows games, again, why aren’t you running windows (at least in dual boot) if that’s what’s really important to you? you obviously don’t care much about the freedom part (or stability) of the OS…
Why are you interested in Linux at all if you don’t mind it turning into a proprietary system somewhat like windows? The reason for being for Linux is being FREE software; if you don’t mind using proprietary, use Windows or OSX, there is nothing evil in it.
If you let Linux be contaminated with proprietary soft, none of its promises will ultimately become true. There won’t be an operating system where the code is available to everybody and anybody, ready to be compiled for any architecture and to be sliced and diced to fit any application, from a $50 Wifi router to a $50M supercomputer.
If proprietary drivers became the norm, the panorama would soon be that of old, unmaintained, single-architecture drivers that would need to be fished all over the net. All that would be left of Linux would be the deep kernel with no means to communicate with the world but through the simplest interfaces.
This is marketing, nothing more. Other distros already do this; ie. *buntu, if you install the restricted kernel modules you’ll get your nvidia drivers downloaded automatically with kernel updates. And I remember a similar mechanism from back in my Fedora days.
I’m not concerned about binary drivers. I understand the dogma surrounding the “proprietary” driver issue, but considering Andrew Morton’s recent musings about the inability of kernel devs to keep their own drivers working on older hardware raises equal concerns. Do we really want the linux kernel to become a 50GB tarball supporting every hardware device every created?
What I am concerned about is Novell establishing some sort of “Suse-certified” standard for drivers, which is what they’re attempting to do here. Novell’s enterprise products will not swapping kernels every time there’s a new kernel released, so is this an effort to encourage driver revisions only when Novell updates their kernel? We already live in a world where software vendors equate linux compatibility with meaning “supports Red Hat RHEL 3/4”.
If the vendors aren’t going to release specs, then at least take a page out of nVidia’s playbook because they seem to do it well. But then again, that effectively equates to a series of ndiswrapper’s running on our kernels binding to generic binary blobs.
Or… call me crazy, but maybe linux *could* commit to a stable ABI over a set period of time. Would it really, truly be the end of the world if they did?
But then again, that effectively equates to a series of ndiswrapper’s running on our kernels binding to generic binary blobs.
I see ndiswrapper as a last resort.If people would do a little investigation before they buy new hardware,ndiswrapper would be quite obsolete.
For example the manufacturer of Atheros chips has an vendor matrix on it’s website which enables you to check wether a certain vendor uses the Atheros chipset.
For example both SuSE 10/10.1 and Ubuntu/Dapper support my D-link DWL-G520 wireless PCI card out of the box so to speek.
A stable driver ABI, you say? Something like the Extensible Driver Interface?
http://prdownloads.sourceforge.net/glider-kernel/EDI-3.2.tar.bz2?do…
</plug>
Yawn, I get tired everytime I hear “binary only drivers have to be there…otherwise vendors won’t be interested, blabla bla”. Get a break! As one article some time ago claimed, Vendors are not that stupid. Problem is much more trivial – they don’t have people who write specifications of hardware, they don’t have even proper coders for drivers! All their docs are windows driver source code, and guess what, even that contains lot of other source for different chips which are on hardware, because they lack time and money to code them properly!
Binary drivers are unwelcome in Linux, and have been for very long time. And it has nothing to do with politics and everything to do with stability of whole system. Only Nvidia drivers have been capable to brought my pc to knees and have hangups several times in the day. Not only that, improperly written binary drivers are mostly nightmare for Windows, even more stable ones like 2000/XP.
Any vendor should be pushed to have one good doc man who writes specifications. Then anyone – even for Windows – could code proper driver for hardware. Most of vendors don’t have to expose their precisious “IP” (e.t. inner workings of hardware) to have good specification and therefore, good driver for their hardware.
About stable API – AFAIK, code is here no one has forbidden to write stable API wrapper for binary drivers and support it. Distros could do it, vendors could do it. Problem is that consensus in specialist level is that binary drivers should not be there.
As side note, I would like to point out, that there are other ways/problems of blobs – for example, firmware distribution, decompressors (for webcams), etc. Usually they are distribution problems and net community based distros like Ubuntu tends to solve with simply downloading it from repositories, before asking user for agreement of that.
The only Linux binary driver I’ve tried that’s worth using is nVidia’s, and even that isn’t in any way guaranteed to continue. Support for many older chips has already become a secondary priority.
Most binary drivers barely work and are released once (for one or two popular distros) and then forgotten about. A few years ago I had to set up a certain Lexmark printer that had no drivers available except for some Redhat and Mandrake RPMs that were themselves a year old. It took me three days to get them to work at all with SuSE and CUPS, which would have been easy had the drivers been written properly in the first place. When the drivers finally did work, the printout quality was appalling, the driver kept crashing constantly and there was absolutely nothing I could do about it.
ATI, Lexmark, Ralink etc. are all show that the kernel devs have a point. Why should we tolerate binaries if so far precisely one company has bothered to do them properly?
So here’s a personal example of how an OSS driver > binary.
Next to me sits a Canon LBP-660 laser printer. The thing is old, almost ancient, tagged with a “Designed for Windows 95” sticker, but it still works fine. The latest drivers available from Canon are for Windows 2000 and, by chance, they work in XP too.
Now what am I to do if I want to run a 64bit Windows? What am I to do if I want to upgrade to Vista? How about if I want to run Linux? Or switch to a Mac? Throw a perfectly good printer away???
The printer has one bad design issue: if the data stream from the driver is interrupted for even one full second because of a heavy CPU load, the printout is doomed. With Canon’s driver this meant making sure all other processes were idle while printing. This was a major headache when the printer was networked.
About a year ago someone finally reverse-engineered a decent Linux driver for the printer. Because of the printer’s design, this driver exhibited the same behaviour under heavy loads.
This time, however, since the driver was open source, I could easily go into the source and insert a call to setpriority(). Now I’ve had almost zero print jobs halt because of heavy loads since the driver has a -19 nice value. This simple modification would have taken a hex editor and (with my skills) quite a few evenings of hacking to do with a binary driver.
It doesn’t solve any problems, in fact it exposes new ones. In particularly, it encourages vendors to write closed source drivers for Linux in addition to a slew of bugs, inconveniences, incompatibilities and instability we have come to expect from binary drivers on Linux. Linux users are going to be worse off in the long run by this move from Novell. Why do you think the Linux kernel hackers are strongly opposed to closed source binary drivers? Their reasons are technical, practical and philosophical. And on all counts they are sound. Novell is new to free and open source development, thus this short-sighted move isn’t surprising. Here is to hoping Novell makes less boneheaded decisions in the future.
Edited 2006-05-21 09:17
I’m interested what the kerneldevelopers thinks about this. Let them decide to use it or not. Novell shouldn’t continue on its own with this.
I’m interested what the kerneldevelopers thinks about this.
We don’t all think the same thing.
Put me in the pragmatic “eh, nvidia ain’t gonna make an open source driver for their hardware possible, so may as well make it easier for end users” camp.
Put others in the purist “if it ain’t got an open source driver, don’t use it camp.”
Hard to tell how many of each there are, as the purists tend to shout loudly.
And so, yet again we have the line trumpeted that “Linux will never succeed unless there’s a stable kernel API… Linux needs closed-source drivers”. Just one question: why?
I can understand the situation with modern 3D cards, and why neither Nvidia or ATI want to release open-source drivers. But I don’t see how the same situation applies for printers or wireless network cards for example.
So please could somebody explain the following to me: if HP can release full, open-source CUPS drivers for all of their printers, why can’t other manufacturers?
If Intel can release full, open-source drivers for their Centrino wireless chipset, why can’t Broadcom or Texas Instruments?
I just don’t understand.
If Intel can release full, open-source drivers for their Centrino wireless chipset, why can’t Broadcom or Texas Instruments?
Not “can’t”, “won’t”.
Intel doesn’t sell any actual radios, (they sell the chipsets to people who put them in radios,) so they can easily avoid the regulatory problems associated with open source drivers.
Broadcom and TI do sell actual radios. And both sell into the telephony market, which has much stricter regulatory requirements than wi-fi market.
They both decided, long ago, to simplify their life with respect to regulatory agencies, by limiting who can modify the code that runs on their radios.
Intel doesn’t sell any actual radios, (they sell the chipsets to people who put them in radios,) so they can easily avoid the regulatory problems associated with open source drivers.
I’m not disputing this, I’m genuinely curious: why would there by “regulatory problems” with open-source drivers?
Intel doesn’t sell any actual radios, (they sell the chipsets to people who put them in radios,) so they can easily avoid the regulatory problems associated with open source drivers.
I’m not disputing this, I’m genuinely curious: why would there by “regulatory problems” with open-source drivers?
Direct link for this comment
This varies by jurisdiction, so let me use the US, since I’m familiar with how the FCC operates.
The FCC regulations for a radio used in a particular “service” are pretty specific about what a radio that is supposed to operate in that service is allowed to do.
Wi-Fi devices are low powered “part 15” devices, which means, basically, that the manufacturer merely has to claim that they satisfy those rules, and no testing is involved.
cellphones are not “part 15” devices. They have to be tested and shown to comply with rules about what frequencies they operate on, what power they admit, what kind of signals they produce and so forth. Once tested, they can be sold in the US. (They become “type accepted”.)
Broadcom and TI both make radios that, depending on their software, will comply with the US regs, or will comply with other countries regs that are different than the USes. It is a lot easier to get the FCC to approve such radios for US use if no one but the manufacturer knows how they operate and only the manufacturer can control the software that goes in them.
If I want to build a cellphone using a broadcom radio, I have to spend a lot of money, sign a serious NDA, and let broadcom do most of the interface design. Then I have to pay a testing lab a lot of money per hour to put the result through a rigorous test. If I can say to the lab “only broadcomm knows how to change the radio parameters” the tests are easier, and they cost less money and time.
So, basically, it’s a case of “security through obscurity.”
Google for “cognitive radio” if you want to see more about the FCC’s paranoia on this subject.
>> Just one question: why?
So hardware manufacturers don’t have to waste time rewriting or worse, recreating their drivers… or hoping somebody in the community will do it just because of Yet another change in the internals of the kernel.
Seriously, you want to ask the question why, how about why would a hardware vendor selling products with $29.99 to $59.99 price points (like wireless cards) have ANY interest in supporting an OS that has in the past three years ALONE required three radically different codebases just to have support? Much less that there’s no guarantee that next year your drivers won’t be broken AGAIN by the next kernel release?
It’s why I get a laugh out of people wondering why companies don’t release drivers for five year old devices that cost <$50 at retail – duh. It costs MONEY to make drivers without giving away how it works.
and that’s what opening up the hardware interfaces IS – giving away trade secrets; ANY arguement to the contrary falls into the provinces of ignorance, naivete or just plain wishful thinking.
Of course, companies wanting to make money on their products and supporting their products, much less PAYING programmers to work on them MUST be evil, right folks? ([i]That’s dripping sarcasm for those of you NOT from New England[/]i)
So hardware manufacturers don’t have to waste time rewriting or worse, recreating their drivers… or hoping somebody in the community will do it just because of Yet another change in the internals of the kernel.
Seriously, you want to ask the question why, how about why would a hardware vendor selling products with $29.99 to $59.99 price points (like wireless cards) have ANY interest in supporting an OS that has in the past three years ALONE required three radically different codebases just to have support? Much less that there’s no guarantee that next year your drivers won’t be broken AGAIN by the next kernel release?
It’s why I get a laugh out of people wondering why companies don’t release drivers for five year old devices that cost <$50 at retail – duh. It costs MONEY to make drivers without giving away how it works.
Except that the argument that they’ll have to keep updating their driver for every kernel release only applies if they have a closed-source driver. If they release a GPL driver that’s accepted into the main kernel, then they can just release the code “into the wild” and never have to worry about it again.
And I can understand that hiring a programmer to do the work might not be justifiable. But how much does it cost to get the legal dept to draft an NDA, and send some sample hardware and the spec document to a willing volunteer? I believe this is what Creative does with their soundcards, and it seems to work well.
and that’s what opening up the hardware interfaces IS – giving away trade secrets; ANY arguement to the contrary falls into the provinces of ignorance, naivete or just plain wishful thinking.
I completely accept that this is the case when it comes to 3D cards. I certainly don’t accept it when it comes to other components. Otherwise, why would HP or Intel or Ralink or Prism (to take some examples off the top of my head, I’m sure there are others) release GPL drivers? Why would Creative help in the process of getting drivers written for their new Audigys?
What have Lexmark got to fear that HP have not?
Of course, companies wanting to make money on their products and supporting their products, much less PAYING programmers to work on them MUST be evil, right folks? (That’s dripping sarcasm for those of you NOT from New England
I’m from Old England. There’s nothing you can teach me about sarcasm.
Anyway, I don’t really see what you’re getting at here. The company will make the same amount of money from me buying the hardware whether I use it on Windows or Linux.
And as I’ve said, they don’t even have to pay a programmer to work on it if they don’t want to. E-mail whoever it is that’s trying to write a reverse-engineered open-source driver (you can bet there’ll be somebody). Offer him the specs document (or, if that doesn’t exist, the source to the Windows driver) under a strict NDA. Job done.
>> So please could somebody explain the following to me: if HP can release full, open-source CUPS drivers for all of their printers, why can’t other manufacturers?
Uhm, because HP hasn’t made a change to how their printers are interfaced in close to a decade? HPGL and/or HPCL was well documented back when most open source fanboys were still in diapers.
>> If Intel can release full, open-source drivers for their Centrino wireless chipset, why can’t Broadcom or Texas Instruments?
Because intel created a fixed hardware interface implementation for multiple devices because they can afford to have their products cost more and/or spend the extra money on making radically different hardware use the same low level interfaces…
While companies like Broadcom and TI often change the hardware specifications and interfaces completely between models to cut costs, improve speed, or simply to try something different – technically, it’s the hardware equivalent of the kernel API situation – which is why when you combine a kernel API that’s not fixed with hardware specifications that aren’t fixed… bad things tend to happen.
with ‘nv’ driver I have 2 lcd’s and a tv at the same time, not much hassle
So when the driver crashes your system or has problems can I debug it?. If you get a crappy driver in Windows, what can you do but?
I’d rather buy hardware that I know that will work with the kernel and opensource drivers. I wonder whether people even consider looking for hardware supported by the kernel, rather than blindly just buying it and hoping it will work.
For those intrested Novell Open Audio had a talk with the people behind the drivers announcement;
http://www.novell.com/podcast/Detailpage.jsp?id=62
Only time will tell if this where between awesome and lame this falls. On the one hand I’m thrilled my D-link wireless card might now work without me trying four or five different kernel pathes and/or emulators, and that I might be able to purchase PC hardware on impulse now. On the other hand I fear this might detract some companies from writing “real linux drivers”.
I wonder how these drivers will fair performance wise compaired to kernel modules, especially i686 kernels on a seperate partition in the front of the HDD like mine =) I also wonder if this new API will be able to handle winmodems and the like.
All in all I think this is pretty cool, but certainly not the end all be all solution to hardware support on linux. After all some companies don’t provide linux drivers because they don’t feel there are enough linux users to warrant the effort. If this new API attracts new users however it is a step in the right direction.
Novell published a Linux DDK but insisted on GPL’d drivers? This is good. Many vendors would consider open sourcing their drivers by using a stable, user friendly API. Blobs? Nooooo!!!!!
Binary drivers are clearly at odds with the Linux kernel, and the differences are irreconcilable, even with this system. Read why here:
http://www.kroah.com/log/linux/stable_api_nonsense.html
In short, making binary drivers work with the Linux kernel is a game of mix-and-match. Vendors need to make sure their binary drivers will work on multiple architectures, kernel versions, and kernel configurations. Novell’s new system makes it easy for the user to find the version of the driver appropriate for the particular setup, but doesn’t appear to make it easier for vendors to make all the different versions necessary.
However, Novell is offering to automate the process by obtaining the proprietary driver source code under NDA and putting it in their build tree. This is where it stops being a matter of pragmatism and starts to develop a bonafide fork of the Linux kernel. Why should Novell be the steward of “closed Linux”? This means that closed Linux will only be built to link with Novell-built kernels. In fact, I think closed Linux is a violation of the GPL, because even though Novell isn’t distributing a derivative work containing non-GPL code, they are distributing non-GPL code that might only link with the kernels they distribute.
The slow death of the community begins when vendors can supply their proprietary code to certain chosen Linux vendors and have it automatically work with those distributions’ kernels. The community can’t be trusted with the code, so community distributions won’t be able to pull the proprietary drivers into their kernel build trees. It’s easy to shrug this off as purist paranoia, that this “proprietary kernel tree” stuff won’t discourage vendors from providing open drivers. But what’s happening here is the creation of a class-system in the Linux community, where commercial Linux vendors differentiate their distributions not based on innovative value-add software and services, but on the fact that they can enter NDA agreements with hardware vendors.
The main reason why proprietary drivers shouldn’t belong in the Linux kernel is because most of them are crappy and can crash the kernel. But I don’t need to rely on this fact to argue against this Novell proprietary driver initiative. If people want to install proprietary drivers, and if hardware vendors want to provide/support them, then that’s their choice. But Novell wants to provide and support these drivers on behalf of the hardware vendors, greatly reducing the vendors’ desire to continue to provide and support them on their own for non-Novell kernels. Then it stops being a matter of choice. There simply won’t be any proprietary drivers unless you choose Novell.
So, ironically, if you really like proprietary drivers, and if you think all Linux distributions should accomodate them, then you should be absolutely outraged at this initiative.
There wouldn’t be any problem under the BSD licence.
Novell should be slapped for the press release being written in marketspeak. Most of the people in this discussion need to be sent back to remedial reading programs– I doubt anyone ranting about binary drivers destroying linux actually read Novell’s site on the technology:
http://developer.novell.com/wiki/index.php/FAQ_for_the_Partner_Linu…
How does the Partner Linux Driver Process relate to the kernel community?
As an active member of the open source community, Novell’s position is clear: The best place for partners to develop kernel drivers is upstream in the kernel.org source tree, where kernel driver code benefits from thorough review and community involvement. Novell promotes having all Linux device drivers be a part of the official kernel.org source tree. However, we recognize that some drivers are not there yet or have been integrated only after a kernel release has happened. For this case, we offer a way to get a supportable and certifiable driver anyway using the Driver Process described here.
In short, what they’re trying to get rid of is me having to sift through the web to download driver “X” for POS hardware “Y” that PHB “ID10T” decided we absolutely have to support, and then praying to deity, or deities, unknown, that the particular version I found will compile and survive a modprobe without taking down my entire system.
In short, they’re creating a process whereby if I have part Y, and kernel Z, driver X will be available, and will just work.
This dovetails nicely with their openSuSE “Build system”, which streamlines the process of packaging your application work with their (and a few other) distros. Strangely, I don’t hear people screaming about THAT bringing about the downfall of civilization.
Yes, they’ll be distributing binary modules, just like your favorite linux distribution does. Nobody said they would all be closed-source– In fact, Novell makes it quite clear they’d rather it *wasn’t* closed source.