The 8th annual Ottawa Linux Symposium kicked off Wednesday in Ottawa, Canada at the Ottawa Congress Centre. Jonathan Corbet, co-founder of Linux Weekly News, opened the symposium with The Kernel Report, an update on the state of the kernel since last year.
From the article:
“Kernel developers often lack the hardware needed to fix bugs, and so the bug-fix process can require extensive back and forth exchanges of tests and results. This process is very slow, and often times one party or the other gets bored of the process and the bug remains in the kernel.
Another problem, Corbet says, is that there is no boss to direct bug fixing efforts unless there is a corporate interest in fixing a bug somewhere and that company puts the resources in to getting specific bugs fixed. Kernel developers are also often reluctant to leave their little corner of the kernel, he noted.”
This is pretty much in synch with comments Andrew Morton previously made about bugs in the kernel for older hardware that simply weren’t being addressed due to either lack of resources or developer apathy.
I think this provides an interesting counterpoint to the position the devs have had regarding driver support. The basic message has been that hardware should be left open, and the drivers will be maintained by the devs in the kernel.
I understand the reasoning behind that stance and the discouragement of closed binary drivers, although I’m not in total agreement and I think this underscores it. When does the kernel hit a saturation point where it just becomes too cumbersome to maintain, let alone troubleshoot or repair, the code existing?
I’m not a developer, but it seems to me that the kernel devs themselves face the same challeneges that vendors do with supporting the linux kernel and changes in APIs. Changes will lead to things breaking, and although the devs are obviously closer to the process, it’s evident that this model is not the utopian model of stability and support it was implied to be.
As more and more hardware and components get supported at the kernel level, won’t the maintenance of all that code start to draw important resources away from actually developing the kernel further? And how are the kernel devs expected to maintain better driver compatibility than the vendors themselves when in many cases they don’t even have access to the hardware for testing and troubleshooting?
I understand the importance of stable, well-written drivers to the stability and security of the platform. But I think a rational middle ground needs to be found between the overly optimistic view of the community wanting entire control of that and the deeply cynical view that vendors are money grubbing profit loving providers of lowest quality possible code to get by and will lead to the death of linux.
If linux is going to really expand into a mainstream platform, the devs are going to have to start trusting the hardware vendors at some point and start sharing the development load. They’re also going to have to trust the free market in this, ultimately hardware that provides a high level of support for linux will be chosen over poor drivers and crappy hardware; look at how many people consciously seek nVidia or even Intel over ATI as a GPU, for instance?
Not trying to stir the pot or anything, but I think it’s something that needs to be addressed and probably sooner, rather than later.
Of two recent security fixes, one was for a one year old security flaw, and the second was for a three year old security flaw.
*smiles knowingly*