Mesa 7.5 has been released, the main new feature is the Gallium3D, infrastructure, which is a new architecture for building 3D graphics drivers. Phoronix has more details: “Mesa 7.5 also brings support for several new OpenGL extensions, reworked two-sided stencil support, updated SPARC assembly optimizations, initial support for separate compilation units in GLSL compiler, and various other fixes and optimizations.”
Gallium3D is really important for Linux.
Now I only hope Intel will somehow open up “their” PowerVR drivers.
You can live without FOSS Nvidia drivers, but Intel is so much bigger and important. It needs to be supported out of the box.
How is this really important for Linux?
OpenGL 3.1 for Nvidia and AMD drivers are nearing Release with OpenCL 1.x support.
How is Gallium3D important for Linux’s future?
To have a single consistent library that all drivers hook into makes life easier for the developer (not having to deal with vendor specific bugs in their OpenGL implementations) and also allows one the freedom to support hardware on non-x86 platforms if required. To have a single library then all developers can link against that one – a single library also ensures that there is no duplication for driver support and optimisations are carried through to all graphics cards so all yield the benefit.
Edited 2009-07-19 04:40 UTC
Avoiding code replication (and simplifying the drivers) is just one benefit, although it is a significant one; driver quality (and quirkiness) have long been problems for OpenGL, and this should help. The other big plus, tho, as I understand it, is that this will make implementing other interfaces to the graphics hardware much easier: if someone wants to implement OpenCL or OpenGL ES (or D3D, or any other crazy thing that uses the graphics card), they can just create a state tracker, using the Gallium3D interface, and any card with a driver that exposes the Gallium3D interface will instantly support this new nifty API. Imagine if someone could implement D3D against a simplified, abstract interface, without having to worry (much, hopefully) about hardware quirks or how he’s talking to the card, and instantly, every platform with a Gallium3D driver instantly offered D3D acceleration under Linux, with no changes to the drivers themselves required. That would be nifty.
Because those drivers are reinventing all the work and they all diverge and create different bugs and issues. They all implement various specifications to varying degrees and ATI’s drivers are basically thrown out every month to see what sticks to the wall. They’re exceptionally poor. A great deal of features are simply not available in IGPs.
Now, if all of that code didn’t have to be reimplemented in different drivers then you’d have less bugs, drivers that could support far more features and drivers that wouldn’t make certain chipsets second class citizens.
As far as i can tell, the open source drivers are starting to hit a convergence. There is kernel mode setting for Intel chips, and lately it has also been submitted to the kernel for AMD/ATI chips (kernel mode setting may result in it being possible to run X as a non-root user). There are two different new memory managers for these two sets of chips, but they are now using a common interface (the one from GEM), and these too may also soon be moved to the kernel. The reverse-engineered open source driver for nvidia chips also is starting to gain some of these features, but it is a way behind due to the lack of information from Nvidia.
Anyway, at the current rate of things improving and getting fixed, at least for AMD/ATI and Intel chipsets (where there is documentation) anyway, there should be a stable and reasonable-performing set of drivers with a common API within a few months time.
There has been more advance in open source drivers for 2D/3D graphics for Linux in the past few months than there has been for all time before then.
References:
http://www.phoronix.com/scan.php?page=article&item=amd_new_linux_st…
http://www.phoronix.com/scan.php?page=article&item=ubuntu_ati_kms&n…
Edited 2009-07-19 23:48 UTC
Support for Intel chipsets might be important, but I feel they won’t really help the cause of 3D on Linux. Even the latest X4500HD is a poor performer, especially for shading operations. Good enough for fancy desktop effects, but barely useful for serious gaming.
I wasn’t really familiar with Gallium3D. It looks quite interesting, yet it seems a bit ambitious to promise OS (esp. non-UNIX) and API independence when most open drivers struggle with OpenGL 2.0 rendering. Of course, 3.1 is simpler, thus easier to implement… Guess we’ll see how it turns out in a few months.
Serious gamers aren’t getting Intel graphics cards and desktop effects are becoming more and more prevelant on all operating systems. I had to spend hours getting my X4500 card working properly with desktop effects so this sounds like it’ll be a nice step forwards when it starts hitting distros.
They don’t use Intel graphic chipsets, yet they are the only ones with decent open drivers! Now, I am quite satisfied with the proprietary nVidia drivers, but many people seem to have technical/philosophical issues with them.
To what I understand, the issue with developing open 3D drivers is not the amount of APIs to support, but the lack of specifications. Making development easier is a good start, but it doesn’t solve the main issue.
Gallium drivers are much easier to write, port, and maintain which is an important improvement over what has gone before. In addition, open documentation for 3D hardware is much more prevalent today and is no longer the main issue.
It will be nice when we can fully retire the practice of duct taping closed-source binary blobs onto the side of open-source systems; Gallium moves us closer to that goal.
Not ambitious at all. Nowadays the drivers have to implement everything, that’s why most of them struggle to survive. With Gallium3D, only one implementation is necessary. The drivers just have to make the bridge between the hardware and Gallium3D.
It makes everything simpler and easier. The cross-platform thing is just a bonus.
I completely understand the point of Gallium3D, but it won’t progress far until there are more specs available. Reverse engineering is really a pain. I believe the radeon driver is getting somewhere, but I’m not sure for nouveau.
Of course, there is nothing preventing video card manufacturers from writing drivers targeting Gallium3D. I am not sure those manufacturers are willing to change their current infrastructure, but I guess we will see.
Indeed, the radeon rewrite driver has hit the “3D glxgears” milestone recently.
http://www.phoronix.com/scan.php?page=news_item&px=NzM4Mg
There is still quite a bit of work to do … but this step represents the “hey, it is starting to actually work now” milestone.
Give it another few months and there should be usable versions beginning to become available.
When it is fleshed out, de-bugged and polished, this open source driver (in combination with the ATI cards it supports) will become the most capable and best performing open source graphics systems available.
It’s my understanding that the OpenGL 2.0/2.1 API’s are a major pain in the ass to implement fully (and provide decent performance); I wouldn’t be surprised if hardware vendors decide that it’s a very good thing to be able to make implementing them someone else’s problem. And we’ll all probably get better drivers for it.
Major hardware vendors already got their own implementations of OpenGL 1.x/2.x/3.0, so I don’t think that’s an issue. Furthermore, Gallium3D will have bugs.
The more I read about Gallium3D, the more it seems like a great idea. Yet, corporations tend to be quite conservative, so I don’t expect they will embrace the technology… If they do, they will surely keep their contributions for themselves, given the MIT licence.
OK, I just noticed that I didn’t finished my last message.
Gallium3D will have bugs, just like the implementations from the hardware manufacturers. However, it might take some time before Gallium3D gets better than these implementations.* Let’s not forget about the Not Invented Here syndrome, which led to the reinvention of the wheel so many times…
Now, I might seem pessimistic. Well, I have used Linux-based OSes for about 12 years (daily usage since late 2003). I have seen a lot of hype that never materialized. Gallium3D sounds cool, but I’ll wait until it’s usable by the average Joe with a “real” graphic card before getting psyched up.
*: I am not really familiar with Mesa3D’s implementation. Back in the day, it didn’t impressed me with my R200. However, I just got another computer with a X4500HD, so I guess I’ll find out soon.
To Intel (poulsbo) … Open it up!!!!!!!!!!!
To VIA … I see your point but please give more documentation (especially shaders)
Anyway, great news and congrats.
Who really owns a machine with Intel GMA 500 graphics? Almost nobody. So I don’t think the issue is that big.
Intel probably also can’t release the drivers, because the PowerVR core is licensed.
I don’t know if anyone’s actually tried to compile Mesa 7.5 from source code (I was doing it on an HP-UX machine), but it’s a bit of a broken mess at the moment:
* As usual, plenty of new gcc-isms have been thrown into the Makefiles and C code, making it less portable than it should be.
* Far too many of exactly the same long #if is peppered through the includes/source:
#if defined(PIPE_OS_LINUX) || defined(PIPE_OS_BSD) || defined(PIPE_OS_SOLARIS)
which should be replaced by a symbol like PIPE_OS_UNIX and also include other compatible UNIXes like HP-UX or AIX.
* When building a static library, the Makefiles don’t exclude -L or -l params from being passed as params to ar, which causes havoc.
* The Makefiles and, more critically, many of the demo programs refer to the GLEW includes and library which are not included in the Mesa 7.5 package. From what I can tell, the source code for GLEW is supposed to be in src/glew, but it isn’t in any of the 3 Mesa*-7.5.tar.bz2 source packages.
My guess is that you’re supposed to unpack GLEW into that dir, but there’s nowhere in the Mesa source or on their Web site that tells you do that (the download page specifically says that there’s only 3 packages needed for Mesa, which is now wrong apparently).
IMHO, the source changes for this release (i.e. including a large gallium tree that’s brand new to the package and not been anywhere near tested on the same platforms as previous Mesa releases – I got 7.4.4 running on HP-UX in no time at all since the changes were tiny compared to previous releases) and the new accidentally undocumented dependency of GLEW (which it seems needs to be unpacked “in situ” if the Makefiles are to be believed) justified a move to 8.0 in my books.
Edited 2009-07-19 20:56 UTC
AIX, HP-UX, BSD devs aren’t contributing to the development so I guess they have to take what they get and be happy about it.
That’s bullshit – I don’t know about the other BSDs, but FreeBSD’s Eric Anholt is a major contributor to X.org and its drivers (mainly intel, but he works on ATI + supporting technologies – DRM2, UXA, etc.)
http://anholt.livejournal.com/
http://people.freebsd.org/~anholt/
Edited 2009-07-20 04:57 UTC
Thanks for the information. I was half expecting the guy to also take credit for XDarwin as well.
I think you may wish to check your facts on that… there certainly are BSD developers who’re working on various parts of X, and not just to integrate it into their systems.
Also on the x.org homepage there are thanks going out to Sun, HP and others for hardware, financial and other support.
Edit (for a link):
http://www.x.org/archive/X11R6.8.0/doc/RELNOTES6.html
Edited 2009-07-20 06:49 UTC