3D graphics on X11: XGL vs AIGLX. This article delves into the inner workings of XGL and AIGLX. It shows that there are many similarities between these two competing/co-operating “rivals” and plenty of room for growth.
3D graphics on X11: XGL vs AIGLX. This article delves into the inner workings of XGL and AIGLX. It shows that there are many similarities between these two competing/co-operating “rivals” and plenty of room for growth.
Hopefully this will be out in some distros before i go nuts waiting for it.
We’re already playing with XGL/compiz on a modified Slackware. Ubuntu and Gentoo already have it going too.
Yes, I’ve got XGL working on Gentoo. It’s very unstable, but it all works, wobbly windows and all. http://gentoo-wiki.com/HOWTO_XGL
Ubuntu, Gentoo and SUSE have XGL in their testing branches.
I am personally using it on Ubuntu dapper and it works beautifully.
Never had a more beautiful desktop.
If you use Ubuntu, Check This forum entry
http://ubuntuforums.org/showthread.php?t=131267&highlight=xgl
Enjoy eye candy.
I also tried it out on dapper and the effects are beautiful. However it frequently completely freezes up X and i have to switch of the machine. I have an ATI Mobility X300 and use the flgrx driver. Shame really so not for daily use at the moment (well not for me anyway).
I’m currently using Xgl with ubuntu dapper full-time on an old Athlon 650Mhz with 256mb ram and a geforce2mx (32mb), and it runs quite nicely, I haven’t had a problem with crashes so far.
This old machine is kicking some serious ass!
AIGLX will be ready in a couple of months since it mostly uses the existing inrastructure, probably with the X11 7.1 release scheduled for april. Xglx is a more long term goal.
Some healthy debate, more options and something we’ve all been waiting for? No complaints from me.
Actaully, the article was pretty well balanced, and written – debunked a lot of what I thought was going on.
I was surprised to learn how extensive Novell’s work had been on their implementation. While I have yet to try them both, I’m going to stick with the Red Hat version – more of a drop in solution?
And while it was impressive, you can see it has a way to go yet.
The effects were great, but I won’t be using it until it supports xinerama.
one thing i didn’t like about it(this may be configurable in the nvidia driver)is fonts on the gl textures (whilst dragging a window, initially openning a menu etc) are different from those on static windows. there’s a noticable difference between the fonts on a window when i stop moving it and when it settles into place. i tried it on mac os and there’s no such problem.
and as for redhats argument that this approach is too big a change – that’s just stupid, since the work has been done, it’s more backwards compatable than their solution, and glx looks a lot better then theirs.
I noticed that font transition problem too.
Using this approach (compositing as bitmap transforming) it will be hard to avoid font antialiasing artefacts, depending mostly on what certain effect does. It might take much tweaking to get it smooth. e.g before it settles down, do pixel interpolation transition to steady state (or anything that seems smooth to a human eye).
you can fix the font issue by open gconf-editor and changing the following value.
Apps–>compiz–>general–>all screens–>options–>Texture_filter
set the value to best, this will give you perfect fonts while screen is wobbling.
If you don’t have two gfx cards I do believe it is absolutely impossible to run it on two monitors.
I’d read the article, but the site is down….
I have XGL running with NVIDIA’s twinview here, on one GFX card, 2 monitors attached.
Article states that direct rendering is possible with AIGLX, while it is much harder to do with XGL.
I doubt that this is true. If you run compositing manager and direct GL program at the same time, both xserver and that program access libGL and 3D driver directly at the same time. This is what should happen in both XGL and AIGLX scenario. (If you turn off AIGLX-based compositing, you end up with traditional Xserver, where you can run direct GL progs currently)
Same improvements to driver frameworks are needed to get this working for both ‘solutions’ (e.g GPU memory management and scheduling). Also, Extensions are needed to have framebuffer of direct GL program redirected to offscreen memory, without program knowing it, and available (shared) for compositing (not sure if texture_from_pixmap suffices). In fullscreen it is however easier because you suspend your XGL/AIGLX and let the program have exclusive access to 3D interface.
I doubt that this is true. If you run compositing manager and direct GL program at the same time, both xserver and that program access libGL and 3D driver directly at the same time. This is what should happen in both XGL and AIGLX scenario. (If you turn off AIGLX-based compositing, you end up with traditional Xserver, where you can run direct GL progs currently)
Nope. XGL does have this problem, AIGLX not. Difference lies in question where rendering solution resides.
XGL is a separate server and can’t take control over direct GL program.
AIGLX is inside of XServer, and when enabled AIGLX is controlling the GL (even direct GL softwares talk to him without even knowing it). Nothing simpler than to redirect ones commands if needed. This was the whole reason for AIGLX, or at least the most important one.
No.
AIGLX can’t redirect directGL programs, i.e. their GL API calls. It(=compositing manager on AIGLX) can only use dedicated OpenGL extension to redirect their screen output to offscreen buffer and be able to read it so it can be composited with the rest of the desktop. So it requires special driver support, in form of GL extension(s) (I’m not sure if this yet exists). There is no reason Xgl couldn’t in the same manner allow that program to access libGL and to do it’s direct rendering, while redirecting output using described mechanism (this is analogous to XComposite extension, only it happens in driver instead of X Server).
Big problem here is memory management. When both compositing manager and another app use video memory, it soon will become too fragmented, thus not too usable for new allocations. Some GPU scheduling might also come at hand if you want smooth desktop experience while something else is choking GPU.
AIGLX-based compositing manager, in the end, redirects all xserver windows to offscreen buffers (that’s what XComposite is for) and uses acelerated GL calls to manipulate them and compose final screen. This is more similar to XGL way than many people think.
Edited 2006-02-27 22:20
Yep, you’re correct here. I’m gonna try to correct my statement as I wanted to say it originaly in my head.
It seems to me that what is more natural to one, is more hackish for another. AIGLX can just do the same thing as XGL, just politicaly more correct.
Just wondering since you mention scheduling requests. Aren’t XFixes and XDamage bound to become part of complete picture in the end? As I recall they haven’t been really active yet. Those two are intended for this kind of work, aren’t they?
Edited 2006-02-27 22:42
I think X protocol extensions don’t have much to do with 3D driver specifics (it’s another layer).
Maybe there already is some sort of GPU request scheduling in DRI or some proprietary linux drivers and I’m just too ignorant to find out (I surely know Microsoft proposed something like that in Vista driver model – because they need it for Avalon).
I think X protocol extensions don’t have much to do with 3D driver specifics (it’s another layer).
They do provide limiting requests on redraws. But, you’re probably right, they are probably to high to provide support for direct GL legacy apps.
Maybe there already is some sort of GPU request scheduling in DRI or some proprietary linux drivers and I’m just too ignorant to find out (I surely know Microsoft proposed something like that in Vista driver model – because they need it for Avalon).
Now, this might be my blind shot:) As I recall NVidia was proposing something like that on XDevConf, but I was too lazy to carefuly read complete SPEC. So in my blurry vision this would fit the needs.
But is it just me, or is X a mess? (no offense) It all seems so convoluted. I can see why nVidia initially wanted to give up on Linux support. Wouldn’t everyone benefit from a rewrite in such a way that OpenGL calls get directly translated to calls to the graphics card, instead of being ping-ponged first?
“But is it just me, or is X a mess?”
It’s just you.
You are not the only one that thinks that X needs a rewrite. There is an interesting article by Jon Smirl that gives an overview of the current X architecture and extensions that are used to accelerate X: http://www.freedesktop.org/~jonsmirl/graphics.html
Jon Smirl is one of the guys working on Xegl, an XServer that implements the EGL API. It basically is a XServer based on OpenGL (see http://en.wikipedia.org/wiki/Xegl). There is an interesting thread about Xegl on the xorg mailing list (Xegl lives!): http://lists.freedesktop.org/archives/xorg/2005-May/thread.html#801…
However, I think Jon Smirl doesn’t work on Xegl anymore and I don’t know if anybody else works on Xegl.
However, NVidia seems to prefer incrementally updating the existing XServer and not rewriting it on top of OpenGL: see http://download.nvidia.com/developer/presentations/2006/xdevconf/co…
Edited 2006-02-27 13:07
Yes X is a mess. Mostly because of Xfree86’s desire to make it one massively convoluted program. X.org is slowly modularizing and breaking it up. So a complete rewrite may be years off it will actually be feasible to do so.
“Yes X is a mess.”
You are confusing X the windowing system and protocol with the implementations XFree86 and X.org.
> “Yes X is a mess.”
> You are confusing X the windowing system and protocol
> with the implementations XFree86 and X.org.
That’s obviously what he meant. I really hate people who play semantic games like this when it is perfectly clear what someone meant. You are in the same group of people who say “Linux is just the kernel”. Yes, technically it is just the kernel, but when most people say “Linux”, they mean Linux as a complete OS and distribution of software. Nobody gets confused, except for pedants like you.
It is indeed so trying and difficult to say what one means and mean what one says, oh the effort.
GNU/Linux is the name of the OS
Linux is the name of the kernel
Its not pedantic its to differentiate between the two , because you know some of us fix both work on both and contribute on both and advocate both , so we decided thats how its going to be.
You disagree and whant to call us pedantic , fine , thats how we differentiate between an expert and a moron. Moron say Linux for everything.
Have a nice day moron
It’s convoluted because the underlying problem is a complicated one. First, X supports two mechansisms for rendering OpenGL — direct and indirect. Indirect rendering is usually used for remote clients — OpenGL calls get translated into a GLX protocol stream, which gets sent to the X server and rendered. The direct method bypasses the GLX stream, and has the application render directly using a userspace GL library and a kernel module.
Neither mechanism makes “calls” to the graphics card — since modern GPUs like to be programmed via command packets and not direct MMIO register access. Instead, the direct rendering mechanism allows libGL to create command packets in userspace, and when enough data has been buffered, to call into the kernel DRI (or the NVIDIA equivalent) to verify the correctness of these packets and transfer them to the graphics card.
One proposal has been to turn everything into a direct rendering client. Remote apps would get a process forked off to service them, that process would then direct render on behalf of the remote client. Doing this removes the distinction between indirect and direct rendering.
In this model the X server as currently implemented disappears. Everything that draws is direct rendered (including the window/composition manager). Remote clients are handled with external processes (good for security). UI events are routed by the window/composition manager and then sent out to the correct process.
xlib support is handled by splitting some XGL code out into a library which turns the xlib calls into direct rendered OGL.
Edited 2006-02-27 15:03
I have tried getting Xgl-Compiz working on Ubuntu and Suse. Both work really well but getting it work on Suse was just the matter of downloading few Rpms from (http://en.opensuse.org/Using_Xgl_on_SUSE_Linux).
There is a good howto in Suse forum (http://forums.suselinuxsupport.de/).
It is very stable with Nvidia.
The article seems to be hosted on someone’s cell phone. I’ve been trying for 10 minutes and still can’t get the last two pages. It easily would have fit on a single page but it has been split up over five pages to increase ad traffic.
It is full of inaccuracies. How about this description of Mesa: One was the birth of the Mesa library. This included a sort of wrapper library, which exposes a set of OpenGL functions to the graphical program, simply translates them to X calls, and then transmits the data to the X-Server using the X protocol. Although this provides the logical functionality of OpenGL, you lose all of the performance enhancements of a proper 3D renderer, making it too slow to use for the majority of applications requiring graphical acceleration.
Or this brillant project (do some multiplication on the bandwidth needed you SunRay users) This is the “Virtual GL” project, which is working at supporting GL accelerated applications through a TCP network. The way this works is that a GL application running on a central server uses its GPU to perform 3D rendering, but rather than displaying the resulting pixmap on a screen there (which could be thousands of miles from the person running the program) it decodes and compresses it and streams the display to the remote user’s machine which uncompresses it and encodes it on the screen. This project is still in its development phase, but it looks promising. Running full blown shoot-em-all, fast-car, wizzing OpenGL games on a remote machine may be possible after all!
It is full of inaccuracies. How about this description of Mesa:
I’m not saying the article is not full of inaccuracies, but you failed to point out what exactly is inaccurate/wrong.
He does tell you what is wrong (namely specific descriptions) but not why. For example, he takes it for granted that the reader knows what Mesa actually is. Brian Paul and the other Mesa contributors surely deserve that much.
XGL is going to come out on top in the end, becuase it is desktop agnostic. XGL is applicable for BSD’s Linux, Unix, …etc, which means that everyone wins. Nvidia is slightly pulling for AIGLX becuase they know their in a dominant position driver wise for Alternative OS’s (i.e. non windows) and AIGLX keeps the basic structure we have today in place. However, Xegl is comming (especially now that XGL is here)and when it arrives, everyone will be back to square one and the drivers will have to re-engineered and it is very posible that their lead could dissapear over night, because there are a lot of ATI cards out there.
This is good and bad. It’s good because companies like ATI and NVidia could almost write one driver to support windows,linux, cell phone OS’s Game systems, …etc, so the support across the board could rapidly increase. EGL’s design is platform independent and nothing in the EGL API is specific to a windowing system unlike GLX/agl/wgl.OpenGL plus EGL and the Mesa extensions will provide portable API for accessing many forms of graphics hardware ranging from current cell phones to the Playstation 3 to PCs and a lot more. Most importantly, it’s all platform independant so no longer would companies have to be asked if they could write a linux driver. They would be asked if they could write a driver that simply conforms to the Khronos OpenGL ES standards and the required extentions.
XGL is going to come out on top in the end, becuase it is desktop agnostic.
Well,… you messed up completely. But only in term of what is agnostic here. Your comment is heading in right direction.
Separate the project and see why. What you know as XGL are actualy two projects (for now)
1. XGL is rendering server
2. compiz is WM and this is the part which is DE agnostic.
What you know as AIGLX is separated in 3 parts.
1. AIGLX, which has nothing to do with DE or agnostic. It is XOrg server extension which enables indirect rendering. It is rendering server only.
2. libCM. Composite manager library providing special effects. Again nothing with DE. If you look at XOrg mailing list, you can see possibility that XGL might use libCM in future too.
3. Modified metacity using the composite manager. And this one is really not agnostic.
Now, a little bit of speculation. If compiz starts to use libCM (davidr is looking at it, at least as he said in his e-mail). Where do you end up?
1. With server that works best for you. XGL, AIGLX or XeGL, EGL…
2. Common libCM, which enables different WM to share pluggable effects between them, but handled with previously picked server of your choice.
3. WM of your choice. And if compiz supports libCM in future, compiz is no problem here. Pick one that you like the best without loosing eye-candy.
In my opinion most of the people on
– linux will use AIGLX, libCM and compiz at the end. Metacity is too gnome specific (and I’m a gnome user).
– handhelds… people will use XeGL + libCM + (I don’t know which works best there, but I suspect that metacity and compiz are to computer centric)
– other platforms? I don’t know but XGL would probably fit there best, since it is the least dependant on XOrg. But again libCM and WM of your choice.
This is good and bad. It’s good because companies like ATI and NVidia could almost write one driver to support windows,linux, cell phone OS’s Game systems, …etc, so the support across the board could rapidly increase.
Maybe, but you have to port XGL there first.
Most importantly, it’s all platform independant so no longer would companies have to be asked if they could write a linux driver. They would be asked if they could write a driver that simply conforms to the Khronos OpenGL ES standards and the required extentions.
Yeah, it would be nice reality. Problem is that OGL is only one part of the drivers. OGL only resides on underlaying structure that connects to kernel and hardware different for each platform. So, you would still have to ask vendors to support linux.
really?
Then if it is such a nice drop in solution that will work out of the box with all x programs, why isn’t there more support int eh distributions for it?
hen if it is such a nice drop in solution that will work out of the box with all x programs, why isn’t there more support int eh distributions for it?
If you ask that if project will work out, as in the future, talk about the project that was started in the present (two weeks ago), then don’t use past for reference. It sounds lame trollish, nothing else.
But then again, I’m gonna bite this one. Why AIGLX for JoeUser on Linux
– NVidia opted for AIGLX instead of XGL (or at least the spec)
– DRI opted for AIGLX instead of XGL. (or at least the spec)
– less ram consumption
– direct control over 3d from inside, meaning easier plug for legacy 3d software
Meaning most of the 3d drivers selected AIGLX. Support will come, soon.
– NVidia support in the next driver release
– some DRI drivers already provide support for AIGLX
– ATI? They don’t provide Linux drivers (or at least their efficiency is near 0). So,… probably,… no.
– XGL is also changing some parts to combine with AIGLX
If developers of different projects can cooperate (read mailing list on XOrg, cooperation between XGL and AIGLX is very visible), if proprietary companies can cooperate with OSS (NVidia is helping with suggestions too), then answer me this. How it is possible that some users can’t cope with the facts which are not their problem?
No one says XGL sucks, it is official, it is a JAW-DROPPER and davidr deserves to be named as one of the “Linux on desktop” prophets. It will be used where it is better and more suitable tool than others. Same goes for AIGLX and XeGL too. It is a fscking rendering mechanism, for gods sake.
If you read mailing list archives, you can finaly realize that you’ll (or at least you should) be able to run compiz under all 3 (same goes for any WM). Compiz is just a WM. And in the middle there is a libCM, which should become a platform for exchanging effects, used by WM and rendered by rendering mechanism.
Edited 2006-02-27 22:36