“LinuxForums takes a look at Xgl, and concludes: “In conclusion, Xgl is an exciting technology, it still needs some work to be done mainly to do with non nvidia/ati graphics cards and extending support to a wider variety of graphics chipsets. I for one will wait a bit longer (until the holidays) before attempting to install it onto my computer, perhaps by then more animations/plugins will be made available and support for other DE’s would be available.” Lots of videos inside, boys and girls, so rejoice.
The author has a misconception about rendering performance vs. responsiveness. He mentions that the system with the integrated GeForce 2 graphics and Athlon XP 1800 was lagging slightly rendering animations, but tries to attribute that to running off a Live CD.
The medium off which the system has been read has no impact on 3D rendering performance.
OS X is pretty smooth even on a Rage 128 with 8 MB of RAM, so that kind of saddens me.
There are ways to squeeze extra juice out of older Geforce cards. nVidia owners will notice a good increase in responsiveness using Xgl by adding this to the “Device Section” of their xorg.conf:
—–>> Option “RenderAccel” “true” <<——–
Some report buggy/stability issues, but I’ve had zero problems on my rig… and I generally run it 24/7.
> OS X is pretty smooth even on a Rage 128 with 8 MB of RAM, so that kind
> of saddens me.
Well the real question you have to ask, is os X graphical system running with everything maximum on that machine?
Offcourse it isn’t.
The author could have set an option in gconf(options for compiz, compiz which is making all the effects)
to actually lower the quality and actually remove a couple effects that would eat up system resources.
But I must admit Xorg isn’t snappy at all but there’s a lot of work changing that rapidly.
Xgl and Aiglx being good examples.
I’ve given up on X ever being snappy. All of these “solutions” are just bandaids. The real problems: the inefficiency of the protocol and the poor division of labor between X server, window manager and client, among others. Adding new acceleration architectures like EXA and Xgl just try to cover up this underlying problem. Windows XP doesn’t use 3d acceleration to get performance, why should X have to go that route? Why does it feel like I’m back in 1995 when I use X, even on modern hardware (ATI x300 with 64 MB of RAM)? One of the biggest mistakes Linux made was adopting X. X should have been allowed to die in the 90s. Now it’s here to stay…
Poor division of labor? The X Window system is perhaps the most modular system out there. How many operating systems do you know that have totally agnostic windowing system, toolkits and window managers? Since you seem to be an authority on the subject, can you point to exactly where the inefficiencies in the protocol are and how you’d solve them?
Poor division of labor? The X Window system is perhaps the most modular system out there.
The passing around of calls is what creates the overhead labor anyway. Everything is seperated, and thus not tightly integrated. Great if you you want to use the graphical capabilities of a machine elsewhere (which is what X was designed for basically), but this setup isn’t used by 99.9999% of users.
Anyway, I might be off. For a humorous (but dated) look at it:
http://www.art.net/~hopkins/Don/unix-haters/x-windows/disaster.html
I’m sorry, I disagree. The first rule of designing robust complex systems is modularizing the components that make up the system. This has nothing to do with using systems on different machines. It’s just good software engineering practice. The people who say X is broken, poorly designed, or slow are clueless twats. The fact that X is network transparent or capable is just a side effect of its powerful well designed server-client architecture. An architecture clearly ahead of its time and embarrasingly under-utilized. I’m not saying the X implementations do not have their drawbacks. But of all the windowing systems out there, it clearly has the most potential to evolve, as it has done for years. And to date it is still the most powerful, modular, network-aware and flexible architecture out there. Eye candy, transparency and wobbly windows are orgasmic geek delights that are only skin deep. And before long all these will be on X. Oh ma bad, I spoke to soon. Have a look at XGL.
You know what’s funny? Almost all those arguments “omg it’s a beast, etc.” were only relevant back in the days of <486’s.
Completely irrelevant when run on modern hardware.
Then explain why I need to have super-duper hardware just to get okay performance. Like I said, I have an ATI card with 64 MB of video RAM. The desktop should fly (I don’t care about 3d at this point). But it doesn’t. I’ve upgraded to 7.0 with the new drivers and I’ve tried using COMPOSITE and EXA. EXA is still broken (screen corruption and no noticeable improvement in speed). COMPOSITE is nice, but everything is slower. Again, I’m not using some cheap onboard video card, I’m using a real video card and 2d desktop performance is still crappy. I used to have a crappy onboard graphics card with shared RAM and it was painful to use X. How come Windows can fly on even crappier hardware, but X given a really nice video card can produce okay performance? Because it’s a pile of shit, that’s why.
No, it’s because ATI drivers are a pile of shit, as you eloquently put it.
I would agree with that except for the fact that on every machine I’ve used Linux on, with a variety of video cards, some well supported, I’ve never gotten performance anywhere close to Windows. Given that we even have to move to EXA and XGL just shows that even the X devs recognize the shittyness of the project they’ve inherited .
And how did you measure this performance?
Nobody is moving to XGL because X shitty. Afterall XGL is an X server. Linux is moving to XGL to take advantage of your graphic hardware which can drastically improve the user experience of desktop environments. It’s a natural evolution that has nothing to do with the shittyness of X.
you are on some parts wrong.
First Xgl isn’t an Xserver, that’s either Xglx or Xegl implementations.
And the midterm solution will be Aiglx because Xgl currently lacks a lot.
Nope, XGL is an X Server, read up on it.
http://www.freedesktop.org/wiki/Software/Xgl
Read it. Really do it.
Next time look up your facts.
From your link:
“Xgl is an X server architecture layered on top of OpenGL.”
What is so hard to grok in that statement?
X server architecture != X server
OMG!
well let’s discuss this more off the forum, meet me at #[email protected]
Nice try. We are going to end up arguing over semantics, I wish I was in the mood. 🙂
well I know what you mean but I wanted simply to point that Xgl alone wouldn’t do much.
I myself call the technology Xgl but I wouldn’t refer it as a complete X server but rather call it by it’s implementation name like Xglx and Xegl.
Hopefully I have’t offended you to much
No offense taken. I just wanted to point out to the other dude, that XGL compliments Xorg, it doesn’t replace it. True, if we want to get technical Xglx is the name of the XGL server, but we all refer to it as XGL. 🙂 My point is that Xorg’s modular design allows for developers to experiment, swap and run concurrent servers with relative ease. I’m not aware of any other graphics subsystem that affords developers such leverage. Thus, saying that X is shitty, slow or obsolete, or poorly designed is just ignorant. Especially when I don’t see anything better, architecturally. X is certainly not inherently or marginally slower or faster than anyother graphic subsystem. For the majority of tasks, a GNOME/KDE desktop is almost as responsive as Windows XP one, if not more so in my experience. I guess in this culture it’s easier to say things suck to make ourselves feel better.
Totally agree, since Xorg forked from Xfree everything is doing better, I see the developers moving mountains.
With the modular design development will go much more easier. And offcourse the improvements towards better acceleration of the rendering etc make us all go yay(down inside anyways).
Back in 1994, I ran X on a 4MG 386SX laptop with Linux over the network to connect to my university. It felt very snappy.
I’m not sure what your speed issues are, but here are some things to look into:
* is your graphics card well supported by Linux? There was a time when ATI support on Linux was great, but from what I understand, it has poor support these days.
* is X configured properly?
* is it a distribution issue? The difference in speed between various distributions can be quite big.
* how much glitz and how many gadgets are you using? Some innocent looking panel applets have a nasty tendency to tie up the CPU or use a lot of memory. Some glitz or window managers can also have a big effect on performance.
X has been around since 1987 and has essentially been unchanged, yet it’s been able to adapt to hundreds of environments and several new features without showing it’s age. Devices and software from 1987 still work with today’s X-servers. This is something neither MacOSX can claim (OS 9 is different from OSX) nor Windows (it’s 16 bit, 32 bit, and 64 bit APIs for windowing are different) can claim, with good reason. Every programmer knows that it’s *very hard* to come up with a stable architecture that works well. No matter how well you plan things, new requirements, you can’t predict the future or how important your architecture will become. And it’s always tempting to make short term hacks to satisfy short term needs (as Windows and the Mac did). The whole “year 2000” issue seems like a case of bad judgement in hindsight, but considering that memory cost about $1 a kilobyte back in the early 1980s (and likely a lot more back in the 1960s when the Y2K issue started) every spare byte you can spare on hundreds of thousands of records is a big savings.
I don’t want to hear excuses. Why do I have to have an nVidia card to have a decent experience in X? We’re talking 2d. Fact is, the X devs have let it languish for a decade and instead of redesigning like anybody else would, they’d rather add a series of bandaids that work on even fewer video cards and are buggy and unstable.
No excuses. X worked well back in 1994 with a Windows XP-like environment (FVWM+themes) and it still does today. If you’re going to make a comparison, make sure that it’s using the same criteria for the comparison.
Comparing advanced features (AIXGL) for barely supported hardware (ATI on Linux) to old technology (Windows XP) using supported hardware (ATI on Windows) isn’t a fair comparison.
I don’t want to hear excuses. Why do I have to have an nVidia card to have a decent experience in X?
Because it is best supported graphic hardware if you intend to use Linux?
All I can say (and this is the only ATI I have) ATI on notebook runs better with XOrg drivers than fglrx. And for sure beats the crap out of Windows driver (runs well, but random series of showstoppers and pauses are making Windows driver really anoying).
Fact is, the X devs have let it languish for a decade and instead of redesigning like anybody else would, they’d rather add a series of bandaids that work on even fewer video cards and are buggy and unstable.
Just out of my curriosity:) You don’t know what you talk about, don’t you?
X is a network protocol. This is why one can run app remotely. Using protocol for internal works has its time costs (it is like microkernel against monolithic). But everyone can agree that X is very efficient.
How would redesigning make more drivers? Well, this one is beyond my imagination. If I try to think about redesigning X, I can imagine low progress, I can imagine few years of waiting, but more drivers?
Well, I can tell you that the ATI drivers probably are the culprit in your case as I also have a real video card but with much more humble specs than yours (nVidia Riva TNT2 with 32Mb VRAM) that gives me a quite snappy desktop experience with both 2D and 3D excluding of course newer 3D games that requires features that this card don´t have.
Edited 2006-04-06 20:04
Passing calls is not what makes X slow (this part can be very efficient). Old driver model with lack of proper video acceleration is. Combined with latency produced by ancient design of xlib and lack of damage/composite extensions in older versions of X.
XGL is fast, and XCB should reduce latency as well.
The passing around of calls is what creates the overhead labor anyway. Everything is seperated, and thus not tightly integrated. Great if you you want to use the graphical capabilities of a machine elsewhere (which is what X was designed for basically), but this setup isn’t used by 99.9999% of users.
Which has nothing to do with it; the protocol by itself is no more or less inefficient that TCP/IP or any other protocol for that matter; the issue is libX11 and the issues with in that.
They have a replacement solution, XCB (http://xcb.freedesktop.org/wiki/) which addresses the latency, threading and other issues which needed to be addressed.
If there is an issue, its the fact that Xorg hasn’t merged it in, depreciated X11, pushed tools kits to XCD, and gradually allow X11 to die the same fait which the bad fashion of the 1980s did.
As for the server/client design – that is how BeOS operates, same goes for MacOS X and heck, even Windows has a server/client model for its display.
Like I say, don’t blame the protocol, blame the Xorg maintainers and their unwillingness to make the merging, and pushing of XCB over libX11.
The problem is mainly with XAA and the ATI driver. If you try using standard Xorg 7.0 with the Nvidia drivers, things fly. Redraws are very quick, up to the point you don’t notice them (ie. similar to Windows).
With any composed desktop (xcompmgr/xgl/aiglx), redraws will be significantly improved as the application will no longer need to be prompted to redraw when a region of the screen is uncovered. The composite manager pulls the pixmaps directly from memory.
Looking ahead, we really should be looking at moving the infrastrucutre to opengl. Xgl still relies on GLX and the underlying X server, but it’s getting closer to what we want eventually (ie. Xegl)
Yeah, I second the belief that you are clueless.
Windows XP doesn’t use 3d acceleration to get performance, why should X have to go that route?
You’re mistaking using a double-buffered desktop with performance. Windows isn’t snappier than X, however if feels that way because the desktop is double-buffered. This has been solved with the Damage extension and the new compositing engines.
I tried Kororaa on my work PC, and the Xgl desktop was snappy, responsive AND incredibly cool.
There’s nothing wrong with X.
XGL is a nice technology, I have recently tried it in Gentoo. It was nice for like 10 minutes, then I switched it off because it does not help me do my work more quickly. I think the main reason for eye candy should be to increase usability, and in my opinion the only thing the current XGL stuff does is to look pretty. I hope that once XGL is stable, usability experts start playing with it and do really useful stuff with this technology.
I currently have Xgl running on two computers, my home desktop, and the computer at my working place, both running Ubuntu Flight 6. I also first thought that it’s just eye candy, but for me, it actually reallys adds usability. Especially at my work, I often have a lots of windows open, first a lot of terminals for SSHing into all the different servers, and a lot of Nautilus windows for quickly browsing the servers filesystems. When you have a lot of windows open, the Scale plugin can really be helpfull. I press F12 and have all open windows aligned on the screen and I can quickly select the one I need. But Xgl even added something that Mac OS X Expose does not have (afaik): If you press F11 instead of F12 then only the windows of the current program are aligned. When I’m in a terminal and press F11, all other terminals are aligned on the screen, the Nautilus windows stay in place. This is honestly pretty useful to me.
Also the ability to change a windows’ transparecny with Alt+mousewheel is pretty need. If I have several windows open, and e.g. Firefox fullscreen on top of this, and I need to read something from the terminal beneath it, instead of having to search for the right terminal in the task list and bringing it on top, I just quickly press Alt & turn the mousewheel down, which makes Firefox transparent, so I can look trough it and read the terminal output underneath it.
Tom
But Xgl even added something that Mac OS X Expose does not have (afaik): If you press F11 instead of F12 then only the windows of the current program are aligned.
Expose’s had this since its first inception.
Also the ability to change a windows’ transparecny with Alt+mousewheel is pretty need.
It is pretty handy. I use this on MS Windows quite often for monitoring logs and chat sessions.
Of course, if I had multiple monitors…
It is pretty handy. I use this on MS Windows quite often for monitoring logs and chat sessions.
Windows has this feature? Thats new to mee, I’ve never seen this before. Gotta try this out tomorrow @work, would indeed be usefull.
Tom
Windows itself hasn’t although it has the ability to have translucent windows / widgets (done in software or with 2d acceleration of the graphics card). It however doesn’t support real transparency with for example every pixel has it’s own opacity value.
You can find freeware / shareware tools which use this ability and the ability to mess around with every other window / application. These will then change the opacity values of the window you want them to do.
” But Xgl even added something that Mac OS X Expose does not have (afaik): If you press F11 instead of F12 then only the windows of the current program are aligned. When I’m in a terminal and press F11, all other terminals are aligned on the screen, the Nautilus windows stay in place. This is honestly pretty useful to me.
Exposé has had that from the start too:
F9: All windows
F10: Current application windows only
http://www.apple.com/macosx/features/expose/
If you peruse the xorg mailing list, or listen around you’ll know that XGL is basically a hack for a little eye-candy. It’s not even close to the end-game. I’ve heard that Xegl is what is really needed, but I’m not positive.
The real question is when will the real Xserver be available that will give the infrastructure for all apis that will ride on top of it? It doesn’t look like they’ll be happening anytime soon.
“The real question is when will the real Xserver be available that will give the infrastructure for all apis that will ride on top of it? It doesn’t look like they’ll be happening anytime soon.”
You know that statement contains no information, right?
Being on the xorg mailing list doesn’t prove anything.
Xgl is not a hack for a little eye candy and this shows that you presumably don’t know very much about it.
// for the other readers
Xgl isn’t a X server , it’s a X server achitecture where the current implementations being Xglx and Xegl.
Xgl is being built to actually fix a lot of the current problems with X and offcourse to put X on OGL.
With Xglx being for development and testing , which runs on top of the current X server and
Xelg offcourse being the future is a very long way down the road. Xgl is yet only alpha software.
//
It’s only compiz that provides the eyecandy here which also will run on Aiglx(the modified Xorg)
Aiglx continues with the current Xorg with some couple changes (more specific indirect accelerated OGL via GLX).
Aiglx being a midterm solution to the problems and will support compiz as well and all those eye candy.
So to answer you question, both Aiglx and Xelg will be at the end of the road with
Xgl the longterm solution to fix alot of issues but a lot of work has to be done so get your hopes up for Aiglx to be first ready for real use.
Both Xglx and Aiglx are avaible today.
Runs on lower spec cards than Vista and the scope for plugins with compiz is awesome. There is some real nice plugins now like miniwin, which shows live previews of windows, videos ect.. in a OSX like dock. It puts the thumbnail preview stuff what Vista has somewhat in the dark.
I’ve always found XGL to be VERY stable considering how early it is, same with compiz and love the xwinwrap to put screensavers playing on your desktop like wallpaper. Again Microsoft give nice eyecany but it’s nowhere near the configuration of XGL since you can do unlimited stuff with XGL/Compiz. I think by the time Vista comes they both will need to play catchup with Linux desktop/XGL.
If you have a card with an accelerated driver and your X config is properly configured, X is not slow. I’m presently using it on a PIII-800 with a “0000:01:00.0 VGA compatible controller: nVidia Corporation NV18 [GeForce4 MX 440 AGP 8x] (rev a4)” and it isn’t slow.
The article is wrong when it says Xgl doesn’t work on intel integrated graphics cards. I’ve got it running on my acer laptop with an i915gm card, which (currently due to the crappy bios they ship with it) has only 8MB of shared memory. The performance feels comparable to my desktop machine which has an Athlon64 3700+ and Nvidia GEForce 5200.
And for those that complain about usability, don’t underestimate the attraction to the masses of “shiny stuff”.
Awesome http://85.128.86.73:6969/
One important thing to remember is that there are lots of different reasons for “slowness”. A LOT of complaints about slowness seem to center around Linux/X/Gnome using too much memory. If lack or overuse of memory is a source of your slowness, XGL won’t make it any better and will probably make things worse because of the in-memory window buffers. Also, you won’t see any benefit for actions like window resizing either, since the application still has to fill in the contents (the frame may resize smoother, but not the contents).
XGL should make moving windows smoother and will allow for cool 3D effects by the window manager. Thats it, at least for now.
Is that for the nv (OSS) drivers or the nvidia (Proprietary) drivers?
i installed the vida linux beta wich comes with xgl and gnome 2.14. i must say that i really like it only some things are a bit annoying.
1. scrolling in firefox is a bit slow
2. bad twienview support it acts like one large display over both monitors
3.switching to console and back kills the xserver for me sometimes
but other than that i dont have any problems feels snappy
and i use it as my main desktop.
I checked out the Kororaa live cd on a Athlon XP 2200+ w/Geforce 2 MX 400.. Given the video card is almost 5 years old, I was quite impressed at how well XGL worked on the system. For being relatively new and still under development, I was quite impressed at how feature-rich, polished and stable it felt (after working with it for a few hours).
For some people, its all about the eye candy, so I am glad to see X/Linux/etc is able to compete. Take this with a sandboxed/virtualized Windows system for legacy app compatibility and its definitely a very compelling desktop platform.
In terms of raw throughput X11 wins hands down over Windows and Mac OS X when it comes to blitting images rapidly to the screen. X11 has never had a problem with raw throughput-it is very fast. What us end users lament is the the repainting and lagginess of windows when reiszing and moving windows around. This, however, is due to the fact that each and every application is individually responsible for redrawing itself.
Due to this fact there is a lack of synchronicty between expose and redraw between different applications(X11 clients)-which is the culprit in question when it comes to percieved slowness. One reason why X11 was built this way is that historically most of the applications visibile on any given X11 display were remote applications running on a physically different machine-perhaps on another server elsewhere in the building-perhaps many miles away.
The fact is that the expose and redraw under X11 is litteraly too fast-it is due to the lack of coordination (timining) between applications performing expose and redraw function which makes things seem sluggish/slow. (ie. you move one window around on the screen- with each change in location the application you are moving and each any every other background application which are being exposed due to the movement have to redraw themselves-X11 is doing way more work than it should and in so doing we get redraw artifacts due to the rendering overkill which is happening).
This has nothing, directly, to do with your graphics hardware. You folks certainly remember the *old* composite extension-ie. the eye candy fad of yesteryear. What composite did was to synchronize all of the expose and redraw commands of all applications, by performing all rendering offscreen and only showing the final result visually. In fact X11 with compsite is markedly slower than raw X11 but due to the synchronization of expose and redraw commands it creates the illusion of totally smooth resizing and movement.
Composite does not require any particular powerful graphics card(no 3D/opengl)-virtually all cards supported by X11 could perform compsiting functions provided that the drivers for these cards adequately support the Render calls upon which compisite is built. Unfortunately very few drivers provided good Render acclertation. And thus Exa was born – Exa being the quick and easy way to provided good Render accleration for all existing X11 drivers.
The older XAA based drivers consisted of thousands of hooks of which perhaps only a tiny fraction were actually ever used(usually due to lack of documentation). Exa is written to accelerate Render calls only and this is much smaller and much simpler to implement. Exa will provided solid composite support for virtually all existing supported cards -probably within the next 10-18 months.
For those who do not have a 3D/opengl capable card and for those who do not wish for or need the added eye candy of Xgl there will be good solid Exa drivers providing high quality Render support enabling very smooth composited screens. I expect that Exa will effectively surplant the older XAA drivers in the next 18 months and when this happens *noone* will be claiming that X11 is slow or sluggish. Xgl, of course, also provides this compositing via the 2 Xserver hack which makes Xgl possible. Xgl is a second Xserver running on the back of a normal Xserver- all Render operations (all the drawing on the screen) (opccuring on the first X server) are piped through glitz (on the second X server) which is an opengl implementation of Render-so glitz intercepts the Render calls and pumps out opengl which is then handed over via Mesa and/or GLX to the 3D hardware of our graphics cards giving us the nice snazzy effects.
Xegl will begin to materialize as the memory management and lowlevel hardware configuration of graphics cards is moved out of X11 and into kernel modules which provide a generic Frame Buffer for accelerated compostion and rendering. The rewrite of memory management in Mesa is now being done. Work is also on going to allow dynamic reconfiguration of the Xserver. Eventually hal will allow allow for dynamic reconfiguration (plugging in a second monitor-autodetecting, setting timing frequencies, initializing, etc.). One of the big issues regarding Xegl is whether or not the Xlibs api should be surplanted with an Opengl API-some advocate this move-most oppose this. However Xlibs has already been rewritten-xcb. Xcb can and probably will replace traditional Xlib calls in the next 18-24 months. The metamorphosis which Xorg and the X11 community has undergone in the last 36 months in profound and inspiring. Things in Xland certainly have a bright future now.