Seth Nickell posted some screenshots and videos showing the experimental Luminocity window manager & Cairo which enable a XOrg-based desktop to get accelerated GL eye candy graphics. update: more here.
This version of X.org and this window manager are available for anyone now, legally. Seth has step-by-step instructions on how to download and use them. So how does that count as ‘catching up’ to a product which we can’t legally use for another, at least, year?
The wobble is just a demo of what’s possible cause no one could use that effect and keep their sanity. But the workspace switch effect was nice and all in all it looked semi-smooth even to move windows while gimp was loading.
Longhorn had that for more than a year (even for their beta testers only) and Mac OS X has this for 2-3 years now. Market-wise, XOrg catches up to Mac OS X only. But technology-wise, XOrg is third.
Considering that Xorg with Composite/Xcompmgr is buggy as hell (and I can’t imagine how buggy is the new GL server + Cairo), yes, by the time Longhorn will be out maybe this stuff will be usable.
This is good from a technological point of view, i just hope the communtiy is still focused on functionality. Eyecandy is eyecandy – it’s nice for the eyes and that’s all. Please don’t repeat the mistakes Apple does nowadays: Eyecandy dominates functionality.
I’m glad to see OSS move in this direction as I thought that Longhorn would be out way before it. Of course, this still isn’t “out” by my definition. Sure, I could compile it and deal with bugs and issues and such, but I’m not going to. There is a big difference between good work and a final project. Now, once Ubuntu/Fedora/etc. include it as the default or good-as-default, then it will be out.
I have a friend who thinks that this will all be done by the middle of the year. I still think that it will be a year before it’s really ready for a production environment.
To all the naysayers: This is incredibly progress compared to what I have ever seen with X. I can’t make use of it (Mac user and I’ve no plans to switch), but I can appreciate it. And those saying that Longhorn will be able to do that forget that Longhorn won’t be out for over a year and is backed by one of the largest companies in the world.
Also, the “just eyecandy” viewpoint is a flawed one, as any long-term use of Mac OS X will show. Smooth graphics rendering can really make a difference in users’ perception of the speed of a computer.
Or maybe ATI’s drivers, which thankfully have the reputation they deserve among linux users, need work. NVidia drivers have been great in Linux for a long time and I won’t buy an ATI card until they step it up.
Xcompmgr doesn’t even work with Ati’s binary drivers. So it seriously needs a wlot of work.
I agree that xcompgr needs work (It won’t work with xinerama for me…), but I would suggest avoiding ATI cards…thier Linux drivers are horrible. Even their Windows drivers leave something to be desired.
I use XOrg’s r200 3D drivers (not ATI’s) and the effect is the same: xcompmgr just doesn’t work correctly. It’s incredibly slow and buggy. And it’s being like that for over 8 months…
I use XOrg’s r200 3D drivers (not ATI’s) and the effect is the same: xcompmgr just doesn’t work correctly. It’s incredibly slow and buggy. And it’s being like that for over 8 months…
Hmmm…It’s very snappy with all the bells and whistles turned on with my Nvidia 5600, p4 2.4…My complaint is that I can’t get it to work in xinerama… Which caused me to abandon it.
I wouldn’t think it would work well with ATI cards, though…period. The Linux support is nonexistent.
That’s not X’s fault, that’s ATI’s. NVIDIA’s drivers provide acceleration for the RENDER extension, so everything that uses it (including XCOMPOSITE) goes faster. ATI chooses not to provide RENDER acceleration, so you don’t get the speed boost. Complain to them, not about X.
One of the xorg developers commented that GCC 4.0 will speed up software RENDER considerably – if it’s enough to be usable without hardware driver support is another story
The r200 & radeon 3D drivers (that don’t work with xcompmgr) *are* developed by the DRI guys and *NOT* by ATi. So, I CAN complain about X, as they ship together.
I wonder what hardware they were running this on… Somehow I think it’s not a i486 with a Voodoo 1000. I am not trying to be facitious or anything! Was just wondering. I wish they could do real screen captures.
It’s nice any how!!! The WOW factor is way up in this one! Great Job Seth!
This is the only effect that requires GL hardware acceleration in Luminocity (and not even much at that, Kristian’s development machine uses an embedded Intel video card ).
Once we start getting these OpenGL powered xservers and window managers we shouldn’t have to worry about RENDER acceleration anymore. I believe these are orthogonal issues.
I know you may yell at me but there is only one real solution , that is to SUE these companies or have gov lawyers make these companies that only support Microsoft open their tech specs up.
As someone else pointed out, he was running one of those crappy i810 integrated jobbies so everybody that is using a system bought within the past 5 years will get this goodness.
It looks like I’ll be moving back to Gnome sometime in the hopefully near future after just ditching my Ubuntu Hoary partition and going to Gentoo and KDE 3.4 yesterday.
I suspect we won’t see somewhat stable bleeding edge of this stuff for another 6 months though – about the time GCC 4.0 starts getting widely accepted
Nvidia already has great drivers. ATI is the problem. They don’t offer RENDER acceleration. These problems go away with OpenGL accelerated Xservers, window managers, glitz, etc..
I can’t really believe ATI’s tactics. All they get by not paying attention to the Linux market (and all markets other than windows) is losing customers. Nvidia has never had this kind of problem. They really care about their customers. Theres no chance I’m gonna buy another ATI card in the next decades…
xcompmgr is a testbed just like Luminocity- don’t expect it to be fast or perfect.
having said that, KDE 3.4’s kompmgr does an amazing job making x.org pretty and enjoyable to use. All the eyecandy crap is on and it looks neat! I really like being able to see through windows, it has already proven useful.
I have one bug to complain about, and i think it’s xorg’s fault. I get a corrupted display on startup (about the left 3/4’s of the 2 screens I have) but a refresh cleans it up. I had this same issue on startup when first testing xorg and xcompmgr, seems to be a Xinerama specific problem.
So folks, if you run KDE 3.4 you’re already there with a USABLE WM that has all of the Composite goodies (no shaky windows, though) The rest of these are sandboxes for ideas and testing..
ATI has different priorities than eyecandy for fanboys. The Linux desktop market is miniscule and any responsible business has priorties. This was all discussed on a recent interview over at Rage3d.com. At this point, it might not even make sense for them to work on RENDER in their drivers and just go ahead and improve the OpenGL performance on linux.
My mediocre 9600 (tweaked a bit) plays FarCry on windows at 1024×768 with lots of extra eyecandy with nice framerates. The transistors are there, just not the drivers.
umm.. none of Apple’s Eye candy hurts functionality, if anything it makes the environment more pleasant for a normal (read non geek) to work in, which reduces stress and enhances productivity.
“Considering that Xorg with Composite/Xcompmgr is buggy as hell (and I can’t imagine how buggy is the new GL server + Cairo), yes, by the time Longhorn will be out maybe this stuff will be usable.”
Yes, Longhorn has no bugs and its development is not late…
I prefer to wait for a eye-candy free system than buy an expensive, closed, DRM-full, bloated, patented, proprietary and buggy non-operating system like Longhorn.
I’m looking to make a purchase of a dual amd64 system and my institution requires me to order only from the ‘big guys’ (right now this means IBM & Sun). Anyone using the quadro FX 3000, 4000, or 4400 under linux or any problems getting xorg running on say the sun w2100z?
I’m not sure if I’d have a use for flashy effects like that, but the videos are sure impressive.
Most importantly, it works with a puny video card like the Intel 810, as it should. All these fancy desktop effects are nothing to any post-Geforce video card.
This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.
Currently waimea uses cairo for creation of its features. The configuration is XML. The person that is working cairo is also the author of waimea. Once cairo becomes more stabilized he will turn his attention to waimea. That said he keeps waimea up to date with cairo system calls.
“This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.”
Thats pretty damn myopic of you.
Implementing the least scaled feature set will always use “less” in terms of resources, but this is at the cost of scalability and future enhancements. Building an infrastructure that uses and supports all of the Direct X9 specification allows for a lot more than just “wobbly windows” as hardware progresses. Unless you feel that wobbly-windows are the pinnacle of desktop “eye-candy” and visual functionality and no one would ever need anything that might benefit from vertex shaders?
Longhorn isnt released yet. I cant purchase Longhorn. Why are you mentioning a not-released product that may not be released with that feature (see WinFS/NT Object FS/Cairo) to something that is about to be released that will definately have that feature?
Building an infrastructure that uses and supports all of the Direct X9 specification allows for a lot more than just “wobbly windows” as hardware progresses.
As hardware progresses? No effect that’s usable on a desktop has so far been dreamed of that uses even a fraction of the power of today’s midrange video cards.
Designing a system now, in anticipation of graphical effects which would have to be so ridiculously advanced compared to the current class of ideas, and locking out millions of people in the process, is a mistake.
Do you anticipate Longhorn having any effects requiring vertex shaders? So far the demo’s we’ve seen are nothing more complex than the usual “windows blowing in the wind” type effect.
To the end user, none of this matters anyhow. They will see the effects OS X does now, on cheap video cards, and they will see the effects that Linux does (assuming this goes anywhere) with the same class of hardware, and then they will see that Longhorn does the same effects, but requires a DX 9 level card.
See, Leo made a comment comparing the implementation requirements listed for luminocity and the implementation requirements listed for longhorn’s UI. Here is the comment since you might have missed it the first two times:
“This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.”
I however felt it was necessary to point out that just because an implementation can do it with less system requirements doesn’t mean its better. There is nothing showing that a directX9 card is needed for performance — its quite possible its needed for feature set.
Now, in order for me to actually reply to his post which mentions longhorn, I too have to mention longhorn. See ?
“They will see the effects OS X does now, on cheap video cards, and they will see the effects that Linux does (assuming this goes anywhere) with the same class of hardware, and then they will see that Longhorn does the same effects, but requires a DX 9 level card.”
You are making a lot of assumptions:
1) OS X, Linux, and Longhorn will all implement the exact same UI eye-candy features.
2) Using the DX9 feature set will not be inherently faster than using the feature set that OSX and Linux use.
Mid-range cards support DX9 currently. Its not a matter of power — it quite possibly is a matter of features for what they intend to do.
2. advanced osx-like acceleration: 16-32 MB of VRAM and above
3. Super acceleration and support: 64 MB and above
I’d be curious to know what “osx-like acceleration” means. CoreImage and CoreVideo seem pretty hardcore to me (they also require a tad more than 32mb video cards), so if “Super acceleration” is better (presumably, since you’ve ranked it as requiring more hardware), why?
And on-topic: Neat videos, maybe one day I’ll be able to stop laughing when my KDE-using friend insists that he has as much eye-candy as my OS X desktop. Not meant as bashing KDE, less eye candy is fine, I just find it funny that people think it can dance like OS X.
2. Download this January snapshot of xcompmgr (otherwise you will need to check out the latest code using the ARCH revision control which is not very commonly installed on some distros): http://baghira.sourceforge.net/xcompmgr-2.02.tar.bz2
./configure; make; make install-strip
3. At the bottom of your /etc/X11/xorg.conf file add:
Section “DRI”
Mode 0666
EndSection
Section “Extensions”
Option “Composite” “Enable”
Option “RENDER” “Enable”
EndSection
Add this line to Section “Device” if you have an nvidia card:
Option “RenderAccel” “true”
In the Section “Module” make sure GLcore, dri, glx and xrender are loaded.
4. Make sure your Xorg supports your graphics card in an accelerated 3D mode. Select “DefaultDepth 24” on your xorg.conf file under the Section “Screen”.
5. Create a hidden empty file on your home folder called .xcompmgrrc
Quote: “ATI has different priorities than eyecandy for fanboys. The Linux desktop market is miniscule and any responsible business has priorties.”
And I urge people not to buy ATI products, or products that use ATI cards. It’s the only way to get thru to bastards like this that the Open Source community will not tolerate discrimination. When you start hurting their hip pockets, then they’ll start to listen.
Firstly: What does Linux has to do with this? This is a Gnome project exploiting an experimental X11 extension funded by RedHat, the only ties to Linux here is the money.
Secondly: Comparing the Linux family with with the products OSX and Longhorn is just silly. I can name a few Linux’es that doesn’t ship with a GUI at all.
But most (if not some) GNU/Linux distribution that uses GUI. And some uses GNOME. Won’t be surprised if a few distro will be incorporating this eyecandy when it is officially released.
“Or maybe ATI’s drivers, which thankfully have the reputation they deserve among linux users, need work. NVidia drivers have been great in Linux for a long time and I won’t buy an ATI card until they step it up.”
You may want to take a tour of the Nvidia user forums before you break out the pom-poms.
KDE has far more eye candy than default OS X. Eye candy is not synonymous with 3d acceleration (which kde has a buggy compositor now). Go look on kde-look.org and notice all the crappy unusable stuff that looks good .
Is there any trick involved to get xcompmgr working in Fedora Core 3? I see the files “/usr/X11R6/lib/libXdamage.*” yet xcompmgr insists “No damage extension”. Composite gets displayed in the “xdpyinfo” output but not DAMAGES and XFIXES.
The final open source stack will look something like this:
1) framebuffer/DRI device drivers – both will be required
2) mesa with EGL. this is being designed right now.
3) XGL the X sever based on OpenGL.
4) luminosity or some other window manager.
5) Cairo running glitz
Nvidia/ATI will have their own versions of 1/2.
Right now XGL is an X server running inside another X server. You are only seeing technology demos. No one has a standalone OpenGL stack working to run XGL on yet.
…makes me wish I hadn’t sold my GeForce FX 5200. I didn’t think I’d need it since I don’t do 3D gaming anymore, and now I’m on a Radeon 7000 VE. The Radeon is fine for regular desktop stuff, but I recently tried enabling composite shadows in GNOME and it was too slow to be usable. It looked great though! I guess I’ll be saving up for another GeForce FX, maybe a 5500 this time.
I agree wholeheartedly about ditching ATI. I tried installing Gentoo on my laptop (Radeon Mobility 9000) and let me just say that getting opengl to run was a pain to say the least! Even after I got it to run after hours of config editing, glxgears only ran at a few hundred frames per second.
Yesterday I installed Gentoo on my desktop (Nvidia GeForce2 Ultra) and the only problem was a compile error emerging nvidia-kernel. I simply attempted to emerge the latest masked version and all of my troubles went away.
My next video card I buy will definitely be an Nvidia, even if ATI does attempt to solve their driver problem.
If you don’t like hours of messing with a driver: don’t run Linux and for heavens sake, don’t run Gentoo. It’s kiddies with this mentality that make people label sects of a community as zealots. It was originally made for hackers to have fun, and sticking a pretty GUI on top of that doesn’t really change that fact. Deep down inside, Linux is ugly like some fat geek’s buttox. Grandma still isn’t supposed to install her leenucks and get on AOL.
All of that said, by all means have some fun and share it with folks! It’s useful, it can be stable, and it’s just plain awesome. Why complain?
The bottom line: it’s wrong to expect any of it to work. You’d be lucky if ATi donated an hour of their time by paying a 16 year old dork minimum wage to make your driver. We’re obviously quite a bit more lucky than that given the support that FOSS has.
Eye candy is cool and all but it’s just window dressing as far as I’m concerned. You know what would really be useful? Hardware accelerated OpenGL without X. I maintain software that does parallel rendering on linux clusters and lemme tell you something: dealing with X on a cluster is a pain. I’m actually a bit shocked that this doesn’t exist.
Why are you being so harsh? Just because I am using Gentoo doesn’t mean I want to be spending hours getting something that shouldn’t take that long to work. It’s not like the hardware is some obtuse piece of hardware that only 5 people in the world have.
Ok, so the goal of Gentoo is to have fun. Replace “Gentoo” with “Fedora” and will you have the same response? Drivers are drivers no matter what distro it is. I do have fun making my system run smoothly and to my liking, but taking that amount of time to install video drivers shouldn’t be that difficult, especially when nVidia can prove how simple it should be.
And why exactly is it wrong to expect it to work? I paid them for their video card, so I expect it to work. I won’t complain if they give closed drivers. I won’t complain if features like TV-Out don’t fully work. I just want the video card to do it’s primary task.
If nVidia can make it happen, why can’t ATI? All that I am saying is that I won’t be supporting ATI because of their driver history.
I witnessed EVAS’s power with my very own eyes as Rasterman demo’ed it at one of the local Linux Expo’s a few years back. It was the freaking coolest thing I ever saw running on X11.
Actually I would expect more problems in fedora, that’s just me. It’s still out of the goodness of their hearts to provide any support. They do make sure their card that you buy works in the enviornment that they intended it to be released on, be it OS X or Windows.
Personally, I find it more than satisfying to see the Xorg folks comming up with what they do have given ATi’s support or NVIDIA’s suppoort. In good style, it doesn’t surprise me that they’ve come up with the eyecandy that they’ve got running off of such a simple machine. (What was it, an Intel box somebody quoted the demos comming off of?) You’re running off an OS that isn’t guarenteed, that isn’t really supported.
I would expect my SUsE desktop to run to specs that SUsE promises, never more than that. I would expect to spend a few hours tinkering with any given item in Gentoo. The Gentoo devs say themselves that you shouldn’t use it if you expect things to just work in a production enviornment.
I’m only harsh because this mentality that Linux is a product and it should work in ____ ways taking ____ amount of time to setup seems to be spreading like wildfire. It’s sick.
I agree that eyecandy can be a good thing, especially when it is not heavily dependent on the power of the video card (as evident from the i810). However, I am curious to know how much of a burden this is to the system processor and memory. As long as they increase is not outrageous, this is A Good Thing.
Thanks Eugenia. Following the instruction, I was able to enable Composite. However, this step:
“In the Section “Module” make sure GLcore, dri, glx and xrender are loaded. “, you don’t need to load both GLcore and dri if you already have glx.
The test is succesful on Fedora Core 3.
@Dennis J. (IP: —.dip.t-dialin.net) –
You forgot to exit X in order to enable Composite. Switch to console mode with Ctrl+Alt+F1, log as root or user with sudo enable, do init 3 then init 5. Don’t forget to log out in console mode. Hope it helps
It looks like OSX environment. It appears to run smoothly without performance hit. I can’t wait to see what future distros will unleash. Hopefully there will be an option to enable them as Composite appears to be stable.
What about Intel video cards? I’ve experienced the bad of both NVIDIA and ATI, but if the eye candy can be had with onboard intel video chipsets, where are the expansion cards? How well do intell X11 drivers support GL?
1) They are only catch up MacosX because longhorn is not release.
2) Why linux users always say that eye candy effects are useless and for “noob” when they don’t have it, and why they are working for an X server that allow those effects.
Many are using lightweight wm just because they are more productive with that. When I’m using osx I disable most of the graphic effects. Some people likes FEATURES, not eyecandy (eg: I can’t stand Metacity not being able to maximize windows in just one directions and other features I may consider wrongly needed to call an application a good wm)
EVAS, RENDER, etc.. all perform wrong arithmetics on colors. They all assume RGB is brightness so add, div, etc direct on precorrected values. it is fast but wrong. You must do gamma convertion, do what you need ( mul/div/add …) and do inverse gamma correction exactly before write to frame buffer.
This is incredibly progress compared to what I have ever seen with X. I can’t make use of it (Mac user and I’ve no plans to switch), but I can appreciate it.
Who says? I have no doubt that GL-enhanced XOrg bridged to QX (as the current XFree and XOrg releases are bridged to Quartz) will be available once it becomes ‘mainstream’, thus giving you similar painting performance for XWindows applications as for native Mac OS X ones (provided you run it within OS X).
Smooth graphics rendering can really make a difference in users’ perception of the speed of a computer.
Agreed. As it does the user ‘experience’ of said computer.
This version of X.org and this window manager are available for anyone now, legally. Seth has step-by-step instructions on how to download and use them. So how does that count as ‘catching up’ to a product which we can’t legally use for another, at least, year?
Er, some developers and skilled UNIX alphaware compilers being able to use things alpha, not integrated, demo of a composite manager does not count as “anyone can use it”.
Tiger will retail next month, and Panther already does more of this, stably (sic) and more intergated.
Even when Longhorn gets out (2006? 2007?) this still won’t be readily available for linux desktops.
This version of X.org and this window manager are available for anyone now, legally. Seth has step-by-step instructions on how to download and use them. So how does that count as ‘catching up’ to a product which we can’t legally use for another, at least, year?
Looks really good.
The wobble is just a demo of what’s possible cause no one could use that effect and keep their sanity. But the workspace switch effect was nice and all in all it looked semi-smooth even to move windows while gimp was loading.
Longhorn had that for more than a year (even for their beta testers only) and Mac OS X has this for 2-3 years now. Market-wise, XOrg catches up to Mac OS X only. But technology-wise, XOrg is third.
If we’re going to be pedantic, EVAS has had an OpenGL-accelerated canvas since 2001…
Considering that Xorg with Composite/Xcompmgr is buggy as hell (and I can’t imagine how buggy is the new GL server + Cairo), yes, by the time Longhorn will be out maybe this stuff will be usable.
Linux will always be the under dog.
And everyone likes to favor the under dogs… That’s what people do.
“Considering that Xorg with Composite/Xcompmgr is buggy as hell”
Ive been running it full time on my system and it works great!
Nvidia 5600 256MB / Driver 1.0-7167
Xorg 6.8.2
xcompmgr v1.1.1
Gnome 2.10.0
xcompmgr -cCfF -r7 -o.65 -l-10 -t-8 -D7 &
Im trying to compile XGL and waimea ATM.
Is this the guy Novell hired to work with GTK and Cairo fulltime?
never mind, he works for red hat.
This is good from a technological point of view, i just hope the communtiy is still focused on functionality. Eyecandy is eyecandy – it’s nice for the eyes and that’s all. Please don’t repeat the mistakes Apple does nowadays: Eyecandy dominates functionality.
No, Seth is a Red Hat employee.
Novell hired:
David Reveman to work on Glitz/Cairo and XGL server
Tor Lilqvist to work on Win32 GTK+ and porting Evolution
I’m glad to see OSS move in this direction as I thought that Longhorn would be out way before it. Of course, this still isn’t “out” by my definition. Sure, I could compile it and deal with bugs and issues and such, but I’m not going to. There is a big difference between good work and a final project. Now, once Ubuntu/Fedora/etc. include it as the default or good-as-default, then it will be out.
I have a friend who thinks that this will all be done by the middle of the year. I still think that it will be a year before it’s really ready for a production environment.
To all the naysayers: This is incredibly progress compared to what I have ever seen with X. I can’t make use of it (Mac user and I’ve no plans to switch), but I can appreciate it. And those saying that Longhorn will be able to do that forget that Longhorn won’t be out for over a year and is backed by one of the largest companies in the world.
Also, the “just eyecandy” viewpoint is a flawed one, as any long-term use of Mac OS X will show. Smooth graphics rendering can really make a difference in users’ perception of the speed of a computer.
Xcompmgr doesn’t even work with Ati’s binary drivers. So it seriously needs a wlot of work.
Or maybe ATI’s drivers, which thankfully have the reputation they deserve among linux users, need work. NVidia drivers have been great in Linux for a long time and I won’t buy an ATI card until they step it up.
Xcompmgr doesn’t even work with Ati’s binary drivers. So it seriously needs a wlot of work.
I agree that xcompgr needs work (It won’t work with xinerama for me…), but I would suggest avoiding ATI cards…thier Linux drivers are horrible. Even their Windows drivers leave something to be desired.
I use XOrg’s r200 3D drivers (not ATI’s) and the effect is the same: xcompmgr just doesn’t work correctly. It’s incredibly slow and buggy. And it’s being like that for over 8 months…
Xcompmgr does work with ATI binary drivers, it’s just that ATI has no Render acceleration so it’s painfully slow and buggy to boot.
I use XOrg’s r200 3D drivers (not ATI’s) and the effect is the same: xcompmgr just doesn’t work correctly. It’s incredibly slow and buggy. And it’s being like that for over 8 months…
Hmmm…It’s very snappy with all the bells and whistles turned on with my Nvidia 5600, p4 2.4…My complaint is that I can’t get it to work in xinerama… Which caused me to abandon it.
I wouldn’t think it would work well with ATI cards, though…period. The Linux support is nonexistent.
Cairo rendering/smoothness is damn impressive!
That’s not X’s fault, that’s ATI’s. NVIDIA’s drivers provide acceleration for the RENDER extension, so everything that uses it (including XCOMPOSITE) goes faster. ATI chooses not to provide RENDER acceleration, so you don’t get the speed boost. Complain to them, not about X.
One of the xorg developers commented that GCC 4.0 will speed up software RENDER considerably – if it’s enough to be usable without hardware driver support is another story
> Complain to them, not about X
The r200 & radeon 3D drivers (that don’t work with xcompmgr) *are* developed by the DRI guys and *NOT* by ATi. So, I CAN complain about X, as they ship together.
>that don’t work with xcompmgr
That don’t work “properly” that is. The whole desktop becomes very slow, and the shadows look like mud.
I wonder what hardware they were running this on… Somehow I think it’s not a i486 with a Voodoo 1000. I am not trying to be facitious or anything! Was just wondering. I wish they could do real screen captures.
It’s nice any how!!! The WOW factor is way up in this one! Great Job Seth!
This is the only effect that requires GL hardware acceleration in Luminocity (and not even much at that, Kristian’s development machine uses an embedded Intel video card ).
Once we start getting these OpenGL powered xservers and window managers we shouldn’t have to worry about RENDER acceleration anymore. I believe these are orthogonal issues.
Is the driver problem.
We still don’t have enough good drivers.
I know you may yell at me but there is only one real solution , that is to SUE these companies or have gov lawyers make these companies that only support Microsoft open their tech specs up.
I can’t wait for this to hit mainstream distros.
As someone else pointed out, he was running one of those crappy i810 integrated jobbies so everybody that is using a system bought within the past 5 years will get this goodness.
I’m especially excited about Cairo; we should be seeing that very soon right?
It looks like I’ll be moving back to Gnome sometime in the hopefully near future after just ditching my Ubuntu Hoary partition and going to Gentoo and KDE 3.4 yesterday.
I suspect we won’t see somewhat stable bleeding edge of this stuff for another 6 months though – about the time GCC 4.0 starts getting widely accepted
Things are going to get very, very fast
If you somehow start dragging 3-5 windows at a time, will there be a big performance hit?
>start dragging 3-5 windows at a time
And how do you do that exactly?
If you run Linux, do yourself a favor and get a Nvidia card with plenty of video RAM.
Very nice.
I can only imagine what’s going to happen when KDE gets a hold of this technology
And how do you do that exactly?
Dunno, maybe if you open multiple windows you’ll get the same effect.
Nvidia already has great drivers. ATI is the problem. They don’t offer RENDER acceleration. These problems go away with OpenGL accelerated Xservers, window managers, glitz, etc..
I can’t really believe ATI’s tactics. All they get by not paying attention to the Linux market (and all markets other than windows) is losing customers. Nvidia has never had this kind of problem. They really care about their customers. Theres no chance I’m gonna buy another ATI card in the next decades…
xcompmgr is a testbed just like Luminocity- don’t expect it to be fast or perfect.
having said that, KDE 3.4’s kompmgr does an amazing job making x.org pretty and enjoyable to use. All the eyecandy crap is on and it looks neat! I really like being able to see through windows, it has already proven useful.
I have one bug to complain about, and i think it’s xorg’s fault. I get a corrupted display on startup (about the left 3/4’s of the 2 screens I have) but a refresh cleans it up. I had this same issue on startup when first testing xorg and xcompmgr, seems to be a Xinerama specific problem.
So folks, if you run KDE 3.4 you’re already there with a USABLE WM that has all of the Composite goodies (no shaky windows, though) The rest of these are sandboxes for ideas and testing..
ATI has different priorities than eyecandy for fanboys. The Linux desktop market is miniscule and any responsible business has priorties. This was all discussed on a recent interview over at Rage3d.com. At this point, it might not even make sense for them to work on RENDER in their drivers and just go ahead and improve the OpenGL performance on linux.
My mediocre 9600 (tweaked a bit) plays FarCry on windows at 1024×768 with lots of extra eyecandy with nice framerates. The transistors are there, just not the drivers.
umm.. none of Apple’s Eye candy hurts functionality, if anything it makes the environment more pleasant for a normal (read non geek) to work in, which reduces stress and enhances productivity.
eyecandy is grrrrrrrrrrrrrrreat!!!
Does anybody else get effed up widgets with composite on? I stopped using composite because most things would mess up as soon as a scroll upwards.
“Considering that Xorg with Composite/Xcompmgr is buggy as hell (and I can’t imagine how buggy is the new GL server + Cairo), yes, by the time Longhorn will be out maybe this stuff will be usable.”
Yes, Longhorn has no bugs and its development is not late…
I prefer to wait for a eye-candy free system than buy an expensive, closed, DRM-full, bloated, patented, proprietary and buggy non-operating system like Longhorn.
I’m looking to make a purchase of a dual amd64 system and my institution requires me to order only from the ‘big guys’ (right now this means IBM & Sun). Anyone using the quadro FX 3000, 4000, or 4400 under linux or any problems getting xorg running on say the sun w2100z?
If you use metacity. Metacity must be compiled without its own render support and composite manager
After that xcompmgr starts working as it should
I’m not sure if I’d have a use for flashy effects like that, but the videos are sure impressive.
Most importantly, it works with a puny video card like the Intel 810, as it should. All these fancy desktop effects are nothing to any post-Geforce video card.
This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.
Currently waimea uses cairo for creation of its features. The configuration is XML. The person that is working cairo is also the author of waimea. Once cairo becomes more stabilized he will turn his attention to waimea. That said he keeps waimea up to date with cairo system calls.
“This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.”
Thats pretty damn myopic of you.
Implementing the least scaled feature set will always use “less” in terms of resources, but this is at the cost of scalability and future enhancements. Building an infrastructure that uses and supports all of the Direct X9 specification allows for a lot more than just “wobbly windows” as hardware progresses. Unless you feel that wobbly-windows are the pinnacle of desktop “eye-candy” and visual functionality and no one would ever need anything that might benefit from vertex shaders?
Longhorn isnt released yet. I cant purchase Longhorn. Why are you mentioning a not-released product that may not be released with that feature (see WinFS/NT Object FS/Cairo) to something that is about to be released that will definately have that feature?
I dont understand, please explain.
Building an infrastructure that uses and supports all of the Direct X9 specification allows for a lot more than just “wobbly windows” as hardware progresses.
As hardware progresses? No effect that’s usable on a desktop has so far been dreamed of that uses even a fraction of the power of today’s midrange video cards.
Designing a system now, in anticipation of graphical effects which would have to be so ridiculously advanced compared to the current class of ideas, and locking out millions of people in the process, is a mistake.
Do you anticipate Longhorn having any effects requiring vertex shaders? So far the demo’s we’ve seen are nothing more complex than the usual “windows blowing in the wind” type effect.
To the end user, none of this matters anyhow. They will see the effects OS X does now, on cheap video cards, and they will see the effects that Linux does (assuming this goes anywhere) with the same class of hardware, and then they will see that Longhorn does the same effects, but requires a DX 9 level card.
“I dont understand, please explain.”
Ok I’ll point it out for you.
See, Leo made a comment comparing the implementation requirements listed for luminocity and the implementation requirements listed for longhorn’s UI. Here is the comment since you might have missed it the first two times:
“This is why I bitched so long about the full Longhorn interface probably requiring DirectX9 capable cards. Here we see (also in OS X of course) how it can be done with far less hardware.”
I however felt it was necessary to point out that just because an implementation can do it with less system requirements doesn’t mean its better. There is nothing showing that a directX9 card is needed for performance — its quite possible its needed for feature set.
Now, in order for me to actually reply to his post which mentions longhorn, I too have to mention longhorn. See ?
http://lists.freedesktop.org/pipermail/xorg/2004-October/003742.htm…
“They will see the effects OS X does now, on cheap video cards, and they will see the effects that Linux does (assuming this goes anywhere) with the same class of hardware, and then they will see that Longhorn does the same effects, but requires a DX 9 level card.”
You are making a lot of assumptions:
1) OS X, Linux, and Longhorn will all implement the exact same UI eye-candy features.
2) Using the DX9 feature set will not be inherently faster than using the feature set that OSX and Linux use.
Mid-range cards support DX9 currently. Its not a matter of power — it quite possibly is a matter of features for what they intend to do.
Longhorn has 3 levels of support:
1. normal 2D acceleration : 2 MBs of VRAM and above
2. advanced osx-like acceleration: 16-32 MB of VRAM and above
3. Super acceleration and support: 64 MB and above
We don’t actually know what Longhorn will require. It’s still a couple of years away. So let’s not worry too hard about it, eh?
Making all in man
make[2]: Entering directory `/home/esalazar/jhbuild/cvs/gnome2/Xext/man’
make[2]: Nothing to be done for `all’.
make[2]: Leaving directory `/home/esalazar/jhbuild/cvs/gnome2/Xext/man’
make[2]: Entering directory `/home/esalazar/jhbuild/cvs/gnome2/Xext’
if /bin/sh ./libtool –mode=compile –tag=CC gcc -DHAVE_CONFIG_H -I. -I. -I. -include config.h -D_XOPEN_SOURCE=500 @XTHREADS_CFLAGS@ -I/home/esalazar/jhbuild/build/include -g -O2 -MT libXext_la-DPMS.lo -MD -MP -MF “.deps/libXext_la-DPMS.Tpo” -c -o libXext_la-DPMS.lo `test -f ‘DPMS.c’ || echo ‘./’`DPMS.c;
then mv -f “.deps/libXext_la-DPMS.Tpo” “.deps/libXext_la-DPMS.Plo”; else rm -f “.deps/libXext_la-DPMS.Tpo”; exit 1; fi
gcc -DHAVE_CONFIG_H -I. -I. -I. -include config.h -D_XOPEN_SOURCE=500 @XTHREADS_CFLAGS@ -I/home/esalazar/jhbuild/build/include -g -O2 -MT libXext_la-DPMS.lo -MD -MP -MF .deps/libXext_la-DPMS.Tpo -c DPMS.c -fPIC -DPIC -o .libs/libXext_la-DPMS.o
gcc: @XTHREADS_CFLAGS@: No such file or directory
make[2]: *** [libXext_la-DPMS.lo] Error 1
make[2]: Leaving directory `/home/esalazar/jhbuild/cvs/gnome2/Xext’
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/home/esalazar/jhbuild/cvs/gnome2/Xext’
make: *** [all] Error 2
Anyone find it funny that they have the line “Beating the sh*t out of bad apps” on their site when they have such crappy xhtml?
2. advanced osx-like acceleration: 16-32 MB of VRAM and above
3. Super acceleration and support: 64 MB and above
I’d be curious to know what “osx-like acceleration” means. CoreImage and CoreVideo seem pretty hardcore to me (they also require a tad more than 32mb video cards), so if “Super acceleration” is better (presumably, since you’ve ranked it as requiring more hardware), why?
And on-topic: Neat videos, maybe one day I’ll be able to stop laughing when my KDE-using friend insists that he has as much eye-candy as my OS X desktop. Not meant as bashing KDE, less eye candy is fine, I just find it funny that people think it can dance like OS X.
1. Make sure you have Xorg 6.8.2 or 6.8.3.
2. Download this January snapshot of xcompmgr (otherwise you will need to check out the latest code using the ARCH revision control which is not very commonly installed on some distros): http://baghira.sourceforge.net/xcompmgr-2.02.tar.bz2
./configure; make; make install-strip
3. At the bottom of your /etc/X11/xorg.conf file add:
Section “DRI”
Mode 0666
EndSection
Section “Extensions”
Option “Composite” “Enable”
Option “RENDER” “Enable”
EndSection
Add this line to Section “Device” if you have an nvidia card:
Option “RenderAccel” “true”
In the Section “Module” make sure GLcore, dri, glx and xrender are loaded.
4. Make sure your Xorg supports your graphics card in an accelerated 3D mode. Select “DefaultDepth 24” on your xorg.conf file under the Section “Screen”.
5. Create a hidden empty file on your home folder called .xcompmgrrc
6. startx
7. Open a terminal and type:
xcompmgr -cCfF -r7 -o.65 -l-10 -t-8 -D7
8. Profit.
>I’d be curious to know what “osx-like acceleration” means.
I meant of 10.2/10.3, where the acceleration was not full.
9. Enjoy Mac OS X 10.0 and 10.1 in all its glory if your card is not an Intel or an nvidia one (aka “slow”).
Quote: “ATI has different priorities than eyecandy for fanboys. The Linux desktop market is miniscule and any responsible business has priorties.”
And I urge people not to buy ATI products, or products that use ATI cards. It’s the only way to get thru to bastards like this that the Open Source community will not tolerate discrimination. When you start hurting their hip pockets, then they’ll start to listen.
Dave
Firstly: What does Linux has to do with this? This is a Gnome project exploiting an experimental X11 extension funded by RedHat, the only ties to Linux here is the money.
Secondly: Comparing the Linux family with with the products OSX and Longhorn is just silly. I can name a few Linux’es that doesn’t ship with a GUI at all.
But most (if not some) GNU/Linux distribution that uses GUI. And some uses GNOME. Won’t be surprised if a few distro will be incorporating this eyecandy when it is officially released.
And Solaris, and *BSD, and even OSX can run Xorg…
But yes this is a technology that will be availible to Linuxes too. I just don’t see the relevance in the xorg vs. win vs. linux thread.
“Or maybe ATI’s drivers, which thankfully have the reputation they deserve among linux users, need work. NVidia drivers have been great in Linux for a long time and I won’t buy an ATI card until they step it up.”
You may want to take a tour of the Nvidia user forums before you break out the pom-poms.
KDE has far more eye candy than default OS X. Eye candy is not synonymous with 3d acceleration (which kde has a buggy compositor now). Go look on kde-look.org and notice all the crappy unusable stuff that looks good .
Is there any trick involved to get xcompmgr working in Fedora Core 3? I see the files “/usr/X11R6/lib/libXdamage.*” yet xcompmgr insists “No damage extension”. Composite gets displayed in the “xdpyinfo” output but not DAMAGES and XFIXES.
From those videos it looks quite speced up. I wouldnt mind having that on my desktop
The final open source stack will look something like this:
1) framebuffer/DRI device drivers – both will be required
2) mesa with EGL. this is being designed right now.
3) XGL the X sever based on OpenGL.
4) luminosity or some other window manager.
5) Cairo running glitz
Nvidia/ATI will have their own versions of 1/2.
Right now XGL is an X server running inside another X server. You are only seeing technology demos. No one has a standalone OpenGL stack working to run XGL on yet.
…makes me wish I hadn’t sold my GeForce FX 5200. I didn’t think I’d need it since I don’t do 3D gaming anymore, and now I’m on a Radeon 7000 VE. The Radeon is fine for regular desktop stuff, but I recently tried enabling composite shadows in GNOME and it was too slow to be usable. It looked great though! I guess I’ll be saving up for another GeForce FX, maybe a 5500 this time.
I agree wholeheartedly about ditching ATI. I tried installing Gentoo on my laptop (Radeon Mobility 9000) and let me just say that getting opengl to run was a pain to say the least! Even after I got it to run after hours of config editing, glxgears only ran at a few hundred frames per second.
Yesterday I installed Gentoo on my desktop (Nvidia GeForce2 Ultra) and the only problem was a compile error emerging nvidia-kernel. I simply attempted to emerge the latest masked version and all of my troubles went away.
My next video card I buy will definitely be an Nvidia, even if ATI does attempt to solve their driver problem.
If you don’t like hours of messing with a driver: don’t run Linux and for heavens sake, don’t run Gentoo. It’s kiddies with this mentality that make people label sects of a community as zealots. It was originally made for hackers to have fun, and sticking a pretty GUI on top of that doesn’t really change that fact. Deep down inside, Linux is ugly like some fat geek’s buttox. Grandma still isn’t supposed to install her leenucks and get on AOL.
All of that said, by all means have some fun and share it with folks! It’s useful, it can be stable, and it’s just plain awesome. Why complain?
The bottom line: it’s wrong to expect any of it to work. You’d be lucky if ATi donated an hour of their time by paying a 16 year old dork minimum wage to make your driver. We’re obviously quite a bit more lucky than that given the support that FOSS has.
Eye candy is cool and all but it’s just window dressing as far as I’m concerned. You know what would really be useful? Hardware accelerated OpenGL without X. I maintain software that does parallel rendering on linux clusters and lemme tell you something: dealing with X on a cluster is a pain. I’m actually a bit shocked that this doesn’t exist.
Why are you being so harsh? Just because I am using Gentoo doesn’t mean I want to be spending hours getting something that shouldn’t take that long to work. It’s not like the hardware is some obtuse piece of hardware that only 5 people in the world have.
Ok, so the goal of Gentoo is to have fun. Replace “Gentoo” with “Fedora” and will you have the same response? Drivers are drivers no matter what distro it is. I do have fun making my system run smoothly and to my liking, but taking that amount of time to install video drivers shouldn’t be that difficult, especially when nVidia can prove how simple it should be.
And why exactly is it wrong to expect it to work? I paid them for their video card, so I expect it to work. I won’t complain if they give closed drivers. I won’t complain if features like TV-Out don’t fully work. I just want the video card to do it’s primary task.
If nVidia can make it happen, why can’t ATI? All that I am saying is that I won’t be supporting ATI because of their driver history.
Rayiner Hashem is right, EVAS had this years ago.
I witnessed EVAS’s power with my very own eyes as Rasterman demo’ed it at one of the local Linux Expo’s a few years back. It was the freaking coolest thing I ever saw running on X11.
Too bad more don’t use it.
In response to my own post about using graphics hardware without X: The wheels are already turning. Thank Jeebus.
Mid-range cards support DX9 currently. Its not a matter of power — it quite possibly is a matter of features for what they intend to do.
I believe that one of the needed features is the ability to draw bezier curves. I think DX9 cards can do that DX8 card cannot.
Actually I would expect more problems in fedora, that’s just me. It’s still out of the goodness of their hearts to provide any support. They do make sure their card that you buy works in the enviornment that they intended it to be released on, be it OS X or Windows.
Personally, I find it more than satisfying to see the Xorg folks comming up with what they do have given ATi’s support or NVIDIA’s suppoort. In good style, it doesn’t surprise me that they’ve come up with the eyecandy that they’ve got running off of such a simple machine. (What was it, an Intel box somebody quoted the demos comming off of?) You’re running off an OS that isn’t guarenteed, that isn’t really supported.
I would expect my SUsE desktop to run to specs that SUsE promises, never more than that. I would expect to spend a few hours tinkering with any given item in Gentoo. The Gentoo devs say themselves that you shouldn’t use it if you expect things to just work in a production enviornment.
I’m only harsh because this mentality that Linux is a product and it should work in ____ ways taking ____ amount of time to setup seems to be spreading like wildfire. It’s sick.
I agree that eyecandy can be a good thing, especially when it is not heavily dependent on the power of the video card (as evident from the i810). However, I am curious to know how much of a burden this is to the system processor and memory. As long as they increase is not outrageous, this is A Good Thing.
Thanks Eugenia. Following the instruction, I was able to enable Composite. However, this step:
“In the Section “Module” make sure GLcore, dri, glx and xrender are loaded. “, you don’t need to load both GLcore and dri if you already have glx.
The test is succesful on Fedora Core 3.
@Dennis J. (IP: —.dip.t-dialin.net) –
You forgot to exit X in order to enable Composite. Switch to console mode with Ctrl+Alt+F1, log as root or user with sudo enable, do init 3 then init 5. Don’t forget to log out in console mode. Hope it helps
It looks like OSX environment. It appears to run smoothly without performance hit. I can’t wait to see what future distros will unleash. Hopefully there will be an option to enable them as Composite appears to be stable.
mesa-solo will allow for full opengl acceleration without X i believe.
What about Intel video cards? I’ve experienced the bad of both NVIDIA and ATI, but if the eye candy can be had with onboard intel video chipsets, where are the expansion cards? How well do intell X11 drivers support GL?
If you read the article you would know that those videos were run on an intel chipset
I believe that one of the needed features is the ability to draw bezier curves. I think DX9 cards can do that DX8 card cannot.
Really? Seems odd. Bezier curves are not complex or computationally intensive to draw. Got some more info on this?
Rapsey> If you read the article you would know that those videos were run on an intel chipset
Yes, that is why I was asking: “where are the expansion cards? How well do intell (sp) X11 drivers support GL?”
1) They are only catch up MacosX because longhorn is not release.
2) Why linux users always say that eye candy effects are useless and for “noob” when they don’t have it, and why they are working for an X server that allow those effects.
Instead of ceo’s defending their shares online now we actually see some results.Those videos are nonetheless quite impressive to say the least.
Many are using lightweight wm just because they are more productive with that. When I’m using osx I disable most of the graphic effects. Some people likes FEATURES, not eyecandy (eg: I can’t stand Metacity not being able to maximize windows in just one directions and other features I may consider wrongly needed to call an application a good wm)
I think they just shook the camera.
Real hard.
😉
EVAS, RENDER, etc.. all perform wrong arithmetics on colors. They all assume RGB is brightness so add, div, etc direct on precorrected values. it is fast but wrong. You must do gamma convertion, do what you need ( mul/div/add …) and do inverse gamma correction exactly before write to frame buffer.
There was someone mentioning evas …
I remember that article about evas, and the video looked really cool.
Wouldn’t it be possible to use evas instead of cairo? Where are the differences between both ?
(This is a question, it’s not to start a flameware cairo vs evas).
lu_zero,
Metacity can maximize windows in one direction.
From: Steve Block (desm.qwest.net)
This is incredibly progress compared to what I have ever seen with X. I can’t make use of it (Mac user and I’ve no plans to switch), but I can appreciate it.
Who says? I have no doubt that GL-enhanced XOrg bridged to QX (as the current XFree and XOrg releases are bridged to Quartz) will be available once it becomes ‘mainstream’, thus giving you similar painting performance for XWindows applications as for native Mac OS X ones (provided you run it within OS X).
Smooth graphics rendering can really make a difference in users’ perception of the speed of a computer.
Agreed. As it does the user ‘experience’ of said computer.
“When I’m using osx I disable most of the graphic effects.”
When I am using OSX I disable all the graphics and switch to the shell. lol
This version of X.org and this window manager are available for anyone now, legally. Seth has step-by-step instructions on how to download and use them. So how does that count as ‘catching up’ to a product which we can’t legally use for another, at least, year?
Er, some developers and skilled UNIX alphaware compilers being able to use things alpha, not integrated, demo of a composite manager does not count as “anyone can use it”.
Tiger will retail next month, and Panther already does more of this, stably (sic) and more intergated.
Even when Longhorn gets out (2006? 2007?) this still won’t be readily available for linux desktops.