“We are pleased to announce the availability of the first full release candidate for the upcoming X.Org Foundation release of X11R6.9 and X11R7. This release marks the completion of the development cycle for the modular source tree. We have tagged both the monolithic and modular trees and have prepared tarballs for you to test. We have updated the schedule on the release plan page.”
Any significant changes compared to Xorg 6.8.2?
What about the EXA feature, is it included?
http://wiki.x.org/wiki/ChangesSince68
EXA’s on the list…
Does this mean I can finally have hardware accelerated composite with my Radeon card?
If so I can’t wait to test
no, it will just crash prettier
Our goal is to have releases every 6 months
I really like that. Just like gnome.
X.org will maybe have decent composite (translucency/alpha). X.org is progressing at turtle speeds.
Why are Red Hat, IBM, Novell and others not adding more hackers to this project? It’s kinda very important piece of *NIX.
I’d imagine that the KDE/GNOME projects are getting much more support from these companies.
X.org will maybe have decent composite (translucency/alpha). X.org is progressing at turtle speeds.
Why are Red Hat, IBM, Novell and others not adding more hackers to this project? It’s kinda very important piece of *NIX.
Actually, the problem is that modularization required more time than expected (IIRC the RC1 was originally scheduled for end august/september) so, with the exception of EXA, there were no new features in 4 months. Now that modularization is done, hopefully the existing developers will focus on innovation again, and, more important, the entry barrier will be lower. For example, working on a single driver will not require downloading 300mb of sources from CVS nor rebuilding the whole X.
First off, RedHat, IBM, and Novell are oriented to providing Linux server solutions. The current X is more than sufficient for that purpose.
X.org is moving at perhaps 10 times the speed that XFree86’s X implementation was moving before the fork.
How long has Vista been in development? Paul Thurrot’s first article on Longhorn development was over 3 years ago. X.org 6.7.0 shipped in April 4th of last year.
Alpha visuals already have strong support in the X server. Why isn’t it all over your desktop now? Various reasons. Toolkit devs haven’t exploited the potentials of composite and alpha aware visuals. Application authors haven’t either. And most importantly, compositing operations require drivers that on the whole the OSS community doesn’t have the information to write. MS does. You can’t blame that on RedHat, IBM, or Novell. That’s politics and nothing else
X.org development is only getting faster. Modular builds will make the barrier for new devs smaller. Serious thought is being given to the core driver acceleration achitecture.
In just over a year, X.org has half as many stories in OSNews archives as XFree86 has in all of OSNews history. The evidence says that X.org is moving faster than ever before and that X is really innovating for the first time in a long time.
It may not be moving fast enough for your tastes, but it certainly isn’t moving slow.
*sigh*. What do you expect, them to jump out and write nvidia’s drivers and ati’s drivers instantly without docs?
The compositors available are already quite good, and if you’ve ever looked into it: Incredibly simple. I believe a year ago the XComposite program was about 600 lines. Now, most of the code is sitting in the xserver, waiting for you to take advantage of it I believe.
A lot of the problems still lie in driver instability.
And X.org has made leaps and bounds… They just announced the first RC of R7, which will be modular: I.E. It will make room for more devs (because there’s less work to getting to know the code you’re gonna work on) and it will make it easier to distribute their drivers.
They’ve layed the foundation, now we’ll see how they go on making neat features from that foundation.
You’ll also be interested to know that the “legacy” code in Vista (all GDI programs (Every Windows program existing today)) will still have their frames rendered on CPU in main memory and then a transport to the VRAM for compositing.
That means, those fancy and amazing, not to mention efficient, graphics will only exist for new code. The old code will just get composited (which is a performance boost).
6 month product cycle for this project is the bomb.
Especially considering that they integrate DRI OSS Radeon drivers into here.
With the new modularity hopefully it’ll be made easy to install newer drivers.
I also like the fact that(even though they’re experimental) there are OSS drivers for R300 class cards integrated. As fast as this FireGL 8800 is, it’d be nice to upgrade.
I hate product cycles more than 1 month lol. I WANT IT NOW! The changes look good so far…
Where can I find what the changes were for the NV driver? It’s one thing I dislike… NVIDIA and it’s stupid license complicating installation for n00bies.
All the details on all the changes can be found in the
ChangeLog at http://cvs.freedesktop.org/xorg/xc/ChangeLog?rev=1.1448&view=markup
Is it just me, or do the modular builds take a lot longer than the monolithic ones?
Is it just me, or do the modular builds take a lot longer than the monolithic ones?
I assume that having many more build targets with all the associated overhead will slow download the build process.
However, now, if I want to get DRI support for my MACH64 based laptop, I won’t have to download the full X source tree and build it*.
(Let alone the fact that X uses a -very- weird configuration and build environment)
Gilboa.
* Actually, I use the binary snapshots which work just fine on my Slackware. However, someone else had to download 300MB of code, enable DRI and rebuild the damn thing…
“Is it just me, or do the modular builds take a lot longer than the monolithic ones?”
No, it’s not just you. It shouldn’t take a whole lot longer to build, but it will add some time.
Right now I’m working on a tool that generate information about a build environment from the Makefiles. Essentially, I hijack the native compilers with my own script that collects information about the compilation that would have been taking place. This fake compiler looks and feels like a real one from the perspective of the build system, but doesn’t even open the source files for reading, much less compile them.
I run this over a codebase that usually takes around 5 days to build, and it takes about 6 hours to run. Almost all of this time represents the overhead of the modular build environment. In other words, I would approximate that the modular tree adds about 5% to the build time of Xorg.
But, on the other hand, most Xorg developers would argue that the modular tree makes it more than 5% easier to develop for Xorg. Therefore, there should be more than 5% more improvements per unit time than there were with the monolithic tree. So, I’d say that the modular tree is a positive development for the end user, even if it makes the build a little longer.
“But, on the other hand, most Xorg developers would argue that the modular tree makes it more than 5% easier to develop for Xorg. Therefore, there should be more than 5% more improvements per unit time than there were with the monolithic tree. So, I’d say that the modular tree is a positive development for the end user, even if it makes the build a little longer.”
I’m not about to argue that modularization is bad for development, and it is something I’ve been looking forward to. I’ve not done any sort of real testing, but it seems to take a fair bit longer than your estimate would make it, IME.
Alpha visuals already have strong support in the X server. Why isn’t it all over your desktop now? Various reasons. Toolkit devs haven’t exploited the potentials of composite and alpha aware visuals. Application authors haven’t either. And most importantly, compositing operations require drivers that on the whole the OSS community doesn’t have the information to write. MS does. You can’t blame that on RedHat, IBM, or Novell. That’s politics and nothing else
Thats why I bought an Nvidia card just to use xcompmgr. Only Nvidia and ATI have the resources to make good drivers, and I’m sick of waiting on OSS drivers to catch up to have a modern Linux desktop.
It’s not a matter of resources or catchup. Only Nvidia and ATI have the specs for their cards. If they gave these out then good OSS drivers would exist.
I’ll go out on a limb here and declare that X11R7 will be the “shot heard round the world” for the graphics chipset vendors providing top-quality driver support. Why?
1) Modular tree enables graphics vendors to, for the first time, ship a graphics driver independent of the release of the X server, using the X server’s driver framework. It also facilitates the development and testing of graphics drivers.
2) EXA is an new attempt to make the acceleration API much clearer than it was before. No longer will graphics vendors need to hunt through MBs of source to find out how to implement acceleration for X.
3) The changing of the guard, so to speak, becomes complete with X11R7. Xorg has asserted its leadership and stewardship of the general purpose X server, producing the first milestone release in over 10 years. Graphics vendors will have confidence developing for X, just as they have confidence developing for WVDDM.
4) The X userbase is expanding, from the workstation and development profile to the desktop profile. People are demanding that they have great OpenGL performance and smooth eye candy on X. They want to play games and see through windows. Three years ago these customers did not exist in any significant numbers.
5) The mass media and public perceptions are changing. Computer technology companies are realizing that being billed as a Windows-centric shop is potentially damaging to their marketability. The OSS VC/IPO bubble is coming to a market year you. Maybe 90% of the startups will fail, but the nature of OSS dictates that these firms will give back to the world more than just used office furniture. Yesterday cross-platform support was a moralism, tomorrow it’s a buzzword.
So, in short, I think this is a good thing.
It’s been a while since I read such an insightfull comment on OSNews.
Maybe it’s the backlash from the US vs. UN, internet thread. 😉
In any case, thanks, make me think this site has a hope after all.
As of this moment, the next-generation Xgl and Xegl projects are virtually dead. For the past couple years, only two dedicated developers have been spending their time on these projects — one of which recently quit because of lack of interest from other developers.
Every time I launch X.org on my top-of-the-line system, with a 3GHz P4 processor, 2GB of RAM, and an Intel integrated graphics card (that produces absolutely smooth, flowing performance on Windows), I can see why Linux is not further down the line. The GNU command-line suite is excellent, as is the Linux kernel, system performance is fabulous, but the last missing piece is the GUI.
Why are we continuing to rehash and rehack the twenty-year-old sludge of subfunctionality known as X11? More to the point: why do I need to wait half of a second for Firefox to resize? Why do I need to wait SECONDS at a time for Eclipse to redraw itself when moving around the screen? Why am I reminded of the old Windows Solitaire “exploding card” screen when moving something as simple as a terminal?
When Vista is released, I will gladly spend money on an operating system that can squeeze every last bit of functionality and performance out of my top-notch hardware. I can stand a molasses-slow Office suite for the sake of openness, and I can accept a half-assed media player (which I only recently have been able to get playing), but at least grant me as much as a descent GUI system.
Just having the composite extention enabled really helps with all of your complaints. However, I think this only applies to nVidia cards at the moment unfortunately.
Is EXA the thing that was supposed to help with this problem?
You forgot the ever popular “my CPU usage spikes to 90% when I move a window with Metacity”.
Vista is not going to make your problem any better, since your going to need a hefty graphics card. It took Microsoft a very long time to implement the DX stuff in Vista (3 years?)
Why dont you complain to Intel, since they are the ones at fault here.
———-Every time I launch X.org on my top-of-the-line system, with a 3GHz P4 processor, 2GB of RAM————
At the same time you said
————When Vista is released, I will gladly spend money on an operating system that can squeeze every last bit of functionality and performance out of my top-notch hardware.————–
That 3ghz with 2gb of ram is the *MINIMUM* needed for just the OS. Just the OS.
While my grandma’s 1ghz Duron w/ 384MB ram still works fine.
No matter how fast AMD and intel make their processors microsoft will find a way to slow it all back down again.
No matter how fast AMD and intel make their processors microsoft will find a way to slow it all back down again.
To be fair you could have said:
No matter how fast AMD and intel make their processors microsoft will find a way to utilize it.
It’s not like a modern CPU has anything else to do…
I could’ve been fair and said that, yes.
But in the context of what I was replying to, my statement was accurate.
It’s no secret that MS is the king of bloatware.
His argument was efficiency. In light of Vista’s early/announced system requirements, the argument about efficiency is simply a joke.
I don’t want the OS to be utilizing that processor. I want the OS to be out of the way, so I can utilize taht processor for games, rendering, the internet, watching movies, office, playing music, burning DVD’s, whatever else I may be doing with the computer.
With the system requirements that Microsoft has laid out for vista, there’s a very real and accurate conclusion to be made:
Vista will be in my way if installed. Yours too. That’s simply just too much bloat. XP gets in the way now. That’s why I use linux.
I shouldn’t need 3ghz and 2gb ram just for the OS. So if I want to browse the internet and watch a movie I need 3.5ghz and 2.5gb ram? That’s pure insanity.
Do you really believe that Vista will fill 2GB of ram with unneeded shit and have the processor run in circles just for fun?
I think the minimal requirement is just to ensure a smooth operation in _any_ operation. Thus the cpu would be needed when moving a transparent window around (over a movie?) and playing with the other effects of window managment. Nothing you would do while playing a game.
> More to the point: why do I need to wait half of a second for Firefox to resize?
Wow! I am running E17 cvs as my window manager on a Celeron 1.06Ghz laptop with 256M RAM, and experience no such thing…
What WM/DE are you using?
I want to know so that I can avoid it! 🙂
> Every time I launch X.org on my top-of-the-line system,
> with a 3GHz P4 processor, 2GB of RAM, and an Intel
> integrated graphics card (that produces absolutely
> smooth, flowing performance on Windows), I can see why
> Linux is not further down the line.
Since when does a “top-of-the-line” system have integrated, shared memory, Intel graphics?
I suggest you do some googling about the current ambitions and efforts of the Xorg folks. The Renaissance of X11 is quite grounded. Keith Packard and others have shown that poor implementation was the primary problem, it had nothing to do with the X11 protocol at all.
The new efforts seem primary interested in re-architecting xorg with modern rendering extensions, proper integration points for specialized hardware, creating new API’s that lend themselves to better performance and quality, and separating X11 code from kernel-space code to increase stability.
Hmm…I certainly am glad that I’m not so unlucky…
I think what’s REALLY needed is for people to learn how to make use of the pure simplicity of Linux. Somehow I managed to get my 2.2 GHz P4 with 512 MB RAM and a 32MB GeForce Graphics running smoother than it did in windows…with all of the bells and whistles enabled.
I am currently running enlightenment 16.7 with xinerama and composite running at optimal efficiency. I experience almost no lag, unless I load all of my twenty-something games, videos, image editing apps, terminals, etc onto one virtual desktop.
Strange how your “top-of-the-line” computer is doing much worse than my bottom-of-the-shelf one.
I’m looking forward to this release hoping that it will include a “real” driver for the integrated graphics of the VIA K8M800-CE Chipset which I have.
Absolutely true. I’ve had a via CLE266 chipset for about two YEARS now. The unichrome (2d) is finally working perfectly with the via driver instead of the vesa, but I STILL haven’t got any DRI at all. The 3d seems to be working, but it is not hw accelerated. What’s the use for 3d seeing it at a staggering rate of two fps?
I can maybe get it to work using Mandrake (yes, not mandriva) 9.2, a specific XFree86 (4.4 i believe) version and install the drivers myself. However I live in 2005 so I am using Ubuntu with Xorg.
If someone from the DRI team is reading this: Please please please finish the DRI for via CLE266
YAY!
You obviously do not have 2D acceleration active on your video card. Those problems do not exist using a card which has a proper Xorg driver loaded.
Where is DRI ??
http://en.wikipedia.org/wiki/Direct_Rendering_Infrastructure
http://keithp.com/~keithp/talks/xserver_ols2004/
Where do I download this?
Will there be binary packages?
I would like a .dep package for Ubuntu Breezer. I want to test this. Those fixes for Radeon have been long wanted.
Where do I download this?
Will there be binary packages?
I would like a .dep package for Ubuntu Breezer. I want to test this. Those fixes for Radeon have been long wanted.
Umm…xorg isn’t something you just download the new pacakge for. You will get it, but only when it comes with the next release. Its a HUGE part of any desktop Linux, and it takes a lot of work to get everything to be smooth with it.
So basically, you can have it as soon as you are willing to move to the unstable/development version of Ubuntu.
I tried the new xorg cvs not so long ago and the open source nv driver had 3d support and was MUCH faster with composite. It’s amazing what they have acomplished in such a short space of time, some of the other opensource drivers like ATI have very good 3d with full EXA support.
I think nvidia are waiting for xorg release so they can release the 80.XX series, seemslike a significant move forward on both fronts. I’ve been using composite in gnome for a long time, windows move so smooth and no redraw what so ever. Mind you the -ck kernel patches really help with the latency, and when minimizing they no longer leave slight shadow trials on the desktop.
Will this bring exa (or generally composition) to the radeon 9700 cards?
Yes, from what a freind told me when using Xorg cvs with his radeon.
That applies to radeon 9700 aswell?
I know that all cards up to 9200 are very well supported by the opensource radeon driver…
yes, more recent radeons are supported with this release. I think all the 9×00 are now supported.
There is experimental DRI (OpenGL 3D acceleration) support for radeon chipset from 9500 up to x800 (IIRC some one reported it was working with x800 agp card).
But there is no exa support for this chipset. EXA support only up to 9200. No one have yet written EXA for newer radeon chipset. Anyway i am confident that radeon 95/7/8/…00 will be supported soon.
You can already use EXA with 9600+ however there is no hardware rendering yet with XRender which is coming. I 3D however on the r300 chips is getting better once XRender is supported, things will seriously fly.
Will X.org R7 support 3D on an old ATI Rage Mobility M/P (8MB VRAM)? It’s a Fujitsu LOOX laptop.
3d Support for M/P exists for some time, unfortunately rendering is only 2-3x faster than in software .
I’m a n00b…how do I enable it? I’m still looking for a good link to some directions. Thanks.
DRI will not improve 2d-performace (composition), correct?
Glad to hear about these improvemens in the x. Especially those about ATi cards. I hope these guys can get some nice 3D accel of my 9600, because my kernel tainting fglrx drivers really *@#!$% (insert bad word here).
Improvements in X, plus a much faster Gnome 2.14 (though I’m on XFce 4.2.2 now), plus hibernating laptop (fgrx is not laptop friendly) will make next summer distros quite promising.
BTW: Am I the only one experiencing better 2D performance with the ati OSS drivers thant with the propietary fglrx?
I have no problems whatsoever using the latest ATI xorg propietary drivers.
Only thing is that I can’t enable composite with DRI enabled and i hate that!
Hope this release will change that although I doubt if EXA in this release will work with my Moblitiy X700 PCI-Express card.
Yesterday I compiled this new x.org release, but it took so many time that I couldn’t testdrive it.
When I woke up this morning and looked @ my Slack 10.2 screen I saw the compile of X was succesfull, so tonight I will give it a try
Lol. I personally have no end of trouble with either driver on any modern Linux I’ve used, and have no troubles whatsoever with the very same OSS driver in DragonFly. I’ve not had any luck getting the OSS driver to do anything in FreeBSD since 5.3 and I am baffled.
A couple Unix using buddies of mine have the exact opposite experiences as mine, excepting that they have no issues in DF either. Then there are people I know who’re using Windows with damned near any card without problems.
Irritating ;^)
No you’re not the only one experiencing that. The OSS driver is *MUCH* faster at 2D than fglrx.
“I have no problems whatsoever using the latest ATI xorg propietary drivers.
Only thing is that I can’t enable composite with DRI enabled and i hate that! ”
So without DRI can you get decent composite accelaration with fglrx and XOrg 6.8.2?
I’m currently using radeon driver but composite is so sloooooow that I’ve disabled it.
Ah ok, I didn’t see you were talking about composite.
But you’re right, when you enable composite DRI will be disabled that’s why things get so slow, I have the same problem.
EXA should fix this so you don’t need DRI anymore to run accelerated composite.
I read the comment above on Xorg development rate.
I hope it does increase the speed and performance, at the moment Windows is more responsive and faster than Xorg by far.
Which is a huge shame, its as if Xorg is letting Linux and its users down
I have to TOTALLY agree with the poster above.
I have a i686 optimized kernel, nvidia drivers up and running. Very light WM – Openbox.
And the GUI is still not as fast as Windows.
Something needs to be done!
Xorg really IS letting down linux.
A friend of mine gave linux up because of this issue!
I was just about to post the same thing. :O. Firefox in linux is unusable because of this and Opera is much slower than in Windows. Why is Xorg / XFree86 so slooooooooowww? Somebody tell me! Linux would be the perfect OS if not for slow speed I have good hardware!
Why oh why you trolls keep on using the same examples and bad logic ?
As always, you trolls choose apps using slow toolkits (basically Firefox, OOo 1.1 and Opera) to then tell us that X is slow. Of course, the apps and their toolkit are slow (and NO, Firefox is NOT a GTK app, it just uses some GTK hooks).
I always wonder when I hear then people say X is slow, because the default behaviour I see in Windows users, is using their app in full screen. I have a 21′ monitor, so I use 1600×1200 resolution, and I do not move Windows often, and I never experience these behaviour. Because I don’t use Firefox (very bad app in Gnome or KDE, does not integrate at all), I don’t use Opera, and when I use OOo, I don’t move the Windows at each sentence I make in it.
That’s to say you trolls are so obvious, please try to be mature and stop the nonsense.
And try fixing your heads to not put nonsense like ‘Firefox slow’ => ‘XOrg slow’, please !!
It has been proven time and again that X is not slow, and that the problems lies in basically 3 things :
– latency
– specs for drivers
– poor implementation of some features
All of this is getting fixed, but I don’t hold my breath for you to stop trolling : as soon as some false showstopper of Linux is fixed (even when it’s worse in Windows), you trolls are pretty quick to find a new one. At least you are an uptodate troll, others are worse than you, still bugging us with the ‘ugly font’ showstopper.
Right so so the problem lies in slow toolkits.
What toolkits do Firefox and Opera use?
And why is it these “toolkits” perform great on Windows and not Linux?
And you said “All of this is getting fixed”, where is the evidence?
Are these problems even recognised and taken seriously?
Does anyone knows if composite works with Xinerama with Xorg 6.9? In 6.8 you can not use composite on dual screens because of that.
My laptop has an integrated intel 855 display adaptor. Trust me, you can barely find a graphic card worse than mine these days. And with Gentoo’s latest version, everything is up and good. At least I can smoothly move things around and watch movies.
I personally think many of the performance issues people are now experiencing have something to do with old versions of software installed. The latest version should make many things smooth and easy. And tuning 2.6 kernel will certainly improve the desktop look-n-feel a lot. If you are using some kind of Desktop Environment, please use Gnome 2.12 and / or KDE 3.4, I don’t think they will let you down.
I heard many proprietory drivers have serious performance issues simply because manufacturers are benchmarking against very early days of XFree 3.3.x. Hope that’s not true now.