“I’ve written a lengthy article covering what I learned during the last two years building the Xegl display server. Topics include the current X server, framebuffer, Xgl, graphics drivers, multiuser support, using the GPU, and a new display server design. Hopefully it will help you fill in the pieces and build an overall picture of the graphics landscape.”
Stacking layers upon layers of functionality/hacks on top of an old monstrosity like X will never get Linux anywhere near OS X/Vista levels. For example, the current Cairo-enabled stack of X crap: App -> gtk+ -> Cairo -> X Render -> X -> XAA/EXA -> hw
*wow*
No wonder it’s slow. It’s time to start over, guys. No matter how much lipstick you put on a pig, it’s still a pig.
I must admit I’m not a fan of the gtk+/Cairo stuff. gtk+ in itself is slow. But this does not mean that the general X design is bad. In contrary… It just needed to develop at more speed than in the XFree86 days.
Just wait. It takes (I guess) a year to all come together: KDE4, Exa, XGL etc. etc.
Here’s a link to a movie taken at the current aKademy2005 (KDE developer conference). It’s rough, but it shows you the (impressive) things to come…
http://vizzzion.org/stuff/xgl_wanking.avi
Here’s a link to a movie taken at the current aKademy2005 (KDE developer conference). It’s rough, but it shows you the (impressive) things to come…
http://rapidshare.de/files/4553011/xgl_wanking.avi.html
According to the KDE developer, this is on a 4-years-old notebook running XGl on a Radeon hardware. Quite impressive.
How do you know X is slow? What system did you use to validate your point? How do you know Vista level is the best?
Visual/subjective comparisons. I’ve used Vista on my hardware. It’s snappy. The UI is smooth. There are no ugly drawing artifacts. I use OS X on my Mac Mini. While it’s not as snappy as Windows Vista on my 2.4 GHz Athlon 64/X800 XT PE, it’s still pretty damned quick, looks great, and maintains smoothness no matter what I’m doing.
Then you have Linux with X … the last time I used it on my Athlon 64 (1 month ago), it didn’t feel nearly as fast as Vista with its debug code. There was ugly smudging here and there while dragging windows. Some fonts were anti-aliased, some weren’t (and it’s not like the anti-aliasing looks great … it looks like plain blurring and hurts my eyes). There was a visible hitch in UI smoothness when something big had to be drawn (a new web page filling the screen, for instance).
That kind of stuff is what I meant. I don’t experience any of it in Vista/OS X — so that’s the level I expect. Slow and ugly is the current state of things in the X world.
Facts:
– 1.42GHz PowerPC G4 / ATI 9200 is not as faster as 2.4Ghz Athlon 64 / ATI X800XT PE.
– ATI Linux drivers have poor performance.
Conclusion:
The X design is sh*t and must be redone from scratch.
Just WOW
Yeah, X running on my considerably faster PC is slower and uglier than OS X on much weaker hardware.
Get it now?
Ahhhh the typical Linux fanboys modding me down yet again because I’ve clarified a point that makes them cringe.
You may continue with your massive wankfest.
No, they’re modding you down becasue you are trolling. Try acting a little more mature instead…
Oh, and now you’re modding me down, Linux is Poo. Is that what you consider more mature?
How was a clarification of what I meant a troll? Get your head on straight. Some guy misunderstood me as saying “My much faster PC has a GUI that feels faster than my considerably slower Mac”. Well no sh*t, but that’s not what I meant.
I was referring to your posts in general. Heck, even your nickname is a troll. I guess you must enjoy being modded down…
1. Routinely do everything possible to be offensive or inflammatory
2. Get moderated down
3. Whine
From your usual behavior, that they don’t just delete your account comes as a remarkable degree of tolerance to me.
That said I got the impression that the original implication about the Mini not comparing favorably performance-wise being that you were a liar about it performing better. If they were confused in the manner that you outline including it in the mocking deduction makes little sense.
So obviously I’m being moderated down based on my name, and not the content of my posts. The fact that my original post about X being a pile of steaming crap is at -5 is proof enough of the fact that Linuxtards don’t like being faced with the reality that their software is poor.
It’s time to come back to reality. Rather than silencing everyone who brings up points about how it could be improved, you people should try to listen. Living in your little dreamland won’t last for too much longer.
Fucking blind sheeple.
For one thing, talking about X instead of a specific implementation shows how clueless you are.
Erm, my girlfriend runs KDE on a Compaq, with a 380 Mhz Athlon, 192 Megs of memory, a 4.1 GB Hard Disk, and an on-board SIS POS video card. This is KDE 3.3.2– AKA ‘new to debian’ software. Admittedly, it’s not pretty (the boot time makes me cry), but with the eye-candy turned off (for the most part) it is quite usable (And this is under a stock config, with a stock kernel, so she doesn’t have to do anything wonkey to update it).
My GF is the living proof that the Linux Desktop is here. She doesn’t know computers– it HAS TO just work, and it has, for months.
Try saying that about WindowsXP on the same harware! And Windows XP is quite a bit older than debian testing… And it sure beats the Windows 98 she was using (WTF, OMG, What wrong with this thing, Argh, Cry, Cry)… and yes, you sir are a Troll.
380 MHz Athlon? Is that some super-secret CPU AMD forgot to release during the K6 days?
And yes, XP will run decently a ~400 MHz/192 MB RAM. Been there, done that — for 2 years.
“380 MHz Athlon? Is that some super-secret CPU AMD forgot to release during the K6 days?”
To be clear (or not so clear), it’s something with AMD printed on it that is clocked at 380 Mhz (Well < 400 anyway)… On a Compaq… May or may not be an Athlon. The computer is a few 100 km away, and I’m not that curios
Anyway, it works, and works well (everything else being equal).
That post was modded down because it was the first post in a thread that most people here has allready read hundreds of times before.
while (linux_userbase < ++linux_userbase) {
A cluless non-developer (you) makes the illogical inferense that he’s desktop is slow + other clueless non-developers has mention X beeing bloated == X and everything around it has to be redesigned.
The people with a little know how (appropriatley cheered on by the other end of the clueless non-developer crowd to make the thread even more annoying) explains why the logic is flawd.
}
“So obviously I’m being moderated down based on my name, and not the content of my posts.”
Oh no, it’s just a coincidence (or is it?) that the content of your name matches the content of your posts.
I suppose for anyone to listen to you about something, you would have to first demonstrate some sort of reasoned insight. If you have any of that, do be sure to share it.
Modularity and layering is a effect of good design as opposed to hacking graphics code directly into the kernel .
The philosophy behind it is great — the implementation is poor. Layer upon dependant layer, slowness, limited support, and constantly changing APIs are not the way to solid advancement.
As for “hacked graphics code directly in the kernel” — well, let’s just say that these hacks have yet to cause me any problems. My gaming machine is still humming away, rock-solid stable, and performing great for 45 days now. Obviously Microsoft’s “hacks” work, and they lead to a usable OS with awesome video drivers.
Layer upon dependant layer, slowness, limited support, and constantly changing APIs are not the way to solid advancement.
so the correct is? …oh, let me guess: independent layers or no layers at all, quickness, unlimited support, and freezed APIs are the way to solid advancement.
Obviously Microsoft’s “hacks” work, and they lead to a usable OS with awesome video drivers.
so video drivers (developed by hardware vendors) are awesome thanks to Microsoft’s “hacks”
Let me say just WOW
No, it’s thanks to Microsoft not changing major APIs/architectures with every semi-major update.
rofl 😉
at the moment, the score of your comment is 3 and i think it should be 5. but i’m sure it will be soon -5 because most people out there cant accept that the whole some stuff + X + some stuff + toolkits + some stuff + window managers + some stuff + desktop environnment + some stuff + … is completely wrong.
too much layers, yeah. the problem? the people. why? they cant accept it.
linux will never be a wise choice for gui apps for that reason. see, you need a simple gui web browser on your linux box. you have to get X and all its libs. but thats not enough! you have to deal with a specific server for good performance. but still not enough, you need to deal with many fonts issues (more libs!). then you realize that hmm, your web browser depends on gtk that needs at least 20 more libs to work. so you get them all hoping you will finally be able to browse the web. but no! you need a basic window manager because X doesnt include one. well you can live without one but thats painful. ok let’s add another 10 libs for like gnome. done now? maybe. but not yet if you plan to install a multimedia player, office suite, ect. more and more libs.
the model is completely wrong. code re-use? thats not a pro-argument. you can re-use code differently. i think that novell, redhat, sun, ect should join up and start something else to compete with windows, macos…
why? linux needs a new gui, badly… i mean, c’mon! with all the people wasting their time on gnome, kde, ect and all the guys at novell, redhat, sun it wouldnt be that hard to start something new… so why not?
linux needs a gui platform in fact. a base gui system that includes: hardware (drivers) management (gfx card, ect), a graphic subsystem (similar to windows gdi), a universal sound subsystem (alsa would do the job) and finally, a damn basic desktop that could be highly customized so people at gnome and kde can still rls their stuff.
finally, linux gui needs to be simplified. developers need an easy to use standart graphic library that would produce a nice output directly (unlike X libs thats kinda hard to use and produce shitty ouput by default). on top of that, people could write gtk and qt to backport existing applications.
it would really kick linux in the ass for good. it would be so attractive for developers as well. less complexity = more people.
but i guess thats all in my dream. people will keep arguing that X is the best and its so well designed that it will live for another 10years.
You think Windows is that much different? It all starts somewhere…
I agree with some of the stuff that you are saying
but you are confusing a few things.
Windows is a package , they provide all that stuff youve talked about in a neat packaged box that operates in a way you expect it to. With Linux , its a open box , you can see the cogs and wheels as it works. You can’t say the same thing about windows, its not a criticism but its a different way of doing things.
I find what you say all that stuff you need just to get a simple app going : X + WM + firefox + gtk + libs. Its just the same for windows and MacOS theyre just packaged differently.
X isn’t the best,and I am sure most people here would agree, but that would be like comparing apples and oranges. As developers we need to do the best with the tools that we have. Sure Linux Distro gui’s need work but let’s not confuse the need of better UI design ( along with a suitable dev platform ) with a full architecture overhaul…
most people out there cant accept that the whole some stuff + X + some stuff + toolkits + some stuff + window managers + some stuff + desktop environnment + some stuff + … is completely wrong.
Why is it wrong? You fail to give any support to your argument at all.
you have to deal with a specific server for good performance. but still not enough, you need to deal with many fonts issues (more libs!). then you realize that hmm, your web browser depends on gtk that needs at least 20 more libs to work. so you get them all hoping you will finally be able to browse the web. but no! you need a basic window manager because X doesnt include one. well you can live without one but thats painful. ok let’s add another 10 libs for like gnome. done now? maybe. but not yet if you plan to install a multimedia player, office suite, ect. more and more libs.
Are you kidding me? Linux distibutions do not work like that at all. Install any mainstream distro and either Gnome, KDE, or both will be installed by default. They both include web browers, media players, and all the libs you need for them. You’re crazy if you think Windows doesn’t have libs too. Look at all the dll files, look at how they are spread out all over the place. You have some libs in several system folders, and some in individual application folders. The library situation is worse in Windows than it is in Linux. If you install Gnome or KDE they both include libraries that allow you to build other applications. It seems your real problem is choice. If you choose to use some Gnome applications and some KDE applications you will need libraries from both desktop environments which isn’t a hassle with most Linux distros because they come with package managers. You don’t need to know what goes where, the package manager handles it for you.
the model is completely wrong. code re-use? thats not a pro-argument. you can re-use code differently.
What is that supposed to mean? How should we use code re-use differently?
linux needs a gui platform in fact. a base gui system that includes: hardware (drivers) management (gfx card, ect), a graphic subsystem (similar to windows gdi), a universal sound subsystem (alsa would do the job) and finally, a damn basic desktop that could be highly customized so people at gnome and kde can still rls their stuff.
First of all drivers are in the kernel not in the gui, and that’s the way it should be. There are X drivers for video cards already if that is what you mean. GDI is nothing special. X performs most of the same functions that GDI does, it interfaces the graphics with the drivers. Gnome and KDE themselves are highly customizable, much more so than Windows.
finally, linux gui needs to be simplified. developers need an easy to use standart graphic library that would produce a nice output directly (unlike X libs thats kinda hard to use and produce shitty ouput by default). on top of that, people could write gtk and qt to backport existing applications.
I haven’t a clue what you are talknig about here. X libs do suck but most people don’t really have to deal with them that much, besides X is being cleaned up and modularized which will help tremendously.
but i guess thats all in my dream. people will keep arguing that X is the best and its so well designed that it will live for another 10years.
You haven’t really given a good argument as to what is wrong with X. It’s fast, it’s network transparent, and soon it will be modular. So what exactly is the problem?
“No matter how much lipstick you put on a pig, it’s still a pig.”
Slashdot: From the best dressed catagory.
Seriously I’ve heard that there are differences in implimentation across X-servers. Just how portable is the “layer of cards”?
Well, it’s true, performance of the whole stack sucks (not performance of components, but performance of the whole thing), but X has also it’s advantages. It’s client/server-based, so I can start for example any of my Linux (oops… Linus, please, don’t sue me! :-)) applications on my NeXT station. And it is free. 😉
That’s true, X’s network transparency is cool — but how much does this really matter to most users? Most users are interested in glitzy eye candy and smooth performance on their local machines.
As for it being free … do you mean free, or Free? 😉 Free as in price, sure … but my time still has value, and I can’t be bothered to spend the time getting this house of cards built and operating. If you mean Free as in Freedom … I really couldn’t care less, and neither should you.
I use the best tool for the job, regardless of the philosophy behind the software. Choosing software based on philosophies is stupid. My time is both money and limited. I have a life outside of computers, and I prefer living it, as opposed to getting my computers working.
> Choosing software based on philosophies is stupid. My time is both money and limited. I have a life outside of computers, and I prefer living it, as opposed to getting my computers working.
Since when is anybody forcing you to use X?
You can use something different (OS X, Windows) if you don’t like it. Especially since you couldn’t care less about Freedom.
So why are you ranting about something that you
a) didn’t help creating
b) don’t help improving
c) is giving to you free of charge
Troll??
> My time is both money and limited. I have a life
> outside of computers, and I prefer living it, as
> opposed to getting my computers working.
So why are you wasting your precious time in a blog dedicated to OS news?
<If you mean Free as in Freedom … I really couldn’t care less, and neither should you. >
Though I think that you’re post is total flamebait I’ll respond to this anyways..
Just how much freedom are you willing to sacrifice?
guess you’ll find out when you load up vista.
as to getting my computers working, After installing SuSE I’ve not had to do ANYTHING to get it working, where as with windows I’d still be fussing with drivers.
Free as in price, sure … but my time still has value, and I can’t be bothered to spend the time getting this house of cards built and operating.
What the hell are you talking about? As a user, you don’t ever have to “build” X (in four years of using Linux, I’ve never compiled X), and the only thing you need to do to get it “operating” is install the proprietary NVIDIA or ATI drivers if that’s the type of graphics card you have.
You may whine and cry that you get modded down all you want, it’ll just keep happening if you keep spewing forth untrue garbage like this.
“If you mean Free as in Freedom … I really couldn’t care less, and neither should you. ”
Speak for yourself. And just make sure you don’t say that to anyone offline.
“If you mean Free as in Freedom … I really couldn’t care less, and neither should you. ”
Does this only apply to “stuff done on computers”, or to any other situations?
If you couldn’t care less, then stop going on about it. And don’t complain when they take away your rights to complain. And force you to use Windows2007 Dictator Edition.
Freedom? not for me thanx!
On a lighter note: I compliment you for being able to say “I COULDN’T care less” in its proper context. Too many people, IMHO, are saying “I COULD care less” to mean the exact opposite.
MOD: +1 (Grammar), -several million (good thinking)
Dear Linux is Poo,
1. Trolling. (It seems that people would be able to find better uses for their time and “get a life”.)
2. Repeating what they have heard others say. (Monkey see, monkey do.)
3. Need to upgrade their 386’s to something more modern, or replace their current ISA video card with one that has 2D acceleration. (You can still find these, occasionally, at flea markets and garage sales.)
Judging from your handle, I’m gonna bet that you fit into category #1, with generous portions of #2 and #3 thrown in as fodder. You really might want to check out the flea market.
Well that didn’t work. User headspace error while cutting and pasting. Would someone please mod my original post into oblivion? It’s 2:30AM here. Backup system engaged… retrying:
——–
Dear Linux is Poo,
I use X every day. My clients use X everyday, often on low end hardware over a LAN or WAN connection.
We use X locally.
We use X over the LAN.
We use X over DSL using Nomachine’s NX software.
And I have yet to see what people are talking about when they say X is slow. I have never, at any time, gotten a complaint from a client about the speed of screen updates. Complaints about OpenOffice.org start up time, yes. Complaints about screen updates, no.
Not EVER.
I must conclude that people who jump into threads to make the “X is slow” claim are:
1. Trolling. (It seems that people would be able to find better uses for their time and “get a life”.)
2. Repeating what they have heard others say. (Monkey see, monkey do.)
3. Need to upgrade their 386’s to something more modern, or replace their current ISA video card with one that has 2D acceleration. (You can still find these, occasionally, at flea markets and garage sales.)
Judging from your handle, I’m gonna bet that you fit into category #1, with generous portions of #2 and #3 thrown in as fodder. You really might want to check out the flea market.
I have never, at any time, gotten a complaint from a client about the speed of screen updates.
Is it too slow to use? No, or at least very rarely (in odd circumstances). But when you use Windows/Mac OS X on a regualar basis, everything from screen updates to scrolling smoothness is generally slower, even with the official nVidia/ATI drivers. Is it so slow as to be unusable? Certainly not. It’s plenty fast, but even on the same hardware such as a dual-boot scenario (both moderate and high-end, mind you), X is generally slower graphically.
Maybe if I sat down with a test suite and stop watch and had identical test boxes side by side running X and whatever other OS, I could detect a difference. Frankly, I have better things to do with my time than tracking down a non-problem.
That doesn’t mean the way it does what you like it done as well as it could be. xgl is still designed to be network transparent, etc.
Just because X does what you want it to do does not mean that it can’t be improved.
And yes, for doing what most people consider next generation desktop graphics, it’s slow.
1. If by trolling you mean I state points that strike too close to home for most Linux fanatics, then yes, I suppose I’m trolling.
2. What I say is what I’ve experienced personally — not heard.
3. Athlon 64, 2.4 GHz, 2 GB of RAM, 360 GB of storage on SATA, Radeon X800 XT Platinum Edition. (Though to be honest, I was more productive on my 386DX 40 than I am on my modern machines :-P)
> 1. If by trolling you mean I state points that strike too close to home for most Linux fanatics, then yes, I suppose I’m trolling.
Honestly, nothing could be further from the truth. I’m a Linux fan, but I try to be quick to point out when I see complacency and denial in the community. (And yes, there’s plenty.) Its just that in this case, I don’t see a real problem. Sincerely, I don’t, and I use X A LOT.
Could X be made faster? No doubt it could. Should it be made faster? Sure. There are known areas where X could improve. Someone posted an excellent link to an analysis in a thread on the story about 2.6.13’s release. What I am saying is that the practical consequences are nearly nonexistent in my experience.
Just for the sake of argument, lets say that X is slow and that I’m just too dense to see it. (Hey, I’m not proud. 😉
X delelopment is currently entering a renaisance, of sorts, after a long “Dark Age” under XFree86 custodianship. IMO, David Dawes has the dubious honor of holding the record for doing the most damage to the Open-Source world while nominally trying to “help” it.
So expect rapid changes to X. To be honest, I have more concerns about X’s feature set than its speed, but I’m sure someone, somewhere cares enough to work on any performance issues.
BTW, are you using any of the newer eye-candy features? I tend to turn off eye-candy myself. Not for performance reasons, but because I find it annoying, and maybe that has something to do with our difference in perception. Though I did try out the latest Suse the other day, and it has way more eye candy turned on by default than I like and still seemed pretty zippy.
Opaque window moves and animation when minimizing windows are all I need.
Let me also take this opportunity to shame the mozilla guys for writing apps that are indeed noticeably slow in doing window redraws. Not unusably, but noticeably slow. I suspect that complaints about X performance may have a lot to do with the performance of Mozilla foundation apps, and the apparent performance of apps that happen to be unlucky enough to be in the forground in front of a Mozilla app window. You don’t necessarily notice it on half way decent hardware, but on slower hardware you do.
BTW, I hear you about the 386-40 and productivity. It amazes me how hardware can increase in speed by hundreds of times and we spend longer than ever waiting for apps to come up. (OpenOffice.org, I’m talking about you!)
Windows is just as bad. An Avalon app:
App -> Avalon -> D3D (through COM interface) -> user mode driver -> D3D (again) -> DXG (switch into kernel mode here) -> display driver -> hardware.
X is terrible, I agree. I’ve read about it in the Unix Haters Handbook but the size of the stack is no way to measure that.
You’re missing one important thing here. D3D can fail to initialize, forcing things into a fallback mode. This is common with DirectX apps, where if you’ve got more than one window. Window A could be accelerated, Window B could be stuck in GDI mode, and you’re not guaranteed that either or both or which will be accelerated or not, so you’ve got to code for both cases. Oh, and DirectX can tounce the GDI state and GDI can trounce the DirectX state due to no driver locking mechanism to synchronize state between seperate subsystems.
The Java2D code is a mess because of this.
Very good read. Best on here by far. Only problem: Nvidia drivers kick ass! You want good eye candy? Hardware acceration in Linux today? You need Nvidia!
How is this a problem…. for NVidia
If ATi can’t get their act together with the quality of their drivers then the Linux market is just going to leave them behind. ATi is doing well so far in the general Windows Gaming market.
hey! check out the new boot screen provided for breezy; after installing todays newly released kernel. Linux is brown!!!!!!
javajazz
X lovers – Most are network administrators. They really don’t care for speed or eye candy. There careers depend on X. They don’t like people f__king with it.
X haters – People that are desktop users(win and mac users) and don’t care for network transparency or the network protocol that slows x slightly.
http://en.wikipedia.org/wiki/X_windows
Question for X programmers. Is it possible to port X server code for like DirectFB or any new windowing system ? If we could then that would be pretty advantageous for trying new ideas without rewriting code. It would be nice if there were two versions of X. One for desktop like DirectFB and other for networks but both could interoperate with one another.
Why is it that even people defending X automatically cave in and concede that network transparency makes X slower?
Unix sockets are *extremely* light-weight. Plus, with DRI, Xshm, etc. the network transparency can get completely out of the way when appropriate, and jump in again if you happen to be running over a network. OpenGL quake is playable over a 100mbit LAN. Try that with RDP.
The one problem that X has is that over a network, latency kills it. It works great over a LAN, even a slow one. But even 100ms of ping latency absolutely kills performance. VNC fixes the problem. And NX absolutely flys! Even over a modem it is quite zippy.
Most people that comment in technical forums don’t know what they’re talking about to any great extent. They have no idea how other windowing systems are implemented under the hood; they don’t even understand how the X server is implemented. They haven’t done any objective benchmarking or profiling. If they’re for or against X becomes secondary to them engaging in socialization rather than technical discussion.
about them been non-technical… yeah, that may be true most of the time… some of them just repeat what they hear too, but even so, you can’t ignore people’s perception of performance at all.
Maybe the problema they’re talking about isn’t in X itself, but even so the problem have to be found. You can’t just ignore them. It’s like police not solving a dead situation because they don’t know if it’s suicide or murder and if someone don’t foward this situation to the correct departament…
The fact is: there’s a problem in graphics rendering that also uses X (protocol, drivers, there’s a lot of things “packed” in your X distribution and also called by the name “X”)…
is the problem in X? probably not the protocol, probably not the work model/flow, but you can’t just blame people if they think so (remember, they’re not technical or don’t know the hole situation), it’s better explain to them where the problem lies, or discuss with them to find the problem…
Wikipedia had a paragraph on it.
X does more than windows . If you do more then that is some overhead which causes slight lag.
Most of the negative comments are from OSX and Win users who never see any lag.
I think we need 2 versions of X . One which is in kernel space and better communication protocol between server and client. The other for network people who need regular X.
‘The device independence and separation of client and server does incur an overhead. X’s network transparency requires the clients and server to operate separately.’
X-lovers -> I work with FLTK. IT ROCKS on X , much more than the shitty MS windows system
That would explain a lot. The way I read this article is:
1) Linux Graphics is a bloody mess.
2) X is still an embarassment, five years behind (at least) what Quartz and Avalon are capable of.
2) Nobody has the time, manpower or inclination to fix it.
Ten years ago, we were having the discussion about X being borken. In ten years time we will still be having this discussion.
Plus ca change…
gypsumfantastic,
I’m actually rather excited about X’s future. As I mentioned in a previous post, X development stagnated pretty badly under XFree86. But things are moving along nicely now that X development is being conducted at X.org.
1. The state of Linux Graphics is not “a bloody mess”. It may not be stellar. But as a sysadmin, it doesn’t keep me awake at night.
2. Again, this doesn’t keep me awake nights. But expect rapid progress.
3. The number 3 comes right after the number 2. Not that I’m in a position to criticize, with as many typos as I’ve made tonight. 😉
The proprietary Unix vendors let X stagnate because they never ever seemed to care about the desktop. They only cared about selling big expensive servers. (They let MS waltz in and take over the desktop, not realizing how important that space was.)
XFree86 let it stagnate because XFree86 is not so much an OSS project as it is an exclusive club.
Fortunately, the proprietary Unix vendors did get the foundations of X right. Not surprising, really. When they deigned to cooperate with each other, they usually got the foundation right and then f*cked up on the implementation. And then the whole thing would fall apart because they couldn’t manage to cooperate long enough to actually get a project finished without one of the parties stabbing the others in the back, or without everyone mutually stabbing everyone else in the back.
I really think that this is the first chance that X has had to really thrive.
I agree that we will likely still be having the X is borken discussion in 10 years. But I also think that the arguments of the “X is borken” advocates will be even more misguided than they are today.
Maybe I an just an incurable optimist. But I believe that there is good reason to expect X to go from being “OK” as a local desktop, to really being stellar.
Note, that even for network transparency, there is a problem with the way X/toolkit works: NX protocol claims a 200% improvement in reduction of roundtrips..
IMHO, having to use yet another layer to be able to use remote display over a WAN indicates that there is a problem in the way the X or the toolkit works.
My favorite graphics system was on NeXTSTEP. The things that system could do: 12-bit graphics, window-dragging, fully obfect oriented (and seanless in that regard, as opposed to multiple clipboards, multiple libs, etc.), display PostScript… all on 25/33 MHz processors with sometimes only 8 MB of RAM (32 MB was pure luxury).
That environment of course became the basis for MacOS X (even the window server process is still the same name I believe).
Indeed, Quartz is a direct linear descendant of Display Postscript. It is, for want of a better phrase, Display PDF with an Open GL backend.
It’s also proprietary. X is Free, but we seem unable as a community to produce anything of Quartz’s quality. It also seems to me that we’re seeing Quartz receding so far ahead of us into the distance that it’s a embarassment to the entire Free Software desktop.
I agree totally. As far as networking is concerned, X is well and truly out of date – throwing (up to 32-bit) bitmaps across even a 1Gb/s LAN is just plain silly, and hideously inefficient.
NoMachine have done an excellent job with NX, but the stone cold fact is that even NX isn’t all that great – I know, I’ve used it. Having said that though, it was across an ADSL network, and the upload speed was only 256Kb/s.
What X needs, once it has been modularised properly, is to create an additional stack, which allows X servers to send drawing commands across the network, while keeping the bitmap-based stack for compatibility purposes.
X.org should dust off the archives and look at things like Sun’s NeWS (remember that?), and develop something similar, preferably with XML-based drawing instructions.
I also think that graphics hardware should NOT be controlled by X directly, but by the kernel. It is after all the kernel’s job to manage hardware, and it’s the X server’s job to manage displays. There should be a clear border between the two.
preferably with XML-based drawing instructions
Yeah, XML will surley make bandwith usage more efficient…
compressing XML is exceptionally effective, not to mention efficient. The ability to easily parse and transform it are further wins.
compressing XML is exceptionally effective, not to mention efficient.
Compared to what?
The ability to easily parse and transform it are further wins.
Seriously, a well defined binary format shouldn’t be much harder to manage.
Is Cairo/Glitz “Display PDF with an Open GL backend?”
Essentially Cairo/Glitz is display SVG.
Apparently I wasn’t being facetious enough for my own good. Quartz 2D and Cairo essentially implement imaging models consistent with PDF, but Display PostScript not only worked with the PS imaging model, but made use of the language itself for rendering.
There’s still a lack of good X articles.Good one.
That’s the question, I have been expecting it to be fixed during the last 4 years… And still we have to wait another year or more for the thing to be stable, and yet be told that drivers are the problem, or applications are to blame, or the toolkit…
X is generally slower graphically.
That’s a perceived experience.Because XP prefetches almost everything,so users have a perceived “snappiness” feeling.Honestly X (fluxbox,even KDE) is faster with gentoo than XP on my AMD64 3000+ 1024 DDR FX5700 128 agp 8x.Windows XP is very fast once installed but as soon as i have installed all patches,service packs,and the whole user environment it’s perceived speed feeling is decreased dramatically.The stupid file hierarchy (does it have any?) doesn’t help a single bit much less the extremely good acceptation of viri,worms,spy/adware etc..
someone spoke up.
This layer on layer on layer crap has got to go.
I understand why its needed as of right now.
X is trying to support old ass hardware, and the old hardware ways. Times are changing people, get wit the program.
Look at DirectX, direct communication to the GPU(in a way) for anything on the desktop.
X needs to take this route with OpenGL.
gtk/qt -> OpenGL -> hardware
Have gtk/qt talk to X for keybard and such.
Remote X over network, VNC! I run vncserver on my server, works very very well. Running a xterm on your local compuer that is running it from another computer is cool and all, but isnt really needed with todays computers and network connections.
Layers are great for some programs, but when it comes to the Desktop, people want speed and it to look good.
Stop screwing around X.org, gtk, qt.
The problem is, that people who have no fucking clue wastly outnumber those who do. They chant mantras picked from marketing departments and extinguish every ray of hope if it doesn’t fit the cliche. It is primarily the problem of OSnews, and only thing I can do is stop reading this shit.
Oh, just remembered: is it legal to say ‘shit’, ‘fuck’, or ‘I am a communist’ here?
That’s a perceived experience.Because XP prefetches almost everything,so users have a perceived “snappiness” feeling.Honestly X (fluxbox,even KDE) is faster with gentoo than XP on my AMD64 3000+ 1024 DDR FX5700 128 agp 8x.Windows XP is very fast once installed but as soon as i have installed all patches,service packs,and the whole user environment it’s perceived speed feeling is decreased dramatically.The stupid file hierarchy (does it have any?) doesn’t help a single bit much less the extremely good acceptation of viri,worms,spy/adware etc..
As I’m reading this in Linux, I’m scrolling down a page in Firefox in another window (albiet a complicated one but with no plugins), and Linux (AMD64 dual core, 6800GT PCI-Express 16 256, official nVidia drivers) stuttered through it very slowly.
eBay is a good example where a complicated page (but commonly used) will take longer on the same hardware to draw in Linux than it does in Windows or Mac OS X with comparable hardware.
Meanwhile, an AMD 2500+ system with a ATI 9800 AGP system running Windows XP (SP2 and all patches installed) and Firefox scrolls very smoothly on the same page. The Windows system is dual-boot into Linux, and I see the same generally-slower experience when the identical hardware is in Linux.
Again, it’s not so slow as to be unusable or even terribly annoying, but it is somewhat slower.
UT2K4 is great on both systems, however.
As I’m reading this in Linux, I’m scrolling down a page in Firefox in another window (albiet a complicated one but with no plugins), and Linux (AMD64 dual core, 6800GT PCI-Express 16 256, official nVidia drivers) stuttered through it very slowly.
eBay is a good example where a complicated page (but commonly used) will take longer on the same hardware to draw in Linux than it does in Windows or Mac OS X with comparable hardware.
Meanwhile, an AMD 2500+ system with a ATI 9800 AGP system running Windows XP (SP2 and all patches installed) and Firefox scrolls very smoothly on the same page. The Windows system is dual-boot into Linux, and I see the same generally-slower experience when the identical hardware is in Linux.
This is because RENDER that draws antialiased fonts is *dog slow*. That is a major cause of X feeling slow.
Disable antialising and see scrolling become very fast (and fonts become crappy).
> As I’m reading this in Linux, I’m scrolling down a page in Firefox in another window (albiet a complicated one but with no plugins), and Linux (AMD64 dual core, 6800GT PCI-Express 16 256, official nVidia drivers) stuttered through it very slowly.
Aha! Please do not equate the performance of Mozilla apps with the performance of X.
With smooth-scrolling turned on I do sometimes see slowness. And on slower hardware, an app window doing an opaque move on top of a Mozilla apps will smear, whereas one doing the same over any other app’s window, including a GTK+ app, will not. Do mozilla apps have screen drawing problems? Sure they do. And *THAT* is something that needs fixing? Are we perhaps beginning to move towards agreement, here?
Correction:
“And *THAT* is something that needs fixing?”
was supposed to be
“And *THAT* is something that needs fixing.”
Second major blunder tonight and it’s 4:30 AM. I’m gonna stop posting before I embarass my self any further. 😉
Aha! Please do not equate the performance of Mozilla apps with the performance of X.
Why not? Firefox is one of the most common Linux apps, and is a great browser too boot. It’s just that it’s a bit faster on Windows (and I’m comparing Firefox on Windows, not IE). Konquerer is even slower, although Galeon seems to be a bit snappier.
[i]With smooth-scrolling turned on I do sometimes see slowness. And on slower hardware, an app window doing an opaque move on top of a Mozilla apps will smear, whereas one doing the same over any other app’s window, including a GTK+ app, will not. Do mozilla apps have screen drawing problems? Sure they do. And *THAT* is something that needs fixing? Are we perhaps beginning to move towards agreement, here?[i]
It’s jerky on even top-end hardware (this system [AMD64 dual core, 2 GB RAM, 6800GT PCI-E x16] was *just* installed) on pages (eBay, table/CSS-complex, etc.) that Windows on identical hardware has no problem with.
The screen refresh-slowness exhibits in other apps too.
I agree that Firefox is popular, particularly on Linux. I agree that it is a great browser. I am a firefox fan, but I am critical of its screen redraw performance.
I only occasionally use Windows, and I was wondering if it were faster there. The screen update speed is not really a problem on my or my clients’ hardware, but it and Thunderbird are the only apps that do seem at all laggy.
I’ll agree that if it is faster on Windows, that suggests a problem in X. However, I simply do not see the lagginess in any other apps.
I conclude that perhaps there is an interaction between X and Firefox that introduces the problem?
(And I implicitly disagree that X screen refresh is noticeably slow in any other apps… that I use, anyway.)
I agree that wherever the “problem” between X and Mozilla (if it is bad enough to be called that) comes from, it is noticeable enough that fixing it would be beneficial.
We have very similar hardware. I have an AMD64 2500+ right in front of me. My 6800GT is an 8x AGP.
Show me where all these X performance problems exhibit themselves in other apps. Give me some specific tests to run.
XUL on Windows is drawn by Win32, which currently delegates to GDI or GDI+ which use hardware drivers to draw the display
XUL on Linux id drawn by GTK2, which currently delegates to GDK/Pango, which delegates to XLib/XRender which use hardware drivers to draw to the display.
Had it not occurred to you that the point between XUL and the hardware at which the re-draw issues occur might not be X? There are a lot of steps along the way. In fact it’s shown pretty conclusively that GTK2’s performance is sub-optimal particularly in comparison to other toolkits such as Qt3 and to a less extent FLTK and others.
Perhaps you should you a bit more research before you start calling for something to be completely scrapped.
With all that said however, that article has denting my previous optimism for the current direction of X11 development. Maybe this will spur some more disussion.
I don’t recall seeing the original message calling for ‘scrapping’ anything completely.
Also, given that this is “OSNews”, not a developer mailing list, you can back off the aggression a tad, and educate instead of flame. It’s quite obvious that the poster is not familiar with the levels of abstraction and delegation involved.
Thanks!
– Kelson
The original post that started this thread very clearly called for scrapping *everything*.
“The original post that started this thread very clearly called for scrapping *everything*.”
Did it now?
“The server would not need to be designed from scratch. Substantial portions of code could be reused from other projects and would be combined in new ways. From what I can see, 95 percent or more of the code needed for a new server already exists.”
Yes. It did. Here is the original thread starter posted by user “Linux is Poo”:
—————
Stacking layers upon layers of functionality/hacks on top of an old monstrosity like X will never get Linux anywhere near OS X/Vista levels. For example, the current Cairo-enabled stack of X crap: App -> gtk+ -> Cairo -> X Render -> X -> XAA/EXA -> hw
*wow*
No wonder it’s slow. It’s time to start over, guys. No matter how much lipstick you put on a pig, it’s still a pig.
Ah. My bad, I thought you were referring to the article.
I recently installed ubuntu on a pentium3.
Scrolling in Firefox was horridly slow.
Then I discovered that nvidia drivers aren’t installed by ubuntu by default(due to the annoying licencing, why hardware manufactures don’t free their drivers buffles me. It’s not as if others will steal them.)
Once I installed the nvidia drivers things where a lot faster.
The slowest of firefox is a far larger problem than the slowness of X.
I hate the heaviness of firefox. But being that it’s really the only usable browser, I still stick with it.
If only links2 had better support of javascript and css.
Jessta is a real dumbass…
But being that it’s really the only usable browser, I still stick with it.
On Ubuntu, there is also Epiphany. Try it!
What you say about slow scrolling web pages in firefox… It doesn’t even have to be complicated! Complicated pages are rendered and can be scrolled easily in FF, but these wchich include Flash… That’s damn slow in scrolling!
Mozilla != X Performance… Firefox is slower on linux. I know people are going to laugh, but if your run KDE, try bloody Konqueror! I kid you not… ‘Tis faster… And it does a really good job of rendering sites [Well, maybe it coughs on the google site of the month, but new Konqs are just around the corner [that should fix this], and I didn’t say uninstall Firefox ].
You realize that this is an issue with Firefox, not X, right?
As I’m reading this in Linux, I’m scrolling down a page in Firefox in another window (albiet a complicated one but with no plugins), and Linux (AMD64 dual core, 6800GT PCI-Express 16 256, official nVidia drivers) stuttered through it very slowly.
eBay is a good example where a complicated page (but commonly used) will take longer on the same hardware to draw in Linux than it does in Windows or Mac OS X with comparable hardware.
Meanwhile, an AMD 2500+ system with a ATI 9800 AGP system running Windows XP (SP2 and all patches installed) and Firefox scrolls very smoothly on the same page. The Windows system is dual-boot into Linux, and I see the same generally-slower experience when the identical hardware is in Linux.
The problem is that the best Firefox is the Windows one. For a better experiance, install Epiphany in Linux.
i still don’t see it… i’ve been walking back and forth between my wife’s computer (pentium 4 2.66 ghz / 512mb ram running XP SP2) and my own (athalon xp 2600 / 512mb ram running gentoo with a ck kernel) and i can’t see the slowness or the chopiness that everyone is talking about on the linux box. did the ebay test if firefox on both computers.
maybe because i use nvidia’s closed source drivers and a ck kernel that optomized for desktop use on a fairly modern machine?
Remote X over network, VNC!
Well i would like to see every X over the network eliminated.Indeed use a vnc-server with ipsec and or vpn.The less that connect (especially to anything graphically) the better.
As I’m reading this in Linux, I’m scrolling down a page in Firefox in another window (albiet a complicated one but with no plugins), and Linux (AMD64 dual core, 6800GT PCI-Express 16 256, official nVidia drivers) stuttered through it very slowly.
Nice hardware 🙂
eBay is a good example where a complicated page (but commonly used)….[i/i]
I mostly use the developer extension and disable everything exept images.That means i surf with;java,java-script,animations,cache,referrer-logging,cookies,etc.. disabled.Furthermore i never use online authentication sites that involves *my money*.Fact is my machine i can control, the server that stores my sensitive data i don’t is illegal for me to control.Yes i don’t bank-online either although i have an unique extra hardware decrypting device.So i mostly surf OS-news,news sites,and everything that’s doc/security related.Those sites are not complicated and render very well especially with addblock and appropiate filters.
However my mileage isn’t a corelation to the average usage and indeed a lot of people need java,javascript,cookies,and what more on by default,as dictated by those websites.
[i]UT2K4 is great on both systems, however.
🙂 yes it does.
[i]scr.. up
App -> gtk+ -> Cairo -> glitz -> GL -> hw
Does anyone know where the cairo/glitz rpms for fedora core 4 are? I have upgraded from rawhide but that cairo isn’t glitz enabled and building glitz is a bitch.
Official Cairo 1.0 has glitz disabled because the latter was not ready.
Kids grow up!
This whole X is slow idea is simply based on WHAT?
Who of you, honestly, used X over the network/over NX/over UnixSockets? And can tell the difference?
(speed, technology, overhead-wise)
If X is soooo damn slow why does something like E (Enlightenment) with lots of eye-candy is really speedy?
(And remember Enlightenment uses SOFTWARE … because the X11-Software engine is the one that is most stable. – There is even a OGL engine in the pipe. But there’s not much work going on to make it work. Because the software egine is fast and there are other topics to work on.)
The whole problem is not that the X achitecture is SLOW. The problem is that some of the most used libaries and extentions are slow (randr, gtk, qt, …)
Even though qt seems to do quite well. gtk could use LOTS of speed improvements.
But remember this has NOTHING to do with X.
Thus let us conclude: X is fast, Applicaions are SLOW.
PS: This might sound like I know nothing but E, Gnome, KDE. But many of you will know that other DE/WM’s feel more snappy too. Like XFCE. I did take E because E comes with a WM, lots of tools that do NOT depend on either GTK or QT (engage, ibar, …) and ships with it’s own Toolkit: EWL. – Please go and read about guts of the X.
Hum… “gtk could use LOTS of speed improvements”?
<Everything you said about X, but replace X with GTK+ and make other fitting replacements>
If you can’t see the irony in what you just said (basically, “what are you talking about, X is fast for me, it’s because of other things”)… well…
Btw, talking about X in this case is mostly meaningless. If we should be talking about something we should be talking about a specific implementation. In the comments posted to this article most mentions of X shoudl be replaced with X.org.
But yes, I agree, some slowdowns has nothing to do with X.org per se, however some do (XAA, anyone?). Anyone who has followed the xorg mailing list on freedesktop for a time should have a pretty good grasp on that and what is being done about it (or what is planned).
To the rest of you, get a *bleeping* clue before you open your mouths! Anyone can read the mailing list archives and browse the wiki (where some info on the performance work exists).
The arguments about layering are basically crap. Most of the “layers” are apis (ie virtual layers).
When you have cairo -> glitz.. thats actually one piece of software. As I understand it glitz is an implementation of cairo that uses OpenGL libraries. Its the same thing. It would be better written as app -> gtk+ -> glitz/cairo -> LibGL -> hw
Presumably X Render -> X -> XAA/EXA is all part of the same software package. Its not necessarily one thing calling another, which calls another and so on. Its probably the top level implementation hooking direct into the lower level api. If an X Render implementation is accelerated its probably going to be using the low level accelerated path.
Most of the layering stuff from the article is protocol and api layers. Its about differing implementations using different underlying structures to achieve the same result.
It’s even dumber than the network transparency debate. It’s true thats the network protocol used by X is really dated and pretty crap. VNC and NX are probably a lot better (especially for high latency/low bandwith connections) However that means diddly squat to local workstation performance. Even Windows is built on loads of (insecure) network transparent, RPC crap, or don’t you read all those security bulletins.
Cairo and Glitz are not one pice of software.
Qt4 and Arthur are.
Funny how many experts emerge every time an article like this is published For those bashing the modular design of linux graphics: please read the article again. There’s no way anyone will effectively maintain 16M-lines-of-code beast, ever.
Thanks for the article.
I think those experts want everything coded in assembler and run as part of the kernel.
All these high level languages and system APIs are surely just bloat from their point of view.
Very likely the only thing they know about software development is that it results in software
your time “still has value” and you spend your days and night on osnews, ready to bash Linux? Ah ok. you are paid for. Nice job.
No.
I’m not sure how Apple or MS do it… but it seems to me that OGL at very high resolutions wouldnt work so well. Running E17 with OGL turned on at 800×600 on my machine screams… but at top 1680×1050 it shudders and lags. Even simple demos understandably dont run well when at such high resolutions. Makes me question this complete reliance people have on OGL as the back end.
If your hardware is not capable of drawing eyecandy at 1680×1050 it doesn’t matter if it is being drawn with OpenGL or EXA. Turn off the eye candy or buy better graphics hardware.
OGL is the fastest API we have. If OGL can’t do it none of the other APIs can either. Also, it is not correct to draw a comparison between a window manager doing translucency and one that is not and conclude that that means OGL is slower because the translucency is slower. Implementing trancluency is computationally more complex than old style window managers. It is probably the case that OGL is 2x faster but you have given it a problem that is 4x more complex.
X has been the de facto graphics standard in the unix world forever, as opposed to the proprietary and incompatible Microseft Windows graphics subsystem. X was designed to scale down and up, Microseft can’t do that. X is network aware, Microseft added rdesktop not so long ago. X is modular, Microseft isn’t. Features that the proprietary Microseft Windows XP system still doesn’t include have been in X for 20 years. Can’t wait to run a glitz-accelerated desktop while laughing at the ridiculous Microseft proprietary graphics toolkit.
I don’t think we are going to get a glitx accelerated desktop according to Redhat.
http://permalink.gmane.org/gmane.comp.freedesktop.xorg/2777
On 8/21/05, Owen Taylor <[email protected]> wrote:
> I can’t speak for other major toolkits, but there is no current plan
> to do this for GTK+:
>
> http://mail.gnome.org/archives/gtk-devel-list/2005-June/msg00166.ht…
>
> Is the mail of mine I keep pointing to. Having all the 30 or so
> apps that are running to make up your desktop be direct rendering
> clients just doesn’t make sense. Translating RENDER primitives
> into GL and speaking GLX over the wire doesn’t make sense.
>
X has been the de facto graphics standard in the unix world forever, as opposed to the proprietary and incompatible Microseft Windows graphics subsystem. X was designed to scale down and up, Microseft can’t do that. X is network aware, Microseft added rdesktop not so long ago. X is modular, Microseft isn’t. Features that the proprietary Microseft Windows XP system still doesn’t include have been in X for 20 years. Can’t wait to run a glitz-accelerated desktop while laughing at the ridiculous Microseft proprietary graphics toolkit.
rofl. incopatible microsoft windows graphics subsystem? where have you been the last 15 years? windows programming (both api+mfc) has been the standart for gui applications for more than 15 years now and gdi has always been an awesome part of it. and believe it or not, IT IS STILL #1.
most people still use windows and there are a shitload of windows developers. much more than nix developers actually, MUCH MORE. some of you live in a dream and look so ignorant days after days acting like microsoft never existed.
and hmm, you look even more ignorant by saying:
X was designed to scale down and up, Microseft can’t do that. X is network aware
lol. first of all, 15 years old applications still work perfectly under xp (im talking about win16 applications). microsoft and backward compatibility are GOOD friends. the base always remain intact and they just add stuff around existing work or on top of it. “Microsoft can’t do that”. haha. well, never heard of .NET (WinForms) or Avalon? stop trolling man youre making a fool of yourself. dont talk about something you know NOTHING about.
anyway, thats why linux doesnt expand rapidely. people like you are ruining it sitting on their ass claiming its already better than everything else while it’s not.
“This whole X is slow idea is simply based on WHAT? ”
On the way that all my Linux boxes play 2d games slowly.
Emulators, arcade games, whatever. If it involves blits to the screen then it’s slow.
X will take significant CPU time, and it will run jerkier than the same emulators on Windows or DOS on a slower machine.
Also, watch top while dragging around some browser windows with flash animations playing. The CPU usage for X is massive.
That’s just how it is. I only run Linux on my computers, but don’t like this aspect of it.
> On the way that all my Linux boxes play 2d games slowly.
Emulators, arcade games, whatever. If it involves blits to the screen then it’s slow.
Then it is a video driver problem, not a problem with the X Window System.
Windows GDI, as AES/VDI and other API’s was build for PC’s, and not for workstations and servers. That’s way they was less complicated and scalable but for sure faster. So don’t compare it.
I knew that it will be such discussion on this topic. Linux has too many admin and geeks users to make revolution in graphics subsystem. You are affraid of new things, you don’t want to make progress.
About layers and levels of graphics subsystem. It would be better, and every programmer would say that, that it is cleaner, easier and consistent to develop for. This makes it more popular.
I think a lot of what why X is perceived as slow comes down to people using i386 based binaries, instead of i686 or compiled code. For instance, on FreeBSD and Gentoo (I use FBSD), X is fast- same with on Arch Linux, which uses i686 binaries.
If there is a solution, perhaps it lies in branching X: having an X for old graphics cards and one for newer systems. You can’t please everyone- why waste your time supporting ancient cards and thus having bloated code? Let those with ancient cards and systems devote their time and coding skills to improving X for themselves. Either branch X or rewrite everything (which seems like a waste of time).
PS. Only ignorant fools with limited vocabularies feel the overwhelming urge to use foul language on forums.
… mozilla redraw is slow :
mozilla is not slow because of X
mozilla is not slow because of Gtk+ or Qt either (it uses its own toolkit)
mozilla on Unix uses libart to render everything … in software. (GDI/GDI+ on Windows)
well, it’s the case for mozilla up to 1.7.x/firefox 1.0x (included)
in gecko 1.8/firefox 1.5, cairo would be used for SVG and new HTML Canvas element. still libart and friends for html & css
in gecko 1.9/firefox 2.0, cairo would be used for everything (even html and css rendering)
so, don’t say X is slow just because you see mozilla redrawing slowly on Unix.
for flash performance in web pages, macromedia linux flash plug-in uses plain X, not even XRender.
Mozilla uses libart to render *SVG* and on *nix it uses GTK+ as it’s low level toolkit.
2D operations are as accelerated on X as they are on windows, it’s just a freakin blit. The problem is the architecture, X is not multithreaded, event handling totally sucks and responsiveness is awful. Some people don’t seem to care, others do.
“Then it is a video driver problem, not a problem with the X Window System.”
On five different computers, all with different graphics cards, some with propriatory Nvidia/ATI cards/drivers?
I can play games like doom3 fine on the machines that are up to it, and xv output with mplayer works on all computers. That’s not the problem. It’s not a problem with the X11 standard either.
The problem IS that the current X servers and video card drivers on Linux are slower for 2d than the same hardware under other operating systems running the same software.
2d games and emulators I’m not too bothered about, but they are a useful test case as they make the slowness obvious.
I’m not sure why he didn’t bother to mention the Open Graphics Project ( http://opengraphics.org ). Having lost its corporate backing, things have slowed down, but it’s STILL PROGRESSING. This project is dedicated to developing new graphics hardware that is specifically designed to be open architecture so that open source drivers can fully exploit the hardware. If more people were to get behind it, it would progress a LOT faster.
It was a survey of graphics software on Linux, not of graphics hardware. No one’s specific hardware was covered with anything other than a passing comment. I do thing that a survey of graphics hardware for Linux would make an interesting article.
Great article. Lots of information, well writen and very compreensive!
I hope it gets updated from time to time…
The only downside it the article refering just about Linux x86 situation and not about how these changes in X server in affect other operational systems and plataforms. (I want to know about BSDs, Solaris and others too, as X server is used by them too and it’s evolution is tied too them too…)
The only downside it the article refering just about Linux x86 situation and not about how these changes in X server in affect other operational systems and plataforms. (I want to know about BSDs, Solaris and others too, as X server is used by them too and it’s evolution is tied too them too…)
The article was about the state of Linux graphics, not the state of the X server. It was already twelve pages long. Covering other OS’s would add another two or three pages.
What ever happened to ‘Fresco’? ( http://www.fresco.org/ ) Wasn’t that supposed to replace X?
And then wasn’t there ‘Y Windows’ too? ( http://www.y-windows.org/ ) Wasn’t that supposed to replace X?
Did they all stagnate? Where these not good ideas? Or just too much momentum behind X for people to support these other projects?
For Y-Windows the dev team was too busy about real life things and so slowed down (stopped?) the development.
But lack of developers is a enormous problem.
But some ideas are good to keep i think.
Finally, a good explanation of how all of this stuff fits together. Pretty complicated stuff. I’m glad smarter people than myself are working on it. It does sound like there needs to be some high-level cooperation among the major players (RedHat, Novell, Debian, etc) to get this working anytime soon. Otherwise, all these people will be wasting time on projects that only solve pieces of the problem and that don’t work together. I agree with the author that X.org should provide some sort of plan (which projects need work and how those projects fit together).
This article does point out one MAJOR issue that will be facing Linux users who are religiously opposed to proprietary drivers. You’ll either have to use Intel integrated graphics or run the desktop in a “reduced” mode. I don’t see nVidia or ATI changing their stance anytime soon.
http://r300.sourceforge.net/
They are working on OpenSource drivers for newer ATI cards.
Here is an interesting and related thread on LKML:
http://tinyurl.com/9fkrj
http://r300.sourceforge.net/
They are working on OpenSource drivers for newer ATI cards.
I’ll believe it when they have a reliable driver with decent performance. R300 was released in 2002 and ATI is getting ready to release a new generation of cards (R4xx was really just an evolution of the R300 architecture, R5xx is a completely different architecture). Without specs, it just isn’t possible to support this hardware in a timely fashion.
Wikipedia seems to think so…
“Common criticisms of X
The device independence and separation of client and server does incur an overhead. X’s network transparency requires the clients and server to operate separately. In the early days, this gave a significant performance penalty on a standalone system compared to Microsoft Windows or Mac OS (versions 1 to 9), where windowing was deeply embedded in the operating system. 4 to 8MB of RAM was recommended for reasonable performance; until the mid-1990s, this was regarded as bloated compared to Mac OS or Windows. In the present day, Windows and Mac OS X Quartz have internal subsystem separation similar to the client/server divide in X and comparable performance and resource usage to X with KDE or GNOME.” (emphasis mine)
Maybe that’s why there’s little perceptible difference between Linux and Windows graphics on a similar modern system, with similar hardware and up-to-date drivers…I know, from using both daily, that I can’t notice any speed drop with X over Windows. Of course, Windows has had a double-buffered desktop for a long time, and that gives the impression of faster redraws when moving/resizing Windows. This feature, IIRC, is in the newest version of X.
way to selectively quote. you skipped the part right after that:
” In the present day, Windows and Mac OS X Quartz have internal subsystem separation similar to the client/server divide in X “
I’m really sick and tired of this endless X discussions. You know what’s the main problem of X? Being a mysterious “X”! I see lots of people discussing “X” when they are actually reffering to a Linux Desktop Environments (GNOME / KDE). X11 is just a part of technology enabling us to have those GUIs, and not a GUI per se, as someone well explained with (blazing fast) Enlightenment example.
In desktop environments we are dealing with lots of other components and mechanisms like WMs, toolkits, ICCM, etc. Could it be that some of those components are lacking optimisation, or rather sole X implementation (XFree86/Xorg), which assuredly has effect in overall performance (comparing to a XiG’s Accelerated X server).
X is bad because it is over 20 years old? Well… go run to buy Itanium and explain to world that those Opterons are stinking bad because they are built on x86 architecture! I mean, it’s all matter of R&D, which is then again driven by market demands. Sometimes I even think that engineers would be capable of making a maid out of grandma, provided strong market demand. And don’t they do it in plastic surgery?
Last but not least, desktop snappyness greatly varies on different distros. Believe it or not, SuSe 9.3 is far more responsive on my PIII@700MHz with 512MB of RAM then SuSe 9.2 on [email protected] with same amount of RAM, and even further, Vector Linux on PII@300MHz with 128MB of RAM is on par with former setup. And all three are running KDE 3.x, you know.
Linux and whole OSS community needs consolidation and focus on common goals, that’s a fact, but mucking X protocol without decent effort of improving implementation, drivers and other system components is simply plain wrong.
There are too many people not working together. It will always be behind Vista and OSX.
Jepp you’re right.
TO ALL:
Wether our X is “good” or “bad”, WE could do better. (Cause nothing is perfect)
And Jon worte a nice article (even when it’s not perfect) what WE could do better, so lets do better.
rm -R “flame war” && make X
Working without direction is the highest level of stupidity.
We’re arguing over direction.
Anonymous (IP: 84.164.240.—),
I agree. And I apologize in advance for going on an off-topic tangent. But today, the bickering in this thread, in which I have perhaps participated, seems so very petty. There have been good points brought up, and people who have discussed and not bickered can safely ignore this.
But walking back from my morning trip down to the Conoco convenience store to get a bottle of tea, a little dog, cute little thing, got hit by a car about 25 feet away from me. The driver did not stop. I snatched her up and ran for my apartment complex, intending to get her to my car and to the vet. She died in my arms before I even made it back to my apartment. That has set the emotional tone of my day.
Later, I brought up cnn.com and the tragedy in New Orleans and along the rest of the gulf coast goes on.
So when I look back at some of the things in this thread, it makes me a little sick.
I’m not saying that the future course of X development is not important. Or that performance issues in Linux/X or XP or MacOSX or Firefox or GTK are not significant.
But a more cooperative spirit is always more productive than bickering. And that is a cross-platform idea.
Again, there has been some good discussion. But some of the discussion has been less than constructive.
I know this is coming off as preachy, and I apologize. But I think that we could benefit from a little more consideration for others ideas, and yes, their choice of operating systems and gui frameworks.
Yes, this is a site for discussion of different OSes and their components. Discussion is great. Flaming reduces us all.
Again, sorry for the preachy, off-topic post.
I`ve installed Athene desktop over Linux without X server, and it ran fast; even doom, that usually is very slow in my sis based videocard with any linux distro. Can’t this technology be freely ported to Linux?
I run windows xp on an Inspiron 7000 (P-II, 366Mhz, 128MB, 6GB, ati8MB) and over it autocad, excel and firefox with no speed problems with the graphics. I think there is no need to compare X and windows graphics system, instead we must face the fact that X is heavy and slow and if we want to make linux an alternative for Desktop users then it (X) should be redesigned
So, your using Windows makes you an expert on X11? I’ve run X on similar systems, not to mention Win95, and it is bareable. We used to run autocad on 350 PII’s in high school. I’d believe Excel is snappy too, it’s some of the best stuff that Microsoft has ever shipped. Firefox though, riiiight. Firefox is slow on my 500K6-2, unbareably slow. It’s noticeably laggy on my 700 celeron as well. And it was out slow on my 350 PII.
I believe that the big difference between Windows and Linux with X is how they prioritize the GUI. I think that in Windows the GUI is more important, gives a sense of speed to the user. In Linux, all the processes have just prioritization (unless you as user decide to change their priorities).
You don’t think so? Hmm, then explain why some apps, like Prime95, are a lot speedier on Linux than in Windows? The basic functions are the same, I believe. I might be wrong here, and I’m not neither a Windows or Linux buff. Please correct me, if wrong or mistaken.
Historically X Windows was given a really low ‘nice value’. As convoluted as this sounds a low ‘nice value’ = ‘higher priority’ under Windows. Most Linux ditros don’t do this anymore, as it has been advised to just let the kernel scheduler figure out what priority the windowing system needs. While niceing the windowing system to -10 gave a ‘speed boost’ it was determined that this was prone to causing higher latencies and and starving other processes. The ‘modern linux way’ is to ‘get the bloody scheduler right’ so you don’t have to do hackey priority boosting to speed up the windowing system. IMHO, this is the right idea (especially since linux isn’t all GUI like other OSes [read, many gui apps act as frontends to console apps, etc.]]), and you really shouldn’t ‘need’ to renice the x server if you are running a 2.6 kernel and pick a scheduler with good interactivity.
Con Kolivas is acknowledged as the “Make Linux Feel Fast on the Desktop’ guy, and he helped spearhead the don’t be a doofus and renice x movement. his homepage is here (seems the info on reniceing x is gone): http://members.optusnet.com.au/ckolivas/kernel/ His work has helped make the Linux Desktop feel quick(er) for many of us.
And again… you are reffering to X implementation and other system components like toolkits, high-level drawing libraries etc, and not to X protocol it self!
Just compare performances of XFree86 or Xorg server to commercial one, like XiG’s “Accelerated X”, and you’ll see that there is more to implementation of X then it’s commonly believed. Though, work on X extensions is being undertaken.
I gues that most commenters didn’t grasp the article intent. It wasn’t about X11 being slow but about the great debate beeteween X developers about the future direction of X developement.
On the one side of the fence are the people that suggest venerable X11 architecture had served us well during the past 20 years but time had come to redefine what the desktop is and how it should adopt to and facilitate huge advances in GPU technology. The current X architecture isn’t up to the task and its easier to take some other framework (one that does good job now, is open and have industry backing) and build X11 compatibility in top of it than try to morph X into something completely different.
On the other side are current maintaintes whose goal is an evolutionary developement to fill the gap beetween X11 and the current state of technology on the competing OSes.
The author’s concern is not about X beeing slow (actually if you run app using standard toolkit designed for X – atena, you will see they are blazzingly fast) but about X model/api poorly matching capabilities of modern hardware and disabling applications access the hw accelerated features that if used properly could make a difference (and will if/when WinFX programmers learn how to turn display model power into usablility enchancements, they have just started).
Those are among others GPU shader filters (eg. advanced AA), direct incorporation of video widgets into desktop applications windows, dynamic and very detailed vector graphics.
He also sees that proposed evolutionary developement path (RENDER+EXA) is too little and in 2-3 years time X will be distinctly obsolete and visualy inferior compared to Win/Mac, while not providing any sensible path for advancement.
He also proposes an technological / organisational approach (allying mobile/embeded industry players that have a lot to loose if MS dominates) that makes a lot sense to me.
IMO it is scallable, consistent, future proof and unifies linux desktop and embedded developement fronts.
The, rather sad, conclusion is that the evolutionary approach have won (as indicated by failure Xegl), because most X developers employers would have something working quickly rather than focusing on the long term. Given currently opening Linux Desktop opportunities (Asia) we really cannot blame them.
DS.
One can also say the failure of Xegl doesn’t have to be one, when it how leads to a long term development, at which all take part. (Wasn’t that the so called failure?)
And who besides our community can do a good long term solution, we don’t have that big business pressure which crosses good plans so often.
Thats also our chance when you think about competition.
That’s a good summary. You are also right that the incremental people have won. I think it is very likely that the X server of 2010 will look a lot like the one you have today. Some bits will be better but there will be no major changes.
The incremental people are focused on extremely low-end desktop-like hardware like sub-$100 notebooks. Changes in high-end hardware are being ignored. They seem to be forgetting that today’s supercomputer becomes tomorrow’s desktop.
Perceived difference is worthless. You can’t visually see a few miliseconds of speed improvements. Your eyes aren’t that quick.
I have a little image processing app that I compiled in Windows and then ported and compiled to linux using CImg. In windows CImg renders with DBI, in linux with X. Compiling with GCC with the same switches I saw about a 40% speed IMPROVEMENT in linux over windows. Same system, same program, same compiler.
Let’s see some real comparisons instead of this ‘I sorta kinda think this here one might look ‘bit faster, deerrrrr’
There are many ways to have a percieved increase in speed but actually slowing the system down overall. More kernel preemption means more context switches, which means less thoroughput.
Povray would be another good way to benchmark identical system performance between Windows and Linux. Let’s see some numbers!!
(I seriously doubt anyone is still reading comments, I just wanted to give my 2cents)
@ jonsmirl:
What I’m not able to comprehand is why we can’t use the same drawing API (aka New_Xlib or Cairo) on all systems, with 2D and 3D backends (drivers) available for legacy and new hardware respectively. Old driver model (fbdev/DRM) would accelerate drawing on legacy graphics hardware and OpenGL drivers would do it for newer through DRI.
Isn’t it the way Microsoft did it with it’s DirectShow being able to use to differend driver models at the same time (VfW and WDM)?
Users, of course, will be able to fine tune amount of eyecandy and effects to match their system performance.
That way we could progress with only minor alterations of existing code. Speaking of witch, I think that Xglx approach is very good for compatability reasons and that that sort of backend is nowdays absolutely a must for any graphical project to gain popularity (just like XDirectFB tries for DFB).
P.S. That was hell of a good article! Many thanks.
I don’t see that the incremental people have won. Actually, from the look of it, it is the other way around. XEGL and XGL are about accelerating X apps. THIS is the incremental approach (am I wrong about this?).
GTK+ (Gnome) and KDE (Qt) apps are moving in the direction of sidestepping X completely. GTK+ will use Cairo and KDE will use Arthur, both of which render using X. However, they can also render directly to glitz/OpenGL (with hardware acceleration, without X).
Apps written directly to X will be supported with a rootless X server rendering to bitmaps (as with OSX and Cygwin) which are then rendered to the screen by a OpenGL window manager as textures. These X apps won’t see any real hardware acceleration since they are rendering to bitmaps, but this is the legacy path.
A new networked “X-ish” protocol would certainly be nice for the future, but I don’t necessarily see it as important for having an eye-candy heavy accelerated desktop.
The main conclusion I get is that there are currently a lot of projects out there with conflicting goals that don’t really fit well together. Somebody (X.org) needs to try to work out a roadmap and get everyone working in the same direction (hopefully through consensus building).
Am I missing something here?
Yes, you are missing something. GTK+ and Qt are _not_ moving in the direction of sidestepping X completely.
Owen Taylor on GTK+: http://lists.freedesktop.org/archives/xorg/2005-August/009356.html
Lars Knoll on QT:
http://lists.freedesktop.org/archives/xorg/2005-August/009362.html
The mailing lists archives are open to everyone, especially those who open their mouths without knowing what they talk about (sheesh!). And if you’re really interested, subscribe and lurk around for a while. Most comments on OSNews regarding X.org (and X in general) get to be pretty funny if you do.
I don’t see that the incremental people have won. Actually, from the look of it, it is the other way around. XEGL and XGL are about accelerating X apps. THIS is the incremental approach (am I wrong about this?).
EXA is the incremental approach. XEGL is about switching the core API to OpenGL and providing compatibility. EXA is about incrementally extending the X server API to encompass more features of the 3D hardware.
The mentioned LKML http://marc.theaimsgroup.com/?l=linux-kernel&m=112554740629800&w=2
dicussion reveals that problems with XGL approach comes down to shortage in skilled OGL hackers that would opitimze common 2d usage patterns in drivers for older software and extend the standard implementation where it lacks.
There are two features that when exposed in the highlevel api basically exclude older (pre dx9) hw:
shader progamability and massively vectorized user interfaces (e.g. only usable when transforms are accelerated). It won’n be easy to write an app makes use of them optionally. MS seems to be aware of that too and I gues that’s the whole point behind XAML.