Over the past couple of months, and especially over the past couple of weeks, I’ve been working very hard to write and complete my thesis. I performed all the work on Windows 7, but now that the thesis is finally done, submitted, and accepted, I installed Ubuntu – and immediately I was reminded of why I do not do any serious work on Linux: the train wreck that is X.org.
Yesterday, I was quietly using my Ubuntu 9.04 installation. I have few problems with GNOME as a desktop environment, except maybe for the fact that Evolution and I don’t get along – for some reason, Evolution crashes about 5 times a day. It has been doing this for years now, on different machines, and I still don’t know why. After a crash, I need to kill all the Evolution-related processes to get the mail client working again. My wish: replace Evolution with a simple, to the point email client, and leave Evolution to the big boys that really need it.
Anyway, I had a whole bunch of applications running, but I decided to have some fun and watch an episode of 30 Rock. I browsed to the directory, and double clicked on the episode I wanted to play. Totem loaded up, but playback was unbearable. My entire quad-core 4GB powerhouse was brought to a screeching halt, and the video playback was choppy, audio lagging – it was terrible.
Since I never had much love for Totem anyway, I installed VLC, and I tried to play the video with that media player. VLC performed better, and the video was actually watchable, but it did make the rest of the machine quite slow and unresponsive. Then, I decided to resize the VLC window to make it a little bigger.
Poof.
And here we see why the X.org stack is a steaming pile of dog poo. VLC was using the XVideo output, which is what it defaults to. Apparently, the resize operating crashed XVideo. Which crashed X.org. And as all you educated Linux geeks know but some of you might want to forget: if X.org crashes – so do all of your applications. Evolution. Chrome with a number of tabs open. Pidgin with a number of IM windows open. Twitux. Evince with an insanely cool study open (‘Mathematical Modelling Of An Outbreak of Zombie Infection‘). OpenOffice.org Writer displaying a friend’s thesis which I was proofreading.
They were all gone.
Now I know what’s going to happen. The Linux fans will come out of the woodwork, and they will start doing two things. One, they’ll start saying that this is not a problem, as the machine is still running, and X restarts – and if not, I can ssh into it. Second, the finger-pointing will begin. Always with the finger-pointing!
I don’t care if X restarts, and I don’t care that I can ssh into it. I’m a user, and what matters to me are my applications, and the data and documents they’re holding. X restarting or the ability to get my nerd on by ssh’ing into my box is pointless and useless.
And then we get the finger-pointing. I’m sure the usual suspects are already busy churning out comments about how this is the fault of the driver, VLC, myself, the 30 Rock video, planetary alignment, anal probes, whatever. And all of you will miss the point completely: I don’t care where the problem lies. Bugs happen. Crashes happen. That’s a fact of life, and something that you have to accept when using software. However, under no circumstances should resizing a video window result in a complete system failure!
Let’s take a look at operating systems with a modern, advanced, and robust graphics stack: Windows Vista and Windows 7. Both of these have a graphics stack that is so far ahead of X.org that it’s like comparing an Airbus A380 to the Wright Flyer. Do you know what happens when a graphics driver crashes on Windows Vista/7?
Absolutely bloody nothing.
The screen will flicker a few times, you might get dumped back into Aero Basic for a few seconds, and then the graphics driver will be reloaded, and everything will be back to normal. A dialog may appear asking you to send debug information to Microsoft. None of your applications will crash, no data will be lost – in fact, you’ll barely even notice. The graphics stack in Windows Vista and 7 is so advanced, that even updating graphics drivers does not require a restart of the operating system or even the graphical environment – the screen will flicker, Aero Basic for a few seconds, and poof, new drivers running (note, though, that graphics chip makers do not yet take advantage of this, and will still force you to reboot. Use Windows Update, and you’ll see how they should do it).
The end result of all this is that I simply cannot entrust my work or my documents to a Linux installation. The graphics stack is so badly designed that resizing a video window can bring down the entire stack, taking with it all your applications, work, and documents. It would be like changing the radio station in your car, only to have your entire car explode.
This is simply bad design through-and-through, and it has been haunting the Linux desktop for a long time now. In the X world’s rush latch onto the “me-too” bandwagon of GPU acceleration, they completely forgot to actually fix their bloody design and move it from the 1990s into the 21st century. As long as X stays the way it is, I will never advise any of my friends or family to use it, because I know that X.org is simply not capable of robustly powering the multimedia and multitasking machines that computers of today have become.
I’m sure the blame-game will be played thoroughly in the comments, but it will only detract from the real problem here. The Linux desktop needs a modern, robust, and advanced graphics stack, which makes sure that crashes and bugs remain isolated, without them affecting the users’ work. Microsoft has shown us how it’s done, now all the X world needs to do is follow.
Nerd, Geek, what’s next? But yeah, I do agree with you with regards to Xorg 7 needing to improve.
Edited 2009-08-15 18:10 UTC
Any true geek wears that name as a badge of honor.
I *completely* agree with you on this one Thom. You hit the nail on the head.
I agree. Windows 7 just works.
just like Vista
But… it’s not Vista.
Windows 7 does work.
No! No! I agree that X needs to learn a lot from Windows Vista & 7 — How not to write memory hogger and slow UI interface. ( X is already light on memory and is fast ) So just need to make sure we do not follow the path of Vista & 7
More like: you hit yourself on the head!
Pro Tip: Get gnome-mplayer and use VDPAU output. XVideo is RUBBISH and the world has moved on. If you use totem you get what you deserve, it’s a bloated piece of crap.
Don’t bother to teach X anything, just replace it with something a lot more simple and streamlined. Leave X to the big, networked GUIs that really need it and give the average desktop user something much simpler. X is extremely dated, and no amount of work save a complete rewrite can hide the fact that it’s over 30 years old and crumbling under its own weight and obsolescence.
It’s time to do what Apple did, develop a new low-level graphics stack. It’s not easy, and it won’t be done quickly, but it needs to happen. Linux is solid, the GNU userland is solid, GTK/QT and most of the apps are solid… but X is not. Next/Apple understood this and dealt with it appropriately, and I think it’s time for the foss world to follow their lead.
There’s no reason to talk about the “Linux fans” this time, as X is a problem that plagues all variants of UNIX and UNIX-like systems save for OS X, and whose problems are generally applicable to all environments that are cursed to deal with it.
Edited 2009-08-15 18:12 UTC
Care to qualify your dramatic prose with some examples of the crumbling weight of X? Perhaps you haven’t followed how beaucoup des extensions and legacy code have been removed from the X server and libraries, nor have you followed the addition of EXA/UXA, GEM/TTM, Gallium3d, KMS, Pixman, DRI2, randr 1.2 and 1.3, etc. Part of the problem is the huge amount of churn in new features and architecture: not enough time and manpower to do testing and debugging.
Part of the perceived mess is that we are all out there in the trenches testing this stuff, while Microsoft’s messy phases are happening inside Microsoft; and what you eventually see is really a “finished” product.
The key to doing “serious work” on Linux is being pretty conservative about what you are doing. If you have unsaved data, don’t try anything new (like a video player you haven’t verified to work before).
VLC is a normal application. It’s not some weird, alpha-quality software. It’s a tried-and-true top-notch video player that kicks the arse of most other video players out there.
But you are also missing the point. Even if VLC was brand new, alpha, and not tested to work, resizing its window should still not result in a system failure!
Edited 2009-08-15 19:04 UTC
Uhh, who’s accusing VLC of being at fault? It’s clearly a bug in the driver and there’s no debate about that. The drivers are in a less than stellar state these days.
Clearly, it was not “tried and true” on your setup ;-).
Indeed it shouldn’t, but currently, stuff like this does happen, and not just with VLC. That’s why you should try new stuff only when you have saved your data. Fixing these glitches is a job for other people, and end users pretty much need to route around the damage by being conservative about what they use. For serious work, you often have a limited set of applications you need to use, and video players rarely are part of it; so the situation, if not optimal, is still quite tolerable.
Admittedly, it’s less tolerable when the crash happens to you the first time ;-).
Your flip answer highlights the problem for end-users dealing with Linux. “Oh that? It’s no big deal! Sure you’ve lost a day of work and play, but in the grand scheme of things, it’s nothing.”
You advocate “routing around the damage” by only using apps that you know to work. You know what? Most users will do precisely that. They’ll route around the damage by using Windows or OS X, where they won’t have these kinds of worries. Computers are tools for people to Get Stuff Done(tm). If the tool is not letting users Get Stuff Done(tm), then it is broken. People will move to a working tool instead.
It’s a bit sad. I’m a fan of Linux in principle, and work with it every day, day in and day out. But, it has a LOT of warts and issues that make it unsuitable for end-users that want to use the system, instead of having the system use them. I understand where fans of inux are coming from. They like the principles of Open Source, and see that Linux goes a long way toward providing computing freedom for people. But, they are often willing to overlook things that truly are show-stopping bugs for end-users. To them, the principle is more important that the practice. But to end users that don’t have the OSS religion, all that matters is the experience of using it.
Yes, and we can expect there to be a group of people that can’t get stuff done in Linux (because of particular software/hardware needs) for years at least. Still, there are those of us that can do all they need/want to do on Linux already, and can tolerate some pain (on the other hand, I just hate doing any work in Windows these days – so even with some bugs, Linux still wins by a wide margin).
All I’m asking is *reasonable* (Windows 95 level?) functionality in most areas, and I’m happy. To stay on topic, the problem brought up in this article hasn’t bitten me, so Linux behaves “reasonably” in this respect for me.
Hi Thom,
The main problem with X is the software development lifecycle is a bit of a mess, I suspect there is not a lot of testers, not a lot of developers and the fact that people don’t report bugs in the software
… did you? If there is a bug and you don’t report it how is it supposed to be fixed?
Not to have a go at you personally I have done it myself … e.g. I complained to a friend about a ACPI bug in OpenBSD and he said “did you bother reporting it?”, I had to admit it didn’t.
There are many on here that say “lets start again”, anyone that has ever worked on a halfway sophiticated software project that it is not feasible with the developer power available.
The new chairman of Xorg is an OpenBSD guy (Matthew Herb) and people from the OpenBSD team are very anal about thier release cycle and code quality (which is a good thing) .. things should improve. I have already noticed a great improvement in monitor detection by X.
Btw I love the podcasts.
Luke
Although most of the so-called criticism of X11 is plain wrong and the presumed critics (including Thom) haven’t got the foggiest idea about what they are writing about, all those neat things you mention don’t really count, as they don’t work yet. And development has slowed down a lot.
I don’t know what Thom does wrong, but I can play 1080p xvid (Matroska x264 seems too demanding, but 720p works) on my aging Athlon 64 3200+. It’s from 2005, never top of the line. ATI 1950 Pro AGP graphics card. Resizing video works fine as well, but not smoothly unless I turn off compositing. If X.org crashes, it’s due to a bug in X.org, not in X11 itself; if it can’t play video properly, it’s because of a poor driver or poor configuration.
Well I agree that development isn’t in the best of places right now, mainly due to lack of manpower. Working on X probably isn’t the most glamorous thing (although I find it quite interesting, albeit beyond my abilities to handle at this time) and they really don’t have much in the way of documentation for developers. All of those neat features *could* work if the time and effort were put into making them working and working with the toolkit devs to tie it all together. And to the credit of the devs, this does happen for some of the more important features and driver issues. But so much more could be done and it’s sad that it isn’t being done and likely won’t be done.
I don’t know if they’re doing something “wrong” per se, but I too have never had X crash from resizing a window. I too can play videos in a variety of formats (including HD videos) on my relatively low powered machines in both windowed and full screen without any problems what-so-ever.
I’ve managed all of the above on a variety of different hardware.
That said, I’m neither a Ubuntu nor GNOME user (Arch + KDE) so perhaps it was GNOME / Metacity throwing the hissyfit?
Either way, it’s disapointing to read that this happens in any “mainstream” Linux install, even if it’s not something i’ve experienced personally.
Now that I’ve got a better handle on how X works (thanks in no small part to those in this discussion who have set me straight) I am starting to wonder if it really is the distro at fault and not the window server.
I have severe issues playing back the simplest video codecs with Ubuntu. Youtube flv, divx, xvid, even motion-jpeg all in standard definition and less than 1500kbps are choppy and sometimes crash or lock up under Ubuntu 9.04. This is on an Athlon 64 (socket AM2+) with 1GB memory, Nvidia 8100 video and a SATA hard drive. It’s no speed demon but it’s more than enough muscle for such mundane tasks. On Windows 7 I can play back any video I choose, even while running other processor intensive things and it never misses a beat. Slackware 12.2 handles the video fine too. So what’s the deal, Canonical?
I’m dropping Ubuntu for Slackware as soon as 13.0 hits the servers. Maybe it’s the bleeding-edge X, or maybe it’s Ubuntu’s kernel tweaks. I don’t care what it is, I just want my computer to do what I built it for and do it well without being bound to commercial OSes.
None of which has fixed the issues that Thom, myself and many, many others have dealt with for years. Yes, I’m typing this from Linux because I still prefer a *nix environment to Windows. But, just yesterday I had some severe graphical issues that I’ve narrowed down to the version of x.org included with Ubuntu 9.04. I may give 9.10 a go when it’s released, or I may just fall back to Slackware when 13.0 comes out; I’ve been itching to try out KDE 4 anyway. None of which may even solve my specific issues (garbled display and unresponsive input devices when ANY compositing — compiz, metacity, xfce — is turned on).
It’s simply amazing to me that the scores of highly talented developers and testers in the F/OSS community can’t do for X what they’ve done for so many other projects: Make it as rock-solid as anything in the commercial world. Perhaps someone out there can come up with a more desktop-friendly UI manager. After all, X wasn’t originally intended to even be installed on a workstation and that legacy still shows its ugly head today.
In agreement with your post…
None of the distros care about X, really. That’s the problem and it’s sad. If Red Hat or Canonical or IBM would put in serious manpower for X like they do for the kernel (well, except for Canonical), it could be really awesome and would make Linux on the desktop more of a possibility. Right now, it’s a spare engineer here or there, or volunteers. There are a lot of great ideas and some smart people. Just not enough time and manpower.
That is indeed a major issue for me. I’m dying to see something comparable to OS X from the F/OSS world, but in my opinion there’s too much politics on one side and apathy on the other for it to ever happen. It’s “good enough” for the unpaid developers so it’s “f–k you” to the users.
I sincerely hope Haiku will prove my observations wrong.
Edited 2009-08-17 02:40 UTC
This is must be emphasized. X Window System is more optimal for non-desktop usages and a lot of UNIX or Linux users know this by heart.
Only those Linux users who don’t know anything and are looking for something to blame for their problems.
All the other operating systems have gone to the X model with a server process for graphics: dwm.exe on Windows and the compositor on OSX.
If you want me (I can’t speak for other people) to take your X complaint seriously, show us the API usage in Windows or OSX that compare to X and how the other desktop system is better.
Now, believe it or not, I do believe Win7 is currently far more advanced because of the work MS put into multithreaded API usage. But Windows Vista and previous versions are not.
I don’t know how Vista/Win7 works, but Quartz in Mac OS X is nothing like X.org. It does NOT adopt a client/server paradigm the same way X does.
X requires clients to send simple, 80’s-style graphics commands. Quartz has taken an approach more evolved, where applications send it PDF commands that are automatically antialiased and are resolution independent.
It used to do, NeXT still had display postscript and I am not sure if the Apple remote desktop in their server edition still uses it. The normal desktop uses just VNC for remote connection.
Anyway as for Xs remote capabilites RDP nowdays runs circles around X and rdp originated from a framebuffer protocol. The reason for this is that the network primitives of X do not scale well they easily can clog up your network connection if you run a modern desktop, you can add various non standard compression schemes to it but in the end there is no real standard to do this.
I personally think this is a joke in itself. One of the reasons of the complexity of X is that it is network enabled by default, but it scales worse than all other remote desktop protocols which do it in a simpler way on modern user interfaces.
If you don’t like calling it a graphics server with clients, then how would you like to describe it?
Quartz by itself does not have the capability to run client applications over a network socket. X does.
Edited 2009-08-17 08:46 UTC
How does ARD work then?
Remote Desktop is *not* part of Quartz.
If I’m not mistaken Apple’s remote desktop is based on VNC. I’m not sure at all about how Windows’ remote assistance feature works, but I highly doubt it’s low level enough to be a core part of the graphics subsystem.
I’ll bite.
As I stated in another comment, X has its roots in a server/thin client model, way back when it wasn’t economical to have a local windowing system on a workstation. Today, that model is still at the heart of X and in my personal opinion is one hundred percent useless in a desktop OS.
Windows’ and OS X’s UI stacks have never, ever had that ability, because they never needed it.
It’s time for X to move aside and make room for a modern, responsive and rock-solid-stable UI in the vein of Quartz/Aqua.
“That model” is also at the heart of Windows Vista and OS X.
Programs that want to display graphics (aka, “Clients”) send commands to a separate program that actually does the display (aka, “The Server”).
It doesn’t stop being client and server just because the parts run on the same system.
Then why the user-observable latency on X that is not present on the others? Windows, BeOS, OS X, OS 9, OS/2 Warp…none of those have the latency and stability issues of X11, and all of them were designed for desktop/workstation (i.e. standalone) use. If they truly used the exact same client/server paradigm they would be plagued by the exact same issues. You appear to be badly misinformed.
You can say that the others are “similar” in concept but in practice they all are a vast improvement over what X has been. So my question is, why the hell can’t X be as good as even the least of those?
Wrong on both counts.
First of all, the client / server model used by X is almost (but not quite) entirely invisible when you’re using a local server. No networking is involved. Communications between the client and server are handled through unix sockets and shared memory. Furthermore, when using OpenGL on a local machine, the X server is used only to set up the OpenGL context – the application communicates directly with the video drivers.
There’s only one problem with X, as it stands, for local connections. Oddly enough, it’s a problem that’s even worse for remote connections – latency. Specifically, there are a lot of operations (thanks largely to xlib) that are synchronous. An application has to send a message to the server, and then wait for a response. XCB (a replacement for xlib) pretty much fixes that, but it isn’t supported by any UI toolkit just yet, and proprietary video drivers (nVidia and AMD) don’t support OpenGL using XCB.
So that’s not a fundamental design flaw – it’s an implementation problem. And one that can be fixed. The latency problem actually has been fixed for remote connections – NX acts as a proxy, and handles some of the messages directly, rather than waiting for the remote server to respond.
The architecture used by Windows’ windowing system is very similar to X. In Vista, the windowing system is just another process, like any other. Applications communicate with that process using local IPC and shared memory. Direct3D and OpenGL are handled the same way they are in X, except there’s no support at all for accelerated rendering over a network link. RDP is implemented by having a different windowing server, which acts like NX – it sends some of the messages over the network link, and handles some others itself. Windows XP works the same way, except the window server resides in the kernel instead of a user process.
RDP is a fundamental, inseparable part of the Windows windowing system, so you can’t really argue that X is inherently worse than Windows because network transparency is a fundamental part of X.
As for OS X… It’s windowing system is implemented much like a local-only X server. It’s still a separate process, and applications use sockets and shared memory to talk to it. It doesn’t have a separate network protocol, so it doesn’t have a fallback for when shared memory isn’t available, but otherwise it’s very similar to X.
The only real difference between Windows / OS X and X11 is that, in X11, the window manager is a client process that talks to the server, while the window manager equivalent in Windows and OS X is built directly into the windowing system.
Other than that, it’s all a matter of implementation.
Now, the implementation of X on Linux may suck, but that doesn’t mean it needs to be thrown away. Especially considering that, for any replacement to be a success, it would have to implement most of the X protocol anyway. If it didn’t, there wouldn’t be any applications that could run on it, so nobody would use it.
You can’t seriously, with a straight face, tell me that because they use processes and threaded messaging that Windows’ and OS X’s GUIs are “just like X”. They are so fundamentally different in nearly every major aspect even a school child could point it out. The latency is the biggest tell-tale; there is no perceivable latency in the commercial OSes because they aren’t wasting time looking for a god damn networking stack to send the commands over! From the ground up it’s a different design. Yeah, there will be similarities because they all want to accomplish the same basic goal: Drawing windows and providing a workspace. However, your entire post may as well have said “a car is really just a horse drawn buggy because they both get you somewhere”. Sorry for the tired car analogy but it’s true.
As Thom said, when the Windows GUI backend crashes it doesn’t take other processes out with it. X does. That’s a flaw, and a f–king major one. I guarantee that if X were either redesigned or replaced with something that left out the legacy networking crap and went with a modern methodology we would see the same performance and stability we see on real window systems.
Really you can. Windows stack, for example, has been redesigned to be a real client-server thing since NT (4.0 at least). And X really does not use the network stack when both the server and the clients are running on the same machine. As previously pointed out, it uses shared memory and/or unix sockets, which are actually exactly the same thing that Windows and Mac OS X are using for communicating between the server and clients.
There is nothing fundamentally wrong in X. It’s almost the best architecture available, and (look for MS documentation in MSDN or ask ReactOS devs) others have been copying it. You really don’t have to send 80’s drawing commands to the server through network. On local machine nothing ever hits the net, and XRender has the same cool features as Mac OS X or Windows. And believe or not, also Mac OS X and Windows have support for commands like “draw line from x1,y1 to x2,y2”.
There are, however, bad implementations, and especially bad drivers around. It is really unforgivable, that there has been bad experiences with X, but in the end it’s the matter of manpower. MS has money and manpower, the graphic card makers have money and manpower. FOSS has not either.
The latency issue is not because of network or design that much, but because of bad drivers AND because of the fact that MS has the graphics drivers in the kernel, not in the user space. During the early days of NT 4.0 and W2K this was a really bad decision, but now a days MS has got it right and working really great. Also Linux is moving to this direction with KMS and Gallium etc.
Thank you for explaining that to me so well. I have been going off of what others have said and taken that as fact, and obviously they (and I) are the ones who are misinformed. I formally apologize to anyone I may have offended; I tend to get heated when I think I’m right.
So, if X really is fixable, let’s fix it!! I’m not a programmer beyond the occasional shell script or PHP-driven website, but I can contribute by offering feedback regarding bugs, or perhaps even some technical writing. Let’s do this the F/OSS way and get *nix based systems working better than the commercial giants!
No, they really aren’t.
Your central argument is that X is inherently deficient because of the network support, which is a load of crap.
Windows has the same network support that X does, and it works in very nearly the same way. X does not use the network at all when the client and server are on the same machine. In the local-only case, both Windows and X use equivalent communications mechanisms (local IPC and shared memory) to communicate with the window server.
As far as your thesis goes, how are these systems any different?
Nor is X. When running a local server, the networking system is not involved. At all. It’s entirely possible to run X on a system with no networking stack at all, and it still works.
The latency occurs because Xlib is synchronous. An application tries to send a command to the X server, over whatever communications mechanism you’re using (Unix domain sockets locally, network sockets remotely). Xlib then blocks, and waits for the X server to acknowledge the response. The application can not continue until the X server responds, and Xlib allows the application to continue.
Running locally, this causes problems because you’re forcing the OS to put the client to sleep while it waits for a response from the server. You need three task switches to deal with a single X message. Since there’s no way to batch them together, this latency ends up absolutely killing performance by causing lots of unnecessary task switches. Even on an SMP machine, the two processes are forced to take turns, rather then being able to run in parallel.
While this is stupid, it’s not a fundamental problem with X itself. It’s a fundamental problem with Xlib.
If you use XCB, you can send these commands asynchronously. The client can fire off a long list of commands, and let the X server get around to it whenever it’s able to. Most of the time, the application doesn’t care when a sequence of commands is finished, so it can just send off the commands, and either get on with something else, or go back to sleep.
Windows handles local IPC mechanisms asynchronously by default, so this problem doesn’t occur on Windows.
It’s an IMPLEMENTATION problem, not a design problem.
It’s entirely possible to write an X server that can recover from video driver crashes in the same way that Vista can.
Following Vista’s example, you could split X itself right down the middle into two processes. You have the server process, which handles incoming connections (both local, and network), keeps track of resources, and spawns the driver as a separate process. Should the driver process crash, the server process can restart it without interrupting the client connections.
No. You’d get a windowing system that nobody would use, because there are no applications available for it, and no drivers for it.
Edit: That last bit is, of course, why no replacement for X has ever been successful – any replacement has to be able to run existing applications. It’s also why we’re pretty much stuck with Xorg – any replacement would have to have at least equivalent driver support.
The problem is that Xorg sucks as an X implementation, for a number of reasons, and some of the lower-level bits used by most toolkits (like Xlib) also suck. At least Xorg sucks less than Xfree86 did, but that’s not saying much.
Edited 2009-08-17 14:20 UTC
As I said to another poster in an earlier conversation in this thread, thank you for explaining in such detail. I was going on bad information and my own bad assumptions and I stand humbled and corrected.
Here’s to hoping X can become (or be forked into) something that makes Microsoft and Apple nervous.
Edited 2009-08-17 22:37 UTC
It seems that I almost never agree with your random rants Thom, but this one I have to say is pretty justified. I think you go a bit over the top with the MS cheer leading, but frankly its hard to argue that – Windows does do this pretty much “right”.
If anyone knows a good technical explanation of why X shouldn’t/can’t be modified to act similarly Id like to read it.
Replacing the working display stack of XP for WDDM in vista cause microsoft a whole lot of pain.
Careful what you wish for.
Progress always has a price tag attached.
It did cause a lot of pain, but Linux has an advantage over Windows.
In Windows, replacing GDI for WDDM was a complete replacement, and GDI apps run in WDDM.
In Linux, they can pull an OS X – a new graphics framework can run a wrapper like X11.app, and old apps can run in that, new apps can run in the new framework.
Of course, what I don’t get is why an app must die when the X server dies, other than the whole orphaned processes thing. That could be solved by X calling another program to actually launch the app, and then figure out a way for programs to survive disconnection from their X server.
You hit the nail on the head. I agree 100%.
You’re also right about the typical Linux fanboy response, but you missed one: the “if you don’t like it, why don’t you fix it and contribute to the effort?” argument. We might see a few of those as well.
Until the graphical and multimedia stack is replaced/reworked, Linux is great for the server market where a GUI isn’t even required, but it simply cannot hold a light to Windows or OSX when it comes to the desktop.
Edited 2009-08-15 18:27 UTC
This has not been my experience of Linux and I am by no means a Linux fanboy. I am just not sure how you arrive at such a statement, with regards to Linux cannot hold a light to Windows or OSX when it comes to the desktop.
I have been using Linux as my main desktop for a few years now and have put it under quite a lot of strain and I can safely say it has stacked up well against the aforementioned … I can’t say that the same is true with regards to Windows in terms of stability.
It allows me to run the same applications that I would otherwise use under a Windows environment (e.g. OpenOffice, Firefox, etc) and I have never experienced any problems with stability.
Perhaps my hardware just works with Linux ?? Without wishing to get into any argument, I would be very interested to understand why you don’t think Linux stacks up well against Windows or OSX.
Not sure why I would get modded down for stating that my experience of Linux is different to the parent poster … oh well
I did not mod you down, but I considered it because of this statement:
I have been working with and developing on Windows for about 15 years now and I have never found it to be “unstable,” especially once NT was released.
I’ve never had the problems of crashing that people speak of… but I maintained my systems (even as linux users have become used to performing or monitoring their system, performing maintenance if or when needed), kept the file system tight, cleaned out out the registry, etc.
Fact is, I rarely find ANY operating system as buggy as 90% of the people on the internet seem to rant about.
I’ve been a heavy user of Windows/Windows NT (3.x all the way to 7), Mac OS/Mac OS X (7.x-10.5), VMS, OS/2 Warp, and my favorite, BeOS. For each of these systems I have used them for at least a solid year, for casual use and for development purposes. None of them ever gave me problems (unless I knowingly went ahead and installed buggy products or drivers – which I have done, but I would not blame the OS if something goes wrong).
People throw up comments like they are tried and true facts, evidence to be pondered, when in reality they are just points of view arrived at by one bad experience which, if they had known the operating system better, might never have happened or never had caused them grief.
“People throw up comments about their experience…”
Right, just like Thom just did. It doesn’t matter that a great portion of people who are Linux-savvy have no such troubles, because Thom’s voice is louder and the happy people don’t have a need to submit to OSNews about the wonderful time we’re having. I will assume that he was “mad as hell and wouldn’t take it anymore” but it was a waste of time and comes off as incredibly whiney.
My advice for Thom (who, for me, has pretty much lost the slow-coming, gradual trust he’s earned for decent, fair reporting since he took over) is to lock up his keyboard until he has something real to report, not just his frustrating (and embarrassing, given his self-proclaimed title of “OS aficionado” or whatever he calls himself) tale of that time he was unable to figure out Linux. He rags on X.org and calls for it’s dismissal when this is such a naive viewpoint that I pity him. Here are the facts:
* Network transparency has nothing to do with performance or reliability
* X.org might suck, but X11 does not
* Duplicating the amount of work to rebuild a system with the capabilities of X is stupid
* Because of the other points in this list, *all* *other* attempts to build alternative windowing systems have failed. And I know. Because I was involved with some of them.
How’s my experience? 720p video plays fine (on my rather crappy 1.7ghz + 1G RAM and AGP Geforce 7300) *while* running compositing and deforming the whole desktop into a sphere mode, with Compiz and KDE 4.1. (I’ve even done tests with multiple videos with decent screen framerate).
Fact is, it *could* be an X.org bug, but more likely it is bad drivers, configuration, or a poorly supported graphics chipset. The fact that Thom stubbed his proverbial thumb and then got mad and decided to lash out against X is a sad one, and unbefitting of someone with the experience he has.
Correct me if I’m wrong, but isn’t Mac OS X a prime example here?
On the other hand there are 199 replies to this topic.
Are you seriously saying that you never had any instability problems with windows NT 3.51?? I don’t think that’s possible, if you really use it. It was marginally more stable than win 95 sr3, while require 3 times as much memory.
It was the first release of a completely new operating system, of course it was unstable and buggy as heck.
I think that “marginally more stable than win95 sr3” is a slight exageration. I remember using NT 3.51 and Visual C++ (amongst other apps) solidly for about 2 years and it proved very stable for me.
Sure, it had some bugs and it had its moments like any other operating system but IMHO it was a very stable platform to work on and certainly a lot more stable than Windows 95.
I remember wanting the Windows 95 interface but with the stability of NT 3.51 – then came along NT 4.0
To be clear, I was comparing it with win 95 osr3. The most stable of the win95 releases. I had a win 95 osr3 computer, the computer lab had nt 3.51. I found an equal number of crashes with both, while the nt boxes had three times as much memory and had a processor that was twice as fast. The systems seemed to be equally fast and stable.
But Yest, NT 4.0 was great. I stopped seeing Dr Watson come up every day ( only to have him crash as well … ).
Ah I seem to remember I used Win 95 OSR2, but it some of the problems could have been down to hardware. I have to say the most unstable OS that I have ever used was Windows 98 .. I don’t know why, but it caused me no end of headaches – could have been hardware related, but I have bitter memories of using it.
There can be instability in any OS, new or old. What I am saying is I never seemed to have the problems that many people report with Windows “Crashing on the hour!” “Crashing at least 3 times/week!” “Constantly hanging/freezing!” Maybe I have a golden touch or something, or maybe other people are cursed.
I loved NT! I actually enjoyed the melding of the old interface with the new kernel.
I would have to agree with you here. The NT based operating systems have all been rock solid on the client and the servers that I have to support.
I didn’t even really have the problems that I hear so many people say that Vista has caused them. Sure it wasn’t great but SP2 really fixed a lot of the tiny nagging issues for me. Windows 7 has been running really well since the public rc and now I’m running the RTM from technet. My problem is that hardware support isn’t there yet, my synaptics drivers aren’t working and the wife’s dell alps touchpad doesn’t have drivers either. Tapclick should be disabled by default
Anyway, I can’t agree any more about the stability and reliability of the NT based operating systems.
I was talking from my own experience with regards to the stability of Windows (95,98,XP), but perhaps should have made that more clear in my posting.
I think you have some very good points, in that people to tend to have a bad experience with an Operating System for one reason or another which then forms there opinion about that OS when in reality it could have been a particular hardware issue etc.
I have found Windows NT 3.51 / 2000 and Ubuntu 8.x/9.x to be very stable platforms for me that have given me very little to critise. I suspect in Thom’s case, the problems are mostly hardware related but he is right to some degree that such problems (e.g. shoddy drivers, etc) should not bring down the whole stack.
Then you are fortunate enough to have never experienced an X crash. The point is that if a crash does happen, X does not handle it well and destroys whatever’s running on your desktop.
Found one only 5 comments down the line
See Comment by Luminair: “The good news is that X is open source so you can fix it yourself”
There are 2 problems with the “if you don’t like it, why don’t you fix it and contribute to the effort?” argument.
Firstly most Linux users are not capable of fixing it. How many hackers on Firefox or some other application would know how to work with the 30 year old X? Also with the influx of general Windows users to Ubuntu and other distros, who can’t program. This means most people who complain over the issue can’t contribute.
Secondly fixing X in the way discussed is a big job. We need the X project behind it, not just a few complainers.
Actually, Windows Vista has learned from X. In Vista, drivers are largely in user space instead in the kernel as in Windows XP, due stability reasons.
This is the solution of Xorg since its creation.
So, well Microsoft, thank you to imitate X window server after 25 years
And these days Linux gets Kernel Mode Setting — moving parts into the kernel. 😉
Moving things into the kernel that should have always been in the kernel (and, in fact, those part were often in the kernel in other X implementations, just not XFree86/X.org). I don’t understand why anyone thinks it’s safe or a sane idea to have usermode processes directly accessing and probing hardware. That’s the kernel’s job, plain and simple.
It seems to work pretty well for Vista/Windows 7.
What exactly is your objection to probing the hardware etc. from user mode? Are you afraid of hurting it?
Fact is, the hardware causing software malfunction is a bigger problem than vice-versa. Thus, it seems to me that hardware support should be moved out of the most sensitive software area.
Edited 2009-08-16 00:00 UTC
My first objection is abstract: hardware management belongs in the kernel. That’s what the kernel does. That’s really the sole purpose of the kernel, in fact: managing hardware and providing a somewhat abstract view of it for userspace software. It thus does not make sense for userspace code to be managing hardware directly. X is the only exception to this general rule.
The other objection is that it’s actually not that stable and leads to lockups and the need to reboot once the video card gets lost. Userspace simply does not have the features necessary to make full use of the hardware in a safe, secure and multi-tasking aware way. You can’t really run two X servers side by side right now (at least not with full acceleration and stability). Having the kernel manage the base video resources allows video resources to be safely multiplexed and managed. Additionally, things such as video card interrupts and so forth can be dealt with by the drivers, which was impossible before, since there’s no sane way to pass interrupts on to userspace.
Frequently, I have had X server crashes on VT switch. The reason is that VT switches are messy dances which involve taking away or giving back control of the video hardware from/to the kernel/X server. Suspending and resuming frequently don’t work because the X server can’t properly deal with the hardware in step with the kernel as it resumes devices (usually, the suspend script just VT switches away from the X server to alleviate this problem, such as it can).
KMS simplifies the design and puts what should be in the kernel in the kernel and leaves what userspace should do in userspace. And with device-specific junk in the kernel, the userspace drivers can be simplified and may possibly be able to be merged into a single driver, which would greatly enhance the stability and functionality of X (all resources can be pooled into working on one or two stable drivers, instead of a whole bunch of different drivers).
And, I should point out, KMS does not mean moving the whole graphics driver into the kernel. It actually ends up being about the same as what you get in WDDM (at least for the kernel mode portion [KMD in WDDM]). If all apps move to using DRI2 for rendering (equivalent to a lot of the UMD portion of a WDDM driver), then the X server will exist solely for window management and event processing and the model will end up being more or less like WDDM. I believe that is the long-term goal for X.
KMS only works for resolution and other minor stuff. But when X starts, it take the control.
In the beginning all graphics stuff in nt were usermode.
Also not all drivers in Vista are usermode.
In case you didn’t notice we are in the process of moving the drivers /out/ of X and into the kernel, this should make performance suck less and hopefully we wont need to depend on X for acceleration.
Edited 2009-08-15 19:40 UTC
Windows NT graphics were originally in userspace, before NT 4. They moved it into the kernel in NT 4 to increase performance. They then moved them back during Vista.
I think MS just learned from their own mistakes, not from Unix/Linux
Having X.org run in the user space isn’t much of a fact of proper design but more to the fact of coding to the Linux/Unix structure. The fact that Linux Structure makes coding for user space so much easier then on windows which tend to encourage development globally. I wouldn’t give X to much credit on that. Remember the argument is if X Crashes then so do all your apps. which is the real consern.
Windows never ran the window manager or session manager in kernel mode. *Parts* of the graphics driver became kernel mode in Windows 2000, yes, but that part has become very small.
But you’re wrong. All CSRSS.EXE does now is session management, thread management, a few other minor details and the console window. Everything else: window manager (USER), the GDI and the graphics drivers live in win32k.sys inside the kernel. All GDI and USER calls do not go to CSRSS.EXE, but instead go to win32k.sys. I can’t find any documentation that indicates that CSRSS.EXE still has anything to do with window management and graphics.
I never said it did window management. It takes part in session management. It also doesn’t do console windows (it used to, but that functionality has moved to conhost.exe).
Window management / non-client graphics is part of the DWM, which uses a client/server model as I described, and of course is all user mode. GDI is implemented in user mode. Yes, the software version makes calls into the Win32 kernel, but in Win7 the GDI acceleration goes through a new DDI (which are user-mode) to accomplish this.
The very first part of your sentence said that window management was never in the kernel, which is patently false.
I don’t think this is true either (but I’m not as certain). As far as I can see, GDI does happen in kernel mode and there’s a kernel DDI that can be used to accelerate it (there was no such DDI in Vista.. it was added in Win7).
The Window Management and Session Management does run in kernel mode. It’s all handled by the driver win32k.sys.
It’s going to be the X dev types that’ll flame you.
Personally I could probably count my X crashes in 3 years on one hand. However I have seen how much of a mess it is. Window-tearing? Bloated codebase? Granted, Xorg should work without any xorg.conf file these days which is an improvement yet it still isn’t bomb proof.
FOSS can’t be exempt from criticism especially for core things like X.
Oh, whilst we’re at it. Can linux sound be fixed please?
It’s funny, most of his complaints have nothing to do with X at all.
– Evolution crashing is obviously Evolution’s fault (no surprises there).
– The XVideo crash is clearly a driver problem, which could happen on any system (and not infrequently does). I use XVideo and now textured video and I don’t have crashes when I resize windows.
– It’s a little known fact that the X server crashing does not require all X apps to crash. They can actually disconnect and reconnect if they so choose. None of the toolkits, however, support this. So…the problem is, as usual, with the toolkits. On Windows XP and Mac OS X, if the windowing system crashes, apps are toast (and on XP, the whole OS is toast since the window manager, etc. live in kernel space).
X only lacks polish. The fundamental design and architecture is quite sound and is followed by most other windowing systems. Driver bugs, delays in getting new features developed and debugged, and lack of manpower are killing X right now.
You couldn’t have missed the point more if you were on venus.
It could be a driver problem, and just as I said, this could happen on any system. It’s just that on other systems, such as Vista and 7, a graphics driver crash has zero impact because of proper isolation – as I detailed in the rant/article.
And yes, it’s anecdotal, but that’s besides the point. This article isn’t about the bug – it’s about the consequences of said bug.
This could be solved by having toolkits make good use of the session management features of X (and the ability to disconnect and reconnect). Then, if the server crashed due to a driver bug (or a kernel oops with KMS), the server could restart and the session manager could reconnect all the applications and it would have the same effect as what you describe with Windows 7.
Yes, but it isn’t. On the other hand the developers time is much better spend making wobblie windows with Compiz.
Alas, this is true.
Someone may have already stated this, since I don’t feel like reading through all 100+ comments.
But this is some serious crap coming from “It doesn’t do anything in vista / 7” I have had a full on BSOD from a graphics driver in both of them.
I haven’t had X crash in many many moons. Even on my HP Touchsmart TX2 I don’t have any Xorg crashes running Debian Sid with the fglrx driver (even though it throws up a lot of errors, since it’s not made to work fully with kernel 2.6.30)
Problems with Evolution crashing? Have you been using the same configuration files for a long time? I always use Evolution, and it stays open for days on end, with only a very random crash.
Couldn’t say the same for Outlook, which will just randomly stop receiving / delivering email and most times just requires a reboot of the entire computer to work again….
Most things (like OpenOffice.org, Firefox, Evolution) has a recover feature.
You probably are running an ATI video card, which has notoriously BAD drivers, yet you blame X.org? My video playback is generally much smoother using Compiz and a nVidia driver than it is under Vista/7. Not to mention that PowerDVD constantly crashes. If it weren’t for having a blu-ray drive, I’d not use it. Compiz works great with Video, yet Aero gets turned off every time I start a movie?
It seems to me that the only thing that improved with Vista and 7’s new graphics driver model is that you generally can use the driver right after installing, instead of needing to reboot… but only to a certain extent.
Pretty much most of this commentary on the inadequacies of X.org are really down to a closed source driver that the X.org developers have NO control over.
Fix your evolution config (you know you can remove the .evolution, but save out your inbox folders first. Or better yet, use Imap so you don’t lose your email.) And change your video card (or driver) and stop your complaining. Sometimes you come off as a whiny little child, Thom.
http://lwn.net/Articles/248227/ <- ATI has open source linux drivers, so “BAD Drivers” are your fault I’m afraid.
PowerDVD… I have a blu-ray drive and don’t use that program.
>> Pretty much most of this commentary on the inadequacies of X.org are really down to a closed source driver that the X.org developers have NO control over.
I refer you again to http://lwn.net/Articles/248227/
As for whiny little child, really showing maturity levels there.
At the end of the day, systems work for people who learn to maintain them properly, and people from the other camp are generally going to have troubles with them.
That being said, I’ve only had 1 bluescreen with Windows7, and that was in the pre RC leaked edition. I’ve been running 7 on 6 machines ever since, and use 3 of those machines 8 hours a day at work, and another is left on 24/7 as a shared machine for the household + media server. *nix I’ve had many more issues.
Hell, I had to upgrade the bios on my last laptop (2007 model, ASUS G2S) before Ubuntu would even run those fabled NVidia drivers you were so happy with earlier. And then once I did, they still refused to render correctly to my HDTV no matter what configs (or indeed which drivers I used, I tried both the proprietary and the open ones), and even the hardcore *nix fans at work explain to me the issues they were having with nvidia drivers just last week while trying to play secret of monkey island, hardly a hard-core high-end game…
When I use *nix, I stick to console, it just isn’t worth the effort of anything else.
When I want efficiency or speed of production or compatability, or simplicity, I go win7. I’m sorry but the OS wars aren’t going to be won by the OSes, they’re going to be won by the programs that they support, and the compatability that those programs provide to an enterprise / business market.
The Open source drivers that you point to are really only part way there (as the X developers state themselves. I’m well aware there IS an open source driver, but I am also well aware that AMD/ATI only have released SOME of the specs, and mostly only for older chips)
I still maintain that this was about the CLOSED source driver. I haven’t had any video issues with either, but then I only recently got the Touchsmart with a ATI 3200 HD. It mostly works. Compiz gives a white screen on the open source driver (R600/R700 is not fully supported yet)
Calling someone a whiny little child has nothing to do with MY maturity. The original article really is a bunch of “it doesn’t work here!” where not that many others have issues. Seriously, I’ve been using Linux for 10 years, and I know where it has it’s faults, but X has had huge leaps and bounds.
I agree with nVidia+Linux+HDTV is all sorts of a pain. Having bought a (cheaper) HDTV that doesn’t have a 1:1 pixel ratio setting, I have some huge amounts of overscan. The only part of this though that is nVidia’s fault is that they broke xvidtune a long time ago and don’t support any other ‘easy’ method of changing resolutions. Not to mention the Windows drivers let you resize the desktop, but then don’t state the proper frequencies. This again is still an nVidia driver issue, NOT X.org, which is what people are blaming this on.
By the way, nVidia also failed on some things for the Windows 7 drivers. If you click ‘test’ for a custom resolution, it’ll complain that the resolution isn’t supported, then… blank screen. No escape, or anything will change it back. Reboot? Nope, still didn’t change it back. Had to go into safe mode at 640×480… yikes, that is HUGE on a 42″ TV.
Let’s be honest, all Operating Systems have their little issues, that’s why IT and computer geeks are always in such high demand. The conspiracy theorist in me wants to say that it’s made that way on purpose….
Can you imagine a day and age when the User can just think something and have it accomplished on the computer, via neural interface or whatever? No one could make any money then. Well, unless you want to wipe the bottoms of those who are stuck in a World of WarCraft 3000.
To prove that not all installs (hardware or user issues aside) are not all created equal. The reason that the Apple computers (and by extension the Amigas and Atari STs of old) were so stable is that they had ONE hardware platform that they needed drivers for, and could keep things simple. Using generic PC hardware is always a hit and miss adventure in stability / features.
With that in mind…. I’ve had Windows 7 BSOD on me at least 3 times (on my tablet) but I think it was due to funky drivers, mostly ones that were written for Vista, since I don’t have drivers for everything for 7.
Either way, the article was nothing but mindless dribble, and really wasn’t relevant. Complain to ATI/AMD for not releasing specs, or making crappy closed source drivers. Submit bug reports to both AMD/ATI and Evolution. I have submitted many bugs over the years, it’s not all that difficult (even under Debian, which someone was saying that Debian should just use Launchpad, in some other comments on this page. It’s really not THAT hard…)
Hi! I’ll clear up this myself. I’m one of the few guys that likely have enough experience programming X11 directly.
The main issue to solve in being able to restart the server is server memory.. you may have pixmaps and other info stored in the server (in video ram), so if X crashes all that is lost. What windows does is keeping a copy of all the video resources in system ram (then is kind of needed anyway if you need to page video ram in and out for stuff like opengl/direct3D). Of course, windows uses more memory for this, because it keeps copies of everything instead of using a paging system, but allows it to restart cleanly.
This is probably not something good to resolve at toolkit level if you want to keep the client/server nature of X11 (it’s a lot of work and wasted ram for the client), though maybe it could either exist as someting like an intermediate layer between the applications and server (not so difficult to do). Windows has no client/server nature so it’s a non-issue. It’s probably not hard to do, but being honest, i had zero X crashes in years and i think most people doesn’t either so it’s kind of a low priority..
Windows Vista kept a copy of the window buffers in system memory, but Windows 7 (with WDDM 1.1 driver) does not. Windows 7 uses device bitmaps that only exist in video memory, and if the driver restarts, these are recreated. This does result in the loss of cached off-screen / non-painting window buffers (i.e. most minimized application windows) when the driver restarts, so if you hover on the taskbar icon and look at the preview of that minimized window, it will be replaced with an icon until you restore it. But that’s a very small price to pay for the reliability improvement.
Second, Windows does use a client/server design. First, there is the session manager which has a server portion (smss.exe) and a client for each logged on user session (csrss.exe). There’s also the separate Desktop Window Manager system, which also has a server process (UxSms service in a svchost.exe surrogate) and the dwm.exe client running inside each composition-enabled user session.
Perhaps you should stick to describing the systems you’re actually familiar with?
Pheraps you should stick to reading the parent posts before answering something unrelated. I wrote about actual bitmap objects stored in video ram, not offscreen window buffers… you know those referenced with an HDC handle. In the X11 model, those are stored in the server and lost after the connection is broken.
Edited 2009-08-17 00:12 UTC
Every word in my post was a direct response to inaccuracies in the parent post.
Your post said: “What windows does is keeping a copy of all the video resources in system ram (then is kind of needed anyway if you need to page video ram in and out for stuff like opengl/direct3D).”
I corrected this. You also said that Windows has no client/server model, so I corrected that as well.
I mentioned off-screen windows because those are a special case on Windows, and that is the only case where they cannot be immediately recreated if the DWM client disconnects (or the server component is restarted).
I’m not an expert on X, just Windows. But it seems to me that the biggest problem for the X model is that there’s too much application / window state that exists in the server process and if it goes down, there’s no way to preserve that state (without explicit app participation). Window buffers / bitmaps are of no consequence and should be able to be recreated just like they are on Windows. That’s why I’m confused that you brought them up.
Yet Windows keeps all your apps open.
Why talk about XP? It’s an almost 10 year old OS, talk about 7. On OS X the windowing system doesn’t crash because the drivers are written by Apple and tested.
For Thom it doesn’t matter where the problem lies. He needs things to work. And they need to be fixed.
Okay, that’s fine. I agree. The drivers are often shit.
The problem is that Thom is making an invalid accusation: saying that X.org itself is a pile of crap and needs to be done away with, when in fact all of his problems are not X problems, but instead driver problems or toolkit/app problems. If you are going to make that kind of accusation, at least get your facts straight.
You don’t get it, do you?
Under no circumstances should resizing a video window result in a complete system failure. That is BAD DESIGN. As I said: bugs happen. however, if your framework design is sound, a bug will not affect critical systems – in this case, my applications and my data.
The bug is not the issue – X.org allowing the consequences that I described is the issue.
Just like a bad app shouldn’t take the whole system under, but in Windows it can and does, games in particular are a horrific offender. That is BAD DESIGN. I don’t understand this holding it up as a holy grail. Sure, it’s less likely to hose you when resizing a video (which btw, I haven’t had happen), but there are other ways it’ll manage to do it, just as with any operating system.
Edited 2009-08-15 19:09 UTC
That’s only because the game has locked the video and keyboard. If you have an ssh service installed, you could SSH into windows and kill the games process
A bug in the kernel scheduler will take down the whole system and no amount of good design is going to make this not true.
Siride’s point is that your arguments do not prove beyond doubt that X.org IS the framework with “BAD DESIGN” in the context of the issue in question. Pinpointing all blame on X would be a rather oversimplification.
Your central thesis in this post that a bug should not bring down the system is sound. However, the conclusion that X is a “train wreck”/”pile of dog poo”/<expletives deleted> is disputable and rather insulting.
X crashed because he resized a window. How is that not proving that X.org’s a piece of crap?
Edited 2009-08-15 21:01 UTC
It proves that the drivers are crappy. The problem would be the same on Windows or OS X. Low-level driver crash = time to reboot.
Your applications and your data would be fine if GTK and QT implemented session management under X. It’s as simple as that. Sure, X shouldn’t crash, drivers shouldn’t crash but they do. But it isn’t just X’s fault here, so why don’t we rewrite QT and GTK while we’re at it?
So, you say you don’t care where the bug is but then it’s X’s fault?
Let me be clear: I agree with you. This kind of crashes shouldn’t compromise application data, specially given X’s architecture. But pointing at X is like pointing at a single pixel in a 10MP picture.
Then the problems run deep and are wide spread?
The problems with the toolkits are well-known and significant. And they could take the steps to make apps fault tolerant. But they don’t, just like they don’t bother with a lot of things. There seems to be very much a lowest-common-denominator approach to using features of the underlying system, rather than work on standards, or figure out ways to use different underlying system APIs on different systems. That’s why making complex apps on Linux is a pain. It doesn’t have to be impossible, nor does it require one API to rule them all. Just needs manpower and there just isn’t enough focus on the desktop to provide that kind of manpower and leadership.
Since we’ve learned from other posters that it’s only sometimes that Win7 can restart the driver it is, with your logic, still BAD DESIGN. Not to mention that it is possible for userland apps in pretty much any OS to, given the right circumstances, take down the system. Again, BAD DESIGN. We’re eagerly awaiting your rock solid OS that has none of these shortcomings.
Ever heard of autosave?
Funny you quote a a very recent fix in one of the worlds biggest and best funded operating systems and hold it against a group of volunteers and a small group of funded persons who maintain a very backwards compatible complex piece of software that they haven’t done the same?
Sure it can be done. X.org is runs fully in userspace so it should theoretically be possible to separate the hardware logic from the administrative logic. And that would probably be a good idea too.
The truth is, that that is probably least of the worries on the X.org developers minds. As there are many fundamental shortcomings in the whole system ranging from mode-setting, memory management and other modern requirements of a graphics systems with regards to rendering.
And I agree that a simple resize shouldn’t cause a crash or anything, but that is just the dumbest dead statement there is: “bug A caused sympton B, that shouldn’t happen!”. Well I guess that’s the way with bugs, and they should always have less impact than you’re seeing and they could all have been mitigated and they are all the work of developers who have been spending there just working on the things you want. And if shit hits the fan people always think they should have focused on that point more, but if they focus on that point and feature development lags because of that everybody is pissed because it doesn’t stack up agains the big boys.
Sorry Thom, I share your feelings on how bad things look. But nobody is focusing on getting you the worst desktop experience on purpose. And the fact that these problems remain are probably due to changes which would otherwise cause peoples wraith. Like abysmal 3d performance, slow vector rendering, even worse video handling etc.
And seeing how much things *have* improved on that front in the last few years. And that my bleeding edge X.org with intel and ati drivers behave really well and finally can cope with all the animated flash and lots of open windows, flawless video playback and even acceptable desktop 3D (google earth works fine nowadays), it would seem that something good is happening.
And sure, everything is almost always in flux, but that is partially the “curse” of open source software. Separate projects with separate timelines and only active support on the trunk or trunk + 1 release branch (which half of the time don’t match your particular distribution).
I really do like open source and am use it for almost everything I do, and as much as I love everything I use I also accept that having paid nothing to expect equal quality. Not what a lot of fanboys want to hear and what your (girlfriend/parents/..) expect and it’s also not the goal of the developers, but that is the realistic truth.
Your post is quite on the money, but I have to disagree with the above.
It seems as if open source developers and advocates always claim the superiority of the open source model, claiming it attracts more developers and leads to better code – except when it doesn’t, and then, all of a sudden, we shouldn’t expect better code because there’s no money involved and they have fewer developers!
That doesn’t make any sense to me, and I find it a huge cop-out. It’s easy to brag about winners like the Linux kernel, Firefox, or the desktop environments, but then please, OSS community, own up and not try to hide behind the “but-they’ve-got-money” argument when the going gets tough.
Bingo!
I partially agree with you. Saying that open source is superior is a pretty bold statement and has to be put into context.
The real major developers will hardly ever tell you that they are better, just that the open source model better suits their problem area. For firefox they argue that they are better because they follow the open standards better (everybody can see how they do it and can influence of fix their behaviour).
Linux kernel developers argue that the linux kernel model is better because of their commitment to longterm driver support and unified driver support. Not all drivers are equally well supported.
People who say open source is better perse are just ignorant. Both open source and their commercial counterparts have their own focus, which makes them different and often the best in within their area. But whether the open source or the commercial alternative is better is just a matter of opinion and circumstance.
And the “they’ve got money” is not an excuse, it’s just an aspect that influences the way development is done and the amount of dedicated development power is available. And with that resources for money and money for demand comes the ability to make more fundamental redesigns in decisive ways.
So my point about money was more that the focus of the party receiving money is better suited to your specific needs (otherwise you’d have bought another product). It’s just a question of offer and demand.
And that is where open source is different for it is not really controlled by offer and demand, it’s more oriented towards itches, ego and idealism, and not necessarily what the endusers expect. You get what you paid for and if that doesn’t suit you, then put your money somewhere (else).
Yes in his case it was shitty drivers, I think it must have been an older fglrx ATI driver, I had the same bug.
But that is not an excuse that after years the drivers are like shit means mostly that they are hard to implement, there is something weird going on in the stack. Nvidia managed to pull it off properly by replacing major parts of the X infrastructure in their drivers. ATI tries to do it in a clean way and still fights after so many years.
Also an isolation layer is missing which would kick X back into vesa if a driver fails.
That point does not really matter when you are taken back to the login screen and the applications are not running any more.
I use Ubuntu 9.04 as my main OS and only occasionally have issues. But this “technically working” crap is a bit delusional.
Yes, I am aware of the fact that almost no apps take advantage of the ability to reconnect or use session management to save state. That doesn’t change the fact that the fault lies with the toolkits and the apps for not being fault tolerant, and not with the X server for occasionally crashing (which is true of any piece of software).
I think it’s a lot easier to say X server is at fault for requiring application developers to do something that’s obviously cumbersome, than to say everyone else is at fault. E.g. it’s more rational IMHO to say that one program sucks, than to say that (essentially) every other program written for Linux sucks.
All your points are valid, assuming they are all correct. I don’t know all that much about how X works under the hood so I’m taking you on your word. But I don’t think this invalidates Thom’s point.
Although there are types of crashes in Vista/7 that can still bring the whole desktop down, the footprint of the critical components that can lead to a system crash has been drastically reduced relative to pre-vista versions of Windows. Saying that the problem in X is the video driver doesn’t really shift the responsibility – video drivers in Vista/7 can have bugs and can crash too. The difference is that the majority of the driver code is now restartable and the system can recover from such crashes most of the time with no adverse affect on running applications (not always, but most of the time).
X may support applications reconnecting after a recycle, but until the toolkits implement it that feature of X is basically useless… I personally don’t take Thom’s rant as a rant against X specifically. He may have meant it that way, I don’t know, but he did say “X stack” which I interpret as the whole thing including drivers.
The point is that he is pointing out a specific and legitimate problem with desktop X systems, and it is a perfectly valid complaint. No amount of finger pointing changes the fact that it still sucks in this particular area. I personally don’t advocate getting rid of X, but it is a problem that needs to be addressed I think – sticking our heads in the sand won’t make it get better.
Does Xlib support disconnects and reconnects? No it doesn’t. It also makes it impossible for a higher-level toolkit, such as Qt or Gtk+, to handle this. So whose fault is it now?
XEmacs can disconnect and reconnect. If there is a bug in XLib causing abort()s for a disconnect, that can easily be changed (very easily). Now your point is moot.
No it doesn’t:
Ok, it handled the disconnect by saving the file – but it certainly didn’t try to reconnect. It exited, just like any other X app.
How old is Xlib? Why hasn’t this been fixed?
Edited 2009-08-16 02:36 UTC
Probably because nobody has bothered to make this work from top to bottom of the client-side stack. No toolkit tries to make it work, so why should XLib devs bother making it work? There have been a few posts on the X.org mailing list regarding XLib’s policy of aborting on failure and how it’s wrong and apps should be able to reconnect, but there wasn’t enough interest to do anything about it. It clearly is a more sane solution, though.
I believe error handling was one of the things XCB better supports. Xlib was a terrible API with known severe problems for a long time, but most things are switching to Xcb now.
So if all the fundamentals are in place, why aren’t we seeing crash-resilience yet? Is it too hard for toolkits to use it? Is the documentation lacking? Or just laziness?
Whatever the reason, the end-result is still that we don’t have crash-resilience yet…
Edit: KDE seems to have considered it and found X lacking infrastructure: http://bugs.kde.org/show_bug.cgi?id=157354 . Were they wrong/ignorant or is there something else going on?
Edited 2009-08-16 14:59 UTC
The one sentence reply in that bug report is not really indicative of much of anything. The only truly useful thing he said was that it was beyond the scope of Qt to handle it (which is partially true as X must fully support, in a bug-free fashion, the ability to restart connections).
The good news is that X is open source so you can fix it yourself
2010 = The Year of Linux
ROTFLMAO!
People who like to point out how Linux is so stable not crashing like Windows are people who don’t use their computer for personal productivity or enjoyment and only run nerdy CLI tools. The truth is that once X crashes, your productive session is gone, much like it is in Windows when you get a BSOD. I don’t care if X may restart on its own – so does a Windows system after a BSOD and in both cases, my session is gone and unsaved work is lost.
X rarely crashes for most people. I haven’t had a legitimate X crash in a long time and I’m even building X from Git master. As with windows, crashes are generally going to be caused by driver bugs.
I am a network administrator for a big company, I use Arch Linux as my main operating system with a virtual windows XP. I have never had to do a hard reboot or hard power off. If the need be, there is always ctrl+sys req+reisub. But even Arch Linux being a rolling release, with always the latest version of a package, and me using the trashy catalyst ati drivers, and no hard power off yet? I don’t know but I think I use my system for far more productivity than most of the people. And I manage a network with lots and lots of windows machines, which are always giving problems.
The only problem with this argument is that Thome would have us believe that this happens for all install of Linux/Ubuntu. I use Linux for every day desktop use and have for years. My Grandma does. My Mom. If the system is such a wreck then why do they prefer it to the Windows way?
My Linux install is rock solid. It’s not used just for “nerdy CLI” tools as one user claimed. I use it for pr0n and facebook just like he does.
Just as Linux users for years have claimed that their OS is better/ready for the masses, Thom now claims that Win is better for similar reasons: obscure technical reasons that effect only a small percentage of users.
Most people do care about crashes and for most people, in my experience, Linux crashes less and gives even novice users a sense of control and privacy that they don’t get from Windows. And trust me, I’ve given Ubuntu disks to some stone-newb users, and so far, without fail, they are impressed and delighted. Then again, they wouldn’t know what X is at all.
What happens here are bad drivers. The X server runs in user-mode but most proprietary drivers work as you notice it, in kernel mode. So may make to freeze your desktop.
When Vista SP1 improves the copy speed, all sites tests the Vista’s SP1 copy speed. When Win6.1 improves how it handle the video driver stack, it appears a post about how great an OS handle that issue.
Why don’t you put the phrase in that way: I would never recommend Windows to my friends that is still vulnerable on trojans and they have to bear that ugly UAC dialogs all the time!? Linux or OS X still never had that issues, even have 8 years exploits!
The main problem in Linux is the sound stack.
If I would do a review on an AMD system that I have (AMD mainboard, cpu, vide card), I can say that Linux with opensource drivers is way beyond the Ati/Vista64 OS. Vista gives me around 1-2 BSOD daily (yes, like in Win98’s presentation).
If you judge cleanly right now Linux have support of OpenGL 3.1 which is comparable with DX10 (not 11 yet), vectorial Postscript graphics (via Cairo), it is interoperable with most networks, it mostly play your movies (if your driver/system combination works, as is with Vista). Comparing Win7 with Fedora 11, for me Linux simply means to cut times in half: the boot times, the shutdown, the application startup. And this on a pretty high end machine (quad, 8G RAM, etc.) And this happens without making so great hype that the Linux is optimized on SSDs (as states Win7) but fails to show this on my machine.
So putting a bad experience to extend with logical fallacy that the video stack is bad, is not either fair nor professional.
You either have a hardware problem, or there’s something very wrong there with your software setup. I regularly use Vista machines that have been running continuously, without a reboot, for weeks. BSOD crashes are usually caused by bad or conflicting drivers (including antivirus/firewall), or by bad hardware such as faulty ram (or even overclocking of course).
Yes, I really believe that the Vista OS after SP1, but more after SP2 is a sound OS.
But here the talk was as principle: Thom have a buggy driver for it’s specific video card. And because of this, he instead writing a bug report, he writes a rant.
Also the video drivers out of kernel space was a feature of Vista+ only, so the context was wrong also, because he put a rant in a context feature against a different architecture with ups and downs. If XServer crash, at least on my machine(s), it will restart automatically. Even applications cannot continue execution, it’s an application problem, not an X Server one.
His abrupt state like: I will not recommend to friends Linux (because of Thom’s buggy video drivers) is out of context. Is like I would say: I would not recommend Windows as is still vulnerable with trojans, other problems. Do you want a glitch free experience, buy a Samsung NC10 and put Fedora 10 (or F11) on it. All from Flash to Compiz will work flawless. No viruses, the video camera attached works also. I would not say that Vista’s or Win7 experience on that Atom machine do not worth to mention, for the sake of restarting the windowing server. Want an almost optimal experience on Linux and Windows, buy a Compaq CQ61 and you will have all devices from Intel or NVidia which are well supported for both platforms. Want a bad experience with Windows, buy a machine as mine. Want a bad experience with Linux? Buy a machine as Thom’s
But no rants needed for that…
You dismiss the ability to easily kill and restart X as pointless/useless, I disagree. Certainly in your scenario, but there are plenty of others. For me the background stuff is more important.
In Windows I’ve had misbehaving apps lock the whole damn system up requiring a complete restart, on top of other things. Different causes/faults but the result is the same. This problem is just as alien to me in Linux, as your problem is to you in Windows. And I encounter both problems in their respective environments just about equally.
Maybe Windows could learn a thing or two from Linux. Better yet, maybe they could learn a thing or two from each other. You’re not wrong when you say the GUI stack in Windows is more fault tolerant.
Edited 2009-08-15 18:47 UTC
For the ordinary user, an X crash and a system hang are one in the same. They don’t know what the X server is, they don’t know they can restart it, and they don’t care. For them it’s quicker to just reboot anyway.
Yet X.org themselves decide to go the Microsoft/Apple route and make it impossible to use Ctrl+Alt+Backspace by default. Genius thinking there. And don’t mention RightAlt+SysRq+K; the original was much easier to remember due to its similarity to the combination Microsoft made famous over a decade ago: Ctrl+Alt+Del. To make it worse, most people probably don’t know what the hell the “SysRq” key even is (let alone its co-existence as “Print Screen”), and the fact that a specific “Alt” key is required (the right one) complicates things even more.
Making stuff up isn’t any way to win an argument. Of course one app can’t bring down a Windows system. That just shows how out of touch you are.
I love 30 Rock!
😀 That is a good show.
Any particular reason you are not using thunderbird? It’s probably the best (Linux) email client out there right now. Even many of us who need to work with Exchange can often happily use it through IMAP (if you can’t, and need to use Linux for your work, contact your company IT personnel).
Because Evolution is a superior application probably, and fits in much better with a gnome desktop
Yeah Thunderbird’s nice, and is the default mail client in Mint 7, which is really great. Ubuntu, but with a few added goodies and overall cohesiveness.
Very good article. Criticizing X.org on popular websites like one is badly needed, as more people need to be aware of the deficiencies. Only when they’re aware of them, something may happen.
My example is even more embarassing. I had a mouse with a faulty cable. It would sometimes “lose the signal”, which would cause the mouse to freeze or “restart” from time to time. This was no problem in Microsoft Windows (XP, from 2001). When it happened, the mouse driver was simply restarted and it would work again. In X.org, it would sometimes crash/freeze the whole X.
But my biggest gripe with X has always been the pathetic performance, horrible slowness, jerkiness and obscene inefficiency compared to MS Windows. And it’s only getting slower and slower every year. My X.org desktop in 2009 is so extremely ultragigaslow that I could never imagine it was even technically possible. I have a rolling-update distro (Arch Linux), and everytime I update the system or some component, the GUI gets slower. Upgrade to a newer X.org – it gets slower. Update the video driver – it gets slower. Update the GUI toolkit – it gets slower. Update Emacs to use GTK+ instead of the archaic X GUI – it gets much slower. update Emacs again to use modern fontconfig – it gets masively slower. And so on and so on.
And it’s more and more frequent. From 1999 to 2004, I could notice a major slowdown maybe every 2 years. Then it was every year. Last year it was maybe every 3-5 months. Then it was every month, and now, I can see my system getting slower almost every day, with each new update.
The whole graphics stack that sits on top of Linux is a gigantic failure. And has always been, this is a chronic problem. All the inefficient, badly written and unoptimized abstract layers-on-layers-on-layers-on-layers of superslow, rough, unfinished and buggy stuff.
So, will Wayland save us? Or anything else?
Bad drivers, and wayland won’t save you.
You aren’t using Intel gfx card by any chance? The drivers have had severe regressions for a while now. I have an ATi and my performance is fine and about on par with Windows XP on the same machine (except for KDE 4, of course).
I should add that Windows also has layers and layers, as does any modern OS. That’s not why it’s slow. If the drivers don’t properly accelerate certain operations, they will be very slow. The overhead from X server internals is pretty low.
Edited 2009-08-15 18:56 UTC
The “bad drivers” has always been such an universal excuse that I can only laugh in despair.
It’s total nonsense.
I can see this on every computer with any graphics card with any driver with any X.org settings. Accelerated, unaccelerated, XAA, EXA, XRENDER, noRENDER, whatever. I even replaced the NVIDIA card on my computer with an ATI card to see what kind of difference it makes. Absolutely no difference whatsoever in anything.
That’s a problem with X.org on Linux, not X.org in general. On FreeBSD, where there’s a proper, working, console mouse driver (moused), that handles the low-level mouse connection, things like this don’t happen. If X.org loses the connection to the mouse, you just restart moused (/etc/rc.d/moused restart) and things start working again.
I have similar problems due to a crappy 2-port Belkin KVM switch. Sometimes, switching between the two computers will cause the mouse to lock up. On the Windows side, it’s dead until I manually unplug/replug the PS/2 connector. On the FreeBSD side, I just restart moused and carry on.
Yeah, I had a better experience with X.org on FreeBSD (it was also noticeably faster, with the same driver and settings, although of course still not nearly good enough). But it would be nice if moused was restarted automatically, just like Windows does it.
Okay, Troll, I’m feeding you so you can shut up and go away. Although I think Thom is pointing his ire at the wrong target, at least he has a point.
But, where you’re concerned, the only way you could have such “ultragigaslowness” ( and one wonders why you’d keep on using any such system, unless bitching is what keeps you breathing ), is if you are still running the same machine you were back in ’99.
Go enjoy Win 7 or buy a Mac and leave us poor X-using fools in peace.
We’ll miss ya.
Congratulations, you have won the “Most stupid comment on OSNews” award for 2009.
Sorry, but you’re way ahead of me.
And, the award is not either of ours to give, anyway.
It includes nothing the *X community doesn’t already know — well, except pointless ranting.
Yes, X.org is in bad shape. Everybody knows that X.org is currently the Achilles heel of the free desktop stack.
The problem is that it’s extremely difficult to fix that. Graphic card companies write their own drivers and have no intention in actually working together.
But there’s hope. Tungsten Graphics is currently doing lots of back-end work for their Gallium3D driver layer. I didn’t hear of any “don’t take X.org down on driver crash” work yet, but if anybody implements it, it’s them.
Your “article” represents the worst kind of rant: one in which you vent and then anticipate fictional responses, as if to say, “I’m entitled to make criticisms, but don’t criticize me!”
Well, you’re certainly entitled, but why do so in an OS News article? The whole thing was like a long-winded comment to a real article. Your experiences bring no new content or information. I’ve read the same article about 1000 times in my 12 years of using Linux. It’s just… trite at this point.
Incredible! I wich I could do a Thesis on how Windows is great and then flame Linux…
Anyway, as some mentionned, this is ridiculous. What have you or the auto-magical scripts done to your xorg.conf. Really I NEVER experienced this kind of crash in 10 years! Also, you forgot to mention all the incredible power of X that still not work on Windows system … display export etc… Also as mentionned, two things lacks the development of Xorg: man power and true drivers/hardware specifications from vendors and THEN, yes THEN we can compare. I did have incredible crash on Windows ** versions that just suddenly happens for no reason. You know the “Blue screen of Death”. Even looking at the so called event logs gives me nothing.
Sounds like another Microsoft paid guy explain how cool is Windows. Sad to hear.
Well it is easy to reproduce, simply install a pre 9.7 fglrx driver and then switch on XV and composite 🙂
As soon as you resize a video window you will run into a total hang of either X or the kernel.
The latest drivers are better but still crash the kernel at half of the compiz effects.
So? fglrx is developped by the X / compiz team? Don’t mix all that and fire on the ambulance…
I did use X with a triple screen on 2 graphic card and I don’t experience any crash at all.
You want X to reload when a driver crash? What for? to crash again? And yes THE big problem in Linux is that vendor don’t provide (except Nvidia) good drivers oss or not.
Edited 2009-08-15 19:43 UTC
So the vendors are the bad guys because software cannot gracefully recover from a driver failure?
A beta driver? Seriously?
“I installed a beta version and it didn’t work as expected! Boohooo!”
I can’t even count the crashes I experienced on Windows because of buggy drivers.
I’m not saying that X.org is without faults (see my other comments here), but ranting about a beta driver is pathetic.
How many times on a linux-related forum when you post you are having a problem with a driver will the people all scream “YOU SHOULD BE USING THE LATEST TEST DRIVER! It fixes your problems, now SHUT UP complaining already?”
Perhaps not exactly as I worded it, but I think that happens frequently.
About as many times as you would see that on a windows-related forum?
Touche!
If you’ve heard the same ‘rant’ for 12 years on how the graphics system is simply not stable enough, shouldn’t you be more worried that the problem has still not been fixed rather than the fact people are not accepting it?
No. I doesn’t concern me because I’m not affected. When I buy hardware, I actually check to see what level of support Linux offers, since that’s my target OS. The people who have the biggest complaints use Linux as an afterthought.
Yeah, you’re right. I suspect that 90% of the time, when people start defending a comment before they’ve even finished making it, it’s because they already know they’re being totally unreasonable.
I understand why this made you angry Tom, but did you even wait to calm down before penning this article? I do actually agree that it would be nice for X to be able to recover from a driver crash somehow, but somehow I find any point you were trying to make was washed away by your repellent tirade.
It certainly would be nice for X to have this feature, but there are two things worth mentioning. Firstly, Vista doesn’t always succeed in recovering from a graphics driver crash, google around and you will see that sometimes people are left with blank screens and have to hard power-off their machines. I have actually seen this happen. I guess it’s better than nothing, but it’s not a panacea.
Secondly, though X crashes are annoying I think it’s totally unreasonable to say that X is basically unusable on a modern desktop. With a good driver X works fine, or at least no worse in most cases than the graphics systems of XP or OS X, which many people seem to get along fine with.
Anyway, for all the people with an irrational hatred of X, I suppose you will all have your wish come true when Chrome OS comes out and we have an alternative Linux GUI stack available.
To add injury to insult, he basically self-identified as having ADHD when he rattled off the list of applications that he was running. It looks like he was trying to multitask more than the OS itself. One system couldn’t take it and crashed, and the other system churned out an equally irrational error message on the web. Neither was stable enough to handle the task.
Windows can “restart” the driver only if the driver itself detects some problem on the GPU and decides to “restart” the graphics card. But it can’t fix crashes. A null pointer dereferente will crash your window box just like it did in linux. And windows can “restart” the card only because modern GPUs allow it, not because Windows does something magic in its software.
Edited 2009-08-15 19:12 UTC
So why it doesn’t happend the same in linux?
Most of these problems might be solved merely by using an alternative to Gnome.
Seriously, though, comments like this:
“The graphics stack is so badly designed that resizing a video window can bring down the entire stack”
do not a balanced piece of journalism make.
Criticising Linux can be a “risky” business, in that some supporters of it are quite aggressive. But the expectation of being accused of trolling should not be an excuse for actually writing trollishly.
C’mon OSNews, you can do better 🙁
Heh, that comment made sense because IT ACTUALLY HAPPENED. X window has this issue, and it’s true. No matter how great toolkit/drivers they can pull, a change(not just recode) is needed.
Well, I believe that it actually happened – whatsmore it shouldn’t have happened and it’d be nice if X was better designed. I just didn’t think the way it was presented contributed much of use. Thom’s work shouldn’t have been lost but the fact that the window was being resized at the time isn’t really relevant – it made it sound like Thom was claiming X was so flawed it can’t even handle window resizes, which is *not* the point he was trying to make. I’m sure we’d all like software to be *better* but we could direct the debate about X.org a lot more effectively if we knew what needed to be better (and why).
I’ve had X and Windows and Macs all crash *without my resizing a window!*. The fact that I didn’t even have to resize a window to provoke a crash does not indicate a fundamentally flawed architecture any more than Thom’s experience does, except in the sense that they shouldn’t have crashed at all. The bugs behind that behaviour I saw could have been anywhere from the hardware, to the kernel, to the windowing system.
I just think the debate would have been a lot more interesting and productive if we’d started with a clearer idea of Thom’s central point: that X applications should not die because of a bug in the X server code.
What I do know about X suggests it has a decent chunk of legacy cruft on the side but I don’t see why that should necessarily prevent it from being an effective platform. Pretty much everything we use has legacy cruft, many popular Unix-related codebases have roots going back decades.
Actually X has three problems, and I am not sure why it is so hard to fix that.
a) The drivers, only one company does it right and that is NVidia, while ATI is getting better I recently had to replace my shiny radeon 4850 with a slower nivida 9600 GT because of the driver situation, the ATI drivers hanged at every second compiz artefact.
(I am rather sure the original author was using ATI also, I remember the X hangs bug for video resizing it was fixed in the latest fglxr drivers (but they still crash X in other parts))
Secondly it is the install process, that is up to the distros, third why on fricking earth is it so hard to simply make a graphical mode setter, almost every distro has it but there is none which comes with X itself (dont get me about the semi working autoconfig script X provides) and no average users are unable to understand even 5% of the xorg.conf!
That is one area Windows and Macs definitely get right well almost every other OS gets it right.
X is a mess in those three areas!
Edited 2009-08-15 19:26 UTC
Hahahaha! I’m using a NVidia GPU and the NVidia drivers totally suck. Yeah, they have accelerated 3D, but that’s it. Power management (display dimming) does not work at all and several times a day the some weird artefacts consisting of random stuff found in the graphics memory are displayed. The second issue is luckily purely cosmetic.
OTOH the community-developed Nouveau drivers don’t have those issues but then they also have no accelerated 3D. Switching drivers on demand is also not possible, because NVidia installs an incompatible library — replacing the default FOSS one.
The worst part about NVidia is their policy regarding FOSS drivers. The official “nv” driver is so incredibly bad, words can’t describe it. Phoronix once described it as “NVidia’s half-assed attempt of appearing open source friendly”.
The only company that does it right is Intel. All drivers are fully FOSS, developed by Intel itself. (Poulsbo doesn’t count, because it’s a PowerVR design — Intel can’t open source these drivers).
Have you even tried the FOSS “radeon” driver? My experiences with the default FOSS drivers for ATI/AMD cards are quite good. Though I have to add that I only tried older Radeon cards with the FOSS driver.
Hmm. I’m using NVidia drivers too, but for me they’re fine. Screen dimming works fine, I use it with the KDE4 power util thingy so when I pull the DC power out, the screen gets dimmer. Flawless. Also I really appreciate the support for TV-out. Plus the VDPAU stuff for GPU video acceleration is really nice. I do think Intel are very cool for doing all their stuff as open source, but the amount of grief I hear from Intel gfx users puts me off the idea of investing in their hardware for the time being. KMS is pretty sweet though.
Well, my GPU is a 9200M. Display dimming does neither work under KDE4, GNOME, NVidia’s tool and not even the CLI tool (nvdimmer or whatever it’s called).
With Nouveau OTOH it works — without 3D. Currently I use my laptop with power plugged in most of the time so I keep NVidia’s official drivers installed. When I know that I have to be on battery for a longer time I actually start Vista, because switching drivers temporarily is more complicated than a reboot. 😉
Actually I just went and checked and I can configure my screen brightness regardless of whether or not the nvidia driver is loaded, so I suppose it must be a feature of my laptop ACPI driver instead. So you’re right, if the nouveau driver can do it then evidently the nvidia driver should be able to as well. Lame.
Since ATI released the 3D documentation for newer cards earlier this year, the open source radeon driver has been advancing at a steady pace. 2D hardware acceleration and Xv now work, but the final bits of 3D have only just begun to work and are still in development.
http://www.phoronix.com/scan.php?page=news_item&px=NzQyNg
Compiz is apparently running now on the latest ATI cards in the development versions of the radeon driver.
They work, but it has no 3d. Actually I am pretty happy with the NVidia drivers now, but I keep my Radeon because it is faster and maybe one day either the foss drivers will get 3d or ATI finally after now almost 8 years might to manage to get working drivers out.
But to ATIs defense they try to behave better than NVidia by doing a proper integration, instead of just dropping the parts of X which do not work out for them what NVidia does.
Anyway the NVidia drivers work for me pretty well so for now I am happy.
And in the end only one thing counts, the drivers should work, and should work stable!
I stopped reading this rant after the first paragraph. My computer is half as powerful as Tom’s and yet Xorg and Linux in general run smooth as silk on my three year old rig. I’ve been using Linux for a very long time now and the Gnome desktop specifically on various Debian based distros for about four years. I’ve never once had a single problem with Evolution, which I couple to my work’s Exchange system as well as my personal Gmail and Nezero email accounts. Not once has Evolution crashed for me. As for video playback; again, smooth as silk. I use Totem primarily and play all sorts of video types, including Windows file formats, as well as standard DVD-ROM disks and not so much as a hiccup. Finally, my Compiz-fusion effects are sleek and smooth on my trusty GeForce 9500 GT. Oh, and I’m running Ubuntu 9.04 64Bit with Xserver 1.6.3. While everyone’s mileage with Linux will vary, depending on hardware, the very same can be said of Windows. Sure, no software is perfect and most have lots of room for improvement, however, just because your computer has problems with Linux, apparently, doesn’t mean you should automatically start pointing fingers at Xorg, Totem, etc.
Edited 2009-08-15 19:36 UTC
Thom,
I’ve used Linux for almost 13 years and over the years I’ve had funky driver errors and usually the problem has been either
a) An open source driver that’s not in the mainline tree for a wireless card or a video acquisition card; or
b) A proprietary driver for X because I wanted to have 3d eye candy from my video card and the video card doesn’t have an open source 3d driver.
Let me emphasize now though: “usually” does not mean always. There’s one instance for a particular Debian Linux alsa driver that I experienced a regression in sound for a Lenovo laptop. I had to do a little research on the appropriate alsa settings to get sound to work again, and the problem has not reappeared in the past 18 months (24 months?).
I’m pointing this out because after 13 years of Linux experience I’ve learned to trust the Linux kernel maintainers to do a better job providing drivers for me than out-of-kernel developers, especially proprietary driver developers. In fact my usual suspects for causing kernel panics are the manufacturer of 3d graphics cards.
So I have to ask because you didn’t mention it, and because your description of the symptom is too tantalizing, did you or did you not use a PROPRIETARY driver from Nvidia or ATI?
If you did, I just don’t see how you can recommend that Xorg developers should follow a proprietary model when a proprietary driver hosed you!!!
You should be aware that ATI’s record of providing a stable proprietary driver has been less than stellar, and while NVidia might have a better record I would not consider them to have as good a record as the Linux kernel folks who based on 13 years of experience have done a better job for me.
My laptop is no where nearly as powerful as yours. I have a Mobile AMD Athlon XP 2500+ with a crappy 64MB ATI Radeon mobility card. Totem plays ALL the media files I’ve thrown at it without issue. And plays them smoothly. Wake me up when windows media player, or quicktime, can do that. Oh and compiz absolutely nothing compares.
I don’t think there’s anything fatally wrong with X. And if there is you have NOT pointed them out. What needs work are the graphics drivers. Unstable and shoddy drivers will always lead to a terrible user experience.
What’s disappointing about the article is that it’s not constructive. It’s an empty thoughtless rant that provides no path to a solution.
It has been a long time since I have seen such a poorly written rant, which so totally fails to provide any details, evidence, or even a coherent argument appealing to some presumed good faith, to back up the “point” it claims to be making. And I’m not limiting that to just OSNews. I mean *anywhere*. This is exceedingly notable, if not singularly so.
Beyond that basic summation, I see no point in commenting further. And I would recommend that others consider doing the same. There is little constructive discussion which can occur on the foundation (or lack thereof) that this editorial provides.
I rarely find myself defending one of Thom’s articles, but here it goes.
Poorly written… Ok, maybe. But details and evidence? Of what? What he is ranting about is a problem X has had pretty much forever. If X crashes, everything goes with it – and there is ALOT of stuff that can cause X to crash. What evidence do you want? I mean this is pretty much common knowledge.
As things stand now (as far as I understand), while X itself is restartable, the applications running at the time of a crash have no mechanism to reconnect to the new session afterward. Whether it is X, the drivers, the toolkits, or the DEs that are at fault isn’t really relevant to the problem – namely if enough stuff goes fubar in the X stack you lose _everything_. As other have said, X itself has mechanism to support applications reconnecting, but since these mechanisms are not adopted by any actual toolkits they are not used and are therefore useless for desktop systems.
For once Microsoft did something right, and he is pointing it out in comparison to X. It is a valid criticism imo, and is something that needs to be fixed for X to remain competitive in the long run. I don’t particular care for the overall tone of the article – some will take it as nothing but X bashing, but if you look beyond that he has a point…
Well, according to another user the behavior with video playback Thom describes is caused by using beta Radeon drivers. Maybe I missed this detail in Thoms rant or he left it out for the sake of ranting.
The pre-9.7 comment wasn’t talking about beta 9.7 drivers, but the released 9.6 and 9.5 drivers. AMD fixed the bug in their July release.
Anyway, I don’t think anyone is going to seriously argue that X is leading Windows 7 in this area, but it’s not all unicorns and candy there, either. I’m getting crashes (in Vista) whenever I watch video and have bittorrent running in the background. The fact that it requires the heavy network usage and video playback at the same time makes me think that this is an issue with the kernel running out of some resource, although I suppose it could just be a driver issue of some sort. It hard restarts my machine with no warning, which I don’t get nearly as often while booting into Linux.
I guess the moral of the story is that different hardware can give you very different results.
Edited 2009-08-15 22:28 UTC
Your problem actually sounds like it’s more likely hardware related. If your system simply restarts without warning (no blue screen, no error recovery dialog on resume asking you to send problem data to Microsoft), then most likely your power supply is insufficient for your GPU when under load.
Nice article. I’ve lately been thinking about a Windows 7 box for all my non-development tasks – I develop apps and drivers specifically targetted for Linux, so I can’t move off if for a work platform. But in terms of getting the non-development work, I’m seriously leaning towards selling out to Windows 7 for my day to day needs after years of using only Linux – I think Linux and/or BSD has been my primary OS for 15 years now).
Good luck with it, Windows file locking always kills every effort of moving my development efforts towards Windows. I am currently on a mac where I can get both worlds, no file locking and stable ui.
Unlocker. It’s small-f free, and lets you remove file locks on the fly.
Well for windows file locking it does not really work out that well, every time you run into a lock situation you have to manually unlock with this tool, and believe me if you trigger build processes from the command line than go into a deployment dir to start a server etc… this gets so annoying that you instantly want to format the harddisk and put a proper unix onto it.Filelocker or not.
Add to that that if you are unlucky the virus scanner for the machine enforced onto you scans into your ide and filelocking suddenly makes your ide crawl or crashes it while you are in the middle of an important change.
I really hate windows for development, no matter if there is a filelocker tool or not. Good that normally I can work under Unix, just at customer machines I have to use Windows!
Well said Thom, I completely agree with you. Unfortunately X is not the only problem with the Linux desktop. I cannot allow my self to dedicate all of my valuable data to the Linux as a desktop OS.
I have been using Windows 7 for the last month and man I love this thing! This thing flies….It’s fast, it’s sleek, it’s polished, it’s rock solid and it just works! Something that the Linux desktop can only dream of (based on its current state). I was using XP before all the way so I completely skipped Vista.
Edited 2009-08-15 21:34 UTC
Wow, that’s exactly how I feel about Windows.
Excuse me, Holwerda, but Vista has been out barely two years and sucked rocks the whole time, and Windows 7 isn’t even out yet, not to mention both of them being totally proprietary code.
So exactly how and where does X “learn” from Microsoft?
Does anybody else see how stupid this rant is? Holwerda is running as the typical user, windows and documents open all over his desktop, unsaved and no automatic saving enabled, then he opens some huge frickin’ 1080i video which grinds his system to a halt in a video player I wouldn’t touch with a ten-foot pole because I know better, then he installs a NEW video player which SOMEHOW proceeds to trigger a crash SOMEWHERE in the system which takes down his desktop.
For this, he goes on a rant about X when he doesn’t even know if it had anything to DO with X per se.
I’ve had KWin crash a number of times on openSUSE 10.3 and 11.0 and take everything down. SO WHAT?
This is an idiotic rant from somebody who experienced a crash and got pissed off about the state of software in the wide world.
News flash, Holwerda! Let me ‘splain it to you in plain English.
SOFTWARE IS CRAP! WINDOWS IS CRAP! LINUX IS CRAP! IT’S ALL CRAP! BUT – Linux is FREE CRAP!
As Woody Allen explained the human condition in five words, with special application to IT, making him the greatest philosopher of the 20th Century: “Nothing works and nobody cares.
So suck it up. Curse all you want, but going off on a rant about X and praising Windows just makes you look like a lame end user who doesn’t understand what the hell is going on.
If X has problems, X will be fixed sooner or later. Until then, STFU.
Huh? A crashing window manager should take nothing down — it just leaves the windows without title bars. Just rerun the window manager or use another one.
When I said “take everything down”, I mean the window layouts and open apps. The kernel is fine. I don’t bother to play with it, I just CTRL-ALT-Backspace and relog in. Rarely happens, although in fact it just happened a couple minutes ago.
It always happens when I’m doing something in Firefox. It’s almost certainly Firefox’s fault, but I have no clue why. Although I have Kongueror, Opera, Kaffeine, and Thunderbird open, plus a couple Twitter apps, nothing else is doing anything at the time. So I’m pretty sure it’s Firefox’s fault, which is no surprise given the bloat in that browser and the various bugs. This is especially likely since I just installed 3.5.2 yesterday.
If it weren’t for the very useful extensions I use, I’d dump Firefox for Opera. Of course, it’s possible the extensions are to blame for this behavior as well.
In any event, while I rant about Firefox, I’m not ranting about X or Linux or whatever like Thom is, nor am I recommending that Firefox “learn” from IE8. That’s my point.
If kwin crashes, your apps should still be open, you just can’t move them around anymore.
Or did you mean that kwin crashed and brought X down with it?
I don’t bother to check. If something goes wrong with the window manager, I clobber everything and get back to normal. I feel that’s the safest approach. At least I don’t have to reboot.
“Until then STFU”
Sorry to harp on only one point at the end of a long post, but if he STFUs how is anyone developing the X stack going to know that there’s a problem which potential linux users care about? Maybe because of his public rant which clearly delineates the behavior he wants to see, some of the folks behind X will consider changes to allow for session reconnects.
I’m sure the people behind X know all about session reconnects and probably have them on their roadmap.
If Thom wants sessions reconnects, just say so. Ranting about how Linux could “learn” from a Windows OS that is widely regarded as crap, or a Windows version that isn’t even out yet is not going to help.
I installed Firefox 3.5.2 just yesterday. Today I notice Firefox has a lot of trouble reloading certain Web pages – they even call it embarrassing in the dialog! – and guess what? Yet another KWin crash!
So if Thom wants to rant, let him rant about the bug ridden Mozilla browser. Just about every time I’ve had a desktop crash, it’s been when Firefox was doing something, either manipulating pages or doing some sort of media thing (downloading video or something).
X is the least of my worries day to day.
So you fit into the “calling it his fault” category.
And, telling people to STFU about a problem with X until it’s fixed… how are X.org devs going to fix that problem if everybody’s STFU’d about it?
Again, the X people undoubtedly know all about this sort of thing. Eventually they’ll either fix it or somebody will fix it for them or somebody will do what Thom wants – make a new server.
Ranting about how Linux could “learn” from a Windows OS that is bloated crap or one that is a re-made version of the bloated crap one and isn’t even out yet is not going to impress the X people.
In this rant, Thom is just doing what I said in my post above – he’s complaining about crap software. DUH! IT’S ALL CRAP! That’s the nature of the software industry at this time. Deal with it!
I’ve been using X in various forms almost exclusively for 10 years now. Some with Solaris, some with FBSD, a bit with OBSD and mostly with Linux.
The funny thing is, despite me having run games through wine, native games, a plethora of applications, I watch movies or series EVERY day.. X has crashed mainly due to some pissy application – is that what this is about? The fact that a single application shouldn’t be able to bring X down?
While that in itself is a shame, I can’t even remember the last time X crashed on me, I don’t think it was during 2009 – but that’s because I get rid of pissy applications. You sort of make it sound as if a pissy application can’t bring down the graphics systems on windows.
I may be wrong – but I certainly don’t share your view of the instability of X. On the contrary, I dispute it vehemently.
Just for the record, I’d like to add that a few years ago, say.. 5, I was using nvidia with my linux machine.
It was great, I had no problems whatsoever. But, 2 years ago, once AMD bought ATI, I decided to go with them instead. Why? Because they were open.
The ride has been long, painful and full of small glitches. But now, I see that nvidia is dropping support for cards, and they’re behind in supporting their latest cards etc. While ATI’s drivers are open, free and they’re starting to be quite good.
So, what you’re describing in your article here.. I’d think it’s either a driver or application issue, not an X issue in itself. But, you’d naturally want to change it so that a driver, or an application through a driver, can’t f–k up X.
That’s great, and I fully support that. Nice effort, but if so, I think it could have been worded way better (and this I write after 2 bottles of wine).
If an application running in userspace can cause any part of the window management system in Windows to crash, it’s considered a bug in the Window Manager, and efforts are made to fix it (especially since the Window Manager is on the other side of the user-kernel boundary).
hunting -> haunting
Interesting article, btw.
Its not so much design its mentality. Personally when I write code I assume my code contains bugs even when I tested it and I could not detect bugs. Therefore I include fallback scenarios and test these as well. Many developers find this a waste of time saying the code should just work however I think fall-back scenarios are not there for the developer, they are there for the user in case the developer makes a mistake.
Wow Thom, you succeeded in raising up those pointing fingers! Well, I have to say that I agree with everyone else’s comments, but I also agree with Thom. Ultimately this should not be a situation where people start blaming one group or another. True, there may be some flaws in the way xorg handles things. Maybe the toolkits (GTK,QT) aren’t programmed well. Maybe the drivers aren’t written well or tested thoroughly enough. But ultimately it doesn’t matter. The main issue is that Unix and Linux are the greatest examples of piece-mealing ever. People love to try to make it sound more glamorous than it really is (“it’s modular”), but it’s not really as pretty as all that. I use Linux as my primary desktop system, have a Mac that my family uses, and maintain Windows systems for a living, so it’s not that I’m trying to flame anyone here. I’m just calling it as I see it. The only reason that Apple and MS can put out a more polished final product is because they control every aspect of the development process. Linux is developed my thousands. It’s true that by having so many people with their hands in the mix can make greater strides and can fix things faster than a closed source model, but it also results in a system where not everyone agrees on how it should be done. Having different ideas is great unless it means that not all of your system’s parts work well together and the end result is a user having their system crash when performing a routine task.
http://bugs.kde.org/show_bug.cgi?id=157354
Resolved WONTFIX.
So it seems like maybe it isn’t just the toolkits that need to implement it, and X is missing stuff? Or is the dev here just wrong?
X may be missing something. I had been under the impression that GTK supported display migration already (just nothing seems to exist to perform it), which is essentially what is needed. In the case of a driver crash I suspect X would need something adding to catch it, and perform a display migration somehow.
I have Used Windows from 3.11 and on and I have used Linux since Ubuntu 7.10 and I have encounterd issues from all OS’s. Sometime hardware, sometime driver or software related. I have even encountered the resize crash from the Parent poster of this article. I can say that this is just what can happen when useing a GUI OS on a PC. Shit happens!! I have some real shity cases with XP. I have seen some crashes in Vista that should not have hung up my whole system and I was just browsing the web which should not crash my whole system! SO this article seems pointless and too discount Linux becouse of it crashing due to window resize sounds like a child crying becouse his brother wont share his toy. Wow what a waiste of a web page! Write about somthing worth while or since you have so much time on your hands why dont you give some technical assistance to the people who volunteer there time on the X.org team instead of writeing the trash! Thanks for you open eye!
Its pretty incredible to me that in the OSS world, where theres supposedly so much focus on choice, that there isn’t an alternative to X.org anywhere. The last one i remember seeing – Fresco? just died out of lack of interest.
I guess the reality is that we, as a community, are basically incapable of creating or innovating anything. We rely on proprietary companies and copy their work, while decrying them. The users just complain and don’t care enough to contribute.
We can’t even design and implement a functional windowing system, despite having total source-level access to the kernels and the gui toolkits that are used in 99% of apps for our platforms.
We have unprecedented freedom to change things, and yet we are totally unable to.
Oh well, I guess just wait until Keith Packard quits and then copy how Apple or Microsoft does things?
Its pretty incredible to me that in the OSS world, where theres supposedly so much focus on choice, that there isn’t an alternative to X.org anywhere. The last one i remember seeing – Fresco? just died out of lack of interest.
why do you need alternative? to split community?
I guess the reality is that we, as a community, are basically incapable of creating or innovating anything. We rely on proprietary companies and copy their work, while decrying them. The users just complain and don’t care enough to contribute.
so… one project has no alternative and this is the end of the world?
We can’t even design and implement a functional windowing system, despite having total source-level access to the kernels and the gui toolkits that are used in 99% of apps for our platforms.
Windowing systems? There is whole lot of them. XOrg is graphics system, not window manager
We have unprecedented freedom to change things, and yet we are totally unable to.
Obviously you have way too much freedom to spew bull.
…the software, the base display software that all the other components layer on, should be able to fail somewhat gracefully.
X is not crap, it is complex, and really does work well as long as what it is built on is reliable. But throw a wrench (bad driver) in the works and it can become painful. But like Thom tries to point out, should it crash and cause all your other applications to go with it (in a default configuration) – force you to log in again?
I don’t know. I don’t THINK it should, but I have some issues on Mac OS X as well… games crashing that won’t let you get back to the desktop, have to ssh in or just kill your machine if the ssh option is not available to you.
Thom: I couldn’t DISAGREE more with you… The problem you obviously have with your setup / card driver. I have had many different machines from desktops to laptops, and when you install a good graphic driver and configure it correctly i really haven’t had X crashed once or twice in about 5 years!!… which is considerably less than Windows crashing … So beside the fact that X windows needs polishing in so many places i think X is way ahead of windows sistem, and offers far more functionality, specially with easy remote and local window managment over shell and cli. Also, part of its instability i guess is because of the portability, windows graphic subsystem did have to worry just for ONE OS.. and X is good for over 90+ platforms, so i guess it isnt fair to compare too obviously (for a reason) different systems .. I think that your review, with all respect i have for you and your work of course, is not based on the facts that matter. Just get “the damn” driver fixed, and you will be suprised how good it works.. Also, yep, i agree gnome is crap and evolution too!!… Wish you all the best!
Edited 2009-08-16 00:50 UTC
You utterly missed the point. And you threw in some lies just to make your comment even more of a troll (you don’t user Windows, but throw in unfounded claims about Windows crashing more that just aren’t true).
The point was that there shouldn’t be a single point of failure, or at least that any single points of failure (i.e. kernel) should be minimal in size. Losing all your apps because of a graphics driver crash is something Windows solved 3+ years ago, but is a problem that still plagues Linux with X.org.
If you can get a non-techie to do bolded text above correctly, your post will have a valid point.
Maybe, but not even nearly as much as this article. Based on your wording… I’d say,,, not good hardware for linux or at least very poor supported hardware.
As far as restoring state? Bull… couldn’t care less. DRIVER SHOULD NOT CRASH AT ALL. And it doesn’t for me. I doubt I saw one X or kernel crash in 5 years.
btw. I’m really curious what your so called advanced use is… Mine usual needs are anjuta, monodevelop lot of terminals, evolution, open office and at least 30-40 browser tabs opened on different desktops. Sometimes even more time than a month not closing single window while working. Yes, I’m a developer.
Then again, I buy my machines with linux in mind and follow HCL when buying machines.
Most of the posters here seem to be jumping in to say “Yeah! X sucks! Down with X!” I know hating X irrationally is a popular sport, but come on here!
You know what Linux fanboys will *ACTUALLY* say? I can tell you, I’m a Linux fanboy.
I agree.
That’s what. The problem actually cited here is a problem and should be fixed. There is no way in hell that the fix is throwing out X. Does this suck? Is it an actual problem with X (as opposed to, you know, all of the imagined problems most of you usually cite as the reason why you think we should kill X)? Yes, yes it is. Who could deny it? There’s easiest fix for this problem is this: “Make X crashing not kill open windows.”
It’s more complicated then that, I know, but we’re talking loosely here. Something to fix that is already in the works and will be done sooner or later (SIYH). Once that’s fixed then the recovery from a crash improves.
Recovering gracefully, as you see in Windows, would be a small step from there.
So, DOWN boys! I know you smell blood in the water, but that doesn’t mean you actually have the ability to sink your tiny teeth in and make the kill.
I think (but am not certain) that was supposed to be a central point in the (not overly coherent) editorial. But it’s mixed in with so much irrelevant, over the top, and frankly, rather bizarre and vitriolic stuff like claims of not being able to trust Linux with his data, a completely unrelated sub-diatribe about Evolution, and defensive sounding preemptive attacks upon people who use Linux before they even get a chance to say anything, that it’s hard to tell.
I quite agree that throwing out X is one of the more inane ideas which floats around at OSNews. And the misinformation which floats around “in support” of doing such a ridiculous thing is pretty tired.
However, mixed in with all the irrelevant ranting and raving, this piece might actually have hit upon something new. Applications die with the X server. That’s actually just a more specific part of the more general idea of making X more robust in the face of adversity. But still. I suppose we ought to give the author credit for it.
For my part, I administer about 80 X servers in 3 cities, on a variety of hardware, and have done so over the last several years. And have not found X reliability to be a problem. (Most run Evolution, BTW.) But still, I’m all for improving reliability further.
Other posters seem to have put the lie to the author’s claim that Windows 7’s video is crash proof.
So I will just finish by mentioning that there is one thing which any new X replacement will absolutely have to do to ever gain any traction whatsoever. And that is to… implement X. If it can’t do that and do it well, you can forget its chances of getting anywhere… unless its devs are also willing and able to port and maintain The World to their “wonderful” new IncompatibliX display server.
Edited 2009-08-16 02:49 UTC
I believe you’re right and I think it is a problem. Network transparency may be significant in the UNIX desktop world, but it’s an infinitesimally small factor in the overall desktop market. Unwavering adherence to the X architecture means that X will always be, fundamentally, a networked pixmap engine with an abomination of a modern, hardware accelerated graphics stack bolted on the side.
I like Linux– the kernel, the GNU userland, even the toolkits and some of the apps that sit on top of them. It was the only OS on my desktop 14 years ago. I’d like to be able to use it today. But things have changed significantly since then. A modern graphics stack is important to me, and as long as X wants to play gatekeeper to my graphics hardware, Linux won’t have a place on my desktop.
And I think we’ll see this become more and more of an issue as time progresses and X falls further behind.
Is network transparency even relevant in X today with Direct Rendering? Do the compositing Window Managers work properly from a remote connection?
This doesn’t make sense. The compositing window manager runs on the machine that has the keyboard, mouse and monitor; not on the program server that is actually running the program. So the window manager is communicating with its local X server, not the remote machine.
Network transparency is the popular whipping-boy for X critics. So what if there is “window tearing” when you move windows? Everything else runs fast, and there’s no tearing when you use a compositing window manager.
X is very stable these days and, LIKE Windows Vista and 7, a bad graphics card driver can bring it down. It would be cool if it didn’t, but at least we’re not behind the competition.
I beg to differ on the video tearing….
Because X is asynchronous you have to set vsync when using a compositing manager. This essentially means that you limit the frame rate to 60 which in turn adds overhead and slowness to compositing managers Certainly kwin (dunno about compiz, but last time this was the case too).
Without vsync compositing managers are smooth as silk however you get hideous video tearing in opengl apps and videos. I find this situation a pretty much lose-lose and certainly not up to par with osx and windows. There is no solution for this due to the inherent architecture of x which simply was not designed for desktop usage
What about the Direct Rendering stuff? Can the rendering of OpenGL apps be sent over the network? Aren’t rendering systems like Clutter and Cairo moving towards using OpenGL for their drawing?
It’s possible to send OpenGL over the network in X, since “lol I can’t remember how long ago”.
That’s what GLX is about.
But do the compositing window managers work in remote schenarios like the DWM does on Windows? I’m really curious about that (I would guess that they do but I’m not familiar enough to know).
As for your last paragraph, the main point of this article is that on Windows, a bad graphics driver most of the time cannot bring down the system, whereas on X it is guaranteed to. If the graphics driver is hung or crashes on Windows, the kernel will restart it. Similarly, the hardware can be restarted if a hardware fault is encountered.
It seems perfectly reasonable to suggest that X.org / Linux should adopt a similar model.
It seems reasonable to request that X be given the ability to gracefully recover in a similar manner. To suggest that it should adopt a similar graphics model to Windows in order to achieve this one thing is nothing short of crazy.
Oh, not AGAIN with this network transparency crap. First of all, it costs you nothing. Nothing! Secondly, we weren’t talking about needing to keep network transparency, what he mentioned was that any successor would have to implement X if for no other reason then we don’t want to rewrite 20 years worth of applications overnight.
If you want to talk about doing an X12, or something, to add support for non-pixmap (I’m assuming you want something like news here, vector based) things, then go ahead! A lot of people would be interested, me included. This doesn’t have anything to do with not adhering to the X architecture. What’s wrong with it? Outside of your imagination, I mean.
And what is it about X that is not modern or being added? What is so fundamental about X that precludes it growing the ability to do whatever it is you think it can’t?
I am really at a loss. Unless you can cite something that is actually wrong with the way X is put together I’m calling shenanigans.
I’d give more credit but it’s nowhere close to an original idea. I’ve been asking for a way to move X apps between servers for years, primarily so I can move them to a dummy server and back (thus allowing them to survive an X restart).
As long as X is used in any form that is close to what it is now, Linux, & the *BSD’s are doomed to the current niche market!
OSX has shown that a UNIX can be used successfully on a consumer machine with a solid GUI. So it can be done!
I’m a FBSD geek and wish to hell that there was something else out there (besides OSX) that I could run my stuff on reliably while running UNIX! but I fear that that will never happen!
Until it does, I’ll continue to use OSX! The only true option for a UNIX geek who is not an X geek!
Is there any other GUIs that can be used besides X? (I know its not OSS but how about BeOS’s GUI or a OSS project to replicate it?… just a thought)
KRR
Edited 2009-08-16 02:06 UTC
As a FreeBSD geek I should think that you would be happy that The Standard GUI for all free and open source operating systems is under a BSD-style license. Do you realize that any replacement would probably be under the GPL?
Yeah I notice random hard-locks with the FOSS Intel drivers and the performance is generally slow. The performance of proprietary drivers is better but the stability is worse. So there is no happy medium when it comes to video drivers on *nix.
X needs to handle driver crashes far more gracefully than simply hard-locking the system or in good cases causing X to restart. There should be some sort of background daemon that monitors drivers and if they crash they should automatically be restarted or revert to a least common denominator driver (without disrupting the user). The daemon should log crash events as well. If something like this exists I’d be very interested in knowing about.
Another issue is applications often terminate while in use without any warning. Launching the apps from the terminal will usually leave an error message output, which is typically a obscure, such as a segmentation fault.
http://www.x.org/wiki/ModularizationProposal
Hopefully this will lead to something that can be made to Thom’s desired goals.
I think Thom experienced the equivalent of a bluescreen, just like I have had with Vista Media Center.
I think the only valid complaint is in the experience he had leading up to the crash.
Problems with resizing video windows etc, should be a thing of the past.
Edited 2009-08-16 04:03 UTC
Uhh, that was done years ago and it has nothing to do with the structure of a running X stack, only the structure of the source code and the build system. X is already very modular and extensible. It lacks manpower and time to debug all of the relevant issues.
That work was mostly completed by 2005, I believe, with the release of 6.9/7.0. However, it *was* a time consuming prerequisite to X moving forward. So in a sense, what John says is sort of relevant, though not, I think, in the way he origninally intended.
David Dawes and his reign of tyranny has turned out to be fairly expensive. (And is a really good argument for forking sooner rather than later when dealing with a not-so-benevolent DFL who is, in fact, a poisonous personality. See the current CDR tools controversy for a variation on that theme.) This *was* all years ago. But we are still, to some extent, recovering from its effects. Much of what is happening in X now (e.g. GEM) really should have gone in years ago. Though I should hasten to add that one should be careful blaming *too* much upon XFree86 at this late date.
Edited 2009-08-17 01:04 UTC
I feel like this article and even the comments are talking about some alternate universe… I’ve thrown Ubuntu on a few dozen friend and family computers in the past year or two and they all work flawlessly; they are constantly thanking me for the change.
90% of these are also running Compiz out of the box and everything works great (in fact, I can’t seem to get things working right WITHOUT Compiz anymore). Many of these are older computer’s barely capable of XP, or could not run XP because of blue screens of death after installation.
So anyway I feel like I’m in some fantasy world where everything just works great for me while everyone talks about these “issues”.
weird
I have had mixed results with GUI’s in both Windows Systems and Linux. For the past 3 to 5 years I had been a Nvidia fanboy but with a spate of constant beta release drivers and an 8800GT that when it went down it hard locked the system on both OS’s – I decided to give AMD/ATI a go.
Since then I haven’t looked back. Sure their drivers haven’t been a magic bullet of computing but they release regularly and fix things. On both Linux and Windows 7 I have gone from Hard locked crashes to no crashes. With earlier AMD/ATI drivers if I did get a crash it would have been in a game and the OS/Driver would nicely dump me to Desktop. Something that never happened with my older Nvidia hardware which required the 1 finger salute.
Personally I use Win 7 x64 as my main desktop for Work/Rest and Play and that includes Graphic Design, Audio work, Media Playback to FullHD Plasma and bouts of Killing Floor, Crysis, GRID and COD 4/5. My current AMD/ATI 4870 1gb and drivers have provided me on Win 7 a great and stable experience that continues to get better with each driver release.
Now for X-Org, it is clunky but again coupled with the right hardware it can be rock solid. Still it’s advantages are in networking not personal desktops and even multiple desktops are no advantage nowdays as AMD/ATI drivers provide that functionability for Windows users.
I wonder how Haiku will evolve on the desktop with modern hardware and it’s capabilities utilising said hardware. BeOS destroyed similar OS’s in this light back in the late 90’s. Both Linux and Windows 2000 couldn’t hold a candle to it but at the end of the day and why I’m still on Windows – the apps count and No alternate OS gives me the choice and complete computing package that Windows does yet.
Even when I try utilising open standards cross platform software where possible there is stuff that just isn’t covered by it all.
Strangely, I have problems with every single OS I ever use, including random crashes, things that worked yesterday failing today, bizarre unkillable respawning processes, etc, etc, etc. Frustrating to say the least, especially when facing a deadline or dealing with an outage.
That said, I never seem to have the insanely vexing issues I see reported here. I have never had an xorg/video driver issue take my entire desktop offline, for instance. I do not use any compositing at all though, and maybe that has something to do with it.
I’m also never quite sure what people mean by “real work” or “serious work,” but it sounds bad. Anything that can turn a person into such a saber rattling sourpus can’t be good for the soul.
Oh, and CTRL+S FTW.
“Totem loaded up, but playback was unbearable. My entire quad-core 4GB powerhouse was brought to a screeching halt, and the video playback was choppy, audio lagging – it was terrible.”
The fix:
# yum remove pulseaudio
# totem-backend xine
Edited 2009-08-16 05:33 UTC
why are you trying to convince us that Xorg sucks? If it is (for you) just switch back to win!
Personally, I never encountered such problems, and my Xorg-based OS is perfectly stable.
Never heard people complaining about the win95-98-ME-Xp-7 blue-screen-of-death?
Just a thing: linux is a matter of participation, if you find bugs, report them and help to fix. If not, use win.
When you report bugs to win, they are actually considering the input and fixing the bugs. When you do it to lin, hardly nothing happens, you’ll get “not a bug, do not submit again !” or “we cannot reproduce (understand, here it works on my own computer dud, so get lost)” or in the best case “we’ll take care of it later (if ever)”. The “you have access to the code, fix it yourself” moto ain’t the best answer ever : not everybody is a hardcore coder with years of experience in the kernel, some are dumbass users that just wants to gives Linux a try, and gets pointless and unresponsive support from the so-called community when they comes to encounter troubles.
Feed up with these noobs and their inputs ? Then just abadon any hope to see Linux accepted by the mass, Microsoft -the evil king- is at least listening to its own crowd ! The Linux land is so puffed-up sometimes, it’s scaring…
Kochise
Edited 2009-08-16 07:50 UTC
I had quite the opposite experience (though that was some time ago), even with a ‘support contract’ for which good company money was spent…
Good and bad support can be found both in the proprietary and the FLOSS world. FLOSS has the advantage that because everyone has access to everything, you’re less dependent on the original vendor – but otherwise, it really depends on the situation.
FLOSS, FLOSS, FLOSS, you still depend on : the quality of the code, the quality of the comments, the quality of the documentation, the quality of the community that would check in your changes (if EVER you were able to do so), etc… OK, you have access to everything, now tell me *WHY* Linux is not 10 years ahead of Windows while it is open-source software and everyone could gets his/her hands into it ? I bet there is more coders working on Linux kernel (and related stuff) than on Windows, *YET* Linux ain’t far more advanced than Windows, it even lags behind, always copying new Windows’ features and themes…
Kochise
Because, despite arguments to the contrary, $$$ (and fear of losing house and home) is a far greater motivator of men than having their name tied to some open/free code or rep++. The amazing thing isn’t that linux/bsd/whatever isn’t ahead of windows in whichever metric you’d choose to discuss, the amazing thing is that it’s so close to parity with so little actual monetary investment.
So little what ? Gimme JUST a raw average of man/hour spend by the community on all the Linux distro, then multiply by an average coder’s $$$/hour to get the “actual monetary investment” spent on it (the time spent on Linux ain’t spent on something else, other coding or else) Plus that Linux is 15 years old, and should have grown better, brighter, …lighter also !
Kochise
Sorry, I can’t parse that.
OK, I’ll make it easier to understand : SWITCH-TO-WINDOWS ! Or… WIPE-THE-PAIN ! Got it ?
Kochise
I understood the “switch to windows” part, the rest of it, I’m afraid not. I take it that English isn’t your mother tongue, and that’s probably confusing matters. FWIW, in most English speaking countries it’s not customary to place a space in front of punctuation.
It took me some time to get down my tree and go out of the jungle.
I am a Ubuntu fanboy. The 9.0.4 version is riddled with problems with graphic cards. As the pace of evolving versions is breathtaking, they even don’t care to repair the bugs. So, what you have today is a stable unstable situation with ongoing unrepaired bugs. Intel graphic cards owners are of course well served by this (failure to produce reliable drivers), but I know also of other cards owners that experience serious trouble too.
I think you attack the wrong culprit. X is not perfect but I can look quietly videos on my old Pentium M for quite a long time. Even Totem, a long time a pain in the ass, is going decently now. I still dislike Evolution like you though (so many crashes ? surprising) and am happy enough with Thunderbird and web mail.
For me the real culprit is Ubuntu 9.0.4. Jaunty is buggy and will go down in the history as one of the worst failures of the otherwise happy Ubuntu family.
Canonical doesn’t test anything. Period.
A few releases ago they had an installer that would not let you leave the mount points screen. What does that say? That the installer WAS NOT TESTED AT ALL! No way that bug could have slipped by with ANY testing.
Use OpenSUSE. Rock solid compared to the xBuntus. Only problem I have with it is occasionally a screwup in the updater applet (NO distro gets updating RIGHT!), and Firefox can crash KWin periodically (which does NOT lead me to start ranting about how Linux can learn from Windows.)
Seriously!
I’m not saying vista/win7 or X is better, just that I don’t have any problems with X running my desktop, but then I don’t use Gnome of KDE, Fluxbox for me
Still, when I was running Ubuntu and Gnome I did experience some X related problems, but I put them down to Gnome not X, because removing Gnome and just running Fluxbox, the problems went away, so go figure? Also Gnome in Debian propper is much more stable than Ubuntu, I would think that’s your problem. Ubuntu had constantly gotten more bloated and unstable with each release IMHO!
But anyway, any improvements to xorg from this piece are alway welcome
thom wrote: “I’m sure the usual suspects are already busy churning out comments about how this is the fault of the driver, VLC, myself, the 30 Rock video, planetary alignment, anal probes, whatever”
An anal probe, done at the exact moment when mars is aligned with uranus, will muck up x.org.
I have a toshiba laptop, which is a fairly good machine, with a turion x2 cpu and 3 gigs of ram.
The laptop came with vista preinstalled, and since i think vista is a big pile of crap comparable to only win me, the first thing i did was to remove it and replace it with a better (read proper) operating system.
Because the smart boys at toshiba decided to cripple the bios and remove the ahci option, i can’t install osx and make it a proper hackintosh, so the only option was to install some linux distro.
I went for ubuntu, since it’s based of debian, which i very much like. I use my laptop in my office, so it’s mainly text editing and browsing. Everything was working, including compiz, so i was happy. For about an hour or so.
This is where i have to agree with Thom. X.org is indeed a pain in teh ass to use. It’s slow and unresponsive. Even the window resizing looks like it’s done on a PII. I don’t really think that disabling full window resizing and replacing it with a frame just like it was done in win 95 is the answer. I also don’t think my ATI video card is crap. If the driver is bad, that does not mean the hardware is crap.
After a period of ubuntu using and x.org cursing, win 7 free evaluation version came out, so i decided to try it.
Well, i have to say, as much as i dislike microsoft for the business model and some crappy products (ie comes to mind), win 7 works great. It does not crash, it offers a pleasant visual experience, and everything works.
Again, i use the machine at my office, so i don’t have the time to fiddle with it. It needs to assist me in my work, and not crash when i have all my documents open. The fact remains that on the same machine, ubuntu sucked (i also tried kubuntu with kde4, and i wanted to throw the laptop through a window), and win 7 flies. They got rid of many problems that vista was plagued with, it’s much more responsive now.
Of course, the biggest problem in windows imo is the pests that are crawling the net, and i’ll always need an antivirus and some adware removal tool. Using linux really feels like freedom, not having to wory about viruses and stuff like that. I wish one day i could use a linux distro as a everyday OS, but i think that’s a long way ahead. I also wish i could find a way to install OSX on my damn laptop
I agree that Windows 7 is definitely going the right way and that X can be a pain in the butt at times.
I’d consider Windows 7 to be more stable than what X currently is, part of the blame goes to the people responsible for the distro configs and partly to the people behind X, but largely thanks to the drivers(blame ATI, Nvidia, etc.).
Some distributions work a LOT better with X and I’ve found Ubuntu to generally be the worst distribution of then all, the default configurations suck, hardly ever work and you have to just open up the terminal and fix it yourself(which you don’t have to do in Fedora for example).
Also if the problem with X applications not properly disconnecting and reconnecting during a server crash is with the toolkits, someone should DEFINITELY fix this ASAP. If I was the project manager for those toolkits, I would make it priority #1. But in the FOSS world, it seems more leet if it crashes horribly every now and then, and then you also get to do some driver and hardware manufacturer bashing.
But still, Windows 7 is far from perfect(and all previous versions are considerably worse), and the drivers aren’t that isolated either. If you really think that Windows 7 with Windows Update is so perfect, here’s a small story for you:
I reinstalled Windows 7 on my trusty little box with Nvidia graphics. Windows 7 nicely found drivers for it and downloaded them from Windows update.
After a while I noticed that my computer keeps on getting blue screens every 3-4hrs or so, with a weird message that didn’t tell me anything(DRIVER_POWER_STATE_FAILURE). Also the automatic diagnosis failed completely.
After googling for some time, I found that some people have had this problem with some USB devices, so I disconnected everything except my keyboard and mouse. Still got bluescreens.
More googling, and found out that this might be fixed by updating your graphics drivers. Checked Windows Update, no updates available. I then went to nvidia.com, downloaded and installed the latest drivers.
No more bluescreens after that.
I gotta say, most of the time when they do “crash”, it seems to have been handled gracefully, for example at one point they seemed to crash every time I resized my Windows Media Player window(sound familiar?), which usually only resulted to some flicker and switching themes back and forth.
But also, I sure didn’t think that it was ok to revert to Aero Basic every 2 minutes, especially since if it happened enough times within some period of time, it completely refused to start the compositing manager or whatever.
So, as a conclusion:
No, X is not perfect, but it’s not like these problems don’t exist with Windows either. You sound like a typical fanboy that sees all the problems in other systems than the one they like and ignores all the faults in the one they like.
I really never had something like those Xorg crashes ever. This surprises myself aswell, because I am using third party proprietary nvidia drivers.
I also never heard of that kind of problems of the ZevenOS User base.
Thom perhaps you try another userfriendly and for you good known desktop (beos) based on a very stable debian system.
ZevenOS Neptune. (http://www.zevenos.com)
Xorg is a bottleneck for linux on the desktop atm. But there are lots of people working on a rewrite of Xorg fixing all the annoying things in Xorg.
A replacement would bring much more problems and difficulties. A better and improved Xorg is the answer of most of the problems I think.
Windows 7 has an improved Graphical System. They developed it from windows nt on and its very stable right now. But I wouldn’t use Windows because I don’t want to have Microsoft spying my computer.
And I suppose the author will also agree that putting the whole Windows graphics stack into the kernel-space is a good design decision? He doesn’t mention his graphics card which I suspect is ATI since they have the most terrible Linux support imaginable. And ATI (or whatever graphics card he has) not supporting Linux is X.org problem, because …?
Being a student and all he sure fails to diagnose the problem. I agreed that X has some failures like running as root (which can now change) and that the X clients disconnect if the server crashes. But the server should not crash in the first place.
So the article is bullshit, yet (see quote) you agree with me completely.
Eh…?
I agree that X.org, like most software, isn’t bug-free. But I think you exaggerate with X.org being a total failure. For the most part, bad-written, closed-source drivers are to blame. X.org is been around for awhile and the code is fairly stable. But apperently some idiots at amd/nvidia think it’s best not to release driver source code and write bad sotware. IMO, X.org has a fairly solid design.
I don’t care where the bug lies. The issue I’m raising is not about the bug – it’s about the consequences of a bug. X.org has NO isolation. A simple bug in a distant corner of the stack will take down EVERYTHING. That is BAD. That means your design is inherently flawed, as the results of a bug should be contained, not given a free pass to wreak havoc.
Well then your title is misleading. Saying “X could learn from Win7” sounds like X is directly responsible for your problems and it isn’t because your problems are elsewhere. If you read the comments you will see that most people don’t have such problems. And saying you don’t care where the problem is is just ignorant. Being a Linux user you should know better. And here starts the trolling. You don’t know where the problem actually is but you immediately start attacking X.org.
Hmm… remembers me of the nvidia drivers on windows vista. they brought down the graphical system aswell.
Most of the time the only result is this:
http://i.zdnet.com/blogs/wddm_timeout.gif
If you are prepared to not use the closed-source drivers, then:
http://www.phoronix.com/scan.php?page=news_item&px=NzM2MA
A Root-less X Server Nears Reality
This will, of course, only work for cards with open source graphics drivers. AFAIK one can’t get to this functionality with binary blob drivers, due to licensing issues of having to put parts of the driver in the kernel.
Once X can run rootless, it can then be “isolated”.
The design of X isn’t inherently flawed (as it can be made to work very well indeed with open source drivers) … rather trying to run badly written closed-source binary blob drivers (which cannot be debugged or even properly tested) as part of the open source kernel … that is the problem. Fortunately, a solution is close.
Edited 2009-08-16 11:47 UTC
You misunderstood the role of rootless/kernel mode.
kernel/user mode is a separation at the hardware “ring” protection levels that a CPU have so a crash in one ring is isolated on other ring. So a crash of your application will not break kernel (that may be in another ring in most OSs), drivers, etc. The x86 CPUs have 4 rings, 0 is reserved for kernel, and 3 is for user-space.
root/user mode X server will not solve those kind of problems because is only a better separation of X server to be able to achieve to run 2 X servers corresponding to their user counterparts.
When the video driver crash, your kernel is still running, Thom suggests that just in case of a driver video crash, the X-Server should restart the corresponding driver. This cannot happen as the architecture of actual X-Org tries to move more power to clients (look on Google Talks talk about XOrg).
The solution lies in a toolkit to be aware of X-Server restart or X-Server-crash-restore. Or of course a complete newer architecture is needed, either on XServer level or on widget toolkit level. Any of those will not happen fast and not overnight for sure.
I wonder if movie studio’s that use Linux as a workstation also experience entire X crashes.
The movie gets delayed
I think (and no one cares ) an interview with a X developer and a technical explanation about the issue would have been 100x more valuable than this rant.
Hopefully we will see more interviews and interesting technical articles in OSNews in the future.
Edited 2009-08-16 09:16 UTC
Yes, but interviews require setting up a time and place and then transcription. It also requires cooperation and a modicum of mutual respect. At least it does in cases where no publicity is needed/wanted, and there is no $$$ to be made by either party as a result of said interview.
Rants, OTOH, simply require a scowl, a keyboard to punish, a feeling of self-importance, and a forum. When faced with this decision, especially in the face of “wanting to put idiot [insert OS here] users in their place” which avenue to you think the majority would choose?
The purpose of a rant/editorial is not to make the author popular or whatever other bullshit – it’s to make sure a problem gets debated. Seeing there are over 200 comments here, from both sides of the fence, this editorial succeeded in its goal.
People are now discussing this subject, and that’s exactly the goal. A rant need not be eloquent or extremely detailed – it needs to make people discuss and think.
And that’s exactly what happened.
Yes, there’s lots of comments but it seems that most of them are just “works for me” / “X.org sucks” non-technical commenting.
What would be nice is to get technical explanations why things are like they are, what we can do to make things better, etc. comments from core X.org developers. I doubt that users ranting and chitchating makes things any better no matter how many comments there will be.
I don’t think you meant to respond to my post. Either that or you’re rebutting points I didn’t make.
To address you point about post count, I’m certain you know that post++ doesn’t mean a thing in and of itself. Launching an incendiary into a crowd of people is extremely likely to have a dramatic effect, but seldom makes for well structured or well reasoned discourse. Discussions of these types, with very few exceptions, result primarily in an assembly of keyboard warriors keybanging fiercely back and forth.
Or do you disagree?
x.org in its current state is an abomination. It is holding down any potential linux has to make a dent in the dekstop market. It must be fixed immediately.
On my friends laptop win 7 would frease every time we tried to play video in fullscreen. it did not matter if it was flas video or windows mediaplayer. to reboot the computer was the only solution.
so it souds that linux has already learnd from win 7
While I agree on the fact that X is a disaster, here is what happens when the graphics driver crashes on Windows:
– you get a blue screen
– the only way to restart, is to restart the whole computer
– of course you lose all your data
XP? Hello, NT 6. It throws you a command line and seconds later you’re back into the GUI. I think it may even try to use the default graphics driver if that fails.
If you’re going to comment, please use information from this decade. As the article clearly describes, that is not true. In the vast majority of cases, a crashed video driver on Windows will result in a momentary flicker as the graphics subsystem is restarted. Applications are entirely unaffected. To them it is the same behavior as if you switched graphics drivers, connected remotely to the session, etc.
My response to the e-mail sent to us with no return address.
—
Hello Sir,
Whilst I do not speak for OSnews, nor Thom, nor any other member of staff; I think I can say that Thom’s point was not X itself. In fact the OS is irrelevant; Thom is basically saying that in the year 2009 software shouldn’t be doing this. Who precisely is to blame—X, drivers or otherwise—is besides the point. A user who sees a crash is going to blame the system, regardless whose fault it actually is. The graphics stack in Windows Vista and up is robust enough to mitigate bugs caused by others and keep the user’s apps running. Data loss is a sin of any OS, and any self-respecting OS should do the best it can to deal smoothly with other people’s bugs!
Essentially, I believe, that Thom is saying that there is no excuse in this day and age. *Any crash* X, or not, that looses the user’s effective data (because they don’t know how to recover the system) is unacceptable.
Don’t think technical, because if you start looking at the technicalities of X and the Linux stack in general, then you end up unable to see the forest for the trees.
Kind regards,
Kroc Camen.
Software crashed , crashes and will crash .
As we are all humans and make mistakes, there is no way that software never ever will crash ever again.
We need to talk about the technical issues, because, who should fix the problem, when we don’t know the technical backgrounds ?
If you are requesting for a better experience in the future you need to address the right people. In my opinion X.org is on the right way.
If third party driver vendors would write drivers that will run in user space not the kernel the problem might get solved easier.
If they release drivers as opensource it would be much much easier for all of us.
Edited 2009-08-16 11:02 UTC
The problem is with the article being misleading to anyone who doesn’t understand the issue. It’s true that it doesn’t matter what caused the crash from a normal user’s view. But the article should have at least /some/ explanation. In the current state it’s just spreading FUD. If the author’s point is the X architecture, then he should clearly state that. And since architecture is deeply technical, statements such “for the user it doesn’t matter” become pointless.
Any OS/software have bugs and crashes.
When Chrome starts with multiprocess architecture, Thom will need to write a rant that how can anyone survive losing data when a flash plugin crash.
Robustness of a process and the relation of other applications with it is not necessarily the right concern. What if Outlook crash or Word crash and the database/file you work on becomes corrupt? Being a bit more robust on restarting their windowing system means nothing in real life. Buying a tested Linux hardware configuration to have your experience smooth (as is offered by Dell) will not make you worry about restarting your drivers configuration.
Also, this “robustness” come with a price. Linux can run smooth on a nettop, and the equivalent counterpart, WinXP will break in the same way as does Linux.
Windows also have a huge garbage of software that makes your system sluggish, unstable and sometimes unusable, mostly if you download small tools from internet.
I would like to see a counter rant that Microsoft do not have a nice way to make in one interface updating of your entire software as Linux do it for more than 5 years in almost any distribution with an GUI. It is really not acceptable for year 2010 we’re in! Or a way that when you click on a site that needs a codec, to show to you which codec do you need after an internet search-up. And I don’t mean those popups that say that your computer is in danger and you need to get a lot of scamware.
You mean like this?
http://www.osnews.com/story/19711/The_Utopia_of_Program_Management
http://www.osnews.com/story/21135/Blind_or_Deaf_Program_Management_…
http://www.osnews.com/story/21714/Blind_or_Deaf_Program_Management_…
Mea culpa, you have right those are written, but the point was another: your bad computer experience with Linux to mean that a specific piece of code is bad.
The point was this: you shown an use-case that fails on your machine. It is annoying for sure. In most normal case the video driver crash should not happen.
For me going to Windows I have in tray some popups announcing me that there is a software update for Adobe Reader or Java, some other updates go through Windows update, etc. When there is an exploit go in the wild for Windows is mostly happen because the user do not upgrade it’s software stack (mostly Flash). I use mostly Fedora on a my netbook and laptop and Ubuntu on desktop (you have right: I pick this combination for the best video driver stability), I know that updates are only a matter of going one application, no annoying popup, etc.
Showing an use case that annoy an user (like using data because of data corruption, video drivers problems) on any OS should be done in a way that don’t appear from outside as an anti-X/anti-Linux FUD. I am not sure if you intend it, but this is how it looks by far.
I did worked with RHEL and Linux driver stack for an air traffic controller software and the drivers are guaranteed to not crash for at least 2 years (we used nVidia Quadro configurations). The entire software stack should not let the machine off more than 2-3 minute per year for entire controlling system (meaning for 10 machines), so they really work (almost) flawless. Because 2-3 minutes does not mean only video driver freeze, can mean network problems, memory leaks, etc.
And for them Windows (I talk about client software, no server) was out of the options. Either 2K, XP or Vista. The alternate OS could be only Solaris.
I always stop reading when someone compare a fresh installed Windows box with linux and scream hallyluhay about the speed, wait a few months and watch them pull out their hear in fustration.
If I had all the troubles with Linux on my desktop like you described in this article, I wouldn’t touch it with a ten foot pole.
But I use Linux since 1998 and since 2002 exclusively at home on all my machines and never ever had this many crashes you described in a sort usage period in all my years of usage. But then I’m one of those people who longs for my own Linux desktop when I’m using Windows Xp or OSX for a long period at work, I program a lot on different machines, being on of those who see a OS as a tool and use the best tool for the job. Because of the weird behaviour of Windows and OSX. I find the multi-tasking on Linux a breeze compared to those two, Always having trouble because there is always something hogging the system for long periods and everything comes to a screetching halt.
Edited 2009-08-16 11:44 UTC
There’s your problem. They fixed that in NT 6 with Superfetch.
My Windows 7 install has been operational, day-in-day out, only restarting for updates, since 25-04-2009, 12:13:26. So it’s not a fresh install.
The Ubuntu install, however, was only a few days old.
OK.
Now install OpenSUSE and reproduce your problem.
Bet not.
People need to stop thinking “Ubuntu” when they think Linux. Canonical software tries to be cutting edge AND user-friendly at the same time and fails at both. THIS is why you have bugs that drag down the whole system.
OpenSUSE can crash its desktop, too. I blame Firefox for that since it’s always Firefox doing something when a crash occurs. And occasionally Kaffeine will crash itself over some video issue, with no effect on the desktop.
The rest of the time OpenSUSE is rock solid.
So try it instead of Ubuntu. I bet your problems go away.
There IS a difference in distros in terms of reliability.
As if the threads under this “story” were not already troll-infested enough. Here comes one from *yet another* camp and direction. Please stop.
Edited 2009-08-17 01:26 UTC
My response is perfectly relevant. Thom is blaming X for his problems, when in fact he doesn’t know WHAT his problems are.
I’m saying Ubuntu may well be one of his problems. Especially since it’s a new install for him. So try another distro.
Why are half of the comments here from people stating how their experience differs from Thom’s, and therefore Thom’s experience must be a ‘lesser’ one? You folks are completely missing the point he is trying to bring across with his firey incantation.
Strange how prose meant to draw a knee jerk reaction leads to that. Appeal To Emotion FTW if you want to stir the masses rather than effect change. Rabble rousing is a fantastic way to rally troops if your intent is to go to war. It is epic fail, OTOH, if you actually expect those being lambasted to listen to your argument or you expect reasonable discourse to follow.
I have been using an ATI Radeon 9200 (RV280) for a while on my desktop and an ATI Radeon 9100 on my laptop. I’ve always been using the open source drivers. Never had any crashes at all. So I guess all this just shows is that the “latest greatest” graphics cards are not really well supported at this point in time.
The bottom line is, if you can’t live without getting a new PC every year or two. Then Linux will never work perfectly for you. So better go use Windows 7 and spend all your money on hardware so it doesn’t perform like shit.
You are missing the point. He wasn’t complaining about the crash. He was complaining about the fact that all applications crashed too and he lost all the data in them. I agree with him, there is no reason why application data and graphical display can’t be separated allowing the system to gracefully reinit itself and still retain the application windows and data.
The “article” though is nothing but a poorly-written rant..
I understand that, but why would one need a way to mitigate crashes when there are no crashes? Well it could be nice. But that depends on how much additional RAM/CPU cycles it would require to have this.
I understand that, but why would one need a way to mitigate crashes when there are no crashes?
There are crashes. Just look at the article.
But that depends on how much additional RAM/CPU cycles it would require to have this.
Why would it take any more than it already does? One solution would be to just split X server into 2 processes; one which retains all the information regarding windows, their contents, sizes etc, and another one which does all the drawing. When the drawing-one crashes the other one will still keep running and can restart it, submit the window data and POOF, problem solved. It would not have much of a difference — if any at all — performance-wise, but when a crash happens the result would be very different from what it is now.
Why would it take any more than it already does? One solution would be to just split X server into 2 processes; one which retains all the information regarding windows, their contents, sizes etc, and another one which does all the drawing. When the drawing-one crashes the other one will still keep running and can restart it, submit the window data and POOF, problem solved. It would not have much of a difference — if any at all — performance-wise, but when a crash happens the result would be very different from what it is now.
Isn’t that what session management is supposed to do? Save the state of a running apps and restore them?
Isn’t that what session management is supposed to do? Save the state of a running apps and restore them?
It would indeed be session management, but it’d be built-in to X, not the toolkit. Toolkit-based session management has a few issues: usually when X crashes your toolkit is also taken down, you need to manually restart the apps etc. But why restart the apps at all when they don’t even have to be killed at all?
I suppose all these issues could be fixed at toolkit-level too, but not all toolkits will anyway implement such and it’d just mean reimplementing the wheel for every new toolkit. Just implement it once, properly, in X and everyone benefits.
Sounds good then.
Maybe it can be at the xlib level so that all toolkits can benefit… It’s probably a bit tough to design though because it could be difficult when a new xserver starts up to enumerate all the apps which need to reconnect and tell them who to reconnect to (basically, how do you know which apps belong to which session)?
I don’t really understand why the toolkits need to be particularly involved.
Maybe, but in the long term it will be better to move away from Xlib as the base for toolkits to something more appropriate, e.g. XCB, which is intendend to be used by toolkits, not as a direct application level API.
I don’t think this is a problem because the process restarting is the server and the clients can still use the same address to connect to.
Depending on the type of connection the clients either have to try to reconnect (remote connection) or wait for the local socket to become available again (or similar for whatever transport they happend to use).
It mostly depends on what resources the application used in its UI. All drawing of text and graphics primitives can just be repeated, just like when repainting the window due to being unhidden, etc.
More problematic are things like images, which have to be “uploaded” again (in quotes since when using shared memory they might not move at all).
Of course the process of “uploading” is also repeatable, the question is if the application still has access to the image data. It might have deleted it to not have it in memory twice (in case of a real transfer), etc.
Actually, while thinking about it, the application developer might want to know when disconnect and reconnect happen. Maybe for suspending certain operations of the software, e.g. similar to networked applications not attempting further network operations when they are notified that the network went down.
Everyone has a different experience. I haven’t had an Xorg crash in… Well, I can’t remember how long.
Vista, on the other hand… After installing the latest drivers from AMD for my x1900, Vista would BSOD at least every other time I started it. So much for graphics drivers not crashing the windowing system… They brought down the entire OS.
Adam
I had similar experiences with X and ArchLinux some days ago. I installed it from the newly released CDs and installed X, but when I started it I had no keyboard or mouse (and no ssh). I had no way to go back to the console, except a reboot. Really nice. Turns out you have to install the packages that do automatic detection explicitely. If you don’t and also don’t configure your xorg.conf, X kills your keyboard access. I really don’t get it. Especially since nobody else seems to have the same problems as X. BeOS just worked, OS X just works, even XP (and Haiku and ReactOS, too!). Please, replace X with something that actually works.
That’s the point of ArchLinux. If you want a desktop distribution then you should go with Ubuntu. Besides, Arch has a great wiki and if you follow it you should have no problems.
The problem was that I followed the wiki. It tells you that you don’t need to do manual configuration, but not that you need to install the evdev driver (or whatever it is) for automatic configuration to work. It only tells you that on the evdev-page itself. I don’t really blame this on ArchLinux. It is a geeks distro, but I still don’t understand why X simply kills any keyboard access when it does not know what to do.
Bug happened on linux and you don’t care where is the fault.
If you cared, you would probably know, it’s not the problem of Xorg, but the underlying driver. I think, you didn’t have the 3D enabled, that’s why playing movies sucked. If you wrote to launchpad.com, you would know, there are some nice guys that are willing to help. In special cases that happens, especially on newer hardware. But don’t blame linux community, blame hardware manufacturers. Some even don’t know linux exists, and if, they don’t care.
All right. That was your experience.
I had some problems with linux, especially driver problems, but all that is nothing compared with any Windows.
If I wanted to pretend to be working, I would install Windows and show it to my boss: Hey, am working, just look at it, it’s loading, please wait, it’s unresponsible, I’m running antivirus scan, I’m defragmenting,i’m reformatting drive, i’m installing windows. again.
My experience with windows is bad! installing OS every 3 months is not for me.
Linux make me work much more faster.
Modern OS support UTF-8(or something universal) as a default. I did some web pages, and on windows it plained sucked. UTF? does Microsoft even know, what does it mean???
Conclusion:
What sucks on Windows:
Microsoft
Proprietary
stability
productivity
efficiency
userability
can’t be customised(GUI, kernel)
powershell is not The Unix Shell(cygwin don’t have apt-get )
no package manager!!! modern system does have a package manager. no next,next,next,next,next,accept,finish!
price(Why should I pay for pain?)
should I continue???
What sucks on Linux:
newer drivers, or special hardware without support from hw devs
doesn’t support some windows programs(games)
poor graphics performance
power consumption(can be improved)
Sometimes OS evangelism is as bad as politics. There are plenty of people who would blame all of our problems on one party or another when they both are responsible.
Your point about what ‘sucks on Windows’ specifically stability is just ridiculous. Maybe Windows 98 and some early versions any OS – even early releases of Linux and Mac have their stability issues when they get out in the real world.
What sucks about republicans:
They are responsible for every problem in America
They want to give school children guns
They want to make all non-evangelical religions illegal punishable by death.
They want to restrict public office holders to white, evangelical Christian males only.
Can’t be customized (GUI, Kernel)
What sucks about democrats:
They like to tax people with money.
They want to take money from working people and give it to people without money.
No hummers from interns for the rest of the country.
Poor graphics performance.
DISCLAIMER: Chill, the author is apolitical – voting democrat, republican, democrat, Perot (sorry), democrat, republican in the past 6 presidential elections. Obviously the above are stereotypes propagated by people with agendas. As technology professionals and enthusiasts, it is our responsibility to be balanced and honest about technology – not to be the Rush Limbaugh and Al Franken’s of the tech world. Yes, there are things about Windows that suck as in Linux, but there are just some old ideas, like stability that may have been true in the past that are just not major issues anymore (i.e. hummers from interns).
I had a student write in a paper about advantages of Windows and Unix (for a Unix class that I was teaching) that “Windows has no security”. I guess he thought since I was such an enthusiast that I would just pat him on the back and give him an A. He was the top student in that class – a real sharp Linux kid – shame on him, he should know better than to make that kind of generalization!
Windows uses Unicode everywhere, and has for a long long time. If you’re having problem with UTF-8 character sets, you’re doing something wrong in your application / web page.
I was working with a user the other day and he was really upset that he was having so much connectivity problems with a laptop of his on which I had installed Linux the month before. He was telling me how disapointed with Linux he was and how he had read online that it didn’t properly support the hardware. I told him to ignore most of the negative stuff on the Internet. It mostly came from ignorant and usually quite insincere users that just couldn’t get over the fact that their favorite OS or company might not survive. On checking his computer I suggested that get a new network cable and that in the future he should consider the hardware first. (Hardware in my experience for end users after viri the common solution to problems with computers. Window(95-7), Macintosh and Linux, etc. included.)
I’ve passed this on to a couple of customers over the years mostly because its a little funny. First read while I was supporting Windows users years ago. An article I believe came from someone at Nasa explaining one of a number of possible reason for random computer failures. Ionizing radiation could knock out or flip a bit in memory of some running program and cause a critical error. A one off I know but if they are crazy because they’ve lost a days work, sometimes it helped.
Found a wiki:
From http://en.wikipedia.org/wiki/Radiation_hardening
“Digital damage: SEE
Single-event effects (SEE), mostly affecting only digital devices, were not studied extensively until relatively recently. When a high-energy particle travels through a semiconductor, it leaves an ionized track behind. This ionization may cause a highly localized effect similar to the transient dose one – a benign glitch in output, a less benign bit flip in memory or a register, or, especially in high-power transistors, a destructive latchup and burnout. Single event effects have importance for electronics in satellites, aircraft, and other both civilian and military aerospace applications. Sometimes in circuits not involving latches it is helpful to introduce RC time constant circuits, slowing down the circuit’s reaction time beyond the duration of an SEE.
* Single-event upsets (SEU), or transient radiation effects in electronics, are state changes of memory or register bits caused by a single ion interacting with the chip. They do not cause lasting damage to the device, but may cause lasting problems to a system which cannot recover from such an error. In very sensitive devices, a single ion can cause a multiple-bit upset (MBU) in several adjacent memory cells. SEUs can become Single-event Functional Interrupts (SEFI) when they upset control circuits, such as state machines, placing the device into an undefined state, a test mode, or a halt, which would then need a reset or a power cycle to recover.”
What I dislike about your article is that you piss and moan and generally sound like a reasonably over-educated but poorly thinking Windows fan. I don’t want to hear about your high powered quad four blah! blah! blah! Or thumping on your chest about your thesis doesn’t impress me at all!
I want to hear: there is an 8800 Nvidia graphics card from Asus (which one.) You are using the xxx Nvidia proprietary driver. The motherboard is a GA-965P-DQ6 (rev. 1.0). The power supply is a 350W icute. etc. etc.
My guesses why you may not done this. 1. Simple ignorance. 2. You were angry, not thinking and jotted off a quick complaint. 3. You know that if you give accurate journalistic quality information someone will write back and say that you are a complete idiot. You need to do this, this, this, very minor and it will work perfeclty.
My favorite number 3.
Windows users don’t want results just to complain.
Thom,
I understand that you’re frustrated and that you don’t really care who’s fault it is for your bad experience, you just care that you lost your data.
I understand that because that’s one of the reasons why I decided to stop using any Microsoft products in 1996. My experience using Windows 3.11 was so terrible with flakey windows behavior, an inability to view a web page using netscape and print something out without windows 3.11 crashing on me was one failure too many.
Having experienced SunOS, a stable operating system with a stable window manager, X-window, back in 1992-1993 I knew that it was a sick joke for a large corporation to push an operating system to the masses as unstable as Windows 3.11 and call it the “best” operating system ever.
That’s when I discovered Linux at an electronics store and I knew I found the solution to my problems. With Linux I would be in control of what software was installed on my computer. And yes, I might have had to learn a thing or 2 to get the system to work with my hardware, but it was worth it, because it taught me things about how an operating system works that I would never have found out otherwise using a closed system which discourages me from asking questions.
My experience taught me that proprietary software was at best suspect, and that OSS software was always preferable for a task if I needed an application and I had a choice of tools.
So I’m asking you again sir, did you or did you not use a proprietary driver which replaces the functionality of Xorg with its own implementation?
If you used a proprietary driver, a driver which Xorg had no responsibility for, how can you blame the Xorg people?
I don’t think it’s reasonable to say “I don’t care who is at fault, I lost my data, so I’m going to arbitrarily blame Xorg”.
To help you see what I’m trying to say, here are some more examples to consider:
“I used a nonstandard compiler which creates its own implementation of runtime libraries to compile my code and when I run the program it memory leaks 1 TB of data taking out all of my programs and losing all of my data. I don’t care who is at fault, I’m going to blame the standard compiler for this mess.”
I’m sure you’ve experienced this one:
“I love my operating system. It’s the best one yet. I got a new hardware toy, that comes with a proprietary kernel driver provided by the manufacturer. It memory leaks however so violently it takes out my OS in under 3 seconds of using it, not only wiping out my current session but also causes any unwritten buffers to get wiped out and hosing supposedly saved data as well on the hard drive.
I don’t care who is at fault, I lost my data. So I’m going to blame the Operating System, and call it the shittiest OS ever.”
Does that help you to see the point I’m trying to make?
Your point is flawed.
It is not relevant whether or not the driver code was proprietary. As has been said a million times already, this article is NOT about the bug. It’s about the results of the bug.
A modern and robust graphics stack can handle a driver crash; Vista and 7 can. They gracefully recover from a driver crash without applications crashing and without data loss. THAT is robust.
Compare this to X: driver crashes, X.org crashes, applications crash, data lost. This is NOT robust.
The fact that the driver code is proprietary (which it indeed was) is not relevant in this story in ANY way. X drivers should be contained properly, so that any bugs in there do NOT affect users’ data.
Fair enough Thom,
But I want to emphasize this part of your response:
“The fact that the driver code is proprietary (which it indeed was) is not relevant in this story in ANY way. X drivers should be contained properly, so that any bugs in there do NOT affect users’ data.”.
The proprietary drivers that I know of, Nvidia and ATI, come in 2 parts: kernel space and user space. What if the bug is in the kernel space? Do you still think it’s reasonable for X which runs in user space to somehow catch a kernel space bug and recover from it gracefully? I just don’t see how that’s reasonable.
You say Windows 7 somehow manages to catch bugs in their graphic drivers and recover gracefully. Well, kudos to the Windows 7 developers for a job well done.
What I would like to know, and maybe you or maybe a guest could do, is write an editorial explaining the architecture used in Windows 7 that does that kind of fault isolation. I’m very confident that Xorg developers would be interested in knowing how that is done so they could implement it as well.
I have lost so much data with Vista crashes, I find your posts on the matter laughable.
I’m writing this on my FreeBSD laptop since my Vista workstation told me it needed to run a chkdsk on my C drive. It’s been running for over 30 minutes now and throwing “Recovering orphaned file” messages left and right.
Even in those few times that I have had X crash (and I can’t remember the last time it happened) I never actually lost data and had filesystem corruption.
Adam
Edited 2009-08-16 20:03 UTC
I think the interesting point is that your applications crashed.
AFAIK most X client implementations make the application exit, so maybe the application crashes were actually bugs in the exit handler and the application developers would appreciate the backtraces that got dumped.
People have been trying to replace X for a long time. It hasn’t worked for the simple reason because so much code is dependent on it, twenty years or so. Since X and the parts that dependent on it are developed by so many people you can’t simply be MS and say “Here is where we begin anew”. Thus X goes on and on.
I use the best tool for the job and right now that tool is Windows Vista, the first Windows I have used since 3.11.
Thought I would start with what you always reply to every posting with.
First if Vista is so great then why did almost no one adopt it, except those bought new computers or were hard core gamers wanting DirectX10?
Second I currently run Windows 7 and it seems crash every so often to bluescreen of death for no apparent reason, since I upgraded my video driver this hasn’t happened in awhile. I run RC1 so hopefully whatever it is will be fixed as I don’t have this problem with WinXP or Linux.
I run Mint Linux (a version of Ubuntu) as my main OS for doing all my development work and don’t run into the issues you talk about all. X is stable as can be.
It really seems to me that you got pissed off and don’t really want to be helped. You just want to rant about how bad Linux is.
I believe your core problem on your Linux system is Compiz. That program has caused me numerous problems with having a stable OS. Also I find that the video driver that comes with Ubuntu isn’t the greatest so I install the latest from Nvidia. I do the same thing on Windows. So turn off Compiz and upgrade your video driver. How hard is that??
Oh well. I think you should just stay with Microsoft as that seems to work so well and stop writing articles about Linux.
Because Vista was crap at RTM and then people that NEVER used Vista told others how bad it is. Even after SP1.
I run Windows 7 as my main OS for doing all my development work and don’t run into the issues you talk about all. 7 is stable as can be.
It really seems to me that you got pissed off and don’t really want to be helped. You just want to rant about how bad Windows is.
See, it can go both ways. Everyone has different experiences and saying “it works for ME therefore you rant†doesn’t help anybody.
X is only 25 years old, not 30. Plus, there have been several better alternatives created over the years but many of them have not been open sourced (e.g. Wayland, Photon MicroGUI for QNX, etc.). So until someone funds a complete rewrite (good programmers don’t grow on trees) or open sources a current project then it’s unlikely we’ll ever see wide adoption — especially by the more persnickety distributions like Debian that zealously defend their licensing model (q.v. the recent brouhaha over Mono).
I won’t go into guessing what the rant owner’s problems might have been, and I don’t even care. I could compile such a rant for every OS I ever used. Instead, I tell you what: whatever machine I had, I ended up using an OS on it that was the most stable on it, and/or most usable. Yes, some X problems, and some display driver problems can cause an X crash, all true. And very unfortunate. Yet, it’s not hard to find a configuration that works and never will cause such crash, if it is, then change HW, or change OS. Complaining won’t solve the problem. Circumventing it won’t solve it either, but it will recult in a usable system, which is what I need, much more than wobbly windows and unnecessary funny stuff.
I will still trust writing anything longer than a cookie recipe to my linux anytime over any Windows version you can throw at me
I absolutely love linux, and can say that Thom absolutely got it right. X.Org is *CRAP*… Hell of a *CRAP* thing. And it’s time for Linux to move on to another graphics stack, for the sake of humanity, by all means. I am so frigging tired of X.Org that actually have been watching 10 years of nonsense in this graphics stack.
CONGRATS THOM
BEST ARTICLE THIS YEAR
If it were for Linus Torvalds to comment…. what do you think he would comment? He’s known for cutting the crap and talking straight, hurts whoever it needs to hurt.
If Linux were to give up on X.Org *TODAY*, what would be the next move? What are the alternatives available? What kind of project could be forked from X-Windows. And what do you think Linus Torvalds would recommend to do (and get followed by millions, as his opinions tend to do).
He’s already hammered on GNOME, KDE and many kernel discussions. Do you think he would ever FORGIVE X? Something tells me *hmm*… No. What do you think he would advise…
A 200+ comments on the subject is of concern. Who ever had a crash from X just like Thom did? I can count myself at least 10 times, using Linux in 10 years. It’s about getting the bad root out of the core.
Hi Thom,
a lot of people have issues with xorg. I also have the “feeling” that it has its share in the missing “crispiness” of my linux desktop sometimes.
Your article however would be much more helpful if you would explain some of the design issues you mentioned.
“This is simply bad design through-and-through, and it has been haunting the Linux desktop for a long time now. In the X world’s rush latch onto the “me-too” bandwagon of GPU acceleration, they completely forgot to actually fix their bloody design and move it from the 1990s into the 21st century.”
Do you know Gustavo Duarte’s blog (http://duartes.org/gustavo/blog/)? He wrote some very useful articles which explain some things very good an in detail.
Articles like this in the context of xorg could help to attract new contributers to fix the problem. And you at least seem to have some shareworthy knowledge!
Cheers,
David
We’re doomed…
with Apple having a pretty expensive alternative, and not that technologically more advanced… with the Amiga dead, BeOS even more dead, and a useless piece of crap called Linux that you can’t even install it on a laptop without a heart attack… we’re left only the Borg Assimilation Empire of MS…
There is no hope at all, resistance is futile…
Wow. This is the 242nd post under this “story”. After sleeping on it… and speaking as an admin who depends upon X working well and reliably every working day on a wide variety of hardware ranging from dirt cheap and ancient, to pretty nice and new, on local consoles, over LANs, and over WAN connections to whole workgroups in other cities using NX… and noting that much of the hardware was not chosen by me… I must say that this whole affair has got to be one of the most notable teapot tempests in OSNews history.
Some people have *way* too much time on their hands, and a penchant for obsessing upon minutia.
Edited 2009-08-16 20:46 UTC
Either you’re intentionally twisting the entire article, or you did not read it.
I did not claim, in any way, that X is unstable. I said something completely else, and you’re choosing to just ignore it. My point is simple: resizing a window should not cause the entire graphical stack to cock up, killing all applications and loosing data. Or are you denying the fact that when X goes down – everything goes down?
It doesn’t mater how stable X is – bugs will ALWAYS occur, and because of that inescapable fact, software needs to be written with isolation in mind, as to prevent simple crashes getting way out of control.
The simple FACT of the matter is that Windows Vista and Windows 7 are FAR better at mitigating possible issue sin the graphics stack than X is. That’s all my rant brought forward, and I have yet to see anyone provide proper argumentation refuting that fact.
I think the main problem here is that I dared to point out that *shock gasp horror* Windows is better at something than Linux/X. That‘s the real sore spot here.
Edited 2009-08-16 23:11 UTC
You keep mindlessly repeating the isolation mantra as if it will magically solve the problem you experienced. It won’t. X11 fundamentally has a lot of isolation seeing as each of the parts communicates over network sockets. Of course, I’m pretty sure you’ve ranted before about how that also makes X suck.
The real problem is that X clients can’t survive the server going away. There are a number of good reasons why this is the case. The most important being that –for performance reasons– the server maintains data structures on behalf of the clients (such as pixmaps). There a number of ways this could be rectified, but it would undoubtedly cause API semantics to change in non-trivial ways.
Rajj is right, Holwerda. You are mindlessly repeating your isolation mantra.
What I’m saying is that based upon my *much* more extensive professional experience using X in real business with real users on a *wide* variety of hardware, the point which you keep insisting (over and over) that everyone is missing, has just not proven to be significant. With *one* exception. (See below.)
Certainly, Thom, I understand the claim that you are making. And it just does not matter that much. That said… I would not be averse to X clients being able to reconnect to a display, as long as the costs in other areas did not outweigh what advantage there is. As an admin, I certainly prize refinements to robustness. (And this would be a good time to say that my very practical responsibilities and requirements as an admin don’t leave me a whole lot of room for having emotional reactions to the idea of Windows doing something “better” than Linux. So although I am certainly pleased when Linux is clearly doing something better than Windows, your assertion that this is “all about” that is just plain wrong.)
As mentioned above, there is an exception, where a persistent session is of practical, tangible benefit, which I and likely others encounter in the real world. And that is with the sessions running over the WAN connections, as we do for our workgroups in remote cities. This is not because X is unstable, but because WAN’s entail a certain (though not large) degree of unreliability. However, for those I use NX for performance reasons. And one benefit afforded by NX is… persistent sessions. You can kill the remote X server (or even pull the power plug and reboot), log back in, and voilá! Your session is still there exactly as you left it. This is, indeed, a useful, though not essential feature when running over a WAN. And I would venture to say that if it starts seeming too important, it’s probably time to look for a new ISP.
So, Thom, you might want to stop chanting, with your fingers in your ears, about how no one understands your point, and actually start *listening* to what other people are saying about the actual, real world *value* of your currently well understood point.
And I really shouldn’t be wasting so many words on this bizarre and largely meaningless thread.
Edited 2009-08-17 00:25 UTC
Except you didn’t state a fact, at least not one that everyone agrees on. You stated a personal experience you had. Other people, such as myself, have had different personal experiences, where graphics stack crashes in Vista brought down the entire OS and caused actual data loss on the filesystem. At least when X crashes on linux/FreeBSD, I can restart X. When the graphics system on Vista crashes, I end up with about 2 megs of data in Found.000. Now why can’t Vista learn from Linux/Xorg?
Adam
Edited 2009-08-17 00:17 UTC
Sorry, Thom, I don’t think the point behind your article came across at all well in the way you presented it.
The way you phrased it made it sound like you were saying “X is so unstable it can even be crashed by a simple window resize”. What I believe you intended to convey was that, whilst crashing due to a simple window resize is very unusual, a) X should be tolerant of driver bugs and that b) if X does have to go down it shouldn’t cause the apps to go down too. But that’s not how your article came across to most people.
It’s very understandable that it was frustrating to lose your apps, I hope that you didn’t lose any important data. But your anger and frustration came across far more strongly than the rather technical point you were trying to make.
Yes! That is a good point to be making.
No, I don’t think that is the source of the sore spot. Don’t get me wrong, there would have been trolls, aggressive Linux supporters, misguided individuals however you phrased your article. However, many people have been complaining about the content and delivery of the editorial as well as stating their agreement / disagreement with your underlying point.
It’s not that you don’t write well, your stuff is generally very readable. But in this case what came across was a lot of emotion and very little technical content, inspiring a discussion generating more heat than light, as they say. I think perhaps your article reads differently to you than to somebody who doesn’t know in advance what the “take home” point is supposed to be.
It’s a shame because there are a lot of useful / interesting things we could talk about here, some of which posters have already touched on. Such as:
* How is the Windows driver isolation architected? How good is the fault isolation, some people here seem to think it’s more a voluntary restart facility than complete isolation. This is important as full isolation is going to be a harder problem.
* How can X users mitigate this now? (e.g. run apps under Xpra or VNC to isolate them from driver bugs in the real X Server? Use an xmove proxy? Are there any X extensions that can deal with this?)
* How should this be solved? Could X implement an architecture similar to Windows? Is there a better way to do it – just make something like xmove but better, then restart the entire X server if stuff breaks? Or maybe just isolating drivers in some way would be better – but how to do that? Can the toolkits handle it in principle, should they?
* Could any of the fault isolation / tolerance solutions from other areas of OS design be leveraged? (Run certain X code in a separate process / careful setting of memory permissions / proxy server / use a safe bytecode for actual driver code)
* There are obvious links between migratable applications and restartable X servers, since a new connection is being made anyhow. Are there app migration systems that are relevant? Does anyone do this already? Keith Packard has talked about app migration in the past, maybe he has expanded on this further somewhere.
* What if we were using Wayland instead of X? Would handling it there be easier? Have they thought about this?
Isolating bugs in drivers is Hard, that’s why all mainstream OS kernels are still effectively monolithic with your “USB feather duster” driver being able to bring down the entire system. It’s good if Windows has attempted to tackle this in the graphics driver case – but now I (we?) am hungry for technical details, a comparison of techniques, debate over the appropriate course of action.
I come to OSNews looking for this sort of stuff and I would love to see an in-depth discussion here about the technical aspects of this. Maybe it would be a good candidate for an original article submission by someone, or for an interview with some relevant developers.
It is impossible if one doesn’t have the source code …
I don’t believe that the NT kernel actually is monolithic. I thought it was a microkernel architecture?
If one wants a similar approach in FOSS, I know of only one candidate … which is the GNU Hurd kernel.
http://en.wikipedia.org/wiki/GNU_Hurd
It would seem that there is not much hope to be placed in this however, compared with the effort expended on the Linux kernel.
The source code to the driver itself? Whilst it’s pretty much impossible / impractical to fix bugs / failures in a driver if you don’t have the source (and the right to use it!), you can certainly mitigate them. By “isolate” here I’m talking about containing the faults and making it possible to recover from them somehow – there’s not always necessarily a general-purpose way of doing this but with the right architecture you can sometimes get surprisingly far.
For instance, I’ve had Adobe’s Flash Player crash firefox before. But by using nspluginwrapper to run it in a separate process you can actually contain its crashes so they can’t bring down the browser (as easily … 😉
It was designed as a microkernel architecture but I think it’s effectively monolithic these days – by which I mean that most drivers, filesystems and other OS kernel components all run in the same protection domain. They can all take each other down, if they’re buggy – in this respect I don’t think the NT kernel is much better than any other mainstream kernel. As-of Vista they’re moving more stuff out of kernel but there’s still AFAIK a heck of a lot of stuff in there.
Usually with contemporary OSes it’s a bit scary when you start to think about how much random weird stuff gets to run at kernel level.
The MacOS/Darwin XNU kernel is another funny one – everyone knows it includes Mach, a microkernel. But XNU is actually a monolithic kernel too, even though Mach is a component in it – the developers docs state this explicitly, I believe. I’ve read about Mach being able to share an address space with the OS servers running on it, for performance reasons. I presume they’re doing something like that, in which case it’s all in the same protection domain and gets a helping of traditional monolithic kernel downsides (and benefits, of course).
Yeah. Hurd is a really interesting system. I was really hopeful on a number of occasions that we were about to see something really interesting from them but they never seem to quite get there. Last I heard they were looking at writing a new microkernel that would fit their needs better, having decided that L4 didn’t suit for some reason.
K42 is pretty cool-looking. It’s based on Linux and L4 but it splits the OS into multiple independent servers running atop the microkernel – like a proper microkernel OS. http://domino.research.ibm.com/comm/research_projects.nsf/pages/k42…
For isolating device drivers there have been various other projects including:
* Nooks http://nooks.cs.washington.edu/
* L4 Unmodified Driver Reuse and Improved System Dependability http://www.usenix.org/events/osdi04/tech/full_papers/levasseur/leva…
* Xen’s isolated driver domains Safe Hardware Interface work http://www.cl.cam.ac.uk/research/srg/netos/papers/2004-oasis-ngio.p…
Nooks is a modification to the Linux kernel that puts drivers into (partially) separate protection domains from the main kernel. The L4 and Xen approaches basically run the device driver *plus* a Linux kernel for it to live in (!) in an isolated virtual machine on top of a hypervisor. You do your work in a different VM. If the device driver barfs it only hurts the driver VM, which can be restarted to recover.
All of these solutions have some relatively significant overheads or downsides, sadly – it does illustrate that this sort of thing is doable, though.
Declaration of interest: I worked on the Xen project I mentioned above. Also I don’t have direct experience with the NT and XNU kernels, I’ve just read stuff about them. I mostly do Linux work.
No, it is NOT all your rant brought forward.
You went out of your way to cuss out one of the mainstays of the Linux OS. You went out of your way to brag about Windows Vista and Windows 7 over the X system, when the two are apples and oranges.
You got pissed because you lost your work and now you’re backing off from what you said because you realize it sounds bad, like a lame luser.
You could have simply said it would be nice if X isolated the effects of driver and other bugs, and allowed easy session reconnects. Period. End of story.
And guess what? I’m quite sure the people who maintain X would agree with that and may well be working on it in the roadmap. Did you bother to ask them?
No, you went off the deep end.
Nobody is impressed.
You’ve hit on something the X11/Linux community has been attempting to resolve for almost a decade – to create an operating system where problems are handled in a graceful manner. When the wrong settings are put in – the GUI can still recover. When the driver fails to install properly, the GUI can recover to a workable state and allow the end user to uninstall the defective driver.
The problem is that there is still far too many who suffer from the disease of hating Microsoft – the failure on their part to move forward realising when there is a good idea and then adopting it, then building upon it. Part of recognising a great idea is also recognising when something you have sucks – like the kid who can’t sing but turns up on American idol, its sometimes necessary to be honest and say, “what we have sucks” and then do something about it. Admitting something sucks is the first step to addressing the problem – if you don’t address the suckage factor and asking why it sucked in the first place, you’re simply going to re-create a piece of software and replicate the same design flaws.
Edited 2009-08-17 10:56 UTC
As someone who administers multiple Sun Ray Servers, I know that X works just fine. That doesn’t mean that applications don’t have issues, but the framework is sound.
This is the second “article” in so many weeks about Thom’s (or some other user, whoever wrote the mindless OpenSolaris piece) who has a problem, and instead of taking the time to troubleshoot the issue and either fix it or provide the developers some feedback, they just bitch and say it sucks. I see these kinds of problems as a challenge, any “real” geek would as well.
This is as dumb (and yes I mean dumb) as segedunum saying that because he can’t get Zope to work under Solaris as “fast” as Linux, that Sun hardware and Solaris is bad. Its like the nonsense I hear from the developers of an application I have to host, every time there is a problem it is the hardware and the operating systems fault.
I thought this was supposed to be a tech site, well I guess it is as long as the “tech” isn’t too hard and the number of site hits are up.
Is this is what the readers of OSNews can expect in the way of articles in the future? I am tired of reading childish, troll filled nonsense instead of well researched and written technical content. Close my account because I do not intend to post any more articles or comments to this site ever again.
The distro is at fault, the end. The buck stops at the distro.
You can blame X or drivers or Thom’s choice of using new hardware with linux. What low level source you blame is irrelevant because the buck stops with the distro.
Only the distro maker interfaces directly with the customer and can guarantee a certain level of software service.
Edited 2009-08-16 22:18 UTC
I haven’t had X crash on me but one or two times. I’m not running Ubuntu, and I’m using XFCE. I will say, though, that the lags in it are a bit off-putting when I’m running a 2GHz Core 2 Duo (but I’ll admit I haven’t tried to find the source of the problem).
However, I went to Linux as my primary OS after XP hosed itself less than 6 months after I cleaned up its last self-petard-hoisting by wiping and reloading. I had had enough. The Vista installations I used had too many problems to coax me to pay for that change. Hopefully Windows 7 lives up to the promises that Vista didn’t deliver (well, if I crank up the UAC slider . I’m unwilling to pay the $200 to find out, but if I get a better deal, or get an opportunity to use it for awhile first, I may end up with it.
I will say this: every problem I’ve encountered in Linux, and wanted to fix, I have been able to, using either grep or Google. I’ve also never had to wipe and reload any of my Unix installations. I cannot say that about any version of Windows I’ve used. Windows may do some things “right”, but it needs to be simplified. I fear Linux may be getting too complex for its own good as well, but it’s not there quite yet (and it still mostly uses text files for its configuration).
Yes. Still reading.
Would you happen to want to fix the problem of VTMB not running on linux? I’d be very appreciative and would hurl praise in your general direction.
Silly, lame and unnecessary article. ‘Some guy had troubles with graphics on Linux and got angry’. Come on.
The idea that whatever was broken shouldn’t have affected whatever was under is beautiful, but doesn’t necessarily apply here. It’s obviously a critical part of your system that is broken, probably the driver. You should just do what Linux users have always done: move to a hardware that is already known to work well. Yes, this will probably mean older stuff. It has always been this way, what made you think it has ended? Somebody told you 2009 was the year of Linux on the desktop?
It is completely unfair to compare your experience with Linux with some windows gedankenexperiment. What was the experimental driver you saw running on windows that broke down gently? What software were you running when you saw the problem to wich windows were so robust?
Windows ‘works’ because closed developers only (or mostly) release proven stuff. No way you can compare the tested stuff you see on windows with the ‘always a bit experimental’ free sofware.
This all just shows how much strategical it is for Micros~1 that the video card companies do not help to support their crucial devices on Linux…
Edited 2009-08-17 04:50 UTC
This is what I have been literally SCREAMING at the X and OSS community since the WDDM of Vista was previewed over 5 years ago.
No matter how hard many of us screamed, the ‘bad reputation’ of Vista has let the good technology get ignored by the OSS world.
And just as I screamed and predicted, as Microsoft was being ignored, the OSS world would once again not realize that Microsoft was doing something right, and it is important that the OSS world PAYS ATTENTION instead of just dismissing Windows with ignorant or clever retorts.
Some additional facts…
1) The WDDM in Vista and Win7 are virtually the same, Win7 adding in more OS WDM 1.1 features as GPU technology permits. (Mainly this gives the OS more scheduler control of the GPU.)
2) Kernel/User Mode Drivers. Despite many of hte posts, Microsoft IS NOT copying anyone with ‘user level’ drivers.
NT Originally had strict User Level Drivers up until 4.0.
What Vista’s WDDM does is take the best of both worlds as it is a kernel and user mode driver technology. This is what gives it the performance of kernel level drivers, and yet is not vulnerable to OS level crashes as most of the driver does run in user mode.
3) The WDDM and ‘driver’ recovery is doing more than just using user mode/upper level drivers.
And this is a LONG list of features that allow it to do more than just recover from a ‘driver’ or application failure involving DirectX, etc.
Vista/Win7 can dynamically flip video drivers on the fly (note that hybrid technologies that only work seamlessly on Vista/Win7, where other OSes like OSX, need to reboot the GUI to flip between GPUs.
Since it is not ‘locked’ to a video driver at any time, you can have a full video card failure and replace the card ‘hot’ and Vista/Win7 will recover and pick up the new video card and drivers.
4) The OS takes full control of the Video and GPU.
This means that when failures or driver crashes do happen, the OS recovers and the applications don’t know any different and don’t HAVE to.
The WDDM also includes VRAM virtualization, which allows the GPU to never starve for memory no matter how many 3D applications your system has running within system RAM limits.
The WDDM also includes a full PRE-EMPTIVE GPU scheduler, so that applications can’t ‘lock’ the GPU. This is what allows Vista and Win7 to EASILY multi-task 3D applications on screen independant of each other with minimum FPS drop and without the applications holding the GPU hostage, so the UI is always responsive.
This is important as it is as big as the move from cooperative to pre-emptive CPU scheduling.
As the GPU is used more for calculations and multiple 3D applications running at the same time, watch CUDA and other GPU technologies are used, non Windows OSes depending on ‘cooperative’ GPU multi-tasking will choke and become more un-responsive.
This is just a quick off the top of my head noting of things Windows is doing with the WDDM technology that NO OTHER OS is even CLOSE to being able to do.
And this is why it is sad, as everyone sat back and laughed at Vista for several years now, Microsoft has been paving the future and leaving the OSS and even Apple and OS X in the dust when it comes to pure technology capabilities.
WAKE UP, there are things to PAY ATTENTION to and LEARN from Windows Vista/Win7 technologies, especially in the WDDM, and it is just one area that Microsoft is running way ahead of the pack.
Another reason the WDDM is getting much attention now is as ‘new’ hardware comes out and people see that only Vista/Win7’s WDDM can properly handle it.
(Hybrid chipsets that flip active GPUs on the fly Nvidia 9400 for example all the way to multi-GPU technologies that abandon SLI and Crossfire and turning the multi-GPU management over to the OS, and Vista/Win7’s WDDM can do all of this easily, and is ready for many things that still are not common hardware technologies.)
There is NO REASON the OSS world has been so ARROGANT and STUPID about what Microsoft GOT RIGHT with the core technologies in Windows. They are either too young or forget that NT was a well designed OS kernel architecture and Microsoft does do some things well.
Shill much? This could be a legitimate post,but it seems your user account was registered today, probably just for this evangelic message.
Or, they may just have the luxury of not caring what Microsoft does anymore. You can get a perfectly usable system with older technology. It seems Windows XP got quite popular even without Vista’s “mind blowing” features.
LWN.net
http://lwn.net/Articles/95956/
Posted Jul 31, 2004 20:58 UTC (Sat) by sbergman27 (subscriber, #10767)
Parent article: X at OLS
What an excellent article, Jon!
And what wonderful news! Finally, X development is exciting again.
Every single shortcoming of X11 which has ever bothered me was covered in this article. Even X11’s problem with high latency connections. It’s amazing how many people think that X’s problems over broadband stem its being a bandwidth hog, when in fact, it is not. But insert a little latency, say 50ms, and many applications become quite unusable.
The xorg fork is something that should have happened a long time ago, except that many of us did not realize it.
A big “thanks for everything” to the current xorg developers. And a big “thanks for nothing” to David Dawes. A perfect example of how sometimes a persons “contributions” can really screw up a project.
I predict that a year or two from now, we will not be seeing all the “X11 Sucks!/No it doesn’t!” flamewars that are so common today. It’ll certainly be nice too see X as an open source crown jewel and not as an embarrassment. I can hardly wait! 😉 The fun starts on Aug 25th, folks!
I don’t know what pile of poo you were running but Suse 11.1 with KDE 4.2 is rock solid here. The only time X has been down is when I reboot. The only time I reboot is after a kernel or hardware update. I watch video’s all the time and resize on the fly. If your system is that unstable, you’ve installed something flaky or messed it up.
I just read this on a forum.
“X Setup is easy even with Gentoo nowadays, no xorg.conf needed, input works with sane defaults and even the xml configs are actually quite straightforward, while Gnome or KDE users wouldn’t have to touch those anyway.”
Well, I’m using nvidia and I’m on latest archlinux, I have deleted /etc/xorg.conf and then tried to start X with ‘startx’.
It started X but with the VESA driver at 800×600. Ok, I tried to switch modes with xrandr but it told me “not found in available modes”.
I stopped X and went to remove the VESA driver, I also removed nv and fbdev. Verified that the ‘nvidia’ driver was loaded, etc. And just prayed that X would get smart and start X with the nvidia driver.
Did startx and it failed, it complained that nv, fbdev or VESA wasn’t available, at that point I got disappointed, I thought that X already auto-detects this kind of stuff.
So at that point if I really wanted to start X with the nvidia driver I had to do a `X -configure’ and that would find the nvidia driver and I had to put the new generated file to /etc/X11/xorg.conf.
I really wished that startx would find the nvidia driver without an xorg.conf as the quote says, how come `X -configure’ can find the nvidia driver and write it to the new generated file but startx can’t find it and start the frigging X?
I really wish these guys that are responsible with X would do a better job at stuff like that. Or someone else will come with a better alternative.
I agree with this article, and I really love Linux, I think X needs more constructive critics so that developers would try to improve things that bother to people, and be more serious about their work.
Edited 2009-08-17 09:14 UTC
No matter what, the Linux platform is not the top priority for hardware manufacturers to publish drivers for, nor to release the specs to. No matter how many new tech acronyms you support with any release of X, if the drivers don’t work, it’s crap.
Ordinary users (your granny, your mom, a school English teacher) won’t give a crap if your apps keep running in the background and can be brought back if your desktop crashes, ‘cos they don’t have a clue what’s happening underneath. It’s the application’s responsibility to abstract those things and do them automatically for them.
Bottom line is, Linux is not a desktop OS suited for people who don’t know how to get down and dirty with the CLI. You can never run too many office/productivity applications without having that dreaded feeling that something’s going to break anytime.
It’s a good OS, but not for everyone.
People nowadays use their computer for regular tasks, like using internet, watch videos, etc. We live in a modern era, and we need a modern graphics system that meet today standards.
X contains new stuff and they are rewriting most of the implementation and adding new stuff, but we also need to get rid of the 1980’s components, I really like the vision that Wayland has and I hope they end up replacing the crap that X.Org is with it.
We seriously need new people that think in new and today ways to make a new graphics system that doesn’t suck, and that can take Linux forward to have better and modern/advanced graphics.
Edited 2009-08-17 09:54 UTC
I’m quite excited to see Wayland too. I think it’s worth noting that really big chunks of the code that makes Wayland possible come from the Xorg project – it uses kernel mode setting, device driver code and probably the graphics memory manager code – so it isn’t the case that X is bad and Wayland is simply superceding it.
Why should modern distros still ship with crap like this?
http://upload.wikimedia.org/wikipedia/en/2/2e/Xeyes.png
http://upload.wikimedia.org/wikipedia/en/c/cc/Xcalc_suse.png
http://upload.wikimedia.org/wikipedia/commons/d/d4/X-Window-System….
It really stress the shit out of me…
You X.Org developers should go back to 1980 and stay there.
That would be pretty cool going back to 1980. BTW, I don’t think it is necessary for X developers, most of them never left the 80s (sorry, i’ll get my coat)
I don’t see a single argument why is X to blame. X is just a single layer in UNIXish graphics software. There are quite a few layers above it if one uses something like Gnome or KDE. Even plain X applications are often linked with “Gnome support” or “KDE support”, although they don’t need that to work, but some people like to have everything “integrated”.
I tried both, Gnome and KDE and never managed them to work satisfactory, so I’ve given them up. I preferred KDE, but it used to crash, at least, once a day.
I must admit that I’ve compiled some of the X applications myself (like Xine), and I used a lot of
“–disable-this” and “–disable-that” switches while running configure. That means that I have drastically reduced a number of dependencies which are enabled by default.
Anyway, I don’t have crashes on my machine, and have no problems watching DVD movies (encrypted too), Windows Media files, FLV files, etc.
So, I don’t think that all the problems are in X layer.
In fact, in world of Linux, it is that it is not easy to pick a good application among couple of alternatives. Majority of desktop software doesn’t bring any serious income to its authors, so bad software, or bad packaging of good a good software is not causing anyone to go bankrupt or something. Good software may coexist with bad software forever. The only way is test everything and find out what works.
But here’s that refutation you were looking for:
http://adam.npark.com/vista-crash.avi
Exactly how is that better than what happened to you under Xorg?
Adam
Edited 2009-08-17 11:10 UTC
Quite literally has been hanging around *nix’s neck since the dawn of the GUI. Ever wonder why Apple does not run X11 as it’s primary, and instead only offers it as a module that sits ATOP their graphics?
X11 was NEVER meant for use as a single user desktop interface where the applications (clients) are run on the same machine as the server (X11) – it was from the start meant for the client/server construct over a network for two different machines, for a type of data at significantly lower bandwidths than what we expect today.
EVERYTHING that’s been done to improve that and implement what’s needed for things like realtime video have been little more than hacks to bypass/circumvent how X11 even works in the first place.
I’ve often said over the past three to four years that in most ways as a desktop OS linux and most of your *nix operating systems have barely caught up to windows 3.1 in functionality – a statement I for the most part still stand by.
Even the most mundane of things… Like a buddy of mine online bragging about his twinview setup as new and revolutionary about a year ago. I almost laughed my ass right through the floor given that’s been what I’ve considered basic functionality on PC since Windows 98, basic functionality on the Mac since System 5, and something I can remember doing all the way back in 1989 under Windows 3.0 with a ****ing Targa board!
Just to compare, today I’m running four displays (I’ve not run less than two displays since 1999). The two on the right are 17″ Viewsonic 171B’s running in portrait at 1024×1280. My center display is a 24″ Samsung at 1920×1200, and on the left is a el-cheapo 24″ Envision also running at 1920×1200. The outer two displays are connected to a 640 meg 8800GTS, the inner two are connected to a 896 meg 260GTX.
I just re-installed both Ubuntu 9.04 and Win7 7600 MSDN last week (new hard drive, figured go clean) – here’s how it went.
For Ubuntu, it boots up sideways on the far right display, which isn’t even what’s set up as the primary video card and of course is it not recognizing the portrait signal from the display (It actually sends the pivot signal). Since I’m on nvidia cards first thing I need is the reference drivers since the ‘open’ ones tend to perform oral sex upon the Equus africanus asinus… this of course is not the least bit compatible with XRandR, and since I’m going to more than two displays I can pretty much completely kiss off any chance of desktop compositing working before the next decade is out, much less this one.
So I start up the nVidia-Settings program and… I can’t save. Oh, even though it puts itself in the menu, you cannot run it from the menu with the proper permissions so instead you have to SUDO it from the command line… Which immediately crashes and goes into an endless loop of starting/crashing GDM until you kill it from a REAL terminal. Ok, guess I get to do this the ‘hard way’ and manually write my own xorg.conf to even get the primary display proper. I get the main display working, sudo the settings program, still only sees one display adapter, but I set up the one remaining display and… what’s this, not even an OPTION for rotating the display? Ok, so edit the xorg.conf to manually insert the second display adapter, and to add the ‘rotate CW’ line to the appropriate displays. Since I have one of each configured just copy them over and manually set the positioning… and boom, graphics corruption on both the outer displays. Ok, fiddle with some options, finally forcing the outer displays to use the ‘open’ driver and the inner ones to run on the restricted ones does the job – kind of.
Good luck running openGL for things like blender.
Meanwhile I installed windows 7, it recognized all four displays, configured them, detected which ones were rotated, properly detected the primary display, automatically even applies the rotated subpixel hinting instead of the standard, ONLY thing I had to change was the display order, which was a simple matter of drag and drop on the display settings page.
Though really I think the problem runs even deeper than simple driver issues – the X11 ‘API’ (assuming you could call it that) is so impossible to program for that most every book on the subject says don’t bother and use a toolkit instead – old school that meant motif, today it means QT or GTK… All of which probably wouldn’t even EXIST if X11 wasn’t so absurdly back-assward and convoluted in the first place. It’s not a good sign when NOBODY apart from the people writing toolkits is willing to use it directly. Even the old windows GDI isn’t THAT BAD! (oh, but for a cocoa)
Edited 2009-08-17 11:28 UTC
Oh, small addendum – Creative labs needs to go **** itself – their latest Win7 drivers under the RTM do nothing but BSOD – something I NEVER saw under build 7100… Felt it was fair to come back and mention that.
Thankfully going back to Daniel_k’s hacked 6.0.1.1370 driver from his “Audigy Support Pack” works like a charm (which under build 7100 enabled a bunch of ‘missing’ features like soundfonts). Under 7100 the stock creative drivers were ‘stable’ but made it so you may as well have used integrated audio – Daniel_k’s drivers brought it back up to full functionality. Under the RTM (well, MSDN copy) it’s the difference between running problem-free, and having a 30% chance of a BSOD every time something tries to play audio.
My point, bad drivers can screw you REGARDLESS of the OS platform… and don’t be too quick to judge the host OS when drivers could in fact be the issue. It’s why Linux stability and full functionality is often a matter of having that magical mix of the right hardware.
Though at the same time (over on the *nix side) I’m having a laugh at the nForce drivers, where disk access is so slow as to be a joke since full SATA functionality on a nForce 680i SLI chipset is apparently non-existant.
But that’s like a lot of ‘supported’ hardware. Yes, it’s supported – but is it supported FULLY?!? Under linux can you rely on your wireless adapter coming back up after wake from sleep? Can your ‘supported’ webcam run it’s full 640×480 mode or are you mysteriously ‘stuck’ at 320×240? Are you able to actually use your $300 high end video card for OpenGL, or just because you want to run more than one display is that right out of the equation? Can you get 5.1 audio out of your soundcard, or is the Dolby DTS decoder being ‘restricted’ and the drivers for a reasonably well documented decade old chipset STILL not actually allowing it?
Even stupid little **** like a handful of keyboard controllers not allowing you to run USB interface peripherals at the same time as PS/2 ones (oh that can be a joy to figure out what’s going wrong – funny thing is OSX on hackintoshes are likewise afflicted!)
Oh yeah, but Linux has the most supported hardware out of the box… Too bad there’s a difference between ‘supported’ and ‘fully functional’.
Of course if linus would quit dicking with the API on every damned build, and if certain members of the community would pull their heads out of their asses on this ‘restricted driver’ FLOSS zealotry, MAYBE hardware vendors would be a bit more enthusiastic about making drivers for it.
But then seeing how Creative Labs has dragged thier heels on releasing new drivers even for their NEW hardware for any OS newer than XP – maybe not.
Edited 2009-08-19 03:05 UTC
You got that right 100%. I recently tried to install 5 different distros on an old nForce 2 box that I have laying around, just to see what all the KDE4 noise was about.
Imagine my disappointment when NOT ONE of those five distros would support high-res graphics. Kubuntu 9.0.4 tried to install an updated nVIDIA driver and left me at a command prompt, Mandrake One would freeze in the middle of booting, OpenSUSE 11.1 and Fedora 11 both gave me the finger when I tried installing older nVIDIA driver versions. The only one that even tried to help me out with different driver versions was PC-BSD 7.1.1. But, alas, it too failed to get me any higher than 1024 X 768.
So, from my perspective, Linux has a LONG way to go before it will ever be accepted as a viable desktop OS. Especially when I can pop in ANY Windows CD from ’95 to Vista and have ZERO issues with my display driver.
GREAT example – I could probably have gotten it working since I’ve become damned near a xorg.conf guru just from trying to get my own setup working – but it would involve spending several hours dicking with display timing lines depending on your display… and depending on the video card even if it supports 24 bit or even 32 bit color on windows unless it’s a recent nvidia or ATI you may find yourself restricted to 16 bit color (one of the LEADING causes of missing resolutions)… and why do you have to dick with this **** which no other OS has EVER needed? Because the drivers program the CRTC directly since few if any *nix distros include the hooks to thunk to 16 bit mode to call VESA BIOS to set the modes with ‘standard’ timings.
… or lack the ability to actually READ the damned monitor strings – which is what put a stop to that nonsense on windows way the **** back on Windows 98. (95 had the capability but didn’t recognize enough display models)
Edited 2009-08-20 00:12 UTC
The reason you will have zero issues with your display driver under ’95 is it because it won’t even recognize your new hard disk. For that reason you won’t even be able to install in the first place.
FWIW, just thought I’d note:
Quite a few people have been talking about what X was originally meant for (e.g. clusters of workstations / servers, not single user desktops, not multimedia use).
I think it’s worth pointing out that the environment X was designed for was a Unix environment. Unix itself having been coded up for some model of PDP before workstations and LANs were common but having successfully evolved to network workstations.
Most people here who’s using X are using it on a Unix (or Unix-like) system. The OS kernel has evolved to support modern circumstances pretty well – Apple even use a Unix kernel for MacOS. Apple could have used X for MacOS but aside from whatever other strategic reasons they may have had it wouldn’t really be surprising if they wanted to avoid XFree86 when they were developing OS X – by most accounts it really was a disaster area at that time.
Although the challenges involved are perhaps somewhat different I don’t see why we should accept that Unix is able to evolve over decades of use but that X is not. The Xorg developers have been doing tremendous amounts of work to modernise their stack over the past few years and it really seems like they’ve been doing a fairly good job. I’m excited to see what happens in the next few years.
>>… immediately I was reminded of why I do not do any serious work on Linux …
>>… I had a whole bunch of applications running, but I decided to have some fun and watch an episode of 30 Rock
…
>>… if X.org crashes – so do all of your applications. Evolution. Chrome with a number of tabs open. Pidgin with a number of IM windows open. Twitux. Evince with an insanely cool study open (‘Mathematical Modelling Of An Outbreak of Zombie Infection’). OpenOffice.org Writer displaying a friend’s thesis which I was proofreading…
>> …They were all gone.
You were IM’ing, reading a study, proof-reading a thesis, emailing, browsing the Internet and watching 30 Rock – it’s no wonder you never get any ‘serious work’ done.
Both of them will crash with an interestingly colored screen when running any 3D applications, even Aero although it’s rare. This does not happen all of the time and all of my research points to the driver resetting, which both OS’s are supposed to handle and don’t. I’ve tried both the Window’s Update driver and the direct from ATi driver. This never happened in XP as it is due to some “intelligent” new driver handling in Vista and 7. So, their stack is no where near perfect either. Some times this will happen half a dozen times in the course of an hour. And for such an advanced OS, why does Windows 7 REQUIRE free space at the front of the hard disk on the first sata channel for it’s boot loader even though modern BIOS’s allow you to pick the disk to boot from? Vista didn’t have this requirement. At least 7 actually sees all 3 of my hard disks during installation, Vista SP1 does not.
Because it isn’t installing a bootloader, it is installing a repair environment.
Technically this is NOT true. If you have some really ‘wild’ off the wall case of my machine crashed on XXX beta, leave it at that, but if you don’t understand the architecture model, you probably should be quiet.
3D applications in Windows Vista and Win7 fully recover even if you rip out the Video Card. The ONLY exception are some OpenGL games that don’t properly recover.
DirectX games, WPF, and any other GPU based operations recover without a crash.
Go back and read the Developer BLogs from the Vista days, or even watch the videos from the developers and people from Microsoft research that designed the WDDM of Vista and WHY IT CAN and DOES seamlessly recover from 99.9% of all video/gpu related crashes without any application being affected.
Also actually read the tecnical information on the WDDM and WDM control of the GPU that is happening. It is something the OSS world needs to LEARN from in its approach.
Heck even the ‘Composer’ (DWM/AERO) of Vista/Win7 does things in a quite unique way because of the WDDM and the OS’s ability to virtualize System RAM. This means it can use system RAM for Window ‘textures’ and directly write them to the display context without having to swamp the RAM to the VRAM. This is why the Composer in Windows is very efficient as there is not a dual buffer nor a delay in write waiting for the RAM copy as you will find in OSS composers and even OS X.
It is just another ‘side’ benefit of the WDDM.
PS Concepts and design of the WDDM and WDM video technologies in Vista/Win7 come from the XBox 360 development engineers, as the XBox 360 also uses VRAM virtualization and GPU scheduling and a semi hybrid video driver model for stability as well.
So if you want to STILL hate Windows out of your ego, then go, ok cool, this technology started with the XBox 360 group so we should see how it works and how we should be using these concepts in OSS OSes.
What you’ve said about windows is not right.
When graphics driver crash in windows it will
1. suddenly restart the system or
2. Great you with BSOD and then restart.
That holds true on all windows including windows 7 because it happened to me on my laptop with GF 8700GT Go GPU.
In both cases it is horrible.
But you are right linux is still buggy and given to customers with no warranty or obligations and you know that; so don’t do productivity on it.
Thom: why are you using Linux?
The reason I ask is that what you consider to be typical usage — word processing, multimedia, IM, graphical email — seems to have been done and done better in Microsoft OSes since the late 90s. So why bother switching to (or even trying) Linux if Windows is working so well for you for this purpose? Why not use the best tool for the job at hand? Besides, I think the rest of the world might have slightly differing opinions about what constitutes “real work” if watching “30 Rock” is part of your requirements.
If you weren’t so concerned about preemptively heading off criticism with attacks on the very Linux “fanboys” who could help you sort this out, you might find that you’d learn a thing or two in the process. Like, for example, how Linux has had the ability to reload hardware drivers of any sort without reboots for a decade (please, run `man modprobe` and be enlightened).
If you don’t want to concern yourself with how your OS works, then the Fisher-Price world of point-and-click no-questions-asked Microsoft computing is where you belong. Don’t forget that Linux itself was started as someone’s hobby project — a sandbox to explore and experiment within. If you don’t share that same attitude, you’re probably just wasting everyone’s time complaining about it here.
Then again, I bet there hasn’t been an OSNews article in quite some time that has driven as much traffic to the site as this one has.
Comments != traffic. This article isn’t that popular at all, hit-wise.
Attention Linux Devotees:
Keep counting the number of angels you can squeeze on a pinhead, instead of admitting all you have is a pinhead graphics system – a legacy that belongs in a crypt under stained glass windows…
Speaking of Windows: I’ve never had a Windows PC crash from simply resizing a window – no matter what the content…
Who owns 93% of the PC market?
FWIW: Windows doesn’t rule because people are too stupid to learn Linux; on the contrary, people have learned that Linux doesn’t work for them…
The Mac can win converts from Windows… anybody on the Linux side gonna argue that point??
Why? The Mac works.
I can’t believe these guys here don’t post anything about the future of X or Wayland.
http://planet.freedesktop.org/
All I see is their douchebag faces, which I don’t want to see. Show me something about Wayland or the future of X or don’t post stuff at all.
Edited 2009-08-17 20:07 UTC
A few years ago I installed Ubuntu Feisty Fawn on my ThinkPad Z60m. It worked quite well and I was happy with it. I even got the S-Video output working to some extent, so I could watch videos on my TV.
Now after a few upgrades, it is totally broken. The simplest things will cause X to totally crash (usually video related.) Sometimes I can get to a console to restart GDM, but many times I cannot. Generally at that point I reboot into Windows for a while.
You guys can blame “bad drivers” all day (and you are probably right), but it is REALLY ANNOYING to have something that worked fine before get totally broken.
Besides X breaking I have some other complaints:
– PulseAudio is complete crap and really screwed up my system too. I disabled it so now only one program at a time can send output to the sound card. This is totally stupid. As others have said in previous comments here, Linux audio is really shitty too.
– Multiple monitor support is non-existent or completely horrible in X. It is ridiculous that in 2009 this requires extensive and complicated editing of xorg.conf instead of a nice GUI. Also what worked for me before stopped working.
– Adjusting screen size in a GUI is still something that only sometimes works. Again this is crazy in 2009.
– Several times updating Ubuntu has broken my X config. This is mainly an Ubuntu problem I would think, but it still sucks. Ubuntu has jumped the shark in my opinion.
I learned to hate Windows because of the crashes and instability. Now my XP partition is where I go for stability after my Ubuntu install crashes. I wish it wasn’t this way I think that really sucks.
Of course I think all these problems are pretty specific to that ThinkPad, and I might be just fine on newer or better supported hardware. I will say that overall my Dell desktop that came with Ubuntu has been fine. I need to start using it more, but laptops are handy
I’ll repeat what I said elsewhere:
UBUNTU IS NOT LINUX!
Canonical is known for crappy bugs!
Try openSUSE, it’s much more solid. Or even Mint, I’ve heard, which is based on Ubuntu, but is more solid and easier to use.
People have to stop thinking that a crappy OS like Ubuntu is all there is to Linux!
Seeing all the rage people are defending X.org, I wished to share some technical aspects.
– the Xlib, the library used to communicate with X is not thread safe and not coder friendly.
Coders have to play with an unecessary complexity in order to create stable and non freezing UI.
– features like video acceleration or 3D are added by modules (plugins).
A graphic stack needs to be simple to use/configure/code . With X.org, a lot of code have to be written when a ‘now standard’ feature like compositing is not available.
– X design is based on X11, released in 1987!
It was never designed for today’s graphics card, never designed for 3D, effects and video. All theses common features were added with a lot of efforts, pain, bugs and hacks. It introduced bugs and performance losses.
So you will ask, why didn’t they recode it from scratch in a unified and modern stack that will reduce the code size by 10 and solve all the known issues?
The reason is simple, every linux app with a UI just RELY on X behaviours/bugs/hacks (through Xlib/QT/GTK) and nobody has enough money to spend in redeveloping all the GUI of the apps.
Sorry, but this time Thom is right
You are blurring many lines in your post, not the least of which is the fact that the architecture of the graphics system has no relation to potential crashes, as does no architecture nowhere to any other kind of crash.
Assuming that the basic principle is sound (i.e. if you don’t write software that is designed to crash, to formulate it anthrophosophically), then crashes should not occur either way, and the basic setup of X.org *is* sound. There might be room for improvement of the architecture per se, but even as it is now, it’s not exactly “constructed to crash”. The problem merely lies in bugs in the server code and, mostly, in the drivers.
The fact that on Windows 6.x (is that the line of Vista and 7? Well those are what I mean) the graphics subsystem can recover even in the case of a graphics driver crash (it sounds doubtful to me that exactly *this* is actually the case.. I’d rather expect a BSOD from a crashed graphics driver, but maybe it’s actually true) is neat, but has nothing to do with the fact that crashes still can occur. So can they with X.org. Or not. Or not on Windows 7 either.
So my grief with your post is that you’re either implying a fix in the wrong place (i.e. making the “inner hull” of X.org “nukeproof”, so that when something blows up inside, it stays inside, which is not what needs to be done and is just a workaround, maybe a neat one, but still, systematically it’s wrong), or you’re simply incoherent inside your post in that you are first saying crashes need to be fixed, but towards the 2nd half you move to the argument that crashes should be contained.
Also, this post, instead of being about X.org, rather runs down to the old argument of “the theory of perfection of F/OSS” and “the real-world adherence of closed-source software”. So all I read from your post is that real-world-adhering closed source software is better than F/OSS which always tries to be perfect, but never achieves it. A worthy debate on its own, but not worth packaging and disguising it within a post supposedly about instability of X.org.
Edited 2009-08-18 01:04 UTC