Today we feature a very interesting interview with Havoc Pennington. Havoc works for Red Hat, he is heading the desktop team, while he is well known also for his major contributions to GNOME, his GTK+ programming book, plus the freedesktop.org initiative which aims to standardize the X11 desktop environments. In the following interview we discuss about the changes inside Red Hat, Xouvert, freedesktop.org and Gnome’s future, and how Linux, in general, is doing in the desktop market.
1. Looking Red Hat’s recent press releases and web site lately, it reveals a new, stronger effort to shift focus further into the Enterprise and leaving Red Hat Linux to the hands of the community for the home/desktop market. This seems to leave a “hole” in the previous target of Red Hat at the “Corporate Desktop market”. The new Red Hat Linux might sound like “power to the people”, but to me sounds like an action that will have consequences (good & bad) in the quality, testing, development of what we got to know as your “corporate/desktop” product. Given the fact that Red Hat is the No1 Linux distribution on the planet, do you think that this new direction will slow down the Linux penetration to the desktop market?
Havoc Pennington: In my view it’s a mistake to create an “Enterprise vs. Desktop” contrast; these are largely separate dimensions. There are enterprise desktops, enterprise servers, consumer desktops, and consumer servers. Quite possibly small business desktops and servers are another category in between.
I don’t think we’ll see a slowdown in Linux penetration into the desktop market. In fact I hope to see it speed up. Today there are many large software companies making investments in the Linux desktop.
2. How have things changed internally after the [further] focus shift to Enterprise? Is your desktop team still fully working on Gnome/GTK+/X/etc or have developers been pulled into other projects that are more in line with this new focus at Red Hat?
Havoc Pennington: We’re still working on the desktop, more so than ever. (Including applications such as Mozilla, OpenOffice, and Evolution, not just the base environment.)
3. In the past (pre-SCO), Red Hat has admitted that was growing wary of patent issues that might arise in the future. Do you believe that desktop open source software written by many different individuals around the globe might be infringing on patents in some cases without the knowledge of these developers? At the end of the day, we have seen some patents that were issued so shortsightedly that many have said that writing software is almost impossible nowadays. What kind of solution for this issue might OSS software developers find, to ensure a future that is not striken by lawsuits left and right?
Havoc Pennington: As you know we’ve been more aggressive than other Linux vendors about removing potentially patented software from our distribution, specifically we took a lot of criticism for removing mp3 support.
One strategy for helping defend the open source community is to create defensive patents, as described here.
Another strategy is the one taken by Lawrence Rosen in the Academic Free License and Open Software License.
These licenses contain a “Termination for Patent Action” clause that’s an interesting approach.
Political lobbying and education can’t hurt either. These efforts become stronger as more people rely upon open source software.
4. What major new features are scheduled for GTK+ 2.4/2.6 and for the future in general? Once, you started a C++ wrapper for GTK+, but then the project got sterile. Do you believe that Gnome needs a C++ option, and if yes, do you believe that Gtkmm is a good one? Are there plans to sync GTK+ and Gtkmm more often and include it by default on Gnome releases?
Havoc Pennington: GTK+ 2.4 and 2.6 plans are pretty well described here.
One theme of these releases are to make GTK+ cover all the GUI functionality provided historically by libgnomeui. So there will be a single clear GUI API, rather than “plain GTK+” and “GNOME libs” – at that point being a “GNOME application” is really just a matter of whether you follow the GNOME user interface guidelines, rather than an issue of which libs you link to. This cuts down on bloat and developer confusion.
The main user-visible change in 2.4 is of course the new file selector.
The other user-visible effects of 2.4 and 2.6 will mostly be small tweaks and improved consistency between applications as they use the new standard widgets.
At some point we’ll support Cairo which should allow for some nice themes. Cairo also covers printing.
Regarding C++, honestly I’m not qualified to comment on the current state of gtkmm, because I haven’t evaluated it in some time. I do think a C++ option is important. There are two huge wins I’d consider even more important for your average one-off in-house simple GUI app though. 1) to use a language such as Python, Java, C#, Visual Basic, or whatever with automatic memory management, high-level library functions, and so forth; 2) use a user interface builder such as Glade. Both of those will save you more time than the difference between a C and a C++ UI toolkit.
5. What do you think of the XFree86 fork, Xouvert? Do you support the fork, and if yes, what exactly you want to see changed with Xouvert (feature-wise and architecture-wise for X)?
Havoc Pennington: The huge architectural effort I want to see in the X server is to move to saving all the window contents and using the 3D engine of the graphics cards, allowing transparency, faster redraws, nice visual effects, and thumbnailing/magnification, for example.
The trick is that there are *very* few people in the world with the qualifications to architect this change. I don’t know if the Xouvert guys have the necessary knowledge, but if they do that would be interesting. It may well be that no single person understands how to do this right; we may need a collaboration between toolkit people, X protocol people, and 3D hardware experts.
Aside from that, most of the changes to X I’d like to see aren’t really to the window system. Instead, I’d like us to think of the problem as building a base desktop platform. This platform would include a lot of things currently in the X tarball, a lot of things currently on freedesktop.org, and a lot of things that GNOME and KDE and GTK+ and Qt are doing independently. You can think of it as implementing the common backend or framework that GUI toolkits and applications are ported to when they’re ported to Linux.
This may be of interest. If we can negotiate the scary political waters, I’d like to see the various X projects, freedesktop.org, and the desktop environments and applications work together on a single base desktop platform project. With the new freedesktop.org server I’m trying to encourage such a thing.
6. How are things with freedesktop.org; what is its status? Do these standards get implemented in KDE and Gnome, or do they find resistance by hardcore devs on either projects? When do you think KDE and Gnome will reach a good level of interoperability as defined by freedesktop.org? What work has being done so far?
Havoc Pennington: freedesktop.org is going pretty well, I recently posted about the status of the hosting move. See here, I also had a lot of fun at the KDE conference in Nove Hrady and really enjoyed meeting a lot of quality developers I hadn’t met before.
I find that hardcore devs understand the importance of what we’re trying to do, though they also understand the difficulty of changing huge codebases such as Mozilla, OpenOffice, GNOME, or KDE so are understandably careful.
There are people who think of things in “GNOME vs. KDE” terms but in general the people who’ve invested the most time are interested in the bigger picture of open source vs. proprietary, Linux vs. Microsoft, and democratizing access to software.
Of course everyone has their favorite technologies – I think GNOME is great and have a lot of investment in it, and I also like Emacs and Amazon.com and Red Hat Linux. These preferences change over time. When it comes down to it the reason I’m here is larger than any particular technology.
As to when freedesktop.org will achieve interoperability, keep in mind that currently any app will run with any desktop. The issue is more sustaining that fact as the desktop platforms add new bells and whistles; and factoring new features down into the base desktop platform so that apps are properly integrated into any desktop. So it’s a process that I don’t think will ever end. There are always new features and those will tend to be tried out in several apps or desktops before they get spec’d out and documented on the freedesktop.org level.
7. Gnome 2.4 was released last week. Are you satisfied with the development progress of Gnome? What major features/changes do you want to see in Gnome in the next couple of years?
Havoc Pennington: I’m extremely satisfied with GNOME’s progress. Time based releases (see here
for the long definition) are the smartest thing a free software project can do.
This mail has some of my thoughts on what we need to add.
Honestly though the major missing bits of the Linux desktop are not on the GNOME/KDE level anymore. The desktop environments can be endlessly tweaked but they are pretty usable already.
We need to be looking at issues that span and integrate the large desktop projects – WINE, Mozilla, OpenOffice, Evolution on top of the desktops, X below them. And integrate all of them with the operating system.
Some of the other major problems, as explained here, have “slipped through the cracks” in that they don’t clearly fall under the charter of any of the existing large projects.
And of course manageability, administration, security, and application features.
8. Your fellow Red Hat engineer Mike Harris said recently that “There will be a time and a place for Linux on the home desktop. When and where it will be, and wether it will be something that can turn a profit remains to be seen. When Red Hat believes it may be a viable market to enter, then I’m sure we will. Personally, in my own opinion, I don’t think it will be viable for at least 1.5 – 2 years minimum.” Do you agree with this time frame and if yes, what parts exactly need to be “fixed/changed” in the whole Linux universe (technical or not) before Linux becomes viable to the home/desktop market?
Havoc Pennington: I wouldn’t try to guess the timeframe exactly. My guess would be something like “0 to 7 years” 😉
On the technology side, we need some improvements to robustness, to hardware handling, to usability.
However the consumer barriers have a lot to do with consumer ISV and IHV support. And you aren’t going to get that until you can point to some desktop marketshare. That’s why you can’t bootstrap the Linux desktop by targeting consumers. You need to get some initial marketshare elsewhere.
There’s also the business issue that targeting consumers involves very expensive mass market advertising.
9. Have you had a look at the Mac OS X 10.3 Panther previews? Apple is introducing some new widgets, like the new Tabs that look like buttons instead of tabs, and there is of course, Expose, which by utilizing the GL-based QuartzExtreme, offers new usability enhancements, plus cool and modern eye-candy. Do you think that X with GTK+/Gnome will be able to have such innovations in a timely manner, or will it take some years before we see those to a common Linux desktop?
Havoc Pennington: I haven’t tried Panther, though I saw some screenshots and articles.
As I mentioned earlier, the big X server feature I think we need is to move to this kind of 3D-based architecture. If we got the right 2 or 3 people working on it today, we could have demoware in a few months and something usable in a couple of years. I’m just making up those numbers of course.
However, nobody can predict when the right 2 or 3 people will start to work on it. As always in free software, the answer to “when will this be done?” is “faster if you help.”
One stepping stone is to create a robust base desktop platform project where these people could do their work, and some of us are working hard on that task.
10. How do you see the Linux and Unix landscape today? Do you feel that Linux is replacing Unix slowly but steadily, or do they follow parallel and different directions in your opinion?
Havoc Pennington: I would say that the nails are firmly in the UNIX coffin, and it’s just a matter of time.
What I find good is that he has a vision, which people may or may not agree with. I wholly agree with time based releases. I think it is a very good thing for a software project to be able to make promises and keep them, like, we will have a desktop ready in 6 months. Right now, Redhat was able to delay its release a little because they knew GNOME 2.4 would be there on time. It makes things like release planning much easier for software aggregators like Redhat and other distros.
3D in Linux, or X more specifically is sorely needed. We have OPenGL, and this should be leveraged much like Apple does with OSX. I think all the building blocks are there.
I also agree somewhat that if you can’t nail the enterprise desktop, which is what Redhat is after, the home market will never come. I feel sorry for others who shall tend the desktop market for companies like redhat to come back to later. Mandrake probably realises this right now too. I want to see Linux in the workplace, and then hardware manufacturers will sit up and take notice.
“3D in Linux, or X more specifically is sorely needed. We have OPenGL, and this should be leveraged much like Apple does with OSX. I think all the building blocks are there.”
I couldn’t agree more. This would be a great thing for Linux and the BSDs to have, and it pains me that there is not more emphasis on it.
This isn’t really a matter of the Linux/X developers having a choice. This all depends on the amount of cooperation OSS/FS developers get from video hardware manufacturers, since 3d acceleration is rather pitiful at this point. *Nothing* can be done without this cooperation.
That is true for drivers for specific cards, but what we have now can be modified to provide a better framework on which to build.
I realize that my next example is a terrible hack, but TransluXent is a pretty snazzy starting point.
2D rendering support is still not optimal on Linux, and here we are talking about 3D support and technologies. What is further astonishing is that over 95% of the graphics we use are rendered in 2D. Has the unix environment it’s optimal performance in the 2D arena? Let’s learn to crawl first, before walking.
-Mystilleef
What’s wrong with 2d rendering in Unix? X11 is much better than anything Windows has to offer.
“Let’s learn to crawl first, before walking.”
Is it not possible that by attacking the 3D problem that we might at the same time be improving the 2D situation? I realize that both have separate issues, but I see no reason not to parallelize our communities’ development strategy…
>X11 is much better than anything Windows has to offer.
I am sorry but this is not really true. Not all cards support overlay on X, for example.
I was talking about the general technology, not specific driver issues.
In addition to what Eugenia said, not all cards support 2D acceleration under Linux. In fact, the card I’m using at the moment doesn’t. Ah, I’ve giving up on 3D acceleration on this card. In the Linux sphere, your best bet to having anything relatively optimal with regards to 2D or 3D acceleration is investing in a Nvidia or ATI card. And even then complete support for all features available on the card is not guaranteed. You’d be lucky if the drivers don’t screw up your whole system too. No, I’m not exagerating, it’s really pitifull.
-Mystilleef
3D acceleration can come before 2D. As Windows has shown people use the 3D routines to draw 2D, and they tend to be faster because the 3D is more optimised.
“Investing” in an ATI or nVidia card done by most users as it’s what it’s sold with. It will not screw up your entire system – whatever that’s supposed to mean.
Is it not possible that by attacking the 3D problem that we might at the same time be improving the 2D situation? I realize that both have separate issues, but I see no reason not to parallelize our communities’ development strategy…
It is possible. But it is equally important to get our priorities straight. More than 95% of the activities we do with our graphics card involves 2D rendering. Much too often unnecessary attention is paid to *nix’s substandard 3D infrastructure. While its 2D misgivings are overshadowed. The fact is that a lot more people need a complete 2D rendering and support infrastructure for a multitude of video cards, than they need 3D rendering.
The proportion of users that need or rely on 3D rendering are a niche group, and most often can afford to sponsor, write or purchase coders to write customized 3D drivers. I’d love to see as much effort paid to 2D and 3D infrastures under Unix that supports a wide variety of cards. But we need emphasize that 2D for the majority of us, in fact all of us is more important than 3D is, except you are a gamer, an artist, an architect or an engineer that utilizes CAD apps.
-Mystilleef
It’s something I’ll think about anyway, as is Another matthew (IP: —.3months.com)’s comment.
“Investing” in an ATI or nVidia card done by most users as it’s what it’s sold with. It will not screw up your entire system – whatever that’s supposed to mean
There have been instances where Nvidia drivers were locking up users desktops and crashing X, not to mention other perculiar activities. I can’t comment on your 2D/3D statement, because I really didn’t understand it. I have never heard of people developing 3D infrastructures before 2D, but again, I’m not a graphics expert.
-Mystilleef
Seriously, The movement to 3D has already started. Linux does not have to go through the motions that every other OS has gone through. Some things can be skipped, or priorities reorganised. When Windows releases Longhorn, or whatever it will be called, it wil probably have a 3D compositing engine to draw the desktop like OSX. And will Linux be trying to get the 5% 2D stuff right before we move on. I think people need to start to ‘care less’ for people with 386’s as a development target for Desktop products. A Geforce4 MX can now be had for what, $40. And the prices are only coming down, or the hardware is getting better.
The other big problem is that there is no ‘one way to do it’. The reason 2D and 3D support is iffy is because there are no nice standardised and fairly high level interfaces for it. As long as each driver manufacturer tries to implement their own features, then stagnation is all that will happen. Look at DirectX on MS. Every card worth speaking about provides drivers that work with it.
We need to start looking forward. The number of people with Old computers get smaller by the day. And the number buying new higher end systems gets bigger. I think it is now pretty much impossible to buy a less than 1.2 GHz processor, as an example. Soon, you will only be able to buy a 2GHz processor, and we will still be providing Pentium 400 MHz era graphics.
> Seriously, The movement to 3D has already started. Linux does not have to go through the motions that every other OS has gone through. Some things can be skipped, or priorities reorganised. When Windows releases Longhorn, or whatever it will be called, it wil probably have a 3D compositing engine to draw the desktop like OSX.
We don’t follow Windows’ footsteps, we make ours. These new 3D technologies are redundant and resource hungry features that not useful on the desktop. The Desktop is going to be predominantly 2D for a good while.
> And will Linux be trying to get the 5% 2D stuff right before we move on. I think people need to start to ‘care less’ for people with 386’s as a development target for Desktop products.
Why should we care less about other people to satisfy your desire? Other people are humans too, you know. I suppose the other people you are talking about are developing nations who can’t afford to purchase 1Ghz machines, or individuals and other entities who see beyond IT/media hype.
Not everyone can afford to upgrade every year to play the upcoming version of DOOM or Half Life. Not everyone thinks it is necessary to upgrade to browse the internet, email, play chess or chat on IRC. When did Linux become the Operating System for the rich MAC user? When did Linux begin to force users to upgrade their hardware to use it?
> A Geforce4 MX can now be had for what, $40. And the prices are only coming down, or the hardware is getting better.
Some individuals make less than a month.
> The other big problem is that there is no ‘one way to do it’. The reason 2D and 3D support is iffy is because there are no nice standardised and fairly high level interfaces for it. As long as each driver manufacturer tries to implement their own features, then stagnation is all that will happen. Look at DirectX on MS. Every card worth speaking about provides drivers that work with it.
I agree with you here. That and the fact Linux is yet to be acknowledged by a handful of hardware vendors.
> We need to start looking forward. The number of people with Old computers get smaller by the day. And the number buying new higher end systems gets bigger. I think it is now pretty much impossible to buy a less than 1.2 GHz processor, as an example. Soon, you will only be able to buy a 2GHz processor, and we will still be providing Pentium 400 MHz era graphics.
I don’t know where you get your statistics from, but old computers are still used widely around the globe. Only those falling for media hype, or those who really need the extra CPU power, or enthusiasts like gamers are upgrading. Go to any developing nation and have look at the computers they are using. If you see more than 20 2Ghz computers in a building, I’ll buy you a beer. Heck even NASA uses 486 based intel CPUs, talk less of your average poor student.
3D desktops don’t need 3d hardware. Remember folks, we played Doom on 486 DX2/33s…
>The other big problem is that there is no ‘one way to do it’. The reason 2D and 3D support is iffy is because there are no nice standardised and fairly high level interfaces for it. As long as each driver manufacturer tries to implement their own features, then stagnation is all that will happen. Look at DirectX on MS. Every card worth speaking about provides drivers that work with it.
There is one. It’s called OpenGL. It’s extremely powerful, etc. etc. Does that make our lives any easier? (I think you meant ‘low-level’, not high-level. You still need a low-level kernel layer, whatever nifty features you have).
As for driver manufacturers ‘implementing their own features’, that’s what the difference between chipsets consists of. It’s why a GeForce FX is not functionally equivalent to an S3 Virge. There is a standardized low-level interface for video cards — VESA (and VESA 2). However, these don’t provide for 3d acceleration. VESA 3 is a better version of this standard, but it hasn’t made much headway.
I repeat, the ONLY way to get respectable video performance on Free OSes is IHV support.
are you daft? you do not understand what this tech is doing.
it utilizes the 95% of the GFX card that is not used during normal computing.
so by using the GPU for drawing the desktop, you are offloading a ton of crap from the CPU. and the power of GFX cards now adays, with their memories (128 MB!) makes them perfect for this job.
“3D desktops don’t need 3d hardware. Remember folks, we played Doom on 486 DX2/33s…”
Yeah, and Doom was 2D. Doom used sprites.
I agree that Linux should be an serious option to developing nations and their citizens, but at the same time it needs to keep up with the competition. You can’t run OpenOffice, Mozilla or GNOME on a machine under 600MHz that comfortably, anyway. Yet, IceWM, Siag Office and DIllo or Links-Graphic will always be there for below that mark.
That’s strange. I run OOo, KDE, and GNOME on a Celeron433 quite comfortably.
Mystilleef, please use the > symbol and italics instead of bold when replying. Thanks.
and by that, I don’t give a damn about 3D. If I was building a decent machine tomorrow, which 2D card would give me the best driver support?
It is most probably Nvidia. I picked them because the linux drviers for their cards support all the card’s 2D features. I also think for linux they have just one driver that supports all their cards. And lastly, it is update fairly regularly. I’m referring to their closed-source/proprietary drivers. I understand the open source version of the driver is lacking.
<sarcasm>Think you could add a few more pictures of him?</sarcasm>
lol yea. i was like wth?
As I was reading the interview I was thinking the same thing
Havoc you should give some modelling pictures to Eugenia next time where you do Zoolander things.
Honestly I think Eugenia did it for those people who scream, give us the screenshots. Only this time it was portraits.
About the file selector mentioned on page one, I think it should say it will be included in GTK 2.4, at first I thought GNOME 2.4 but I think I heard it would be introduced in GNOME 2.6.
Also I could from reading the article that the comments would be centered around X 3D stuff, and you want to know why, because it needs to be improved, and the only 2-3 people that will be able to do it are the people already working in the Xfree86 project. Last I heard they were knowledgable in all things X.
Another thought what about game developers, I hope they would be interested in having a look into it.
About the last remark in this interview:
It’s very strange that linux developers are seeing UNIX dead. I don’t think linux will ever have the scalability of Solaris or the stability of BSDs. I remember that several years ago, linux community was thinking the same about MS products. But probably this remark is just another hype. And God knows we get used with this coming from linux (especially RedHat) community. Too bad linux haven’t clearly defined its share into the market (maybe except the ‘I look cool if I don’t use Windows’ guys) and it’s thinking about the death of other OSes to settle into the market.
I agree that linux doesn’t provide the stability of Solaris – Yet. (I’m not qualified to comment on the BSD statment, since I’ve yet to see a nonhardware or nvidia driver related crash on Linux)
As for the “Too bad linux haven’t clearly defined its share into the market…” comment, it’s seems that you don’t ‘get’ Linux. Linux is whatever a vendor or end user wants it to be, provide they, or someone else (paid or otherwise) is willing to code it.
That’s it. There’s no magic ‘market’ for Linux. It’s everything from a embedded devices to desktops to servers to routers to clusters. I use Linux because it does what I need and want in an OS. I don’t use Windows because it fails to provide what I need and want.
Doom ran in about 320×240 with 8bit colour.
It wasn’t until we got SIMD (MMX) that we could move past 8bit colour. Because it took too much time to move colour blocks around in the x86. Most o this moving was relegated to the video card if it could do it. This is why back then it really payed to have a good driver for your video card, and video acceleration was important. Also with the ISA bus things moved really slowly. You wouldn’t want a 1024×768 screen on ISA.
Today your screen is at least 1024×768 with 24 o 32 bit colour and it requires a lo of processing to move all of that data around escpecially if you use special effects such as alpha blending. The x86 series processors are not designed for such a job but video card processors are. Now you could write a good 2D card driver for every video card but it is hard because there is no standard.
OpenGL is a standard and you can use it to do 2D. So just write all your stuff for OpenGL and use the supplied OpenGL driver that comes with the video card. This way any card that has OpenGL support can do complex 2D graphics without having to send some of it back to the x86 processor because of lack of support.
I have been reading this Interview and partially agree with Havoc here. The 1st part of the Interview contains this paragraph:
“One theme of these releases are to make GTK+ cover all the GUI functionality provided historically by libgnomeui. So there will be a single clear GUI API, rather than “plain GTK+” and “GNOME libs” – at that point being a “GNOME application” is really just a matter of whether you follow the GNOME user interface guidelines, rather than an issue of which libs you link to. This cuts down on bloat and developer confusion.”
He is absolutely right with that but sadly only he knows what to use and not to be confused. I see a lot of people showing up in the channels not exactly knowing wether they should use a GTK-App GNOME-App or BONOBO-App window for their stuff, they all slightly differ while the last named ones inherit BonoboUI components. But they do differ from technically point of how things are being done. This can really be a complicated thing and even experienced programmers trap themselves because they don’t exactly know what happens next. They once started their project decided to use the BonoboUI component for writing their applications and now they see that a new Toolbar and Menu code (and more stuff) show up in GTK+ and right now it’s unclear wether the developer needs to switch back on using a GTK-App rather than the other 2 named ones. These are technically huge changes in the code. I once as told that libgnomeui and libbonoboui will use wrappers to point to the new GTK architecture but again wrappers are NO longtime solutions.
It would be pretty nice from Havoc Pennington (or anyone else who claim responsibility for GNOME’s roadmap (by ignoring the fact that it’s communitywork, but that’s a different topic) to write a summary about good practices and bad practices, what to avoid to use and what not. So we get NOW the chance to switch our applications to the right track rather than waiting until we hit the wall and then need weeks or months to change the apps to work the new roadmap (I’m not refering to API consistency here).
—————————
Another thing is the X11 issue which I do agree with myself. I was playing with DirectFB (Framebuffer) for quite some weeks myself and managed to get other people interested in that stuff as well (even from GNOME) and I think that the best way for Desktop users would be to make usage of DirectFB. I know on one side the X11 Network layer will disappear but on the otherhand you benefit from a lot of things.
a) you get rid of X11 (~150mb) of complicated and ancient stuff (as HP somehow said himself)
b) using DirectFB (GTK already supports it in it’s GDK layer) you somehow get rid from the CONSOLE and X11 layer and your system become one e.g. you boot your Linux kernel and find yourself on your Desktop GNOME in NO TIME. You can use your framebuffer, do not waste much memory for X11 loading and switching from one FrameBuffer is easy. E.g. maybe you know the old Amiga System where there was no console, you operated on one concept where you had your desktop and windows.
The benefits are so much:
– No loading of X
– getting rid of ~150mb in favor to 200kb FrameBuffer
– DirectFB supports AA fonts (because GTK uses Pango)
– you safe a lot of Memory because you boot linux straight in the framebuffer and have still all your memory left for applications rather than having it suck up by the X server
– speed through direct hardware communication.
– native transparency and shadow support and all the nice effects that Havoc has dreamt of.
And many more. You can read my comments about that on the DirectFB Mailinglist.
http://directfb.org/mailinglists/directfb-dev/2003/06-2003/msg00000…
Please refer to the 2 links inside it as well. I know DirectFB is in it’s beginning but it’s heavily developed and I belive that this approach is a better one for the long term rather than hacking around in old X11 code that ‘no one really understands’.
I am refering here to DirectFB on the framebuffer layer not the X11 drivers. And think about that that many other systems already offer FrameBuffer support some of them already longer than Linux.
I have Redhat 9, and framebuffer set to vga=791.
I have the above, can I load GNOME in the framebuffer out of the box or do I have to re-compile GNOME to use the framebuffer.
Also what would the command be from the console to launch GNOME in the framebuffer instead of X?
I whole-heartedly agree that DirectFB is a great solution to moving passed X11. I have been thinking about an OpenGL desktop on top of DFB, a la Quartz on Mac OS X, where every window is an OGL window, damn small linux ( http://damnsmalllinux.org ), a live linux distro, has it out of the box, and it’s fairly snappy even under VirtualPC on my slow Mac Cube
A whole APIs widgets written over OGL + DFB would be really damn sweet, IMO
Why do you expect people to keep buying new hardware? What for? “Only $40” is cheap for people in countries in South America? Just because there are legions of immature & selfish geeks who upgrade every 6 months & want to have flashy screen fx & boast about their newfanged Radeon card and bang-bang game? What about the hundreds of millions of computers dumped into landfills? The needless pollution? Why should you need a 1GHz computer just to type and browse the web? I’m very tired of this attitude towards computing. What about in 6 years time when you can’t understand why anyone would still be using a 5Ghz computer. What happens to your 1GHz computer now? Landfill? Wastefulness.
I’m sorry, m8, but he was right about the 3D desktop stuff being a waste of resources.
Using the GPU to composite is reasonable, that is true, but things start going downhill once you realise that you have to shunt a framebuffer down the AGP bus for every window you want to composite. Moreover, you have to keep a framebuffer-sized chunk of memory for each window, which very rapidly can cripple a system with a limited amount of memory. It is for this reason that OSX does not favour serious users who like to keep many windows open at the same time.
So yes… Using the GPU to composite is a nice idea, but as you can see, that is counter-acted by a massive increase in the workload.
It looks even less ideal once you realise that OSX really only uses it for transparency and making the windows shrink and fade in and out.
If something worthwhile is done with Quartz, I might change my mind, but until then, it’s a nice technology being used for completely the wrong thing and in the process, making performance worse than it was before.
“I have the above, can I load GNOME in the framebuffer out of the box or do I have to re-compile GNOME to use the framebuffer.”
Actually NO – or better NOT yet. Most distributions sent their GNOME Desktop out compiled against X11 libraries.
GTK the widgetset used by GNOME has the advantage to use different GDK backends such as X11, DirectFB (recently replaced old framebuffer) and Win32. So what you need to do is:
a) enable FrameBuffer in the Kernel,
b) enable Fusion in the Kernel,
c) compile DirectFB,
d) compiler GTK with the DirectrFB backend.
… after this point you can compile nearly all native GTK+ applications and play with them e.g. gtk-demo or the DirectFB tools itself … simply boot Linux, Login and then enter ‘./gtk-demo’ in the console
I have tried to compile GNOME a bunch of times using DirectFB but the developers seem to not care for the GTK backend correctly and thus they often:
#include <gdk/gdkx11.h> rather than
#include <gdk/gdkfb.h>
I hacked around in the GNOME code for a while and replaced a bunch of these includes to proceed with compiling. This usually works perfectly and I was able to compile 45 out of 80 Tarballs (which makes an entire GNOME Desktop) without any problems. But there are a few libraries such as startup-motification and libbonoboui who makes usage of direct X11 calls (libbonoboui for example SOCKS calls) which is hard to bypass. I wrote empty functions for it to wrap it only to continue compiling GNOME.
So the status of GNOME is
a) developers do not care for including the GDK backend correctly e.g. through proper checks in the configure scripts.
b) they use a few direct X11 calls to achieve some easy goals.
What needs to be done is to fix a) and b). Point a) looks rather easy it’s just an #ifdef in the code and an correct configure.in (.ac) check to determine which backend is used. After that the few X11 library calls made needs to get wrapped or replacement libraries to use the DirectFB engine. It sounds a bit complicated and theoretically now and may scare the one or other away but once this is done you do not notice any complicate things anymore because it’s really trivial to compile GNOME then. The same value it takes right now.
“Also what would the command be from the console to launch GNOME in the framebuffer instead of X ?”
./gnome-session &
Probably. Dunno yet.
Try installing xdirectfb – basically a directdb X server.
> Try installing xdirectfb – basically a directdb X server.
The point is to get rid of X11. If I gonna use xDirectFB then there are basically NO serious benefits and it would raise the questions wether using the native driver made for the GFX card wouldn’t be better in this case. But you seem to have carefully read the DirectFB issue that I raised here and that you figured out from my writing that I was strictly NOT refering to xDirectFB.
Doing exactly this will be overhelming. It would add needless bloat to your system.
a) X11 as is
b) DirectFB + it’s drivers
c) xDirectFB as driver in X11
d) compiling GNOME to support it.
a) and c) is not wanted and shouldn’t even be thought about. It’s a nice thing to play with and test stuff e.g. transparency but that’s all. A true DirectFB solution is the solution sitting ontop of the Kernel without X11 at all.
wow he is cute 🙂
And.. what about the children!?!
Seriously. Nothing is stopping you from using it. These people are talking about making advanced features available for people with advanced hardware. There’s nothing that is stopping you from using your current setup. And Linux is known for supporting legacy hardware, so there is no problem. Why do you feel the need to upgrade anyway? If you want newer features, you have to buy newer hardware. If you want to use older hardware, you simply need to maintain your software.
>> A Geforce4 MX can now be had for what, $40. And the prices are only coming down, or the hardware is getting better.
>Some individuals make less than a month.
So perhaps RedHat shoould release a “Third world edition” that is optimized for 386’s and trident 256k cards running in RGB mode.
>Havoc Pennington: As you know we’ve been more aggressive than other Linux vendors about removing potentially patented software from our distribution, specifically we took a lot of criticism for removing mp3 support.
Like i need a software company making moral decisions for me.You think i am going to re rip my CD’s as so to put some kind of Funky RedHat copy protection scheme on them. RedHat must be high!
>Havoc Pennington: I would say that the nails are firmly in the UNIX coffin, and it’s just a matter of time.
Does he seriously believe this? Linux is enterprise quality only in the minds of the likes of RedHat, Suse and Microsoft hating companies. UNIX will always be there remember the UNIX killer? called NT?
Yes, they will call it Debian Linux.
No, Redhat is protecting themselves, not you. There are legal issues with using the mp3 standard.
You can read more here:
http://www.mp3licensing.com/royalty/software.html
While other distributions have taken a more cavalier approach, Redhat is actually doing the right thing since they are not paying.
> Redhat is actually doing the right thing since they are not paying.
… but like others to pay for their Distribution or/and Support.
“The point is to get rid of X11. If I gonna use xDirectFB then there are basically NO serious benefits and it would raise the questions wether using the native driver made for the GFX card wouldn’t be better in this case. But you seem to have carefully read the DirectFB issue that I raised here and that you figured out from my writing that I was strictly NOT refering to xDirectFB. ”
Speak for yourself. There are more than a few people who _need_ XF86’s network transparency, myself being one of them, and find the idea of getting rid of X11 absolutely idiotic. Just because your needs don’t include network transparency doesn’t mean that you have some sort of G-d-given right to tell everyone to axe it.
As usual, this is a case where people attack XF86 because someone told them it was bad. Sure, network transparency has a touch of overhead – but how much is too much? Do you really know where the bottlenecks are happening? What are _your_ credentials to tell me so?
To address your benefits:
“- No loading of X”
So what? Loading up X takes a hot 10 seconds on my P166MMX laptop. I find that loading GNOME takes a lot longer than loading X – let’s just go back to TWM so we get instant load times, right?
“- getting rid of ~150mb in favor to 200kb FrameBuffer”
I run X fine on my 96mb laptop. Stop confusing the issue with blatantly false claims of memory bloat.
“- DirectFB supports AA fonts (because GTK uses Pango)”
XF86 does, too. Ever tried RedHat 8 or 9?
“- you safe a lot of Memory because you boot linux straight in the framebuffer and have still all your memory left for applications rather than having it suck up by the X server”
And the magical framebuffer doesn’t take up memory, right? Have you ever considered that the only way I _can_ run some apps is via X? Let’s see your 32mb laptop run SAS on its own – the only way you’re going to do it is to use X’s network transparency to run it from _another computer_.
“- speed through direct hardware communication.”
XF86 can do this. It’s called RENDER and DRI.
“- native transparency and shadow support and all the nice effects that Havoc has dreamt of. ”
These can be coded into XF86. In fact, I’ve heard general talk indicating this will be true in the (possibly near) future.
XF86 is a mature, tested architecture. DirectFB is not. You make slightly more sense than your inane “KDE is the greatest” rants, but that’s not saying much.
-Erwos
>Using the GPU to composite is reasonable, that is true, but things
> start going downhill once you realise that you have to shunt a
> framebuffer down the AGP bus for every window you want to
> composite. Moreover, you have to keep a framebuffer-sized chunk
> of memory for each window, which very rapidly can cripple a system
> with a limited amount of memory. It is for this reason that OSX does
> not favour serious users who like to keep many windows open at the
> same time.
How many windows do I need to have open before I can feel serious about my computing? I’ve often suspected there was something wrong with my demeanor as I sat in front of the machine.
I accept the argument that graphics acceleration of the desktop requires better and more expensive hardware, and will not necessarily be desired by most of the world. I wouldn’t worry though. Even if the U.S. market favored such acceleration, if the rest of the world didn’t want to follow they would just persue another Linux distribution.
Even Mac OS X does not require acceleration. iBooks don’t have a supported graphics chip, so fall back to software rendering.
But the argument that using hardware acceleration will hurt system performance is just plain wrong. You can run CPU monitor on a Mac and watch the load just drop away from the processors when Quartz Extreme is enabled. Are you that concerned that the AGP bus will have traffic on it? That’s what it’s there for. How will your system be faster with a bottlenecked processor and a quiet AGP?
Are you are saying there is a memory penalty somewhere because without hardware acceleration your windows don’t need frame buffers?
oGALAXYo wrote:
> He is absolutely right with that but sadly only he knows
> what to use and not to be confused.
Dude, that’s the whole point. There will be 1 Menu/Toolbar API and all the others will be deprecated eventually. That’s good.
> now they see that a new Toolbar and Menu code (and more >
> stuff) show up in GTK+ and right now it’s unclear wether the
> developer needs to switch back on using a GTK-App
When the new API is available, you should use it.
Erwos, I agree completely. X is not that much of a problem. Armchair developers who are afraid to think that boo hoo, OSX has transparent windows and we don’t need to look at things more closely.
150 is the size of the SOURCE. Are you confusing source size with binary size?
Also, the funny thing is, DirectFB screws the people with nVidia cards–you either have DirectFB, or hardware acceleration, which is fine if you’re just trying DirectFB out, but gets MIGHTY annoying if you use it for your main system.
…. is a distinction that I’m pretty sure is very vague at the levels needed to catch up with MacOS.
Basically, video cards have on board chips which can be used to accelerate common operations like scaling, rotating, filling polygons, transforming vertice data and so on.
Now, these operations are hardware specific, and take the form of opcodes to the chips. The purpose then of the driver is the take this low level interface and wrap it into a higher level one. Fortunately we have a standard for this in the 3D world, it’s called OpenGL, and it provides a very nice API and set of tools to do 3D graphics.
I see some people here saying “We should use OpenGL for all our graphics!”. I think that’s the wrong approach.
Ultimately, what we want is an API that is designed for 2D graphics and makes that easy (because let me tell you OpenGL was not designed for rendering widgets and UIs, I’ve written an OpenGL based widget toolkit and it was Not Fun), but that still eventually is translated into hardware acceleration opcodes.
Does such an API exist? Yes, in fact it does. It’s called XRender, and was designed to map very well to modern hardware (both 2D and 3D acceleration engines). XRender is a new API, so it doesn’t have the wide driver or hardware support that OpenGL has yet, but that will start to come faster once KeithP has finished his test cases so driver authors know what they are doing is correct.
Theoretically, no matter what you use, whether it be XRender or OpenGL or Direct3D, the instructions to the card are basically similar. There’s no reason why a desktop drawn and composited using XAA and XRender should not be as fast as ones done via Quartz Extreme – in fact, they should even be faster as Quartz itself is not hardware accelerated, only Quartz Compositing is. With the necessary changes to the innards of XFree, there’s no reason why it couldn’t all be accelerated.
So, in other words, don’t get distracted by 3D vs 2D, OpenGL vs X. They are all different sides of the same coin.
@Erwos
“myself being one of them, and find the idea of getting rid of X11 absolutely idiotic.”
I shouldn’t feed trolls but this only reflects how much you DON’T KNOW. Thanks for enlighting me with your half informations.
I didn’t wrote that X11 should get removed from GNOME. I wrote to support the backends of GTK better e.g. have it (GNOME) additionally support DirectFB. This also shows me that you didn’t spent into reading the supplied link from me and the two sublinks that I suggested people to read. I clearly described there that X11 support should NOT be removed.
GTK and GNOME should continue using X11 but I like to be able to use the DirectFB backend for GDK (GTK) as well and like to be able to compile GNOME for DirectFB.
A bit more conform GDK backend usage would be nice so everyone can benefit from it. E.g. a good start would be to check with configure which GDK backend is used and adding an #ifdef in the sourcefiles so either <gdk/gdkx11.h> or <gdk/gdkfb.h> is being used. Doing this will NOT change anything for you. Writting wrapper libraries for replacing the X11 stuff for those who like to use DirectFB rather than X11 is no problem either because YOU are not affected by it. For you X11 user nothing changes.
————————————————————
@Murray Cumming
“Dude, that’s the whole point. There will be 1 Menu/Toolbar API and all the others will be deprecated eventually. That’s good.”
**Dude** I didn’t said anything else and I’m already aware of this situation. I only like to be on track with the new changes and how the interact with the changes of bonoboui and gnomeui. I know they all wrap back to GTK+ and it’s nice for you to know about it. – What about others ?
————————————————————
@Greg
“150 is the size of the SOURCE. Are you confusing source size with binary size ?”
cd /usr/X11R6/ && du -sh == ~148 mb X11R6 + Fonts
cd /mp/xc/ && du -sh == ~291 mb X11R6 sources
————————————————————
And to all of you three now. I think you make a lot of noise for nothing as if someone would cut the roots beyond your feet. That was not my intention and flamages would have been easily avoided if my stuff would simply be read. It’s not like you three heard the first time about what I like to explain because I raised this point a couple of times in the gnome channel and on the Mailinglists. If you don’t understand what I write or if the stuff is understood differently on what I liked to say then it would have been easy to raise the voice and normally bring it up so misunderstandings can be avoided in first place.
Now you all come up in masses defending X11 like it’s the ultimative solution for doing graphical things and on the otherhand you sign a petition that X11 should change because of whatever reasons (see the petition). What does this tell ? It tells that X11 is far from being optimal like the GNOME people would have to see it. X11 is a big projects the code is nearly 300 megabytes. It supports many different architectures, different graphiccards, it’s still in an ancient IMAKE buildsystem and much much more. Even HP is right when he says that the Xourage (what a strange name) team may have hard times to deal with fixing all the stuff and implementing the things required in the requirementsmap of freedesktop.org. Now who will do the changes ?. I only brought up a far better solution worth to think about. X11 will never change the way you like to see it, not without powerful users. Users that know the architecture and internals of X11 better than anything else. People who develop around it for many years with big knowledge. It’s not a simple hackerstoy. We can have different opinions here (as we usually always have) but I think that DirectFB (written by the same people who are behind GIMP and GTK+) makes good sense. While you make huge changes in GNOME you really look scared and whine when it comes to change the entire concept of displaying things.
I do not force anyone to change it but I can at least raise a point that is worth to think about. DirectFB is a good concept and probably easier to handle than X11.
The technical issue which needs to be resolved, is enabling full use of pbuffer from OpenGL or other DRI applications.
Work is being done on this, see http://www.mail-archive.com/[email protected]/msg1264…
By using pbuffers in the windowing server, be it XFree86’s X11 server, XDirectFB, or just the multiapp core in DirectFB, one can enable hardware accelerated drawing (2D or 3D) to onscreen and offscreen textures (pbuffers). These can then be combined on screen using translucency. Combined with storing the window content either in graphic card memory, or by saving content to main memory using the AGP bus, one can have a similar responsive windowing system as OSX has, without flickering due to applications having to respond to expose events due to opaque window moves.
This is the approach that is taken by the transluxent server as well, but the author didn’t have hardware support for pbuffers on his development machine, so he used software rendering to textures instead. Yet he was able to get translucent rendering going fairly well. See http://www.stud.uni-karlsruhe.de/~unk6/transluxent/
I think there are a large number of person(s) who are trying to make a mountain out of a mole hill making out that X11 needs to be replaced with something.
Sure, I would love X11 to be replaced with something that is super-ultra-modern, heck, I am sure that almost EVERYONE would love that, however, the reality is that X11 isn’t as bad as it is made out to be.
There have been a number of people who have pointed out where things can easily be corrected by tweaking aspects of syncronisation so that tearing of Windows do not occur, better support by companies in terms of producing drivers that FULLY take advtange of the capabilities of the card, extensions that push more work off onto the GPU and as more people jump and catch the “opensource bug”, the mindshare of contributors will increase.
I stress again, X11 isn’t the problem. There are numerous examples out there, such as SGI’s X11 and X-Accelerate which prove that X11 itself isn’t the cause of the slow down but the actual implementation itself.
As for widget sets, but Qt and GTK support OpenGL extensions, meaning, there is no reason to say there isn’t the “stuff” there to make an “accelerated GUI”.
“In the past (pre-SCO), Red Hat has admitted that was growing awry of patent issues that they might arise in the future.”
Huh?
is this a new thing on osnews? having so many pictures of the interviewee in an article?
cmon, one picture is good enough, i almost wanted to puke having to look at another picture of this guy on every damn page…
anyway, i agree with having a unified, standardized baseline for creating desktop environments. i hope freedesktop.org succeeds in creating these standards.
“Linux has its Nails on UNIX’s Coffin”
If you read the article, you’ll notice that Havoc never said that. It is Eugenia who assigned the nails to Linux. I guess SCO people would like to use that statement (in Eugenia’s rendition) if it actually came from a distinguished GNU developer.
Sorry, E., maybe some other time.
Mystilleef wrote:
“We don’t follow Windows’ footsteps, we make ours. These new 3D technologies are redundant and resource hungry features that not useful on the desktop. The Desktop is going to be predominantly 2D for a good while.”
“Why should we care less about other people to satisfy your desire? Other people are humans too, you know. I suppose the other people you are talking about are developing nations who can’t afford to purchase 1Ghz machines, or individuals and other entities who see beyond IT/media hype.
Not everyone can afford to upgrade every year to play the upcoming version of DOOM or Half Life. Not everyone thinks it is necessary to upgrade to browse the internet, email, play chess or chat on IRC. When did Linux become the Operating System for the rich MAC user? When did Linux begin to force users to upgrade their hardware to use it? ”
“I don’t know where you get your statistics from, but old computers are still used widely around the globe. Only those falling for media hype, or those who really need the extra CPU power, or enthusiasts like gamers are upgrading. Go to any developing nation and have look at the computers they are using. If you see more than 20 2Ghz computers in a building, I’ll buy you a beer. Heck even NASA uses 486 based intel CPUs, talk less of your average poor student.”
<end of cogent argument>
************************************************************
Bravo! You hit the nail right on the head.
Most Americans (those living in the northern hemisphere) live in Disney world. Their perception of world affairs is distorted and have little knowledge of what the situation is with Americans in the southern hemisphere (read Latin America), much less Africa, Asia, Eastern Europe, and the rest of the Third World.
In my neighborhood, I am one of the fortunate ones who owns PII 266 with a cd-writer. To me, a Realtek NIC card (made in China) is perfect!, a 16 MB Riva(Nvidia is more than enough, thank you very much, IceWM is the non plus ultra: light and fast. Debian/Sid is my deluxe distro, I give away Knoppix, Morphix and DSL to anybody who wants to try Linux for the first time. Most people down here use Pentium 100s and PII 200s. We don’t waste our time with silly games. In these times when people are eating out of garbage cans thanks to globalization, games, Aqua, and eye-candy means didly to us.
Regards,
Mario
With UNIX dying. It will (hopefully) mean it is replaced with something better. Havoc did not say BSD will die, but UNIX, which refers to something like SCO UNIX, or other commercial Unices.
Linux has the momentum to take it right past UNIX right now. The other very good thing is that it is quite vendor nuetral. You have adoption by IBM, SUN, ORACLE and other heavy hitters. Commercial UNIX will die because it will no longer deliver value.
And also, project like GNOME do not, and IMO should not cater, or even try to cater for the entire user base. People who need lightweight GUI’s should just use a lightweight system. I am not rich myself, and I do understand that computers should not be a preserve of the rich, we can not expect people with 3GHz computers to run the same software as someone with a P133. People buy computers to be able to use the power of them. That is the good thing about Linux. You get the same OS, but you can adjust your software to suit your hardware requirements. So GNOME/KDE for people with mid to high end computers. And Windowmaker/XFCE et al for those with slower hardware.
The way I see it, if Linux displaces proprietary Unix, it will be by Linux being phased in as proprietary Unix gets phased out. For now, the proprietary Unices scale better than Linux, and have a few other nifty features as well, so they still have their place. Linux, though, does not stand still, and as it acquires the high-end features of the high-end Unices, there will be less and less reason to deploy them. In the meantime, apps can be written for both Linux and proprietary Unix, and many older apps written for the proprietary Unices can be ported to Linux with relative ease. That means that the transition from proprietary Unix to Linux should be relatively smooth. It will be less of a revolution and more of an evolution.
Is a really dumb idea for the general case. Everyone dings X11 for its efficiency, generally by looking at top and crying in horror at how much RAM X is taking up, not realizing that a lot of that RAM usage reported isn’t real, but rather is memory maps from the video card’s on-board video ram. To the extent that X11 does use a lot of memory, it’s because the programs running against X11 are storing a lot of graphics resources in the X server.. if those processes had to do their own graphics rendering operations in-process, all of those processes would be taking up more RAM.
There’s a lot that could be done to improve X11, but I’d like to see it done in the form of enchancements to the XFree86 product. Ditching the hundreds of megabytes of /usr/X11R6 means ditching the thousands of historical X applications, for no defensible reason.
Note that whining about Imake is not a defensible reason.
IMHO.
I wish people would stop complaining about the network layer of X11 causing the slowdown. The slowdown is due to xlib and the protocol used to communicate between server and client. The network transparency code (Which desktop users just aren’t using anyway) is not slowing things down.
My advice:
Don’t rewrite your desktop to suddenly become 3d capable, help rewrite XFree86 so that it is more optimal.
>There’s a lot that could be done to improve X11, but I’d like to see it done in the form of enchancements to the XFree86 product.
Well, just to add properly hardware accelerated alpha blending to xfree86 (and not just a lame hack) you probably need to have an internal window manager on the server (yes, just like windows and direcfb and etc…). That probalby means getting rid of the current window managers, so would current apps even run after doing this?
People were also saying 3d isn’t usefull for desktop stuff, well, take a look at windows, they’re already using it for stuff like video rendering: VMR9 uses pixel shaders to add video effects, and I guess that pretty soon this kind of stuff will find it’s way into more common purposes, like using the 3d card anti-aliasing and filtering in font rendering for example.
>>
The slowdown is due to xlib and the protocol used to communicate between server and client. The network transparency code (Which desktop users just aren’t using anyway) is not slowing things down.
<<
Isn’t the protocol used to communicate between server and client part of the network transparency design?
Also,
don’t forget that Gtk+ and Qt can also be optimized. Right now they are both not.
>@Greg
>”150 is the size of the SOURCE. Are you confusing source size >with binary size ?”
>cd /usr/X11R6/ && du -sh == ~148 mb X11R6 + Fonts
>cd /mp/xc/ && du -sh == ~291 mb X11R6 sources
cd /usr/X11R6/ && du -sh == 16M
cd /etc/X11R6/ && du -sh == 12M
When you install XFree from source, there is in fact a lot of cruft going to be installed. You can savely throw it out. I’m running this configuration since 6 month. I can safely say nothing will break.
To be fair the above numbers don’t include oriental/Asian fonts and the DRI drivers (I have a TNT2 card). But you would need those fonts with DirectFB too (if you need them). And (DRI) since when is a lack of drivers an advantage?
Currently DirectFB only support Matrox cards and lowly Trident chips.
“Isn’t the protocol used to communicate between server and client part of the network transparency design?”
No, actually. The client and server can communicate over a real network through TCP/IP or the ancient DECnet protocol, or client and server on the same machine can communicate through Unix sockets and shared memory, which is quite fast.
Get it through your head: the network transparency is not the bottleneck.
Luckily my SiS 6326 of 1998 has finally got hardware 2D acceleration in XFree 4.3.0… it took five years, but then you get something!
It’s even better than in Windows 2000, where a driver upgrade for my video card breaks it, so that it falls back to standard VGA, and of course it doesn’t seem it can be undone.
Now indeed alpha transparency in X would really be a great thing. With a bit of luck, this makes screenshot applications much faster, and it makes shadows and transparent terminals possible.
I doubt if 3D is really necessary except for eye candy. I mean, aren’t most programs designed to use at a certain size? If you then make stamps of them, you can’t easily recognise one browser window out of five.
The best way, I think, is to move to Display Postscript or such, with that the toolkit problem is solved (every program has it’s own image and text links for it’s functions), you can freely resize windows to either see others or to be better able to read text, no seperate code for displaying/printing is needed anymore, and I could go on. However, it is very unlikely this will ever be implemented, so I go for alpha-transparency.
>How many windows do I need to have open before I can feel >serious about my computing? I’ve often suspected there >was something wrong with my demeanor as I sat in front of >the machine.
Argumentative 😛 Honestly, though, there are people who typically operate one or two applications at a time (classic web browsing, e-mail, word processing user stereotype) and those who use 5+ applications at a time. Modern operating systems in general attempt to provide scalability to suit both sets of users, and even the power users with 10/15/20+ windows open. FYI, I’m running Photoshop, Dreamweaver, one filemanager window, an FTP client, a control panel, a web browser to admin a server, a web browser aimed at Ace’s Hardware, one here and an ICQ clone right at this moment. Quartz favours users with fewer windows and makes it hard for the power users to operate, since they lose so much memory on maintaining window framebuffers that it cripples other power-user resources. imho single app users don’t benefit much from their windows spinning and shrinking, and neither do the power users. So my question, is what benefit these features actually bring to 99% of users.
>I accept the argument that graphics acceleration of the >desktop requires better and more expensive hardware, and >will not necessarily be desired by most of the world. I >wouldn’t worry though. Even if the U.S. market favored >such acceleration, if the rest of the world didn’t want >to follow they would just persue another Linux >distribution.
Indeed Honestly, I don’t mind new features and tools that require more resources, if they bring benefits. That’s the progression of computing and it is a good thing. I only take issue in that 3D desktop additions don’t seem to bring ease of use improvements or workflow improvements. To my mind, Adobe adding layers to Photoshop was one of the most profound benefits to modern graphics/design/marketing/etc. and brought massive reductions in people’s time to perform tasks and spawned new forms of artwork and development of artistic endeavour. 3D desktops don’t do anything like this.
>Even Mac OS X does not require acceleration. iBooks don’t >have a supported graphics chip, so fall back to software >rendering.
Fair enough, but Quartz then adds more load to the system than conventional 2D rendering systems. Simply put, if the user performs just as well with 2D systems as Quartz, then what is the advantage of adding the overhead. For the iBooks, the system has been slowed down to no benefit.
>But the argument that using hardware acceleration will >hurt system performance is just plain wrong. You can run >CPU monitor on a Mac and watch the load just drop away >from the processors when Quartz Extreme is enabled. Are >you that concerned that the AGP bus will have traffic on >it? That’s what it’s there for. How will your system be >faster with a bottlenecked processor and a quiet AGP?
It’s not that hardware acceleration will damage system performance. Indeed, I can appreciate that it will help it. The trouble is, that with Quartz, you are adding more overall work to the system than systems that don’t use the acceleration. And yes, dumping all the windows down the AGP bus is none too bright if you have other graphics-card heavy applications operating.
>Are you are saying there is a memory penalty somewhere >because without hardware acceleration your windows don’t >need frame buffers?
Yep
A good interview with an intelligent person. He is bang on in many cases, especially about X. I am also pleased about his Linux to the desktop timeline. However I must take note with something:
Havoc Pennington: I would say that the nails are firmly in the UNIX coffin, and it’s just a matter of time.
Mmmmm, I am not so sure about this. I don’t think AIX, Solaris or the other Unix-like OSes are quite in the coffin yet.
I have used UNIX since BSD 2.8 on a PDP 11/70 circa 1980 at MIT. Linux is UNIX (POSIX, X11, etc.), albeit a free one (like BSD for some value of “free”). “Linux vs UNIX” is classic marketing claptrap from the Linux vendors.
He’s so obviously spewing RedHat’s latest “model” where they charge extra for Enterprise-level tools, it’s sickening. Linux makes me happy because it gives me a usable system for user desktops and also a powerful server system, all in the same hardware/software platform.
RedHat’s latest move to re-separate the server from the client is a horrible move in the wrong direction again, if you ask me. But I understand they’re trying to survive. They’ve given away their work for too long and they need to hang big pricetags on software to make PHB’s at huge companies think it’s “worth” buying.
Sad…
He’s so obviously spewing RedHat’s latest “model” where they charge extra for Enterprise-level tools, it’s sickening. Linux makes me happy because it gives me a usable system for user desktops and also a powerful server system, all in the same hardware/software platform.
Well, it seems kind of strange that HP took that approach because from another newswire Red Hat said that their enterprise product will be based off the “community” version and thus allow them to get “in touch” with the “community”. Yeap, that sounds like the president on the road “getting in touch with the ordinary folk”.
RedHat’s latest move to re-separate the server from the client is a horrible move in the wrong direction again, if you ask me. But I understand they’re trying to survive. They’ve given away their work for too long and they need to hang big pricetags on software to make PHB’s at huge companies think it’s “worth” buying.
If they bundled more with it then I could justify spending the amount on it. I have no problems paying the amount for the basic enterprise workstation version, however, when it contains no more than what the “community” one would, where is the value? why not bundle CrossOver Office 2.0.1? StarOffice 6.1 (when released)?
Lets look at Solaris, you can have you basic support contract which gives email and access to all patches, and this support goes all the way up to the ultra-uber support contract which even includes the ability to request DRIVERS for unsupported devices! I mean, if that ain’t value for money, I don’t know what is!
Where is the value that Red Hat is supposed to bring? Just because they’re Red Hat doesn’t mean that people are going to go all gushy. Just look at SuSE Enterprise Desktop, around $99 per-desktop INCLUDING Crossover Office and IIRC, StarOffice as well. That is what I call value for money.
Now sure, their marketing stratergy doesn’t have the a-typical US razzle-dazzle hype-lie-hype marketing programme, but if you are making the decisions, what do you base it on? what the product can provide or how good the marketing is?
There are two threads that caught my attention, first the one about DirectFB vs. X11 and second, that about abandoning old computers. I would like to address both.
1) I got the impression that the accusation of being a memory hog was directed against RAM usage not diskspace. Still it is flawed. X’s RAM usage as reported by top or so includes the VGA cards RAM, buffered pages in swap, etc.. See the FAQ you should find at /usr/share/docs/xfree86/FAQ.gz
2) Could the people complaining that the poor computer user has no access to middle to high-end computers please stop. Even today older machines are better off running Linux 2.2 with xfree86 3.3x and NOT xfree86 4.3. In this light, i guess, you can see the fallacy of your argument.
Best regard
I think that if one is in the business of ditching X (leaving aside whether one should or not), Fresco/GGI is the way to go. It’s fast and is network transparent. Sadly, it’s still very far away from being usable.
Agreed on both Fresco and GGI. GGI is definitely the way to go, as it abstracts all the hard work so the programmer can focus on the task at hand, and supports a variety of backends, such X11. (and probably Fresco when it comes of age)
Fresco is definitely a next-generation display server, (if that is what it is called) allowing programmers to fully leverage Corba technology and what not. I still think X is good enough for most needs. Oh yeah and to all of the people who want to get rid of network transparency, 1) UNIX sockets are EXTREMELY fast and 2) I use X11 on a network about 4 times a week, it’s very useful.
Now, my main issue with Fresco and other so called alternatives is the licenses that I’ve seen. Low-level display servers should try to promote use to the largest audience possible. Using copyleft, (as in LGPL, which is still almost the same as GPL) this becomes impossible. Maybe a desktop, such as Gnome, works ok under xGPL, but it just doesn’t make sense to use a copyleft license for low-level libraries.
Regarding X11, I run 3d accelerated graphics under X11 on FreeBSD, and I don’t really see all of the issues that everyone is talking about. Yes, the drivers are unstable, but that is because it is NVIDIA’s first FreeBSD release. To be honest, transparency on the desktop is of trivial importance at best. Sure, I’d love to have better transparency for my pager in FVWM or Sawfish, but it just isn’t necessary. And Quartz is not exactly the holy grail of display design, anyway. As someone mentioned before, it doesn’t work well with loads of applications loaded at once.
Now, X works GREAT for me when I have 25 terminal windows open, 7 web browsers open, maybe a media player, totem, 5 gvims, all at once. (Oh yeah, and these are using something like 3 desktops, 9 viewports each) I’d really like to see Quartz do that. (Of course, I’ve barely used MacOSX, so I’m not really qualified to say much about it) Of course, this all leads back to my main point that 3d really only makes sense in gaming, if you ask me. (And don’t rave about win2k/xp transparency, it was much slower than any transparency that I got in BSD, btw.) It’s cool, but really a novelty on the desktop. Besides, software is enough if all you want is transparent terminals…
For me, X starts up in less than 3 seconds, so if you’re having problems, then maybe your configuration is sub-optimal. I don’t know. Just stop complaining and spreading FUD about XFree.
What’s wrong with 2d rendering in Unix? X11 is much better than anything Windows has to offer.
Context switching and the overall bloatedness of not only X but the entire X model. Drawing anything in Windows requires only a single context switch. In X it’s somewhere up around 20. Consequently X’s overall performance suffers.