The GNOME development community has announced the official release of version 2.22 after six months of development. GNOME is an open-source desktop environment that supplies a complete user interface and an assortment of programs for Linux and other Unix-like operating systems. GNOME 2.22 includes some important new architectural features and a handful of significant new programs. Among the most important enhancements in GNOME 2.22 are the GVFS virtual file system framework, which brings improved network transparency to GNOME desktop applications, and the PolicyKit framework, which provides improved support for secure privilege elevation.
Congrats!
I like Gnomes robustness ( although there are unfixed 3 year old bugs http://bugzilla.gnome.org/show_bug.cgi?id=308632 .. )
And evolution^2 seems to suit Gnome really well. But i think the rate at which this project evolves it exceeded by far by other projects ( no name calling in this thread )
But i hope Arch will be quick to package it ( it still is my main desktop .. altho i really like ***4 SVN too atm )
I’m a big fan of GNOME, having been using it consistently since ~2.14 (before that, KDE). The 2.22 changelog is nice, I guess, but like others I feel like progress is becoming a tad stagnant. I still prefer GNOME to KDE4, but I could see that changing once KDE4 receives a bit more polish (4.2?). It’s not necessarily GNOME’s fault, as I think it has more to do with QT evolving faster than GTK.
Edited 2008-03-13 00:52 UTC
There is a big difference in development philosophy. GNOME is much for focus on steady incremental changes or “evolution” if you want, while the KDE team has been more focused on big changes to try to create something new and “revolutionary”. It can be seen in release schedules. GNOME releases every six months which leaves only a 4-5 month instability window where people can merge in new things. This has led to 12 releases in 6 years. In the same space of time, KDE has released 7 major versions.
Both approaches have merit but invariably the development of GNOME may seem a bit more dull from the outside, due to the steady evolutionary approach.
The approach is very practical, pragmatic and business like if you like, while the KDE approach is more enthusiastic and idealistic.
I personally think both teams can learn from each other here. I think there sometimes is a bit too much change for the sake of change within KDE, while GNOME sometimes is too conservative.
For GNOME’s part I think they should stop doing ALL releases on a six month schedule because the six month schedule makes big platform changes very hard.
Maybe two releases every even years (2008, 2010 etc) and one release every odd years (2009, 2011 etc) or something to that effect would be a good compromise.
That was true but from my understanding the KDE devs have decided to also go with more incremental updates with the KDE4 series. They are learning their lesson after the whole KDE4 vs 4.0 debacle, if every release is a big one then people expect way too much. With incremental time based updates then the devs can focus on getting things working properly and if this release doesn’t have some feature whats another six month wait. I personally think that’s how Gnome gets away with not updating or fixing bugs for such long periods of time, things change so slowly that you just get used to the quirks. I still think incremental updates are a good thing.
I don’t sit on the KDE lists much at all, so this is just my personal observations –
KDE seems to keep larger changes to the bigger release (KDE3, KDE4), which happen every few years. They let that guide the minor releases, which will tend to have some big, not architecturally big, changes (e.g. KDE 3.3 vs. KDE 3.4). This then leaves minor enhancements, and bug fixes, etc. for the point releases (e.g. KDE 3.4.9, KDE 4.0.2). So I think they have been consistent.
Last I read about major changes to KDE was for the 3.4 release, which was quite nice over 3.3 – though I didn’t use 3.3 much. (I transitioned to KDE around that time from GNOME.)
Any how…from my perspective KDE has a consistent and good plan – use the major version for major arch issues, minor version for functional updates and major fixes within the arch, and point releases for bug fixes, etc. They also do quite well with getting updates out too.
Haven’t paid much attention to GNOME lately, but they don’t seem to be making any where near the news or doing near as much to put together a user friendly desktop as the KDE folks (one of the reasons I moved to KDE).
There was no debacle, or lesson to be learned. KDE4 required some significant tear-down / rebuild of KDE3. That took some time, and is still partially a work in progress. KDE 4.1 was targeted for release approximately 6 months after 4.0, before 4.0 was even released.
KDE has always had an incremental update policy between major versions; changes get pushed into svn, and on a regular basis they are tagged for a point release.
KDE and Gnome cannot necessarily work on the same type of rigid release structure. KDE is based primarily on Qt, whereas Gnome is based on Gtk and a collection of supporting libraries, all of which have their own roadmaps. Qt4 introduced enough changes that warranted a rebuild of KDE to take advantage of, Gtk et al. do not evolve in the same cycle, so any decision for a reworking of Gnome will be based on an intersection of different projects reaching a certain point.
That’s not to say one method is better than the other, they both have their advantages and disadvantages, as KDE 4.0 underscored. Simply saying that you cannot arbitrarily use the same yardstick to evaluate development cycles.
It is always nice to see new GNOME releases. I have been using GNOME since 2.10 and the improvements although smooth and subtle have been significant. *Especially in performance*.
GNOME 2.20 / GTK+ 2.10 apps seem just as snappy as Qt4 counterparts to me. I think the improvements in the Cairo library have greatly helped there.
Unfortunately, in my experience, GTK+ is slower than ever, especially with the modern GNOME themes (Clearlooks et al.). It’s been only getting slower and slower since 2002 (the move to Cairo was the biggest disaster for performance, but the recent GTK+ themes have really exploited the slowness potential). It is now – in 2008 – so slow that I can see the individual widgets (spin buttons etc.) being slowly drawn to the dialogs, one by one. I can even easily make screenshots of it. Plus, of course, the Metacity window manager is so unbeliavably superslow it’s ridiculous (it can consume the whole CPU just by moving the mouse cursor over the close/maximize buttons, it can only manage to change its title about twice a second, and generally the decoration always appears on the screen with a noticeable delay after the window contents).
Sadly, the GUI performance is a disaster.
You my friend have a graphic card which has no RENDER support. This has nothing to do with GNOME-it has simply to do with the quality of the driver support for your graphics card. For all users of graphic cards even with only minimal RENDER support GNOME via its user of Cairo(which is a RENDER-based drawing API) runs (ie. draws, renders, paints etc.) far, far faster than ever before.
Now it is possible that you are just using a graphics driver which is horribly outdated-but in all likelihood you simply have a graphic card which cannot be adequately accelerated with EXA(the new RENDER-based acceleration driver architecture which is currently replacing the old-school XAA drivers). I use GNOME 2.20 on 4 different machines using 3 different graphics cards and slowness of GNOME/GTK+/CAIRO is something I have yet to encounter- 1) nvidia 6600GT 2) Intel 815 3( Radeon Express 200m. Of these 3 cards both Intel and Radeon (fglrx) have very weak RENDER support-yet I do not experience this tremendous slowness of GNOME/GTK+.
Now stop wasting your breath making pointless accusations about GNOME/GTK+. Inform yourself about the version of the driver you are using and see if there are not newer drivers available. Check out EXA status page at http://www.x.org/wiki/ExaStatus to see what kind of EXA work has been done for your card. If it turns out that your card cannot be accelerated by EXA, and your graphics is not built into the motherboard of a laptop or desktop with no expansions slots, simply fork out 30-40 Euro for a newer graphic card and your concerns will be history.
Look, bud. I have, overall, about 90 users on Gnome. I have console users running with the crappy graphics cards that come in servers. And I have lot’s and lot’s of XDMCP users running remote X. And I have even more running sessions over the WANs using NX. I have my laptop, my umpc, my desktop, and a few of my users have standalone desktop boxes.
Another poster has guessed that your driver does not support render. I will be more blunt and say that your post is full of it. I have never observed the behaviors you describe (with such apparent glee) in the years in which I have had many users using Gnome in diverse rendering environments.
I *have* observed that the redrawing of Mozilla apps is very noticeable slow. But complaints about that should be directed at Mozilla Corp. (Epiphany, which uses gecko but is a Gnome app, redraws very snappily.)
Sounds like you’re using some unaccelerated driver like fbdev, or something is really wrong with your xserver.
I run Gnome 2.20 on a PIII 500 with a crappy ATI rage 128 graphic card and I have none of the problems you describe.
While certainly not as bad as the experience that the parent poster describes, I have used GNOME on a Celeron 800 machine and a P4 1.5 Ghz both with 256 MB of RAM and an old Nvidia Riva TNT2 with 32 Mb of VRAM and I can tell you that it is really slow, feels a lot slower than either KDE or XFCE and yes… Sometimes one can indeed see the GTK+ widgets being drawn one at a time! And it doesn’t matter if I used the Nvidia driver or the OSS nv one, the result was all the same.
In order to see its slowness in action, one needs to be using a sub-1Ghz machine as it is a lot harder to spot on faster machines (There are some profiling tests out there that show that accurately, though). Actually, that was one of the main reasons that drove me and kept me into KDE in the first place: I didn’t see a reason to dump a perfectly good machine and shell out more money buying a brand new one when I could use a full featured DE comfortably on the current one, the other being the excessive dumbing down of the UI.
I don’t think that GNOME should receive all the blame though as I have tried running XFCE as well and although snappier than GNOME as far as desktop responsiveness is concerned, the overall tearing when drawing UI widgets, moving windows around or just scrolling a text field can be perceived as well. And as another poster has already pointed out, Firefox (Gecko) has its share of the blame for UI performance problems, too.
With GTK+ being notoriously slow compared to most modern GUI toolkits and the ongoing efforts to address this problem, I wonder how can most of you guys deny this fact with such a straight face? (That’s not aimed at you specifically, nxsty, it is more of a rhetoric question.)
Edited 2008-03-14 19:26 UTC
Nvidia Riva TNT2 does not support RENDER, either in nv or the NVIDIA propietary drivers. So with a Riva TNT2 you get basically 0 hardware acceleration on a GNOME desktop-sure you get pixmap blits and fills for free, but you have to remember what GTK+ and Cairo are providing: fully anti-alaised internationalized text(Pango) and porter-duff based high quality rendering(Cairo). If GTK+ and Cairo were only delivering what you got with straight Xlibs or FLTK/TCL-TK/Motif etc. you might have a point in saying they are slow. They are slow on hardware incapable of producing high quality modern graphics. The rendering tech used by GNOME is superior in terms of rendered graphics quality to either what that of MS XP/VISTA or MAC OS X. You don’t judge the performace of a modern GUI tool kit based on the performance of a graphics card which hasn’t been manufacutered in almost 10 years.
That still does not explain the terrible GTK+ performance as that card, while old, is not too shabby either. Windows can do it, KDE can do it, why can’t GNOME? What you’re saying is almost the same as the people justifying Vista’s performance (that’s horrible on anything but brand new hardware when pretty much everybody else manage to deliver the same functionality with a lot less hardware).
And again, GTK+’s slow performance is nothing new and has been publicly acknowledged by its developers as a problem hence the ongoing efforts to address it.
This page on GnomeFiles has some interesting discussion about it: http://www.gnomefiles.org/comment.php?soft_id=370
Windows has the advantage that the propietary drivers for Windows are written to accelerate exactly those primitives which Windows uses. That which Windows renders is also far inferior to what GTK+/cairo is rendering in terms of quality. KDE in drawing the Desktop is simply doing far, far less than what GTK+ and Cairo are doing.
What I said does explain why GTK+ and Cairo are slow on your card. Both of these technologies need supported hardware acceleration of RENDER in order to deliver a performant desktop. Your card predates anti-alaised text on Linux, predates the RENDER extension of Xorg.
Only those GUI toolkits based on Xlib primitives of yore are going to be able to deliver passable performance(KDE/QT3, Enlightenment/evas,motif,fltk, Tck-tk). Concievably you could harp on NVIDIA to make a sizeable investment of time and energy to update their drivers to more adequately support a card that has not been in production in nearly 10 years. I also think you would have a hard sell trying to convince GNOME devs to rearchitect GTK+ to run without RENDER and without Cairo. You could also harp on the Xorg developers of nv to either a) really improve the XAA support and tweak the software fallbacks or b) spurn them to improve the new EXA support in nv. Please do check out the latest nouveau driver -supposedly they have full EXA acceleration which means full RENDER acceleration-maybe this can help you:
http://nouveau.freedesktop.org/wiki/FrontPage
And on this page they state:
And your card is a NV05
http://lwn.net/Articles/270830/
Or you can switch to a desktop which does not deliver the high quality rendering which I and many others have come to expect from GNOME. Like I stated elsewhere I do not have these problems using GNOME on my 4 computers using a NVIDIA 6600GT, NVIDIA 5200FX, Intel 815, Radeon Express 200M. All of these graphics cards are between 2 and 6 years old.
I do understand your frustration, I encountered similiar problems back 3 years ago when I ran a LTSP with ancient (pre-1998) thin-clients. But it also isn’t that hard to find a good AGP graphics card with proper EXA support on ebay-probably cost you less than $20.
Sorry, I can’t agree with that. Arthur is the Cairo equivalent for Qt, isn’t it? And I heard it is a tad faster than anything that GTK+ can come up with but provides pretty much the same functionality with the advantage that it will work on different platforms. Zack Rusin ran these famous benchmarks some time ago showing that Qt still has a large advantage over GTK+: http://zrusin.blogspot.com/2006/10/benchmarks.html
Granted, this is nearly 2 yrs old and some people, including some proeminent GNOME developers, have already contested some of its findings and pointed some flaws in his methodology, but it still is a good metric to show that there is still a large gap between the two toolkits.
In my case, it is a matter of perception: Qt apps start faster, draw its widgets in a blink of an eye and perform a little faster than the GTK+ equivalents. Of course, it is harder to tell with my new rig, but after having tried every possible GNOME iteration since 1.4 on slower machines, I don’t think that it is unreasonable to suggest that GTK+ still might be slower than many GUI toolkits with the exception of SWT/Swing.
However, I do appreciate your responses. Not only they were very polite, but they were very insightful about the current state of X and GTK+ architectures and you even provided an advice with regards to my graphics card. Thank you, sir!
No problem, you are welcome.
Just to make clear: Thats why I said KDE/QT3. Nowhere in my comments did I refer to KDE/QT4.x. Interestingly enough QT4 is tying directly into the accelerated EXA primitives for its performance.
A desktop rendered with KDE/QT3 *is* doing a lot less, with much lower quality of rendering than what GTK+ and Cairo is doing. Arthur does indeed look quite promising, and it’s rendering quality will certainly be comparable with Cairo-and from some of the benchmarks we have seen it seems to be quite fast.
Yet Arthur is not directly based on RENDER-and thus not limited by it. A properly supported EXA driver does provide RENDER acceleration (which is an assortment of specific rendering/compositing operations), yet due designing QT4.x directly based on EXA primitives, QT4.x can take advantage of hardware acceleration even in those drivers which do not fully accelerate RENDER.
Remember GTK+/Cairo is paying such a high price on performance(necessitating good hardware acceleration of RENDER in the drivers) because of the use of the RENDER extension.
Rasterman has routinely blasted the RENDER extension as horribly inefficient(to much shuffling around of pixmaps in and out of video memory leading to lots of round-trips with system memory which is horribly slow across the PCI/AGP bus. His(Enlightenment) work in Evas provides incredible high quality rendering on very primitive hardware with all sorts of compositing bling and awesome performance. Evas does not use RENDER, and it is filled with hand-tuned assembly code to eak out every last drop of performance. Which is why Openmoko is abandoning GTK# for Evas now-their emdedded graphics hardware simply cannot properly accelerate RENDER.
I guess it could be put this way: Evas and Arthur are taking advantage of hardware accelerated primitives already supported in the drivers of existing cards(whether XAA(Evas) or EXA(Arthur). RENDER takes advantage of accelerated primitives which SHOULD be supported by the cards, ie. what should a graphic card provide in the way of hardware acceleration operations in order that our deskstops can have first class rendering at their disposal . Also remember this- if it were not for the RENDER extension with all of its problems, we would not now have the EXA drivers which are designed to eak hardware acceleration out of our graphics cards for the purpose of accelerating RENDER, thus deprecating XAA, the old drivers, and even putting pressure of ATI/Intel to provide documentation.
RENDER, designed by Keith Packard, one of the principle and longest serving developers of X, was designed to push the limits of the capabilities of the drivers-pushing our toolkits to abandon XAA which theoretically could accelerate 1,000’s of primitives while in reality barely acelerating only a handful, and often that handfull of primitives were of no benefit to rendering a high quality desktop.
The trick with EXA was identifying a tiny subset of primitives, which when properly implemented could provide almost complete hardware acceleration for all of the primitives which modern toolkits need to make high quality rendered desktops.
I defend RENDER so strongly because it is the current direction of Xorg. Although I really admire Rastermans talent his libraries and methods are not gaining much of any traction inside of Xorg(which is a crying shame! and why all the canvas/animation plans for GTK+ should stop until everyone has read and grasped EVAS!) and thus does not reflect the future of Xorg development.
The friendship of Carl Worth, one of the principal architects of Cairo, with Keith Packard, who wrote RENDER, and Carl’s friendship with Owen Taylor, one of the principle architects of GTK+ probably play an extremely significant role in the issues discussed here. Likewise Zach Russin invested so much time in EXA primarily for Arthur, after all Trolltech paid him to do this work,- RENDER acceleration via EXA was probably not the first and foremost goal, but a fortuitous by product.
The recent Gnome releases show a lot of polishing on the surface, whereas there is a lot of hidden work in the background to make the internals work better. The surface work gives the appearance of lots of activity much better than the behind-the-scenes work. I think KDE suffered from this problem while they were working on a lot of their infrastructure, like kio and kparts, or in the infrastructure overhaul in the last two years. Currently, KDE is doing a lot of surface work through plasma and so its activity is a lot more noticeable.
This Gnome release actually has one big change: gio/gvfs, which is a big hunk of new infrastructure that should make things a lot better in Gnome going forward.
Other Gnome infractrural work being done is an introspection/idl compiler; animation framework (perhaps based on Clutter’s?); generic html widget with XULRunner and Webkit backends; closer collaboration with AWN as a possible replacement for the current panel; closer collaboration with screenlets; and dbus-based replacement for GConf in GLib.
One could look at Pigment and Clutter as experimental next generation toolkit ideas that GTK+ will steal from.
There is greater and greater interest in using the Vala programming language for Gnome applications without the need for a vm based language.
So, there is a lot going on, but it’s a bit under the surface. KDE does a much better job at publicity and keeping its user base informed of what’s going on. For Gnome, Planet Gnome and the desktop-devel list are the best ways to keep abreast, but are not as public as, say, dot.kde.org.
I’ve often wondered why some gnome apps don’t play well with browsing to windows shares. Even window 95 can do it. Once this gets included into ubuntu, i’ll switch without a doubt.
Finally Gnome gets real network transparancy. Does it mean that all Gnome apps can now make changes directly on a remote source (instead of just being able to browse and to drag & drop in Nautilus)?
Regardless, I think Gnome is very late, considering the fact that KDE has had this for a long, LONG time with kioslaves (a true bliss for developers) and that there’s an achitecturally more correct implementation: fuse. Both kioslaves and gfvm are on a library level, which means that kio is restriced to kde apps, and that gfvm doesn’t work with any app that’s not a gnome app. Fuse, however, is on a filesystem level, which means that ALL apps can access the remote source as if it was a local one. Even nano, vi, netbeans and your wine-powered notepad.
No. While both KIO and GVFS obviously have client libraries for application developers, the actual work is done in helper processes, e.g. kioslaves for KIO. The respective client library is the most convenient way to control such a helper process, but it is not required.
As explained above, support for both backend technologies can be built into “foreign” application as well.
One of the implications of this architecture, i.e. separate processes and communication using a protocol, is that either side of the communication channel can be implemented without any code dependencies on the other side.
The result is the same: there never will be any app that will integrate these technologies. The only app I can think of where an attempt has been made is OO.org. Kio will NEVER be available in your wine powered notepad, Enemy Territory or nano. With Fuse, client modification won’t be nessecary, which is a huge plus.
Their choice, good for any competing application that can offer this functionality.
I can’t see why Enemy Territoriy (that’s a game, right?) would need that, but Wine could map VFS as Windows network shares, etc.
I don’t follow FUSE closely so I don’t know on how many platforms it is available already and how it allows applications to detect that it is not a local filesystem (i.e. so they do not use standard IO API which would block). Extended Attributes?
Fuse mounts act as regular filesystem mounts, so they can be used by any application. For example I use SSHFS to mount directories on my servers over FUSE, I can then access them from any KDE, Gnome, CLI or whatever, application.
And with KIOfuse, you can mount any KIO slave with fuse. So actually, what parent is talking about has been possible for years already. It’s just that nobody takes advantage of it.
My understanding is that gvfs will be exposed via fuse.
I dont think that it is integrated yet but it’s on the todo list.
And KIO fuse has been doing this for years already. Just not used a lot, no idea why – though it feels to me that it is just because KIO is perceived to be KDE only technology. I know the GVFS developers thought it was KDE only, while it was and has been for 6 years possible to create non-C++, non-KDE implementations which work together perfectly. They are re-inventing the wheel (again) instead of taking advantage of what’s there – and I believe that’s mostly because of social and political, not technical reasons. Just like with ARTS or DCOP (arts was independent of KDE and better than anything available at the time, DCOP could’ve been implemented in a compatible way as well – but they chose to write DBUS which still isn’t as good as DCOP in some areas).
Sorry for the frustration in the above paragraph, even though it is afaik true I probably shouldn’t have said it. If only because the ‘afaik’ in above sentence.
If you are frustrated, it is better not to respond at all.
During the design phase of GIO, one of the KIO designers was around to give advice. Further, various things have been looked at. Including KIO and FUSE (+good look at what was bad about gnome-vfs).
I don’t know for what reasons KIO wasn’t chosen, but it wasn’t just dismissed.
Hmm, i think D-Bus wants to archive different things than DCOP. Idk about GVFS vs KIO tho.
So .. you think KDE technology is not considered on purpose? Corporate conspiracy?
Btw: Arts was good at the time .. but they would be stuck with it like KDE 3.5 ..
I think choice is only NOT bad if the better solutions survive .. and you say they dont?
Of course they don’t. They never did. Sure, now GStreamer is better than arts. But 7 years ago arts was better than all other linux audio systems combined – yet not chosen by gnome (they chose to stick with ESD, which sucked so hard they used a pre-alpha shitty GStreamer release and were actually happy with it).
The best tech doesn’t win. Come on, why does MS even exist, then? Why is IE the most popular browser? Etc etc etc?
OK, that are a few valid points .. i still dont agree
Microsoft examples dont count .. having a monopoly really helps in getting your stuff out to people and there was a time when IE was the best browser ( people tend to forget coz it has been major suckage for years now )
Arts might not be the best example .. even KDE never seemed to really love it .. idk.
But it seems KDE not really upset about those decisions. This is the first time i hear something like this about DCOP vs DBUS.
Although .. i still think in open source better solutions have a really good chance. Just look how fast distributions integrate good new solutions ( like pulse audio for example or AIGLX ).
I really love KDE4 .. i am running 4.1SVN atm and i like it a lot. For me only Konqueror needs a big plugin DXS thing like addons.mozilla.org ( and more stability ). Maybe using webkit in the long run might be good for webdevelopers cause they just need to test their sites against the 3 major engines.
And you shouldnt be bitter. KDE4 is a major archievemnt, I think nobody denies its technical superiority ( well atm only the “pillars of kde” and QT rule .. but i am sure the rest will follow .. well it has to)
What do you think is the cause for the negligence towards KDE technologies? Is it just Red Hat wanting to do their own thing and having the money to do it?
.. well .. maybe it is a marketing problem on KDEs side. Maybe KDE solutions arent really considered because people just think they are too tighly nit to KDE and stuff and not really independent.. idk.
There definitely are both technical and social reasons KDE technology isn’t used as much as it deserves. And awareness sure is one of them. We never did marketing right, and we probably still don’t. I also believe there sure are some irrational feelings keeping KDE tech from being used. But I have no idea how to solve those issues. Of course we try to do better in the marketing department, and in networking with other people and companies in particular – that’s an area Gnome is years ahead in.
Our community is simply more technology oriented. Personally I hope the move to windows (and Mac) will bring more people good at other things than developing code to the project, and that might turn things around.
And we’re also working in some areas to make KDE technology less dependend on KDE itself. Strigi, for example, is pure C++ – no KDE dependencies whatsoever. But just as with Arts, the simple fact it’s used in KDE seems to prevent it from being used by other projects…
It might have something to do with the fact that ARTS died a long, long time ago-no active maintenance, which is why KDE 4.x is no longer using ARTS and instead using Phonon. GNOME would have been foolish to adapt ARTS, pulseadio is now replacing ESD-which also died a long time ago. No conspiracy against KDE.
It might also have to do with DBUS and DCOP actually being quite similiar and that DBUS was co-written by KDE developers and that the DCOP of KDE 4.x is now a DBUS-based DCOP system-due to their similarity. DCOP was originally written during an intense hack-session by a couple of KDE devs at a major KDE conference about 6 years ago. DBUS has been painstakingly designed to give KDE and GNOME what both projects need. DCOP could never have been used in its current form in the GNOME system, if for no other reason that it does not tie into the Glib main loop. No conspiracy against KDE here.
It might have something to do with the fact that a vfs is an absolutely essential part of any desktop platform, that GNOME-VFS was just short of a total abomination(slow, buggy, crashy, unreliable, crippled etc.) and that GNOME in deciding to abandon GNOME-VFS found themselves in a position where they needed to implement a replacement for it quickly. Personally I think the KIO system has been an incredible feature of KDE, providing features and possibilities which GNOME-VFS was never capable of delivering. Yet when one looks at the design of GIO, one is likely to find a number of similairities, primarily due to the fact that the use cases for KIO and GIO are quite similiar. Yet again however anything which is going to available to the entire GNOME desktop has to available inside the main loop of Glib(at least according to my understanding), so GNOME committing themselves to developing GIO is not some conspiracy against KDE.
Where I do agree with you is that there ought to be desktop-neutral solutions to a lot of these things which would maximize code reuse and simplify creating new applications.
Here is a somewhat lengthy look into one area desperately in need of code sharing and simplification:
OSS reached version 4.0 and went open source shortly after the Linux community had officially ditched OSS. OSS 4.x is not being used by Solaris and the *BSDs and at the same time OSS support for applications is being stripped in the Linux world.
ALSA is powerful but a usability nightmare(dmix) but it provides a solution to a lack of hardware mixing. Prior to OSS version 4.x OSS would give applications exclusive access to /dev/dsp* effectively meaning only 1 app could have access to sound at one time.
The frustration that this caused is but one of many reasons OSS has been banished form Linux land, even though this is no longer the case with 4.x. ALSA has been pursued aggressively to fill in the gap left when OSS was abandoned in Linux.
Now ALSA is not compatible with, nor will it be written for Solaris and the *BSD’s. This means cross-platform applications have to provide access to ALSA and OSS. Pulseaudio was written for Linux and due to its use of Glib it is not as immediately suitable for KDE. Pulseaudio can replace all of the existing Linux audio API’s except Jack-and it was written as a replacement for ESD.
This means that under Linux anything using OSS, ALSA, libao, SDL, OpenAL can simply tie into pulseaudio and things just work. Right now libsydney is being developed by the author of pulseaudio and a members of the KDE community-libsydney will provide what pulseaudio provides for all desktop environments-it will be a FDO standard.
To sum all this up -what does it take to create a audio stack that runs on all desktops, is easy for all apps to connect with, and which runs on all platforms(linux, *BSD, solaris, Windows, MAC OSX). This holy grail is still elusive.
Phonon is simply an abstraction-it ties into a number of different API’s depending on the platform. Yet it provides no core functionality-it doesnt really do anything, whereas Pulseaudio provides good mixing and network transparency for all apps shy of hard-core real time audio stuff, which is reserved for Jack. Gstreamer also plays a role in this, there are backends for Gstreamer in Phonon, in pulseaudio, and plugins in Gstreamer for OSS, ALSA and Jack.
Simplifying this mess is a grand task, and it is not for a lack of people working on it, that it has not happened yet. Remember the levels at work here: only OSS and ALSA actually tie into the hardware at the driver level.
Pulseaudio brings its own real time tricks into software mixing-it sits on top of OSS and/or ALSA-and provides a fundamental audio API which applications can use without having to worry about the drivers. OSS 4.x and ALSA and Pulseaudio, on Linux, all use HAL to automatically find and configure avaiable audio hardware including hotplugged bluetooth headsets-none of this available for BSD, opensolaris, windows or MAC OSX with these systems.
Gstreamer sits on top of the drivers, it can also sit on top of pulseaudio, and it provides convenience API’s for applications to manipulate audio and video. Phonon sits on top existing audio drivers(ALSA,OSS), existing audio servers(Pulseaudio, ESD, NAS, ARTS) and even existing media-frameworks(Gstreamer)-yet it is sufficiently abstract to run on all platforms utilizing whatever is already there.
Now how can all this be solved ? There is so much feature overlap, so many code stacks, so many different API’s-it is truly a labyrinth. What belongs at what level? does per app sound control belong at the drivers level like OSS 4.x, or does it belong at the audio server level like pulseaudio. Why does Gstreamer need to to tie into ALSA and OSS directly ? Whereas Pulseaudio will run on Windows(older versions), can it run on BSD,Solaris and MAC OS X ? Phonon can run on all of these but it does not provide anything except a wrapper to high-level apps that need simple audio support.
How do we define the levels?, which levels are needed? 1) audio driver level (OSS, ALSA). 2)audio server-ie. network transparency, per application audio control, mixing of multiple sound sources and sinks (ESD, ARTS, Pulseaudio). 3) low level media API level for applications (libao, SDL, Gstreamer). 4) high level media API for applications (Phonon) ? How can this all be simplified and maintain backwards compatibility and extent this compatibility across operating systems ?
Imaginary grand solution, ie. Holy Grail:
Create a new project-called Advanced Open Desktop Sound System (AODSS). Talk the ALSA guys/gals and OSS guys/gals into joining the AODSS. Port all of the ALSA drivers to OSS 4.x framework, port all the cool mixing tech and gui stuff from OSS 4.x to ALSA. Now port these two things into one big gigantic FLOSS goulash named AODSS. Kill off ALSA, kill off OSS, replace both with AODSS. AODSS must be 100% compatible with the entire history of OSS and ALSA. Make it possible to run JACK on top of AODSS at the same time without any contention and latency with a new project named Sydney-Pulse (SP).
SP must run on any operating system, it only can speak to AODSS, it must emulate SDL and libao, and ESD, and ARTS. Make all heavy duty audio apps use SP-ie. flash, skype, realaudio, and allow it to be used for all legacy apps which used libao or SDL. And combine it with NAS,and any networked audio stuff left over in Xorg. Adapt Gstreamer to run on windows and MAC OSX-ie. gst-plugins-audio-core, gst-plugins-vista-directaudio. Force all applications not needing hardcore audio, but not only needing trivial audio to use SP.
Create a new FDO standard called Elecnon, replace Phonon, and make it a FDO standard and mandate that anything not already covered has to use this lib if it wants to ever be included in any distribution. Any application which refuses to play nicely with any/or all levels of this new stack will result in a horde of FLOSS activists hunting down the programmers and pelting them with eggs.
The unfortunate part is that the original author or aRts, who now works on libsydney, had extra efforts to keep out certain dependencies without any gain, since some people decided they didn’t want to use shared efforts for media decoding and rather all do it again and again themselves.
Obviously at some point even better solutions appeared, but it is still a shame that the best solution at its time hasn’t got more widespread use.
superstoned is wrong on the D-Bus topic, but your last sentence here is as well. KDE 4.x is only using D-Bus, not a DCOP + D-Bus combination like in the later 3.x series.
No part of the DCOP protocol conflicts with using a GLib mainloop and I am pretty sure that libICE, which DCOP used as the transport layer, can be used from a Glib mainloop driven application just like it can be used from a Qt mainloop driven one.
I don’t think that superstoned wants to imply that there has been a conspiracy, however it is still weird that nobody wrote GLib based client and ioslave base libraries when it became appearant that GNOME-VFS was not gaining traktion as easily and application developers had to wait so long for an easier to use solution.
The Pulseaudio daemon can have any dependencies it would like to have, the don’t impact any application using it.
Beauty of the service approach, see comments above about DCOP, also applies to KIO or GIO.
This is quite an understatement. GStreamer is the actual media handling facility. It does all the decoding, transformation and encoding, all bits “below” basically just pass on the data, at most perform only tiny transformation like adjusting volume.
Hmm, I’d rather phrase that like: it is implemented through existing media-frameworks and therefore automatically “inherits” the respective framework’s capabilities.
Sure the front-end API looks different than any of the frameworks used to implement it, but since it basically just forwards data to the active backend I don’t think it qualifies as a layer in the same sense as the other components, e.g. sound daemons.
It is more like bindings, e.g. GStreamer C++ bindings, just not binding to a specific framework.
Why shouldn’t it? It has processed the data until it is suitable for consumption of the IO system, so why shouldn’t it interact with the IO system right away?
Especially since its design makes this optional.
Thanks for correcting me. It turns out that DCOP has been completely replaced by DBUS in KDE 4.x.
My comments concerning the Glib main loop are based on my readings of various blog posts/mailing lists over the past 2 years. I do not currently program in either KDE or GNOME, so I *could* be quite wrong. QT 4.2 has Glib main loop integration, so KDE libs, dependent on QT can tie into Glib.
Certainly on the protocol level(libICE) there would have been no difficulties-yet I don’t think the protocol is the salient point here. To be able to effectively use KDE stuff in GNOME it is necessary to have both running in the same loop and the same process.
[/q]
DCOP itself would not be impacted either way regarding glib-main-loop integration. My Bad. IF DCOP had been pursued rather than DBUS-it would have meant to support Glib integration in QT and QT developers would have had to approve Glib as a dependency(all of this is moot now: this of course did happen, 3 years later.
#1:
http://lists.kde.org/?l=kde-core-devel&m=110429020127854&w=2
The politics involved made it more difficult to achieve than simply writing DBUS which has now completely replaced DCOP. DBus has been in development for about 2-3 years now–at the time prior to DBus was written, QT did not have Glib integration and KDE Libs dependent on QT would have been unworkable in GNOME apps shy of hacking in such Glib integration. Although some hacks were done to make this possible, only with QT4.2 did such become part of mainstream QT.
#2
#2:http://developer.kde.org/documentation/other/dcop.html
To sum it up if GNOME would have been willing to abandon their GNOME-VFS, and if KDE would have been willing to accept a Glib dependency this all could have happened.
My bad. Glib integration is provided in Pulseaudio, but it does not depend on it.
True.
Thanks for clarifying this. I guess if it is sufficiently transparent my concerns about another level are moot.
I guess it is debatable whether or not we need a audio server. Gstreamer talking directly to the hardware means that there may be conflicts if we are doing networked audio, and makes it more difficult for per-app volume control, which is probably best done on the audio server level.
Eventloop integration is not needed for this since the application and the service are in different processes.
Or to give a probably more obvious example: if and which mainloop is used by a webserver does not have any impact on any web browser, neither does one web browser’s choice of mainloop affect any other, since they interoperate through the TCP/IP as the data transport and HTTP as the protocol.
No, see above.
Implementation details such as which event loop is being used or if one is used at all (could be using threads) of any of the involved programs (e.g. 2 DCOP clients communicating with each other through the DCOP server) does not limit implementation choices for the any of others.
Politics, like the example you quoted, only rear their ugly head when people push the wrong level of sharing, e.g. trying to introduce a shared client library instead of encouraging the implementation of a native one.
True, but using KDE libs just to use DCOP would have been an artificial requirement, a GNOME application would more likely have used a GLib based DCOP library.
This is probably based on a misunderstanding of the KIO architecture, quite like the misunderstanding of DCOP.
Implementation choices of one side of the KIO communication channel, e.g. the KIO slave, does not introduce library dependencies on the other side of the channel, e.g. the KIO client.
You don’t need GLib in KDE just to be able to use KIO from a GNOME application.
anda_skoa,
I did get the point you were making and did grasp the distinction between things requiring main-loop integration (for GNOME),KDE and QT apps, and things which relative to whatever main-loop(either Glib or QT) are “out-of-process”, KIOslaves and DCOP. I appreciate you beating this through my rather thick skull Yet as I was googling around for information about KIO and DCOP I saw no discussion of these things in regards to GNOME except where the issue of main-loop integration was discussed. Although they *are* seperate issues, the discussion of these things took place in a context which coupled the two together.
I suspect that the GNOME devs suffered from a degree of NIH, and saw the adaptation of DCOP and KIO as something they could not stomach. Maybe it is just coincidence that by the time these things were being actively discussed the issue of glib-main-loop integration of QT was already on the table and DBUS was already being written.
I guess if I speculate a little deeper I would draw the conclusion that Miguel’s love for ORBit and Bonobo(the two primary components of GNOME IPC) lead the GNOME guys to look down on the simple “hack” which was DCOP. And although I am not completely sure-I think Bonobo was developed by Eazel who wrote Nautilus, which was by far the most “promising” application for GNOME back then(Evolution, which also used ORBit and Bonobo, being the other “killer” app). IIRC GNOME-VFS also debuted in the 1.x series of GNOME and predates the KIOslaves by 2-3 years.
For what it’s worth ORBit and Bonobo were really nice technologies, at least, if not only, in theory, because they were bug-ridden nightmares in reality-much like GNOME-VFS. Maybe it was loyalty to these “killer” apps ? To this day DBUS usage in GNOME does not offer a fraction of the IPC stuff which Bonobo did(albeit DBUS is now used for whole new levels of IPC and integration which Bonobo never enabled), and I really miss being able to dynamically pull in seperate views in Nautilus. Though I suspect this has far more to do with design choices(and HIG) than with the capabilities of DBUS, which due to its similairity to DCOP could provide something similiar to Kparts for GNOME.
I part ways with part of what you said regarding politics-I don’t think that either KDE or GNOME make any trully neutral design decisions-politics is there from the get go. Take any work going on at FDO in the past years and you will see that politics is almost always at play. Maybe the rule can be thusly summarized-whereever there is a vested interest, there is politics.
I can’t help the feeling that these issues are neither here nor there, and that we are second guessing what perhaps ought to have been, but which never was.
Thanks for helping to clarify the distinctions and weed out my misconceptions. Sometimes I am a bit slow;)
I don’t like this version. I can’t turn off cursor blinking in gnome-terminal.
Quote from bugs.launchpad.net:
I guess that’s what you’re experiencing …
i had to urn off blinking in xfce keyboard settings (i use xfce with gnome-terminal)
GNOME treat artwork like it’s not there, this is why they just slapped that green leaf on at the last minute.
KDE devs take artwork and just use it if they like it, GNOME on the other hand it’s just like trying to get blood from a stone. You would have see more improvements in this release but because of the poor communication and unwillingness to to use original ideas your stuck with the same old stuff.
Artwork maybe a small part of the DE but it’s the first thing people see, they have a poor attitude towards this and thats why it’s been so dull in the past, not much better now.
What are your complaints aside from artwork (which honestly is usually modified by the distro anyways)?
Thats not the point, if you go by what you say then all the other DE’s dont need to do artwork. The default look is what GNOME is marketing and if you look on review sites they DO use the default look.
It’s not just my opinion, I’ve experienced this first hand, it’s just what they are like.
Leaf slapped on last minute? It’s there since 2.20 and wallpaper selection process was well documented on Gnome blogs.
Last minute of 2.20, not many actually liked it but because so little attention was payed to the wallpaper(which was consistently said) then thats what was used.
BTW, the guy who did the clearlooks theme didn’t even like it and practically begged for it to not be default, it just didn’t go well with the theme.
Edited 2008-03-13 15:37 UTC
If people did not like it, it would’ve changed during 2.21. There is a plan to refresh the images though.
IMO I found the image pretty similar to one background I saw for another desktop environment.
How I get modded down for something truthful and yet people can go on about how KDE does things better in a GNOME thread and get modded up.
Its not truthful, its subjective. In other words its just your opinion. I happen to think the default Gnome looks pretty good, especially with compositing turned on to add a couple of drop shadows here and there.
I agree. I really like the gnome theme – it has been the most original theme on linux, maybe until Oxygen came out. Esp the Ubuntu version I think is original, though many ppl don’t like the orange. I must say it gets rather annoying if you work with it for a while, but at first it looks refreshing from all the blue from other themes.
I’m talking about the default wallpaper not the theme(which i like personally). Some people will like the green leaf wallpaper but the point being that it was picked by a few members who happen to like it, not by the people who where supposed to be making one. On top of that the guy who picked it had the old Clearlooks colour(which is was comparing the to the green leaf) and even posted that in the final release shots of 2.20 when he shouldn’t have.
Edited 2008-03-13 20:49 UTC
Never attribute to malice that which is adequately explained by stupidity – Hanlon’s razor
Maybe he just forgot to change the theme color etc. Anyway, pointless discussions about favorite colors make my brain hurt
Agree with that. I’ve made various screenshots where I didn’t realize that I had an old Metacity theme (due to stuff in $HOME vs the system installed theme).