This article is about KDE’s memory usage, but there are comparisons with other solutions for your desktop. Lubos Lunak concludes there is still work to do and there is some lowhanging fruit, but KDE does already a nice job on keeping its memory usage low.
RAM’s cheap. I think the time to care about 50-70MB of memory space to your desktop was 5 years ago, but these days machines default to 512MB of RAM, and even older machines are usually upgraded.
I’ve yet to have trouble running any DE, KDE/Gnome/WMaker, on my laptop with 184MB of System RAM (8MB shared to video).
I usually don’t throw firefox on the poor thing though, that’s just the straw that breaks the donkey’s back (it’s not good enough to be called a camel) .
50-70MB of RAM is rather small. My irssi session is currently using 40MB, but it’s actually doing something useful.
I still don’t understand what KDE and Gnome do that requires my RAM.For that amount of RAM I’d expect them to offer me something to improve my ability to produce.
* Firefox offers me rather good web browsing
* MPD offers me music to simulate my brain
* IDLE offers me python programming
* ratpoison offers me window management
* udev offers me device detection
* mplayer offers me movies
* xbindkeys offers me physical buttons to lauch all my applications
What does KDE offer?
* software buttons to lauch applications
* borders around my applications
* ???
– Jesse McNelis
What does KDE offer?
* software buttons to lauch applications
* borders around my applications
* ???
– Jesse McNelis
What about all the libraries (kdelibs)? Did you forget about that? Plus, taskbar buttons have long since stopped being simple buttons to launch applications. There are loads of there things involved like the desktop pager.
Well, part of it is X, which is probably using a bunch of shared memory to keep things going well.
Then you’ve got a score of libraries which are required for the environment and the applications meant to work well in it. But that’s not that huge.
There’s also the fact that while a process may use 50MB of memory, the OS will probably give it 70MB, or more. There’s always extra free memory in there, sort of a reality of malloc/free.
To load those icons you end up, probably, loading decompression algorithms for every single image type, including code to render svg’s…
KDE offers a wrapper for QT, plus QT itself. Then it’s got communication stuff, and other helper libraries. It ends up wrapping pretty much everything I believe, to the point where it can get applications to work “transparently” over something like sftp://.
Then there’s KDEinit which I believe does some prelinking for kde libs so that your apps start up faster.
As for Gnome though, I really have no clue. But, just for fun, I blame Metacity.
“KDE offers a wrapper for QT, plus QT itself”
No. KDE offers classes built with Qt. There’s no wrapper. Helpers, perhaps.
If they could swap QT out for something different because client applications never make a call to any QT classes then it’s a wrapper.
I’ve never programmed KDE, but the way the applications function it seems to me that many things function beyond what QT does. For example, toolbars always have KDE’s toolbar customization interface. Menu’s are always removable and applyable to KDE’s desktop toolbar (via Mac OS X).
ma_d wrote: “If they could swap QT out for something different because client applications never make a call to any QT classes then it’s a wrapper.”
You obviously don’t understand C++. KDE classes are subclasses from the Qt classes in most cases. So even though you don’t see something like a QWindow in the code, a KWindow is using the members and methods of that Qt class.
> You obviously don’t understand C++. KDE classes are subclasses from the Qt classes in most cases. So even though you don’t see something like a QWindow in the code, a KWindow is using the members and methods of that Qt class.
Just to clarify: While many classes in kdelibs are indeed augmented subclasses of Qt classes, that doesn’t mean applications always need to use them or that applications don’t use many other Qt classes directly (you didn’t contradict that, of course, just wanted to make it clear for the discussion).
In fact, as Qt has taken on more of what kdelibs used to do in version 4, quite a few kdelibs classes are being retired in favor of new Qt4 classes for KDE 4 – shifting the focus of kdelibs a little bit more towards high-level frameworks, perhaps.
What does KDE offer?
* software buttons to lauch applications
* borders around my applications
* ???
Just to add to your list:
* Global spell checking (konqueror, kmail, etc)
* Global password management (konqueror, kmail, kopete, etc)
* Transparent network access to your data in all KDE applications (sftp, ftp, etc)
* Reusable libraries. If you run separate apps, they all load their own libraries for GUI and such. Firefox has XUL, IDLE has Tkinter, mplayer has… something, maybe just their own custom thing, openoffice has their own (big) toolkit. If you replace those with KDE apps, they all use the same libraries. So the overhead for one is large, but as you run more apps, it gets less and less.
* Features. Tons of them. I guess the apps you listed are good enough for you, but I like my software to have features. Amarok instead of MPD, Eric3 instead of IDLE, Kwin instead of ratpoison, and kaffeine or codeine instead of mplayer.
I used to be a big fan of fluxbox, because it was so fast and small, but eventually I made the decision to compromise some resources for an actual user friendly system. And as you see from the article, it doesn’t even cost me much at all.
KDE has a pretty steep memory footprint IMHO. A typical commit charge with a few programs running is about half of my installed physical memory.
At least it doesnt invoke the swap partition often thanks to Linux’s decent low-level memory management. This is in contrast to Windows where once allocated memory is not always freed properly and needs to be deallocated and defragmented with 3rd party utilities.
The situation was even worse in Windows prior to 2000 and XP.
> KDE has a pretty steep memory footprint IMHO.
Did you read the article? Every desktop environment suffers from that problem.
> A typical commit charge with a few programs running is about half of my installed physical memory.
Commit charge != memory used.
> At least it doesnt invoke the swap partition often thanks to Linux’s decent low-level memory management.
Nod, that’s true.
> This is in contrast to Windows where once allocated memory is not always freed properly
Sorry, that’s bull either way you look at it. If you’re referring to leaks in the Windows kernel – no, I’ve rarely even had reason to suspect somthing like that on NT-based kernels, and it usually turned out to be my application’s fault (it’s not Windows’ mistake if my app VirtualAllocs a huge amount of memory but forgets to clean it up, it’s my mistake) – and, like under any halfway decent OS, any memory allocated by a process is automatically (or rather by definition) freed by the kernel on process termination.
> and needs to be deallocated and defragmented with 3rd party utilities.
No, no, no. Third party “memory defragmenters” are snake oil. See my previous comment. In the best possible case “utilities” like this do nothing and take up some memory for themselves (by the simple presence of the process), and in the typical case they start trimming the working sets of other processes causing the kernel to swap needlessly thereby reducing performance greatly.
> The situation was even worse in Windows prior to 2000 and XP.
No argument from me there
Did you read the article? Every desktop environment suffers from that problem.
No I didnt read the article because I already knew that. KDE has a notoriously large memory requirement. It can easily demand 500+ MB with no extra programs running.
Commit charge != memory used.
Correct. Commit charge is the physical and virtual memory availible to the OS. I used it in context to physical memory.
Sorry, that’s bull either way you look at it. If you’re referring to leaks in the Windows kernel – no, I’ve rarely even had reason to suspect somthing like that on NT-based kernels, and it usually turned out to be my application’s fault (it’s not Windows’ mistake if my app VirtualAllocs a huge amount of memory but forgets to clean it up, it’s my mistake) – and, like under any halfway decent OS, any memory allocated by a process is automatically (or rather by definition) freed by the kernel on process termination.
Yes its both. An application is responsible for freeing unused memory blocks during execution and everything at termination. This can be done using the free() function in C or dynamically by garbage collection or reference counting.
The NT kernel also frees memory once a application process is terminated but is not very efficent nor effective in certain scenerios. Such as when a crash occurs or when running poorely written programs. The kernel, however, should still be capable of handling this in theory.
My point is memory management is unequal and varies between OSes. This is why Windows often becomes sluggish after a period of use without a restart.
[i]No, no, no. Third party “memory defragmenters” are snake oil. See my previous comment. In the best possible case “utilities” like this do nothing and take up some memory for themselves (by the simple presence of the process), and in the typical case they start trimming the working sets of other processes causing the kernel to swap needlessly thereby reducing performance greatly.[i]
Its may be true but eventually the swapping ceases and overall performance is restored without resorting to a system restart. At least this has been my experience.
KDE has a notoriously large memory requirement. It can easily demand 500+ MB with no extra programs running.
Oh, come on now. A base KDE system with nothing running only takes 60MB according to the article, and I know I’ve seen it at 65 myself with a Konsole open in Gentoo. Unless by “no extra programs running” you meant that you were running every single KDE app you could find at once.
No I didnt read the article because I already knew that.
I suggest you make the effort then.
KDE has a notoriously large memory requirement.
Notorious? Says who? And notoriously large compared to what, exactly? A command line environment and Vi?
We’re talking about a desktop environment with the applications demanded by people who use desktops, and the infrastructure to support the quality of those applications.
It can easily demand 500+ MB with no extra programs running.
And we’re going to believe you and not the article – for what reason, exactly? The 500 MB figure is absolute bull. You’re going to have to itemise where it came from and give us some output from what you’re measuring it with for it to mean anything to anyone – as the article has done.
Single anecdotes like “KDE consumes all my memory! OMG!” simply don’t wash, and that’s partly why Lubos wrote the article in the first place I would imagine – to allay some of these questions and misconceptions. The Gnome performance initiative, on the other hand, came about simply because far too many people filing bug reports, reviewing it and users and customers were experiencing the same things and there has been some subsequent performance investigation to back that up. I see no evidence of that here with KDE.
I suggest you make the effort then.
I’m to lazy to read. 🙂
Notorious? Says who? And notoriously large compared to what, exactly? A command line environment and Vi?
This is of course distro dependent because SUSE’s KDE is heavier than perhaps Slackware’s.
And we’re going to believe you and not the article – for what reason, exactly? The 500 MB figure is absolute bull. You’re going to have to itemise where it came from and give us some output from what you’re measuring it with for it to mean anything to anyone – as the article has done.
Well just because SUSE’s KDE uses great deal of RAM doesnt mean I dislike it. KDE happens to be my choice desktop. Your hostility is unfounded for and are quick to make asumptions.
But my proof is in KDE System Guard under the Process Table tab. It currently says 82 Processes are running using 581,644 KB memory. Thats what a typical session uses with one Firefox instance running.
Edited 2006-09-14 01:39
You’ve probably got:
All of Gnome’s libraries loaded on top of KDE’s.
100MB for Firefox.
Beagle – is this still a huge memory hog?
Other mono stuff – ZMD?
Tons of extra stuff running in the background that distros like Suse start up.
None of these have anything to do with KDE being bloated. Also, does System Guard show the actual used memory or does it add in caches and buffers as well? If it does, then whatever you run Linux would try and fill up the entire amount of RAM with cached files for performance reasons.
If 41 out of those 82 processes are KDE applications, then all code pages and all shared data pages of qt and kdelibs have been counted 41 times towards that maximum. KSysGuardD does exactly the same silly thing top does (i.e. using /proc/<pid>/status data without explaining that it can be very misleading). And this is just an example – any code or data page shared by more than one process is counted misleadingly by /proc-based utilities. If you had read the article you would have taken that into account.
“My irssi session is currently using 40MB, but it’s actually doing something useful. ”
How your irssi session can use that much ram ?
Exaile, my music player, is written in Python and so, is very memory hungry. It uses 35 MO of ram, with a big library loaded.
There is so much wrong about a stupid irc client that uses so much ram when something like a music player written in god damned python uses less.
More respect with Irssi and Python.
Python isn’t the kind of thing i’d like to have on my DESKTOP.
Mono/c# is a *lot* faster than CPython and is as great as a programming platform. Exaile is, for my taste, the best gnome music player. I wish it was in C or C#.
“What does KDE offer?”
Mmmm… a framework to build applications? Wait, no, what’s the relevancy of that? Mmmm… K3B, KOffice, Konqueror, Amarok, Kopete? Well, f–k, KDE, who needs K3B if there is cdrecord & company? KOffice? Pfff… there’s vim, sc, SQL. Who will need something else? Amarok you said? Hell, no, that monster eats all my memory. Oggdec is more than enough. It’s a nobrainer. Anyway, what’s that Kopete thing?
clearly, you’re one of the ppl Lubos was talking about when he said: “Their users are happy with what they have and don’t care about KDE, or, in the worse case, they badmouth KDE but would never switch anyway. The same way our users are unlikely to switch because Window Maker is not a desktop from our point of view.”
“clearly, you’re one of the ppl Lubos was talking about when he said: “Their users are happy with what they have and don’t care about KDE…”
You probably didn’t note the sarcasm
lol, sure didn’t 😉
* MPD offers me music to simulate my brain
Good point
“I’ve yet to have trouble running any DE, KDE/Gnome/WMaker, on my laptop with 184MB of System RAM (8MB shared to video).”
Hey, I’ve heard such claims a lot so I’ve got to ask, what are your usage patterns? Do you not have a lot of panel applets and such? 3 of my machines have 256MB and two of them share 16MB to their video cards, and they can be alright if I do not do too much, but when I try to do everything I want they are regularly swapping every 10 minutes or application-switching.
I use GNOME, the big disappointment in the article, and what I usually run are the following:
* Firefox or Epiphany (Epiphany ends up saving me around 5MB according to GNOME’s System Monitor).
* GAIM
* Mail Notifier
* applets including clock, notification area, volume, menu, taskbar, network monitor, system monitor, and logout, virtual desktop switcher
* quicklaunchers for 8 applications
* 3 or so terminals
* a couple nautilus windows
I cannot dream of keeping a mail client (the fat evolution or even thunderbird) open or using any word processor other than Abiword without having to go for a jog around the block before my system sorts itself out. I am sure it would help if I had faster RAM, but I think for what I run, 256MB should be enough. It seems to be for KDE 😐
So yah, how do you use your desktop?
“RAM’s cheap. I think the time to care about 50-70MB of memory space to your desktop was 5 years ago, but these days machines default to 512MB of RAM, and even older machines are usually upgraded. ”
You are very narrow minded. There’s use for embedded market, btw.
You don’t want to use something like the monster Gecko (mozilla) on embedded. You want to use KHTML or Opera.
I think it’s great to keep memory requirement low because it’s easier to port software to the embedded market.
So let me get this straight, in your embedded solution, you are running X with a DE? Hmmmm. Not sure what to say here.
Google for ‘maemo’ and ‘olpc’ to see examples of GNOME and X being used in memory restricted systems.
Edited 2006-09-13 13:45
———-RAM’s cheap. I think the time to care about 50-70MB of memory space to your desktop was 5 years ago, but these days machines default to 512MB of RAM, and even older machines are usually upgraded.————
Kind of a weak argument, no?
I know I’m not the only one who……… For as much as I’d *like* to buy more ram because it’s so cheap…. I’ve got plenty of other things I could spend that money on.
Bloat is as much of a problem as is insecure or buggy code is.
There’s no reason I should be forced to upgrade every 6 months because my computer can’t handle my operating system.
What Vista is going to do to computers is just amazing to me. I mean granted, with new features you’re going to have bigger code, but just because “ram is cheap” is no excuse for lazy programming or bad management.
They need to clean that stuff up!
Yes ram is cheap, but how do you think that 50-70mb of data got there? From the hard disk, so that means longer startup times. If the files are spread out that could be 10secs to load 50-70mb with seeking.
KDE has done a very good job of keeping memory requirements low thanks to qt and things like kdeinit. The gnome developers are really working on improving the memory usage, but it should have been like that from the start. Even so, the gnome hackers are still hacking things like cairo and pango/fontconfig which ALSO improve KDE. The new fontconfig is much faster:
http://fontconfig.org/wiki/2_2e4_20release_20notes
Good job to the KDE developers in making a good desktop and keeping some variety other than gnome in the OSS world.
Agreed. The KDE figures look quite impressive there, and hopefully will serve as an incentive to Gnome developers to further improve. So far, I have been quite impressed with the amount of effort Gnome devs have been putting into reducing memory usage and improving optimization—I’d like to think they would take cues from KDE where appropriate.
One thing I think is worth noting is that, while the linked benchmarks seem to have been conducted very fairly, they weren’t really conducted according to typical usage. For example, who really starts up KDE or Gnome from an xterm? I realize that the point was to be as specific as possible about what was being measured, but for typical usage I think it would be worthwhile to run a similar test on, say, Kubuntu versus Ubuntu (and Xubuntu). Since I’m running Edgy at home, I may install kubuntu-desktop and run some similar tests tonight, to compare KDE and Gnome in a more typical, commonly-used environment (incidentally, I still expect KDE to win, although I personally prefer Gnome).
Maybe I’ll report those results tonight (:
Edited 2006-09-13 04:48
KDE does not use Cairo or Pango, it’s enough that Gtk/GNOME is blamed for being slow because of them. Improving them will only bring Gtk/GNOME back to where they had been before they started using Cairo and Pango.
And while Patrick Lam’s work on fontconfig seems to be a result of a bounty funded by Novell (which I actually don’t think really qualifies as “gnome hackers working on fontconfig”), it should be also noted that KDE developers had pointed out these performance problems and this solution to fontconfig developers long time before that. And you can find names of KDE developers in the fontconfig changelog as a result of their patches improving fontconfig performance.
The gnome developers are really working on improving the memory usage, but it should have been like that from the start.
I’m afraid it’s just a rather good example of the fact that Gnome just didn’t have a well worked out architecture before it was started, and what development tools the developers were actually going to use to build it.
I know they say optimising should usually be done after software has been completed, but in software so large as a desktop, a well worked out architecture with separate components (Qt, kdelibs etc.) that have been the subject of optimisation work themselves at various milestones helps absolutely no end, I’m sure.
Trying to optimise a desktop after the event, and trying to work out what is actually happening (a key problem in trying to optimise software) is nigh on impossible in a complex piece of software. A good example is Windows, and over the years Microsoft have had the resources to throw at it to try and organise and optimise it effectively, and the programmers at Microsoft certainly deserve praise. It’s no mean feat. However, Vista tells everyone that they have failed in the long run. It’s looking less and less likely that the Windows codebase can be changed effectively to add new features in a trouble free manner. Regular improvements and commits to SVN, and then building a new version regularly is the way to go here rather than rushing to build a huge new version with ill-defined features and improvements.
In the open source world, KDE and Gnome just do not have the resources to throw at dedicated optimisation work after the horse has bolted from the stable. Efficiency is the key.
Even so, the gnome hackers are still hacking things like cairo and pango/fontconfig which ALSO improve KDE. The new fontconfig is much faster
The Fontconfig work will help everyone I think, and I can remember KDE people like Lubos being involved as well in all that. I don’t believe KDE has a plan to use Cairo at any point (Arthur is a part of Qt) but they could certainly use it at a later date.
I completely agree with you segedunum. The problem really is gtk. The codepath for doing simple operations is too long and this makes it difficult for new developers to get into. On the flip side, KDE uses QT which is C++ and doesn’t force you to learn things like GObject.
Kudos to Trolltech for doing such a great job on QT, but if they had released it under a better license to begin with, gnome would have never been developed.
btw: The biggest problem with Vista is backwards compatibility. Thats what is killing MS.
Well, you can use C++ for Gnome/Gtk apps too, if you want. Inkscape is a good example. I think the ability to use pretty much any language you like is actually one of Gtk’s strengths.
Anyway, the Xfce stats in TFA seem to suggest that Gtk itself is actually pretty good, with regard to memory usage at least. It seems though that the Gnome libs are rather less efficient.
Also, both Abiword and Gnumeric can be compiled to either use Gnome libraries, or just a “plain” Gtk version. I’m not entirely sure what the difference is, but both versions are available in the Ubuntu repositories. I’d guess the latter version is good if you’re using Xfce, but it seems the author wasn’t aware of this.
I completely agree with you segedunum. The problem really is gtk. The codepath for doing simple operations is too long and this makes it difficult for new developers to get into
The problem is not GTK+, as shown by the results of XFCE. You did realise that XFCE then lose because of non GTK apps like OOo, right ?
As for the codepath, I guess that’s a matter of appreciation. I don’t find it too long at all.
On the flip side, KDE uses QT which is C++ and doesn’t force you to learn things like GObject
Then use gtkmm and stop being a C++ zealot, please.
Kudos to Trolltech for doing such a great job on QT, but if they had released it under a better license to begin with, gnome would have never been developed
This I agree with. But at the same time, KDE would not have advanced so much. The competition between KDE and Gnome is a very good thing.
The biggest problem with Vista is all the hacks they put into it, denying every basic maintainability engineering concepts.
I’m afraid it’s just a rather good example of the fact that Gnome just didn’t have a well worked out architecture before it was started, and what development tools the developers were actually going to use to build it.
Uh, the GNOME platform was planned ahead. The development tools of choice are obvious. The basic ideas of the current GNOME platform (CORBA, Bonobo) were thought out even before the 1.0 release. For such a well planned platform, Qt sure has had too many major releases so far.
There were mistakes, of course, but not just on the GNOME side. Your little comparison to Windows is completely irrelevant to both GNOME and KDE.
GNOME matured, and with that the platform is changing. I personally really like where it’s going with Project Ridley, which simplifies it a great deal. It is kind of the opposite of what KDE is doing, which is building a platform on top of a toolkit, with a lot of duplication; GNOME is building the toolkit to be the platform.
Edited 2006-09-13 14:08
> It is kind of the opposite of what KDE is doing, which is building a platform on top of a toolkit, with a lot of duplication; GNOME is building the toolkit to be the platform.
May I ask you to elaborate on that? Because I’m not seeing it. kdelibs duplicates rather little of what Qt is doing; it extends and augments it, and adds several completely own frameworks. In the KDE 4 development cycle, a lot of kdelibs code is being retired in favor of new Qt4 classes that have been newly added to the Qt toolkit, so that kdelibs can concentrate on other tasks.
Heavy code reuse is one of the core tenets of the KDE programming model – have a look at its strong component model which was designed to facilitate this, for example – and duplication is generally avoided unless practical necessity.
Uh, the GNOME platform was planned ahead. The development tools of choice are obvious. The basic ideas of the current GNOME platform (CORBA, Bonobo) were thought out even before the 1.0 release.
And just how successful have the choices of CORBA or Bonobo actually been? I rest my case there. That has all culminated in the perceived need for higher level languages like C# (wrong actually) and the Mono framework.
Your little comparison to Windows is completely irrelevant to both GNOME and KDE.
In what way?
GNOME matured, and with that the platform is changing.
Gnome still has largely the same architecture, with a lot of different libraries and no common programming framework, that it has always had.
I personally really like where it’s going with Project Ridley, which simplifies it a great deal.
Project Ridley at the moment is a wiki page, and is merely an initiative to consolidate functionality into GTK. There’s certainly no programming been done or a list of definites finalised on anything beyond that. What would eventually be Gnome 3 is a huge amount of work, and basically it’s a rewrite and a complete reorganisation of the architecture.
It is kind of the opposite of what KDE is doing, which is building a platform on top of a toolkit, with a lot of duplication;
There’s no duplication there. You need an adequate toolkit and programming framework at the core, and then build your desktop infrastructure on there. You need to build your desktop with the right programming tools tested, optimised and debugged first – you can’t just do it ad-hoc.
I would argue, from what I’ve seen, that Gnome is looking to go the same way as KDE by consolidating as much functionality into GTK as possible.
GNOME is building the toolkit to be the platform.
while(cart)
{
horse();
}
Edited 2006-09-13 21:49
I would argue, from what I’ve seen, that Gnome is looking to go the same way as KDE by consolidating as much functionality into GTK as possible.
I was going to add that in consolidating functionality into different layers, or a core layer of a whole system, that you run into the same issues of coupling and cohesion that every programmer or analyst does ;-). It’s not quite as simple as libgnomeMustDie.
And just how successful have the choices of CORBA or Bonobo actually been? I rest my case there.
Your case was that the GNOME developers didn’t plan ahead, and I showed that your case was bullshit.
The success of CORBA and Bonobo is relative. You could consider it a success if you consider the fact that applications like Evolution have been developed on top of it, and are deployed on enterprises by companies like Sun, Red Hat and Novell. Or you could consider it a failure if you point out that the GNOME developers have developed a better system to replace it.
That has all culminated in the perceived need for higher level languages like C# (wrong actually) and the Mono framework.
So why is Trolltech starting to support Java now? Is there a “perceived” need of a better language than C++, and is that wrong too?
In what way?
Because it has nothing to do with GNOME at all? The Vista project is a failure mainly because Microsoft decided to rewrite major parts of the system, kind of like KDE 4.
Project Ridley at the moment is a wiki page, and is merely an initiative to consolidate functionality into GTK. There’s certainly no programming been done or a list of definites finalised on anything beyond that.
You’re not really keeping up with GNOME then, because you don’t know what Project Ridley is. It’s an ongoing effort, and you can see the results in each new release. GTK+ 2.10 replaces libgnomeprint* for example, which becomes obsolete and will be removed in the next GNOME release.
What would eventually be Gnome 3 is a huge amount of work, and basically it’s a rewrite and a complete reorganisation of the architecture.
Not really, it will simply remove the libraries that are being deprecated in each new release. There won’t be a major rewrite of the GNOME platform; the only parts that need big changes are gnome-vfs and, to a smaller extent, GConf. The rest is basically a clean up of old stuff (removing ORBit and bonobo for example).
So why is Trolltech starting to support Java now? Is there a “perceived” need of a better language than C++
No I would imagine that there is a “perceived” need of java users for a better toolkit 😉
Your case was that the GNOME developers didn’t plan ahead, and I showed that your case was bullshit.
The success of CORBA and Bonobo is relative.
That’s called planning ahead. Just how many applications throughout Gnome used, and merely inherited, Bonobo and CORBA functionality when compared to something like DCOP in KDE? Obviously no one planned ahead as to just how poor a fit CORBA was going to be.
Or you could consider it a failure if you point out that the GNOME developers have developed a better system to replace it.
Well no. I consider it a failure because CORBA in the wider world has been a failure, and Bonobo just wasn’t used and embedded through the desktop infrastructure. Oh, and it’s actually KDE that is now using that new system called D-Bus, rather than just using it in select areas.
So why is Trolltech starting to support Java now?
Because there’s a market for Java development tools, not because something like KDE needs it. It doesn’t mean that KDE, being the size of project and natively compiled that it is, is looking for a main language away from C++.
Is there a “perceived” need of a better language than C++, and is that wrong too?
At a purely desktop level there is a need for a higher level language in KDE, which is why Ruby has been given so much attention.
However, in the case of Gnome the problem is that they’re developing an object oriented piece of software in a language that isn’t object oriented and they just don’t have the toolkit to develop the base of their desktop in, such as the Mono framework, Java classes or Qt toolkit. They need a higher level language, as well as a higher level development framework, to develop their desktop in.
Because it has nothing to do with GNOME at all? The Vista project is a failure mainly because Microsoft decided to rewrite major parts of the system
And just why is Microsoft having to rewrite major parts of the system?
You’re going off on a tangent here. My original argument was that in large software projects it is incredibly hard to do performance optimisation and organisation of it after it has been completely developed. Over the years this has caught Microsoft out, but they have had the resources to panel beat the massive Windows codebase into shape and cracks are now beginning to show. In the case of Gnome, they just don’t have those kind of resources in the way that Federico Mena-Quintero is now bravely looking at many problem areas. A good 95% or more of his problems are actually finding out what’s going on, and what goes where. In the case of KDE they have a fighting chance because of the structure of the system.
I’ve no idea in what way you’re trying to argue otherwise.
…kind of like KDE 4.
Well no actually. The same code still exists, but it is being ported to a new toolkit. Although changes will occur, there isn’t a massive change of structure in the project. kdelibs, kdebase etc. will still exist.
You’re not really keeping up with GNOME then, because you don’t know what Project Ridley is. It’s an ongoing effort, and you can see the results in each new release. GTK+ 2.10 replaces libgnomeprint* for example…
Consolidating existing libraries into GTK doesn’t by itself make up a new major release.
Not really, it will simply remove the libraries that are being deprecated in each new release. There won’t be a major rewrite of the GNOME platform
Well no, I would hope there wouldn’t be. However, it’s pretty clear that if they are going to be able to achieve the things on the Project Topaz page then they’re going to need to restructure many things. It will be interesting just how far they can get before parts do get rewritten, or what parts get written in new technology like Mono eventually.
You’re trying to argue that KDE 4 is a rewrite rather than a port…….
…the only parts that need big changes are gnome-vfs and, to a smaller extent, GConf. The rest is basically a clean up of old stuff (removing ORBit and bonobo for example).
Removing redundant stuff doesn’t add up to a major new version.
Edited 2006-09-14 12:01
Your case was that the GNOME developers didn’t plan ahead, and I showed that your case was bullshit.
The success of CORBA and Bonobo is relative.
That’s called planning ahead. Just how many applications throughout Gnome used, and merely inherited, Bonobo and CORBA functionality when compared to something like DCOP in KDE? Obviously no one planned ahead as to just how poor a fit CORBA was going to be.
Or you could consider it a failure if you point out that the GNOME developers have developed a better system to replace it.
Well no. I consider it a failure because CORBA in the wider world has been a failure, and Bonobo just wasn’t used and embedded through the desktop infrastructure. Oh, and it’s actually KDE that is now using that new system called D-Bus, rather than just using it in select areas.
So why is Trolltech starting to support Java now?
Because there’s a market for Java development tools, not because something like KDE needs it. It doesn’t mean that KDE, being the size of project and natively compiled that it is, is looking for a main language away from C++.
Is there a “perceived” need of a better language than C++, and is that wrong too?
At a purely desktop level there is a need for a higher level language, which is why Ruby has been given so much attention. However, in the case of Gnome the problem is that they’re developing an object oriented piece of software in a language that isn’t object oriented and they just don’t have the toolkit to develop the base of their desktop in, such as the Mono framework, Java classes or Qt toolkit.
Because it has nothing to do with GNOME at all? The Vista project is a failure mainly because Microsoft decided to rewrite major parts of the system
And just why is Microsoft having to rewrite major parts of the system?
You’re going off on a tangent here. My original argument was that in large software projects it is incredibly hard to do performance optimisation and organisation of it after it has been completely developed. Over the years this has caught Microsoft out, but they have had the resources to panel beat the massive Windows codebase into shape and cracks are now beginning to show. In the case of Gnome, they just don’t have those kind of resources in the way that Federico Mena-Quintero is now bravely looking at many problem areas. A good 95% or more of his problems are actually finding out what’s going on, and what goes where. In the case of KDE they have a fighting chance because of the structure of the system.
I’ve no idea in what way you’re trying to argue otherwise.
…kind of like KDE 4.
Well no actually. The same code still exists, but it is being ported to a new toolkit. Although changes will occur, there isn’t a massive change of structure in the project. kdelibs, kdebase etc. will still exist.
You’re not really keeping up with GNOME then, because you don’t know what Project Ridley is. It’s an ongoing effort, and you can see the results in each new release. GTK+ 2.10 replaces libgnomeprint* for example…
Consolidating existing libraries into GTK doesn’t by itself make up a new major release.
Not really, it will simply remove the libraries that are being deprecated in each new release. There won’t be a major rewrite of the GNOME platform
Well no, I would hope there wouldn’t be. However, it’s pretty clear that if they are going to be able to achieve the things on the Project Topaz page then they’re going to need to restructure many things. It will be interesting just how far they can get before parts do get rewritten, or what parts get written in new technology like Mono eventually.
You’re trying to argue that KDE 4 is a rewrite rather than a port…….
…the only parts that need big changes are gnome-vfs and, to a smaller extent, GConf. The rest is basically a clean up of old stuff (removing ORBit and bonobo for example).
Removing redundant stuff doesn’t add up to a major new version.
Obviously no one planned ahead as to just how poor a fit CORBA was going to be.
Uh, it was good enough to get this far. Admittedly it’s not the best solution, but you can’t hold on to your original claim (that you’re now trying to twist to something else).
Oh, and it’s actually KDE that is now using that new system called D-Bus, rather than just using it in select areas.
GNOME is not using D-Bus in selected areas, it’s gradually moving everything to D-Bus. GNOME started taking advantage of D-Bus with the 2.8 release, 2 years ago. For GNOME 2.18 this will continue; for example, a new libpanel-applet API has already been developed. Eventually everything will have moved over.
And KDE doesn’t use D-Bus anymore than GNOME.
Because there’s a market for Java development tools, not because something like KDE needs it. It doesn’t mean that KDE, being the size of project and natively compiled that it is, is looking for a main language away from C++.
GNOME is not looking to replace its main language either.
I won’t bother to reply to the old C vs C++ debate you try to bring up. Suffice to say that I’m very happy at how well GNOME supports several languages, specially C, C++, Python, Perl, C#, Java.
My original argument was that in large software projects it is incredibly hard to do performance optimisation and organisation of it after it has been completely developed.
And it’s much harder to optimize something you haven’t written yet. Cairo is the perfect example of this.
Again, the huge failure that is Vista simply doesn’t compare with the incremental time-based releases of the GNOME project.
A good 95% or more of his problems are actually finding out what’s going on, and what goes where.
I take it you haven’t optimized much code before. KDE suffers from the exact same problems.
Well no actually. The same code still exists, but it is being ported to a new toolkit. Although changes will occur, there isn’t a massive change of structure in the project. kdelibs, kdebase etc. will still exist.
Of course kdelibs and kdebase will still exist, these are nothing more than the package names used to place the platform libraries and the basic desktop programs. That doesn’t mean the code won’t change.
In any case, if it was really just a port it sure seems like an extremely hard one to do. KDE 3.5 was released almost one year ago, and KDE 4 is not even alpha yet; it doesn’t even have a schedule yet. I think this proves my point.
Consolidating existing libraries into GTK doesn’t by itself make up a new major release.
Other than consolidation, improvements to gnome-vfs and GConf, there’s not much more you can expect from a new major release of the GNOME platform. This work is very important: ISV’s will have a much better defined platform to work with.
However, it’s pretty clear that if they are going to be able to achieve the things on the Project Topaz page then they’re going to need to restructure many things. It will be interesting just how far they can get before parts do get rewritten, or what parts get written in new technology like Mono eventually.
The Topaz page is mostly consisted of braindump at this point (my favorite being rewriting GNOME on top of XUL). Major features are being integrated to GNOME every 6 months, whether you call it Topaz or not. Similar to Apple: they don’t need a Mac OS XI to deliver exciting new features.
The KDE 4 buzzwords: Plasma, Solid, Phonon. GNOME 3 won’t have its own “Solid”, because the Novell-developed gnome-volume-manager was already integrated with GNOME 2.8. GNOME 3 won’t have its own “Phonon”, because GStreamer was adopted with GNOME 2.0 (!). GNOME 3 probably won’t have its own Plasma, because frankly I have no idea what that’s supposed to mean (I was looking forward to the weekly updates aseigo was going to provide on it, but it seems he hasn’t had time to do that).
KDE 4 will also have Akonadi. GNOME may even adopt it someday, but for the moment evolution-data-server is already integrated into several desktop components. It will have a new icon theme, and GNOME didn’t hold until Topaz to have one (see the 2.16 release for the first release of the new icon theme).
GNOME 2.18 will have NetworkManager integrated, to make sure networks “Just Work”. Better than Windows, in fact.
This is getting long, so I’ll stop: GNOME Topaz will simply be a cleanup and consolidation of the GNOME 2 platform, once 2.x is ready for it.
Mono will continue to be an exciting platform that can be used to build GNOME applications, just like Python.
Edited 2006-09-14 20:54
Obviously no one planned ahead as to just how poor a fit CORBA was going to be.
Uh, it was good enough to get this far. Admittedly it’s not the best solution
No it was not, clearly shown in that nearly no applications actually bother to use it. And it’s been considered obsolete, and except a Gnome port of kparts(You know, the stuff KDE developers made when they realized CORBA was not fit for the task), no technology has been put forth to replace it.
GNOME is not using D-Bus in selected areas, it’s gradually moving everything to D-Bus. GNOME started taking advantage of D-Bus with the 2.8 release, 2 years ago. For GNOME 2.18 this will continue; for example, a new libpanel-applet API has already been developed. Eventually everything will have moved over.
And KDE doesn’t use D-Bus anymore than GNOME.
Gnome is currently only using D-Bus only in selected areas, mainly interfacing with HAL to do automounting and devicediscovery. Essentially the systembus part of D-Bus. It’s true that current KDE realases use it just as much, and it has had a option to use it for the same stuff since 3.4.
As you say Gnome have started moving to using it in other places, like the libpanel-applet API. But that’s all new functionality since Gnome newer have had a IPC infrastructure. Unlike KDE who have had one in DCOP since the start of the 2.0 series, and have now already moved it full scale to D-Bus. All DCOP enabled applications(all) in the KDE 3 series will use D-Bus in KDE4, compared to that the D-Bus using application in Gnome is miniscule.
And your errors regarding the KDE 4 buzzwords, as you label them, is telling.
The gnome-volume-manager is not equal to Solid, since volume management only are part of what Solid will do. It will also handle all kinds of hardware devices, including Bluethoot, USB etc.
And the Gnomes GStreamer addoption has always been lacking, afterall ESD was still required and handled the desktop sound as late as 2.8(Maybe it still does, I haven’t bothered to check).
Edited 2006-09-14 23:13
No it was not, clearly shown in that nearly no applications actually bother to use it.
The three most important applications on the enterprise desktop are the web browser, e-mail+calendar and office suite. All used bonobo (there was a mozilla-bonobo package, abiword and gnumeric used bonobo, and I don’t have to mention evolution again).
Anyway, we don’t have to go through the reasons kparts and D-COP are better technologies, because that was not the point. Someone claimed that GNOME just adopted CORBA without planning at all, when having something that worked like Bonobo + CORBA was Miguel’s idea even before GNOME was formally created. The implementation of that idea didn’t work optimally and is not exactly easy to use, which is why GNOME decided to replace it.
There are other factors involved. Why it’s certainly useful to have something like Bonobo to integrate office documents for example, viewing a document inside a file manager interface is something bad from an usability point of view, and I’m glad Nautilus moved away from that. I remember a KDE developer with similar thoughts, Matthias Ettrich. It’s an interesting read:
http://blogs.qtdeveloper.net/archives/2005/08/03/some-basic-thought…
Gnome is currently only using D-Bus only in selected areas, mainly interfacing with HAL to do automounting and devicediscovery. Essentially the systembus part of D-Bus.
Several GNOME applications use D-Bus, and several are being ported from ORBit to D-Bus.
> Even so, the gnome hackers are still hacking things like cairo and pango/fontconfig which ALSO improve KDE.
You might want to check out the Fontconfig changelog some time — you’ll find a number of performance and optimization patches by the very same Lubos Lunak who wrote the article discussed in this thread, as well as several other SuSE-employed KDE developers.
KDE isn’t being developed in isolation. The KDE developers spend quite a bit of time on Freedesktop.org and other shared projects to improve the Linux desktop platform stack from top to bottom.
Given that xfce was lower on most tests, including the word processors and the browsers, the conclusion seems a little biased.
The most interesting this is just how much memory linux uses. I wanted to run it on an old laptop 300mhz 64 mg ram and it was just impossibly slow even with xfce if I wanted to use thing like open office or firefox. The computer ran just great with win98 and msoffice or explorer – even if it was a fairly unpleasent experience. I guess these old computers just aren’t the game anymore but it surprised me and was a pity. These figures show exactly why.
I wanted to run it on an old laptop 300mhz 64 mg ram and it was just impossibly slow even with xfce if I wanted to use thing like open office or firefox. The computer ran just great with win98 and msoffice or explorer – even if it was a fairly unpleasent experience. I guess these old computers just aren’t the game anymore but it surprised me and was a pity. These figures show exactly why.
I used Linux 2.4 and Gnome 2 on a 200 MHz Pentium 1 with 80 MB RAM for about a year and a half before upgrading to a 2GHz Athlon XP with 512 MB RAM. Admittedly it was an aggressively optimized Gentoo box (-O3 -funroll-loops -finline-functions -frerun-cse-after-loop -frerun-loop-opt -falign-functions=16 -march=pentium-mmx) because any other distro was too slow.
As I recall, the startup times were as follows: gnome-terminal: 8 seconds, Mozilla: 15 seconds, OO.o v1: about 90 seconds. But after starting, the apps worked well enough – provided I only ran 1-2 at a time.
The thing to note is I had 2 GB virtual memory ;-).
The Windows 9x series is highly optimized, parts of it are written in assembly for speed. Then again I heard Windows 98 won’t work on CPUs faster than 2 GHz ;-).
lol, those are the worst optimizations for a low-mem system i can image… you should just compile everything -Os, all ‘optimizations’ you use increase codebase – making the apps slower in startup and increasing their memory usage… -Os makes disables all size-increasing stuff while keeping the other -O2 options. small and fast, some ppl argue you should even use -Os on large-mem systems cuz most size-increasing optimizations aren’t that good anyway and the smaller size is more important (processor cache is limiting performance as well!).
“The Windows 9x series is highly optimized, parts of it are written in assembly for speed. Then again I heard Windows 98 won’t work on CPUs faster than 2 GHz ;-).”
Writting stuff in assembly is irrelevant when the damn architecture is awful. Ever tried to run regmon, or filemon? The damn thing continuously access some random crap whenever you do anything.
I was never impressed by the performance of windows or any microsoft application I used, or use.
Check out Distrowatch.com. You can search for distros designed to run on old equipment with small amounts of memory. You might want to try Puppy Linux.
Given that xfce was lower on most tests, including the word processors and the browsers, the conclusion seems a little biased
If you had read till the end, you would have understood why this is so.
The most interesting this is just how much memory linux uses
That doesn’t mean anything. Linux is loaded at start and that’s it, your kernel stack won’t grow so much.
I wanted to run it on an old laptop 300mhz 64 mg ram and it was just impossibly slow even with xfce if I wanted to use thing like open office or firefox
Which is exactly what the test says. OOo and Firefox are the culprit.
The computer ran just great with win98 and msoffice or explorer – even if it was a fairly unpleasent experience
Which is a BS comparison. You should have run Win98 with MSOffice and Firefox too, and come back tell us how well it works.
If you want to compare with MSOffice (so I guess you didn’t run any app actually, as I don’t know of an app called MSOffice, I know of Word or Excel) and Explorer, then just launch Nautilus (and Abiword for Word, or Gnumeric for Excel).
I did read to the end but have a different opinion, ro mperhaps draw a different conclusion.
The comparison is not BS because it is a comparison that would be valid for most users who need to use a ‘office’ package that is microsoft compatible. Although there are other great programs the only office package I felt confident would do a good job of converting word documents was openoffice2, which meant that the minimum requirement was openoffice plus xfce, kde, etc.. All I was reflecting on was the fact that I couldn’t get this running on my machine fast enough for it to replace win98 + word, which was pity. I think for lots of average users that is the comparison that actually counts.
I’m a loyal Fedora/KDE user, but I must disagree with the OP.
While KDE runs just fine on all my different workstations and desktops (All with 512MB – 8GB memory), It’ll swap like crazy on my PI 366/256MB laptop rendering the machine unusable.
On the other hand, the same laptop runs Gnome 2.14 just fine.
– Gilboa
Edited 2006-09-13 09:29
It’ll swap like crazy on my PI 366/256MB laptop rendering the machine unusable.
On the other hand, the same laptop runs Gnome 2.14 just fine.
Sorry, but I and a lot of others just won’t be able to believe that. You’re going to have to itemise and benchmark that observation on that machine, and work out what’s going on, for that statement to mean absolutely anything to anyone.
And given the fact that you say you’re using Fedora……..
Edited 2006-09-13 11:33
“Sorry, but I and a lot of others
Rather OT, I don’t like it when people refer to themselves as “we”. Unless these “others” want to join this thread and comment, lets leave it as “I”, OK?
“You’re going to have to itemise and benchmark that observation on that machine, and work out what’s going on, for that statement to mean absolutely anything to anyone.”
Last time I tried running KDE on my Laptop the machine used >128MB of swap.
Even simple tasks such as clicking on kicker caused the machine to I/O itself to death. Currently, I’m running GNOME (+2 x Terminals and gvim * 2) and my swap usage is rather low (~40MB).
This is not a valid benchmark, I’ll agree. I never claimed that my own experience was anything other then, well, my own experience.
“And given the fact that you say you’re using Fedora……..”
Yes, Fedora is GNOME oriented. But I’m using the tailored KDE-RedHat project’s RPMs.
– Gilboa
Edited 2006-09-13 11:50
“Rather OT, I don’t like it when people refer to themselves as “we”. Unless these “others” want to join this thread and comment, lets leave it as “I”, OK?”
I join the comment. So that makes a we.
“Last time I tried running KDE on my Laptop the machine used >128MB of swap.
Even simple tasks such as clicking on kicker caused the machine to I/O itself to death. Currently, I’m running GNOME (+2 x Terminals and gvim * 2) and my swap usage is rather low (~40MB).”
If you are using a Gnome optimized distro, I can understand.
… Fedora is a bit bloated, it’s a known fact, but I’m not using Fedora’s own RPMs, but KDE-RedHat’s RPMs [1].
As I said in my OP, KDE runs just fine on machines with 512MB memory (or above.) You’ll be amazed at how fast KDE runs the machine I’m typing this on. (2 x Opteron 275, 8GB, GF6800 ;-))
This -not- an attack against KDE, far from it. I’m just saying that in my -own- experience (Using both Fedora -and- Slackware), KDE was too slow to be productive on a PII366 with 256MB of RAM. (And DRI)
– Gilboa
1. http://kde-redhat.sourceforge.net/
Oh, go and get slackware…
😉
Had been using Slackware on that machine (with Dropline GNOME) up until a couple of months ago when I got tired of waiting for Slackware 11…
While KDE did run better w/ slackware, It was still unusable due to excessive swapping.
– Gilboa
OK. I seem to require a massive surgery to remove the foot out of my mouth.
In an attempt to produce solid numbers to solidify my claim, I reinstalled KDE and done some memory measurements… and found that:
A. KDE starts just fine -without- going into I/O frenzy.
B. KDE leaves around ~25/256MB free. (GNOME leaves around 34MB, but I took the KDE configuration from a running machine, no laptop optimizations what-so-ever.)
C. As in GNOME, as long as I don’t start massive memory-hogging-applications (OO anyone?), my swap remains unused.
As I said before, I tried using KDE on Fedora (FC4/3.4, FC5/3.5.1) and Slackware (10.1/3.3, 10.2/3.4.3) a couple of times before and I never could get it to work without abusing my swap. I even ran out of swap (~512MB) once.
I sifted in the different change logs (3.5.1 … 3.5.4) and couldn’t find anything about fixing massive memory leaks or QT optimizations. Never the less, KDE works just fine now on 256MB. (Or at least as good as GNOME)
I stand corrected.
I stand corrected.
I stand corrected.
I stand corrected.
I stand corrected.
Kudos to the KDE developers and Trolltech.
– Gilboa “off to surgically remove foot out of his mouth” Davara.
Edited 2006-09-14 13:31
KDE 3 can eat more or less mem, but just by switching to QT4 in KDE 4 will improve things a lot, as seen here: http://dot.kde.org/1061880936/
“To back up these statements Matthias gave some numbers about Qt Designer which was ported to Qt 4 with only the necessary changes to make it compile: The libqt size decreased by 5%, Designer num relocs went down by 30%, mallocs use by 51%, and memory use by 15%. The measured Designer startup time went down by 18%.”
I got this old laptop from shelf. Gateway 366MhZ, 8GB, 128 MB memory.
I successfully installed XP pro and PCLOS on two partitions. XP pro with minimum fancy GUI effects and themes runs perfectly with pagefile+128M combined usage=300MB. It takes more time, 30 sec to start application but once it goes to virtual memory it flies thereafter. No crashing, no panic, no slowdown. I can easily spare extra 30 sec(who can’t) to start application. I can run wireless, WMediaplayer, firefox and simple word file simultaneously.
Come to KDE. ubuntu refused to install as it detected 128M memory. only distros got work on it was PCLOS and knoppix. If I start KDE, with firefox, abiword and amarok it really becomes as slow as molasses. Total memory usage is still around 300M with KDE. But there is something else in KDE or linux that is slowing down multiple applications. i don’t know. Don’t tell me to try another distro like puppy or DSL or tweak kernel. You have to compare it with functionality of XP on same system.
It is cheap mentality to think, buying memory nowdays is cheap.
————————–
Linux is Free only if your time is Worthless.
Edited 2006-09-13 13:36
rakamaka, You want to say how KDE is slow when you load multiple applications and 2 of the 3 apps you choose are huge non-KDE apps.
Ubuntu’s graphical installer needs more memory, but there’s still a text based installer.
You don’t have to tweak a Linux system to perform well on your laptop (although you can), you just have to be a bit more realistic with your choices. In my case, the only OS I ever had to tweak to perform on my laptop is Windows Vista. The point is, if you want to run a current GNU/Linux distribution with modern features you should get modern hardware. If you can’t, there are other choices still available and supported, like CentOS 3.x or 4.x.
Weird.
I got PCLOS .93 to install on a laptop with P2 266, 160megs, 6gig hd, and 2meg video where as Winxp won’t even get to the install screen. And it runs pretty much like you described windows on your machine.
Hmmm..
I suppose the addage YMMV holds true once again.
If I start KDE, with firefox, abiword and amarok it really becomes as slow as molasses. Total memory usage is still around 300M with KDE
Wrong. Total memory usage is still around 300 M with KDE + Gnome.
You didn’t even notice that Abiword is not a KDE app, and yet, a lot of zealots assert people have problems with mixed Gnome and KDE apps.
But there is something else in KDE or linux that is slowing down multiple applications
Perhaps the simple fact that Abiword has to get from VM all the Gnome libs it needs, hence the slow down has nothing to do with KDE, Linux or Gnome, but just your ignorance.
————————–
Windows is expensive even if your time is worthless.
I don’t like it when people refer to themselves as “we”. Unless these “others” want to join this thread and comment, lets leave it as “I”, OK?
Well, you’re not just going to have to convince me but everyone else. There are no benchmarks, no timings and no nothing to back that up.
Last time I tried running KDE on my Laptop the machine used >128MB of swap.
Even simple tasks such as clicking on kicker caused the machine to I/O itself to death.
I’m assuming you have read the article about how difficult it is for anyone to measure actual memory usage? How on Earth did you measure swap usage and how did you know it was KDE causing it?
And the fact that you may have had it ‘I/O itself to death’ means absolutely nothing to anyone. I, and probably many others, have had no such trouble whatsoever and Lubos has created some good benchmarks to back this up. Certainly, there isn’t a deluge of people saying ‘KDE swaps my machine to death!’ as there is people saying ‘Gnome performs poorly’, which has initiated the Gnome performance initiative that is going on.
Yes, Fedora is GNOME oriented. But I’m using the tailored KDE-RedHat project’s RPMs.
It really rather depends on whether anyone else has had the same problems with the Red Hat RPMs, or whether it’s some other problem on your machine causing it. The odds are it isn’t KDE because I don’t have that problem, and as far as I know, no one else has reported a similar problem.
“Well, you’re not just going to have to convince me but everyone else. There are no benchmarks, no timings and no nothing to back that up.”
Your taking my comment way (and do mean *way*) too seriously.
This is not a PhD thesis. It’s a mere observation that requires has a *huge* YMMV on it.
I’m assuming you have read the article about how difficult it is for anyone to measure actual memory usage? How on Earth did you measure swap usage and how did you know it was KDE causing it?
$ cat /proc/meminfo is fine by me.
“And the fact that you may have had it ‘I/O itself to death’ means absolutely nothing to anyone. I, and probably many others, have had no such trouble whatsoever and Lubos has created some good benchmarks to back this up. Certainly, there isn’t a deluge of people saying ‘KDE swaps my machine to death!’ as there is people saying ‘Gnome performs poorly’, which has initiated the Gnome performance initiative that is going on. “
YMMV?
By I/O itself to dead I mean, I open the KDE menu and the machine starts swapping pages for 10 seconds before the menu is displayed, -or-, the machine needs around ~2 minutes to go from GDM login to fully loaded KDE. (~30-40 seconds in GNOME) while generating constant I/O. (Disk activity led anyone?)
“t really rather depends on whether anyone else has had the same problems with the Red Hat RPMs, or whether it’s some other problem on your machine causing it. The odds are it isn’t KDE because I don’t have that problem, and as far as I know, no one else has reported a similar problem.”
As I mentioned a couple of posts later, the machine used to run Slackware/Dropline (9/10/10.1). Slackware/KDE, while running faster then under Fedora, was still, unusable.
– Gilboa
BTW, why must every comment turn into a WWIII?
This is not a PhD thesis. It’s a mere observation that requires has a *huge* YMMV on it.
It doesn’t take a PHD thesis to do some troubleshooting.
It’s a mere observation that requires has a *huge* YMMV on it.
Then it really doesn’t mean a thing.
cat /proc/meminfo is fine by me.
It’s not fine by everyone else.
By I/O itself to dead I mean, I open the KDE menu and the machine starts swapping pages for 10 seconds before the menu is displayed, -or-, the machine needs around ~2 minutes…
Yer, and? It’s not happening to me with a Celeron 500 with 256 MB of RAM, either running Kubuntu or Slackware.
How do you know it is swapping pages? How do you know it is even using swap excessively? How do you even know it is memory related? How do you know it is even KDE that is doing this? You could have something on your machine that is simply incurring a lot of disk activity.
As I mentioned a couple of posts later, the machine used to run Slackware/Dropline (9/10/10.1). Slackware/KDE, while running faster then under Fedora, was still, unusable.
Sorry, but it hasn’t happened to me or a lot of other people.
BTW, why must every comment turn into a WWIII?
I’m just wondering why you’re bothering to post. If you’d got some readings to back that up, or lots of other people had described the symptoms you have described, then there would be something in it and we would have heard about it by now.
As it is that isn’t the case, and your observations are worthless. You’re also assuming an awful lot about your machine using swap excessively, or that it is even related to KDE, when all you’re seeing is the LED light flash on your machine.
Edited 2006-09-14 11:24
http://www.osnews.com/permalink.php?news_id=15820&comment_id=162299
By I/O itself to dead I mean, I open the KDE menu and the machine starts swapping pages for 10 seconds before the menu is displayed, -or-, the machine needs around ~2 minutes to go from GDM login to fully loaded KDE. (~30-40 seconds in GNOME) while generating constant I/O. (Disk activity led anyone?)
As you use GDM you basically have a GNOME running already and then load KDE on top of it. Of course it will consume more memory that way. It’s like starting GNOME from KDM.
I’d like to see KDE memory requirements and footprint compared to MS Windows (2000+XP) with different browsers and office suites..
I know that Windows 2000 can start and run well in 32MB RAM (without office suite started)
http://nexle.dk/daniel/win2000-32mb/
Cheers,
Daniel
I run Win2k on a 64MB box at home, and it’s actually not bad running Firefox as long as I don’t open more than 5-6 tabs. After that, it gets grumpy. 🙂
I’d also like to see KDE compared to eCS.
I know that Windows 2000 can start and run well in 32MB RAM (without office suite started)
You could try it in a distribution like Red Hat Linux 6.1 or 6.2, which was released around the same time as Windows 2000.
Memory would be worth the expense compared to Gnome it is horrible.
KDE is light years ahead of Gnome anyday.
Memory is cheap, technology gets cheaper everyday quicker since the leaps take shorter distances.
KDE is more stable than Gnome, the restarts in Gnome grow very old.
2.6.17-1.2174_FC5 #1 Tue Aug 8 15:30:55 EDT 2006 i686 athlon i386 GNU/Linux
KDE all the way.
KDE has more options plus a cleaner look, less cluttered and easy to administer.
I do not have any memory usage problems with my system, it runs like a charm.
KDE defeats Gnome in all fronts.
I must begin by thanking the author for the work done and sharing it with us. It is this sort of thing that spurs on progress.
The one irrefutable conclusion we may reach from all of this, is that applications take up huge amounts of memory. There have been comments here that “memory is cheap”; but why abuse it? What, precisely is causing all of this bloat?
Well, interestingly enough, I have recently had the dubious privelege of investigating ways to reduce executable size, and found a few startling facts. There is a strange, almost contradictory way applications are produced:
– Procedural programming languages provide mechanisms to group similar code into procedures.
– Procedures are then called from other procedures, in turn called from other procedures ad infinitum in a mainly hierarchical call graph.
– We are told that procedures should be short- the shorter the better, as long as a procedure implements a well defined function.
– The compiler compiles these procedures to one or several object files
– The linker maps procedure calls between objects and external libraries, and links an executable with one or more threads
– The Optimiser [and heres the rub] goes and undoes all the programmer’s hard work.
So what, precisely does the optimiser do that is wrong?
Well, one example, would be inline expansion. It is fact that calling procedures is a collectively expensive operation. Eliminating as many procedure calls as possible has a significant impact on speed. The optimiser can eliminate procedures by effectively copying and pasting the code for a procedure wherever that procedure would otherwise be called.
Bloat vs. speed? Well, One thing the stats as collected by the author does not tell us, is how much memory is consumed by application code, and how much consumed by paragraph alignment of code or data, and how much is consumed by pure data vs. code etc.
It’s a wierd world we live in, master Jack
> But my proof is in KDE System Guard under the
> Process Table tab. It currently says 82 Processes
> are running using 581,644 KB memory. Thats what a
> typical session uses with one Firefox instance
> running.
Ah, ok. So We’ve just discovered where the problem lies
The numbers shown by the Process Table tab just refer to:
memory effectively used by apps + buffers + cache.
In my system with 1G ram with LOTS of applications opened (konqueror with 10 tabs tabs and sites opened at once, kopete, kmail, lots of applets, konsole, …) it says that almost 1G ram is used. That’s because I have 41M of buffers and 744M of cache. So these numbers are quite useless, as Lubos Lunak pointed out…
As usual, in the long run a good programming approach such as that of KDE which aims at maximizing code reuse and cleanliness, shows its results.
Edited 2006-09-14 09:38