Novell’s Ben Maurer has been reducing memory usage for Mono 1.1.x recently and he now asks for the same extend of optimizations to be done for GTK+ itself. Read more on his blog and the wiki.Also, Gnome seems to require PCs with at least 128 MBs when used with lightweight distros (e.g. on Arch, Gentoo, Slackware you will need a minimum of 110 MBs for a “clean” Gnome desktop, and after loading Firefox or Evolution heavy swapping starts) and at least 192 MBs on more popular distros like Fedora, Mandrake or SuSE.
Windows XP runs better than Gnome with 128 MBs of RAM (requires about 85 MBs for its “clean” desktop) and this means that a lot of cheap PCs, or even business/school PCs everywhere in the world can not run a Gnome desktop effectively or satisfiably. KDE’s memory usage is smaller than Gnome’s too (on my Arch Linux is about 95 MBs).
Some Pango/GTK+ speed optimizations would be nice too.
GTK+ definitely suffers from poor redraw speed and other performance problems, as well as over complexity, such as in their tree view.
However, what annoys me more than anything is that you can’t depend on GTK+ if you want to build cross platform software.
When will the Windows version of GTK+ 2.6 finally be released? After all this time since the official GTK+ 2.6 release, there is still no accesible Windows port of it. GIMP, GAIM etc. are all still using 2.4.
If they call it a cross platform toolkit, why is there no Windows version of the latest release after 2 months!!? They should not even have released 2.6.1 and 2.6.2 before straightening out the Windows version and releaseng it. This way both developers and users would be pleased.
I at least hope the Windows release won’t be buggy and half baked (like most of the 2.4 releases) when it is released. After all this time, they better get it right even if it is quite late. (((((((((.
It’s still a hog, have you tried Fluxbox? BOTH KDE and GNOME ARE HOGS, but it seems we can’t have it any other way, if you need the extra features, you will have to use a hog of a DE.
Isn’t XFCE GTK? Aren’t Gnome and GTK different? XFCE works fine on a laptop of mine that chokes on Gnome.
>BOTH KDE and GNOME ARE HOGS
Yes, but thing is, these hogs can be optimized every now and then. And what I have witnessed from KDE on the last 2 years, its developers HAVE optimized it any way they could. Kudos to the KDE devs! There was a time where Konqueror would need up to 7 seconds to load clean, now it needs 2 or 3 without prelinking (on the same machine). Other aspects of KDE have been optimized as well. I mean, look at the Wal-Mart $199 PCs with 128 MBs of RAM, they ALL run KDE (xandros, lindows, Linare, Lycoris etc).
Gnome requires more memory, and its GUI feels slower. That’s my observation on a number of distros and machines (and not only my observation).
However, it CAN be salvaged. Things like the data-evolution-server, the gconfd-2 server, the gnome-settings-daemon, Nautilus, Metacity and ESPECIALLY that darn thing called gnome-panel (with the terrible architecture where even the SIMPLEST applet there can be requires 7 MBs of RAM), all these things CAN be optimized and in some cases, optimized easily.
If the gnome devs sit down for 2-3 weeks and optimize speed and memory consuption on the above, I would not be surprised to see Gnome come down from 110 MBs of RAM down to 85-90 MBs.
The reason GNOME chews 100+MB of memory is because you have a lot of stuff running with GNOME. You have daemons/services for the various GNOME frameworks for session management, gconf, volume management, etc. You have the panel, nautilus for desktop drawing and files, and any applets.
Looking at GNOME’s memory usage only relates coincidentally to GTK+ memory usage as a framework.
>You have daemons/services
I know. Read my comment above. But thing is, GTK and Gnome are developed by the same people pretty much. If we can get them to optimize one thing, they could optimize the other too.
> XFCE works fine on a laptop of mine that chokes on Gnome.
XFce does not use the gnome daemons, or nautilus or metacity (which are real demons in memory consuption).
I’m all for the concept, but the numbers in the blog don’t seem like they’d really add up to much. I’ve noticed that Windows XP can run decently on 500MHz as long as there’s 192 or 256 MB of RAM, but KDE on the same machine will be much slower, prefering more megahertz than RAM.
Really though, RAM is cheap and it’s getting cheaper, and processing power is getting cheaper too. Plus I think there’s a lot more to be gained by settling on and implementing the future of X.
But it’s all good work.
Yes, it’s cheaper for us here in US and Europe. But many people at third world countries, or poorer countries in general, run on PII/PIII/Athlon/Durons with 128 MBs of RAM. Linux’s strength IS the third world, because there are no patent worries and because it’s cheaper to get than XP (taking piracy aside). That’s why Linux with Gnome/KDE *must* be able to run on such older machines.
I still have an old laptop with 128 MB of RAM and a PIII CPU with 650 MHz and KDE3.3 runs just fine on it, even if I open Konqueror, KMail and XEmacs. I haven’t tried GNOME since 1.4, so I don’t know how it runs on this machine.
But I fully agree, one of the most important goals of the Linux UI’s should be to make them run fast on older computers. A 650 MHz PIII machine with 128 MB RAM is still a pretty fast machine. I have an old HP Gecko with a 60 MHz PA-RISC CPU and 64 MB off RAM. NextStep runs just fine on this machine. Imagine KDE or GNOME on a machine like this. There is still a lof of room for improvements.
GTK is only growing by time and so will memory consumption. It’s no big deal if you stay away from Gnome, that’s the main problem, not GTK itself. I just saw that Gnome 2.10b is in Arch’ Testing repo, maybe it’s time to test it, just to see if it sucks like Eugenia says.
You are right. Making desktop linux an option on older hardware in third world countries is huge in a lot of ways. It’s the ethical thing to do, and on top of that it increases “marketshare” if you can call it that, and brings in more developers, which helps the whole thing along.
I _never_ said that Gnome 2.10b sucks, ok? I just say that it requires lots of memory, more than it should. That does not mean that “it sucks”.
In my experience, gnome 2.10beta runs faster than the old, wich quite surprised me. I am running Arch too but with self-compiled gnome 2.9.91 with the same flag.
Ok, I installed Gnome 2.10b and it’s much snappier than I remember it a year ago. It takes 59mb memory, just running “start gnome-session” from console, so there already has been some optimizations in this version. There is a nasty bug in the menu though, I can’t see the text when hovering on it.
No, it does not take only 59 MBs. Or, do you mean, it takes 59 MBs alone, except the memory Linux/services require for themselves? Eugenia was describing numbers for the whole system, I think, not just for Gnome.
I think the “Finally” part in the title is misleading. Makes it sound like it never got memory optimizations. The GTK developers aren’t as careless as this news article makes them look.
I would have chosen “The future in necessary GTK memory optimizations” or something like that.
Anyway, as you can see in the blog of the very same author referenced here, GTK already got some memory optimizations:
http://codeblogs.ximian.com/blogs/benm/archives/000448.html
I_never_ said that Gnome 2.10b sucks, ok? I just say that it requires lots of memory, more than it should. That does not mean that “it sucks”.
Thanks Eugie. Some of us like Gnome (got to do something with this P4, 512ram when most of my time is web browsing).
59mb is for the whole box, and even Metacity feels pretty fast.
Well, I have my gf using a p2 400 w/ 128 megs ram, on gnome 2.8 and its ok, the interface is pretty smooth, i think its kindof laggy, but neither her nor her grandma notice(partially i think its because how slow 98 was running on it, filled to the brim w/ spyware.)
One thing I wondered, ive always thought it would make sense to adopt some mobile-device software for slower computers. Look at some GPE and Matchbox, they run on PocketPCs, couldn’t something like that be adopted and optionally installed, there might be a slight reduction in functionally, but it would allow someone to select between a high or low memory footprint, my only example of this(probably not the best) would be how windows lets you turn off different visual effects to improve performance. Reasonable idea?
That’s pretty impossible. Can you load gnome-system-monitor, click on the second tab and report from there how much RAM it’s really using?
Why are we always so self-centered? The picture of ourselves dressed in white as sisters of mercy, delivering bowls of hot soup to unending queues of grateful third worlders is patronising as it is depressing. Charity will indeed deliver some old computers to poorer countries, but their statistical penetration will amount to almost nothing.
What will in the end make Linux blow Windows out of the water, what *really* will show as it biggest advantage, is its ability to run on non-PC devices.
Gnome and KDE must slim down so that they run *fine* on a sub-100$ device made by packing a small TFT with a simple motherboard powered by a highly integrated, 300 MHz RISC, a 1Gbit RAM chip and a 1Gbyte flash chip.
Linux enables these devices with a great OS and hundreds of applications at near-zero cost. These things will *sell* by the hundreds of millions in developing countries, once someone makes them, and will give widespread access to IT to huge numbers of people, even outside the missions!
Here you go, it’s down to 57.7mb now. As I said before, this is running when “startx” from console.
[xerxes2@UFU ~]€ free
total used free shared buffers cached
Mem: 254816 226768 28048 0 19068 149980
-/+ buffers/cache: 57720 197096
Swap: 511676 0 511676
You didn’t do what I asked. I asked you to report the memory amount gnome-system-monitor is reporting on the second tab.
BTW, I don’t understand why you said that your machine uses 57 MBs of RAM. From what I see there, it’s using 226 MBs.
57.7mb is what’s used, the other is cached memory by the kernel. I don’t know what this system-monitor is, I don’t have time looking for it either, free shows right.
> Also, Gnome requires at least 128 MBs on lightweight distros (e.g. on Arch, Gentoo, Slackware you will need a minimum of 110 MBs for a “clean” Gnome desktop, and after loading Firefox or Evolution heavy swapping starts) and at least 192 MBs on more popular distros like Fedora, Mandrake or SuSE.
Blahblah Gentoo, arch, etc.
Gnome on Gentoo require the *same* quatity of memory then in Fedora or Mandrake or SuSE. Period.
Stop blahblah and give some proof.
btw, never a benchmark shows that Gentoo/arch/slack/… run fater than Mandrake or Fedora.
You misunderstood, please read more carefully before you reply in such a manner.
I am talking about the OVERALL memory of the “average lightweight distro”, NOT Gnome’s. On Arch Linux, getting to Gnome’s desktop, would require 110 MBs (and that includes all the services I run, X etc). On my Fedora, is close to 160 MBs, as Fedora uses more services.
FYI, the services I run on Arch are:
Pcmcia_cs, network, dbus and hal. That’s it.
> FYI, the services I run on Arch are:
Fine. It’s easy to do the same with Fedora/Mandrake/Suse.
For Fedora, launch system-config-services.
No, sorry, it’s not that easy to do. A LOT of the functionality of Fedora NEED these services, so no matter how much you “lean” it down, it can never be ARch or Gentoo, because it was not meant to be as such.
> On Arch Linux, getting to Gnome’s desktop, would require 110 MBs
How you measure this ?
> On my Fedora, is close to 160 MBs, as Fedora uses more services.
What services ?
All servers (like mysql, apache, ntpd, …) are disabled by default (except sendmail which is needed for crond/atd).
It take 10 secondes to remove unneeded services.
Linux use *swap* for unused programs.
I think if some people here, who complains about the amount of memory, have a look up about how GNOME works. You’ll find that the amount of memory reported vs what is actually used are worlds apart.
With that being said, however, there is always room for improvement. Lets hope that the coders stop dicking around and making up excuses and actually fix the Pango problem which is slowing down the whole damn thing.
Zerxes, FYI, cached memory is placed in resident memory too. And if your Resident memory is full, it will swap out application data and replace it with the cache. Therefor, even cached memory counts.
Try unrar’ing a 4.3GB iso, or md5sum’ing it. All your application data will get swapped out, and the ‘cached’ memory will grow to 90% or so. And it really hurts, because starting or switching apps feels like you’re doing it for the first time. No, even worse than that.
I think a large part of Linux’s “memory issues” can be blamed on how it’s cache works. It might be good for database systems, but on a desktop I really don’t need to have 90% of my RAM filled with an ISO or DVD movie.
I’m running GNOME (via GDM) : with Firefox, XChat and GNOME System monitor, 89/512 of physical memory is used (and I don’t report other non-GNOME/GTK programs running as part of the base system). I’m running a non-customized GNOME : I means everything in the desktop is the default GNOME configuration (a panel at the top, one in the bottom, GNOME menu at the top left, …) except that I have added a weather applet to the top panel and some laucher. There is quite a difference between 89MB and 110MB. Perharps it is because of the use of “-O2 -mcpu=i686 -march=i486” optimizations whichdidn’t unroll loops and such others things that make executable bigger ?
on my gentoo system (p3 800, 384 megs of ram), a freshly booted gnome running nothing but system monitor used 86.6 MB. I opened up a terminal (which obviously bumped that number up a little) and ran free. The values reported by free were very close to the numbers system monitor was reporting..
> A LOT of the functionality of Fedora NEED these services
Which one ?
I have :
atd (you can remove this one if you want)
crond (” ” ” ” ” “… “”)
sendmail (atd and crond need sendmail (or alternative) ; same for all distribution)
httpd (disabled by default)
named (disabled by default)
ntpd (disabled by default)
postgresql (disabled by default)
proftpd (disabled by default)
squid (disabled by default)
xfs
messagebus
haldaemon
syslog
The only requires deamon are :
xfs, messagebus, haldaemon, syslog.
You are playing with numbers. Point is, I have two laptops, both with 128 MBs of RAM (AMD K6-2 300 Mhz, PII-mobile 333 Mhz). I tried Mandrake, Ubuntu, Debian and Arch recently on them. Only Arch (with most services removed) managed to work KINDA ok.
Oh, and I also had this AMD Duron 1.2 GHz with 128 MBs RAM (Linare PC, Linux-compatible). Fedora DIES ON IT. I mean, it was SO SLOW because of memory, I can’t even begin to describe it! I had to use XFce just to make it a bit more bearable (still very slow though — no, that was not CPU bound, it was the memory — problems got away after I upgraded the Duron to 384 MBs of RAM).
It doesn’t matter if it needs 90 or 110 MBs of RAM. What matters is, that I could NOT use these desktops with 128 MBs of RAM and Gnome, it was freaking unusable, while WinXP was just fine.
Point is, user satisfaction on a 128 MBs machine should be better with Gnome. That’s the bottomline.
There was also a gtk-devel meeting today, discussing this very issue.
http://primates.ximian.com/~bmaurer/GIMPNet-%23gtk-devel.log
should eventually be here, but not yet:
http://www.gtk.org/plan/meetings/20050221.txt
Note, for all those talking about drawing speeds, it’s my impression that none of these optimizations will affect drawing speed, only memory usage.
> Only Arch (with most services removed) managed to work KINDA ok.
There is very few things that can explain this :
– utf8 (enabled by default on FC (since RH8.0)).
– pam ?
– xorg configuration (16 bits or 32 bits)
– selinux (very little)
Everythings else come from the same source and probably use the same compiler.
btw, i used FC2 on a 92 Mo system with FC2.
> Point is, user satisfaction on a 128 MBs machine should be better with Gnome.
Yes. But when you say some “troll” try arging.
Gnome applets memory usage is somewhat frightening if system monitor numbers are even anywhere close to reality.
Let’s see.
clock-applet: 17.9 meg
trash-applet: 26.5 meg – which is only a half meg less than xchat
They actually discuss applets in the irc log given above.
So we’ve all heard “premature optimization is the root of all evil”, or something to that effect. Well, the KDE developers don’t exactly subscribe to that and it has worked out pretty well for them. I remember reading an interview with a KDE developer from about 4 years ago where he said that they “always” think about optimization. Gnome 2.x has been out for how long now? It’s time to start optimizing.
I’ve got a gig of ram, but if I was sitting even on a 512 meg box I would be worried about this stuff. And I can only imagine that it’s going to get worse as more and more apps get coded in something like Mono or Python.
<snip>
When will the Windows version of GTK+ 2.6 finally be released? After all this time since the official GTK+ 2.6 release, there is still no accesible Windows port of it. GIMP, GAIM etc. are all still using 2.4.
</snip>
Look around !
* gtk+ is now 2.6.2
* glib is now 2.6.2
Download here:
http://gladewin32.sourceforge.net/
From the above given chat log:
Feb 21 16:36:45 <owen_> Making GNOME pleasant on 256M of memory would benefit quite a few people even in the US. (its’ probably not that bad with smaller sets of mail folders than I have)
A lot of this is firefox, evolution, and OO. Not to say that Gnome/Gtk+ couldn’t use some optimization.
> I think a large part of Linux’s “memory issues”
> can be blamed on how it’s cache works.
Errm. That’s the way cache works in other OSs too. Whenever I end playing Far Cry on w2k, reopening minimized programs is sloooooow and swappy.
Creating an efficient cache policy with no corner cases is hard. You’d need psychic powers to do it perfectly.
I thought I’d do some measurements of mem usage. This is all on my Arch laptop, 192MB RAM. All numbers are from `free`, not including buffers/cache.
Bare system at GDM: 25MB
Bare logged in GNOME session (clock+battery appl, no background image): 53MB
GNOME Sessions with one nautilus window and fresh firefox displaying osnews.com: 70MB
Bare logged in XFCE3 session:27MB
Bare logged in XFCE4 session:32MB
So it looks like GNOME uses around 30-ish MB on background services, panels etc. Or about 2 Firefoxs.
From Eugenia:
>It doesn’t matter if it needs 90 or 110 MBs of RAM. What >matters is, that I could NOT use these desktops with 128 >MBs of RAM and Gnome, it was freaking unusable, while >WinXP was just fine.
I don’t agree with this. I HAD a pentium III with 128MB, and winXP(professinal edition) swaped as hell. Even then only starting it, the desktop took somethink like 1 minute to load. switching between application was completely slow because my machine swaped all the time then. It was possible to use it, yes. It was possible to start word, or even to start diablo, if you agreed to wait 2 minutes.
It was so slow, I couldn’t support it any longer. Fortunately later I got a duron 800Mhz and 256MB of ram, and it felt so fast under XP, even if the processor was much slower!
Now I’m using kde, and I got 146MB of memory used, with preloaded konqueror. I’m using gentoo linux, everything compiled with -O2 and samba support.
WinXP is only usuable with 192 mb of ram or more ! Thought I don’t know if there is a big difference between home edition and professional edition…
So perhaps the problem not only comes from the desktop, but also from the distribution ? one month ago I used Fedora, and it was sooooo slow. It was also slow on every computer I have put it. I believe, they changed some libraries for font rendering, and their mozilla uses pango. It’s sad that these distribution starts so much services we even don’t know about. It’s almost like under windows then … And also if pushing these services off, they still use lot of ram, because their library are compiled with everything. Samba, nfs, support for kde/gnome, cups, java, ldap, pam, apache cups, xineram, some strange module for notebooks you don’t own.
Why should someone need to start apache, nfs, sendmail, or samba server if he doesn’t need them ? No wonder that you desktop feels slow then !
A lot of people reports that gentoo is so fast, because of optimisations. This is not really the fact, optimisation almost do no differences – gcc is sadly still not really good at it. I tried to recompile my distribution with different flags, the only differences were that sometimes, then using too aggressive flags, some programs failed to compile 🙂
btw, using one time a distribution like gentoo, arch, or LFS make some thing clear in your head, and you understand that there is no need for all these services, which commercial distribution starts. And even if pushing these services off, they are still much slower, I allready saw that with fedora, mandrake and suse.
So : If you want a distribution that uses less ram, take one of this lightweight distribution. Distribution like fedora are made for all people in the world, and libraries needs to be bigger because they want to meet the need of everyone.
Howdy all
Just thought I`d throw my two cents in, I recently put Windows XP (service pack 2) on my AMD 1800+ 256 MB RAM box and damn it is slow compared to KDE 3.2.3, 3.3, 3.4 beta and Gnome 2.4.
Gnome has always been laggy on Gentoo but not like XP seems to be, Gentoo/Redhat/Mandrake never burst into swap storms like XP does and all can certainly run multiple applications alot more smoothly.
IMHO both KDE and Gnome really need to stop adding features and concentrate on bug fixes and a little optimization for a short while, this would clear up quite a few problems and give us back our rock stable/ snappy desktops of yore.
I really don’t know… but talking about otimizations, does GTK+ and GNOME use liboil?
i’ve seen win2k and winxp being fast AND slow, all on the SAME machine. and for me it all came down to this: NO antivirus + NO firewall + NO startup progs + NO extra services for winxp = fast, really fast.
In fact, antivirus software seems to be the biggest culprit in performance problems (extremely long boot and login times, etc).
From the log:
“what kind of experience is GNOME today on a system with 128 or 256 MB of ram”
“i’ve never tried myself”
LOL
My guess is most dev’s are using 1GB+ RAM machines and don’t have a clue about reality for many people…
Anyway, nice to know something is beeing done, so keep up!
Cheers
I doubt that GTK or GNOME do much number-crunching or similar things that would benefit from optimised implementations like liboil appears to have.
The only thing I can think of would be image manipulation, which is AFAIK done using imlib, already a highly optimised library.
An good example of the kind of optimisation desktop apps benefit from is something I saw on the kde-optimize list recently. When Kmail checks emails, it does something like this:
for (every mail in folder)
{
KURL url=get_URL(folder);
…. do some processing…
}
as folder is constant, the URL call can be extracted from the loop, saving many cycles:
KURL url=get_URL(folder);
for (every mail in folder)
{
…. do some processing…
}
There are probably hundreds of places where that kind of optimisation could help in KDE, GNOME and GTK; it’s just a matter of taking the time to find them all. The KDE optimise list seems a very worthwhile idea.
i would also add that i’m surprised at the differences in performance (and mem usage) between various distros. the empircal evidence doesn’t match the theoretical defense that “it’s all from the same sources”.
now somewhat off-topic but perhaps related, does prelinking make much of a difference for openoffice? i ask because that’s the only difference i can tell in how it’s offered in MEPIS vs other distros and man, oo.o starts up astondingly fast in mepis! on a PIII 700mhz laptop w/ 4200 rpm hd, it takes about 4-5 secs tops to launch oo.o writer.
back to gtk/gnome, here’s something i’ve noticed: some gtk apps, like oo.o, firefox/mozilla, seem to be snappier (i.e. menu drawing) when run under gnome than kde. but some others like epiphany show no difference. NOTE that this is all on older, slower machines e.g PIII 500, 600, 700 with 256 mb ram. .. it would be nice if someone can measure this. (i would if i knew what tools to use what exactly to trace.. and if i were at all familiar with gui dev ).
.. which brings me to my last point: i think ALL developers shoud be REQUIRED to use an older/slower/less capable machine as their main workstation, while the latest and greatest machines, or even older but relatively modern ones should be relegated to testing ONLY
Gentoo has the fastest GNOME environment I’ve used. I don’t suffer from many of the *slowness* people attribute to GTK+/GNOME. And I have anywhere from 5 to 8 apps open across several workspaces at anytime. In fact, I have several Mono applications running and one Java application (Eclipse) running, and the only unresponsive application is Eclipse. There are occassional 2-3 second swappings here and there, but this doesn’t disrupt the overall user experience. I’m running the 2.10 beta series of GNOME, and it is damn responsive.
PS. I don’t have Evolution installed. Last time I tried it, it was bloody slow. I also use Epiphany instead of Firefox, if that makes any difference.
1.4GHz Athlon with 256MB RAM and 512MB Swap
My blank Gnome has always taken about 80MB. And i find that Gnome 2.8 runs better on a machine with 192MB of RAM than Gnome 2.2 did (I’ve tried both within the last month).
really don’t know… but talking about otimizations, does GTK+ and GNOME use liboil?
May I suggest libwonderoil?
“i think ALL developers shoud be REQUIRED to use an older/slower/less capable machine as their main workstation, while the latest and greatest machines, or even older but relatively modern ones should be relegated to testing ONLY “
So that means you’re willing to pay me compensation for doing that kind of work in my free time, right?
When I run KDE, Gnome, and Xfce on my Thinkpad 600 (300MHz cpu, 228 meg RAM), I get very satisfying performance with all three. Of the three, Xfce is the fastest, and uses the least amount of memory. KDE is next, then Gnome. This is with Mandrake 10 PowerPack. I’ve turned off all unnecessary serverices (I think?).
For memory usage upon basic, clean start up, it’s about 80meg for Xfce, 125 for KDE, and 160 for Gnome.
In each case, I go the system monitor, and look at what service/app is using what memory. With Gnome, there are the huge hogs metacity, pango, and nuatilus, plus both GTK+ and Gnome libs. With Xfce, there is very little (except GTK+ libs), and with KDE it’s DCOP and the QT libs.
Conclusion: Since GTK+ based Xfce is super fast and uses little memory, and while GTK+ itself certainly could use some optimization (QT is more efficient), it’s the Gnome daemons and framework that are the big hogs. Gnome devs should concentrate on either replacing those daemons or completely optimizing them.
Also, as has already been mentioned, Gnome applets are a joke. When running Ubuntu (a very optimized Gnome implementation), I added the weather applet, until I looked at it’s memory usage (something like 30 megs – what a joke). Then it was “bye bye” weather applet. Pity, since it’s a cool applet. So applets are another area in Gnome that very badly need optimization.
Finally, my eMachines PC, which has 1.6GHz cpu and 256 megs RAM, can be rather sluggish with WinXP (even Mandrake on the slower Thinkpad running KDE is snappier), and when running Mepis or Mandrake on the same machine (I’ve set up a tri-boot), both run faster than XP. So I disagree with those that say WinXP is faster and uses less memory than Gnome or KDE on Linux.
You mean you don’t get satisfaction out of optimizing your programs? That’s wierd, I must be old fasioned then. Personally, I find the optimization part the most rewarding. It makes me feel proud. Writing applications isn’t hard these day. Optimizing is. It requires you to use your brains, rather than clicking together a glade app and connecting the signals and stuff.
But I’m probably the exception (most developers are laaaaaaaazy after all, and don’t feel proud anyway… or shouldn’t, anyway…).
Conclusion: Since GTK+ based Xfce is super fast and uses little memory, and while GTK+ itself certainly could use some optimization (QT is more efficient), it’s the Gnome daemons and framework that are the big hogs. Gnome devs should concentrate on either replacing those daemons or completely optimizing them.
I (almost ) totally agree…
but i do find that starting from bellow (GTK+, fontconfig, pango…) is always a very positive thing, because it affects not just gnome, but a lot more (like a huge ammount of apps, MONO, and ofcourse GNOME).
Anyway, those applets are real suckers!!! :[
does the memory usage of applets include shared libraries for each applet?
in other words, all the applets use the same libraries, is it shown as used by each applet?
Ulrich Drepper’s dsohowto document mentions (in chapter 3.9) that shared
libraries should be linked using the –as-needed ld flags in order to avoid unneeded dependencies on other shared objects.
This is not done by default.
To check the number of unused shared libraries dependencies in binaries and .so-s just do
ldd -u -r /usr/bin/* /usr/lib/* /usr/libexec/*
Each unused library adds some startup overhead…
It’s hard to say what the total overhead for the whole GNOME is, somebody
would need to hack one of the build systems that build from scratch to add –as-needed to the linker command line
(or -Wl,–as-needed to gcc)
If you do that, pleas post the results here…
i’ve noticed something and i hope that someone can clear this up. – this i gnome 2.10b running on ubuntu(hoary)/ppc
system monitor shows:
total used mem.: 153Mb (0b swap)
(Evolution, Licq, Sysmon, Firefox, xterm,battery applet, network applet, cpufreq applet, firestarter running)
evolution 2.2 mem.: 79.8Mb
firefox: 58.9
licq: 55.8
nautillus: 42.9
evo.data-server: 41.1
evo.exch.storage: 37.8
evo.alarm notify: 37.1
gnome-panel: 30.7
sys. mon: 28.1
10+ more apps@ ~20Mb
somehow that doesn’t add to the 150Mbs used memory!
now free says that 760Mbs of Ram are used.. is the rest of the ram kinda sleeping?
total | used | free | shared | buffers | cached
Mem: 1034272 | 760032 | 274240 | 0 | 18528 | 587380
-/+ buffers/cache: 154124 | 880148
Swap: 1048568 | 0 | 1048568
AFAIK, the numbers reported for individual applets aren’t that accurate, they include some memory that’s actually shared between the applets and the panel. If you just tried to take all the individual numbers reported and add them up you’d find they don’t match the actual total of used memory at all. They’re definitely bigger than they could be, though.
Are you sure they don’t just run the OO.o quickstarter by default? That uses the same trick as MS Office uses – preloads the entire thing into memory on startup, so when you run it, it appears to load extremely fast. Most distros package it but don’t install or enable it by default.
clock applet claims having 7.8 out of 25.7Mbs shared..RSS is 9.7Mb, but i don’t know what that is…i guess what makes the clock-applet so heavy is the connection to the todo-list and the calendar
To check the number of unused shared libraries dependencies in binaries and .so-s just do
ldd -u -r /usr/bin/* /usr/lib/* /usr/libexec/*
This sounds very interesting !
But The command doesn’t work for me,
>> ldd -u -r /usr/bin/* /usr/lib/* /usr/libexec/*
ldd: unrecognized option `-u’
Try `ldd –help’ for more information.
Can you give me the right command ? Thanks !
Those options are only available in the latest binutils.
“. which brings me to my last point: i think ALL developers shoud be REQUIRED to use an older/slower/less capable machine as their main workstation, while the latest and greatest machines, or even older but relatively modern ones should be relegated to testing ONLY ”
This would solve the problem in a big hurry. The Devs would not put up with slow performance in their development efforts, so they would optimize their own cooking.
Gnome needs more memory
You cant force devs, to use older machines, thats just rediculius.
in the log, there was a boot option for booting Linux with less ram, linux mem=256 i think.
i think, having the ‘framework’ and such optimized is very good. but in general, Linux desktop software in general needs
a little more optimization. KDE, as been optimzing for a little wile now. so this is a good thing and maybe along the way,
evolution, firefox and OO and such will optimize also.
-Nex6
All the numbers from top, ps, etc will *not* add up to what you get for the total ram. Don’t try it, it just won’t work. The problem is that some pages are shared (ie, you don’t need 20 copies of the code for printf).
What is nice is to use the numbers to detect *changes*. “After I added foo.patch the rss went down by 100 kb.” This probably means you cut 100 kb from the system. But other than that, it’s hard to get the numbers to add.
This is only a start. One of the things that I have learned from my experience on Mono is that a dedicated effort at small optimizations — low hanging fruit — can be just as effective as the big-guns-hackers hacking the guts of the system.
That’s code for “no one can realistically use this thing in the real world and no one can even think about migrating to it internally at Novell”.
One of the things that I have learned from my experience on Mono is that a dedicated effort at small optimizations — low hanging fruit — can be just as effective as the big-guns-hackers hacking the guts of the system.
That’s something that really needs to have been done iteratively at regular intervals – for years. Hacking the guts out of the whole system (and it isn’t just GTK) is probably going to be the only answer.
Every time I’ve tried, even stripping things down as far as I can, Windows XP is painful at 128, and not pretty at 256 (especially if you try to do something, or have Word/another word processor and a web browser open at the same time).
FWIW, i have a similar experience with XP. Not that the author of the article doesn’t have a point though but some here clearly overdo it and do not state any benchmarks at all. Interestingly, XFce 4.x -which also uses GTK2-, runs pretty damn well on old i565 Pentium1 and i686 Pentium2 machines with as little as between 64 and 128 MB RAM and with Firefox included. Benchmarks to back that up are not necessary on this site and i don’t have the machines here handy either so i skip those parts.
“Look around !
* gtk+ is now 2.6.2
* glib is now 2.6.2 ”
”
Gtk+/Win32 Development Environment (runtime, devel, docs, glade, etc.) Installer 2.6-rc5 (.exe, 9.34M)
Gtk+/Win32 Runtime Environment Installer 2.6.2-rc1 (.exe, 4.79M)”
These are RELEASE CANDIDATES, not final.
Windows XP runs with greater preceived celerity on my system. This may be because the the XP interface is integrated around the operating system and are designed to compliment each other. Gnome or any X developers will be hard pressed to have specific kernel schedular optimizations in favor of user interfaces in mainline by default.
Gnome and KDE’s poor performance may also be due to their reliance on GCC, though I am not sure, perhaps whatever microsoft compiles their system with is better optimizing?
That’s code for “no one can realistically use this thing in the real world and no one can even think about migrating to it internally at Novell”.
Uhm, people can use this in the real world (cf Beagle, iFolder, minue, MonoDevelop. Or just look at the article above this one :-). Novell has dedicated alot of teams to using it (iFolder, Beagle, other internal apps).
Eventually, managed platform can provide better memory management (because it can be free from fragmentation with a moving GC.
One of the things that I have learned from my experience on Mono is that a dedicated effort at small optimizations — low hanging fruit — can be just as effective as the big-guns-hackers hacking the guts of the system.
I think it is – at the moment – a lot more effective than that, what the big-guns-hackers do.
This is something Trolltech have recognices for a year or so, too. Trolltec have seen, that a lot of Qt-Developer have still used Qt2 instead of Qt3, because Qt3 was slower. So, Trolltech have Qt4 optimized.
I think, that a toolkit have to do the best, for which it is created.
So I think, that at some point it is more important to optimize a toolkit, than integrating more features.
So, Ben Maurer, go on and optimize Mono, Gtk+, … 🙂
Stenley
Uhm, people can use this in the real world (cf Beagle, iFolder, minue, MonoDevelop. Or just look at the article above this one :-). Novell has dedicated alot of teams to using it (iFolder, Beagle, other internal apps).
People using this stuff in a company on a daily basis on machines that are not the latest and greatest in tech is the real world. The developer and general open source user’s world is not that real world. You find that out as more people use your applications. I assume you’re finding that out, which is why I wrote the above.
Eventually, managed platform can provide better memory management (because it can be free from fragmentation with a moving GC.
Tosh. If you’ve already got an environment and applications that are inefficient, and you then decide to run them inside a virtual machine environment (which in itself takes up more memory and adds more overhead to the system) then you’re just making things worse. No GAC in the world is going to solve that problem. I’m sorry, but you’re just arbitrarily throwing .Net terms and hype around that don’t mean anything.
GAC above should be GC, meaning garbage collection. Sorry, my head is full of too many damn .Net buzzwords.