Novell and OSNews are sponsoring the memory reduction project led by Novell’s Ben Mauer by providing bounties to developers to help to clean up bloat in GNOME and related programs. If you are a developer and you are interested in some extra cash or prizes by making Gnome more usable on machines with 128 MBs of RAM (very usual configuration in developing countries or even European businesses), please read here. Related post here.
I’d just like to mention that you do not have to be a developer to help out. As mentioned on the blog, one of the most important things we can get are bug reports. If you are able to reproduce a set of steps that sends the RSS of a process through the roof, please do file a bug.
> on machines with 128 MBs of RAM (very usual configuration in developing countries or even European businesses)
Is this serious? Or is this sarcastic? Anyway this is utterly out of reality, I live in a 3rd-world country and computers with 1GB DDR RAM memory are common. There are some with 512MB and still old computers with 256MB RAM, but 128MB, never seen it for a while. Oh, and I doubt European companies are still in the stone age either. C’mon, gimme a break…
This is great to see!
Although not always connected, code size and memory usage sometimes go hand in hand, one example is FLTK (and eFLTK) which is used in the EDE desktop from which I’m typing this comment. Startup times are a flash and it consumes ~11MB of memory (eworkpanel, edewm, and eiconman combined).
I enjoy using Gnome but found myself irrated by the long startup times mostly due to the gaggle of libraries. Once it’s actually up things are usable (provided you’ve got 320+MB of memory).
Hopefully further reducing memory usage will involve better use of already-duplicate memory regions like those mentioned in the bounties, as opposed to moving memory to disk, “we can tile the background into a grid of mallocs, and reduce memory by writing hidded background areas to disk and freeing the region.” (which ofcourse would be a disaster but probably free up some memory
I’m looking forward to a ‘fast and light’ Gnome!
I live in the USA and my linux box has 128 MB of RAM. It got me through all my college CS classes. I know a lot of small businesses that are still running on Pentium II’s – right here in the US. So get your head out of your rear.
My (private, American, college prep) highschool has most of its computers running Windows XP on 128 mb of ram. They have been considering moving some boxes to Novell Linux Desktop, however one of the blockers is that it simply cant run on 128. They are trying to find enough RAM to give a few boxes a boost.
I’d add that this has benefit for users (like myself) who have 1 GB of memory. The memory that’s not used for applications gets used to cache files from the disk. If you are using less memory, you can cache more files, which means your computer is faster and quieter because it has to touch the disk less often.
I’m glad your 3rd world country is rich and can afford such things. As for the rest of us. Low memory usage is a GOOD thing.
Glad Gnome is finally doing something.. perhaps I’ll actually give this next release a go and see if it’s gotten usable. Every release I’ve tried has been horrible.
I must admit tho it’s sorta sad that they have to be sponsored or w/e just so they actually do something about a well known problem. I’m sure there are memory problems in all the programs I use(which are NOT gnome) but I don’t notice it as much as I do when I’m in gnome.
“I must admit tho it’s sorta sad that they have to be sponsored or w/e just so they actually do something about a well known problem.”
Not at all. Up to this point, there has been a huge response from the GNOME community to my posts and bug reports about memory usage. In fact, we have already gotten a few patches.
These bounties exist for the same reason as the other UI bounties: it is not that we are too lazy to program. We are just looking for new people to get involved. As well, it is just plain fun :-).
> I live in the USA and my linux box has 128 MB of RAM
Yeah, why not I remember I sold my old IBM PS1 several years ago, and actually I could have kept it and today I would use DamnSmall Linux
All depends on what you intend, but Gnome on 128MB RAM is unrealistic. I have already installed it with 256MB already and it’s damn slow, the HDD is always busy. It’s a useless waste of time. From my experience, 512MB is a bare minimum for Gnome + an office suite for instance. 128MB is fine if you don’t use a window manager.
What the hell?
LOL
Why not I was looking at recent eBay bids for brand new MacMinis, these guys sell their Mac Minis more expensive than at the retail store.
But then if I use a Mac Mini, although I could, I will NOT use Gnome, thank you Gnome if fine when you have no dinero.
hmmm… I wudn’t say GNOME isn’t usable on 128 MB as of now bcoz I used it for a couple of months after which I upgraded to 256 MB.
I applaud the efforts being made to enable GNOME to run @ low configs too
Umm… they are not talking about new PCs, they are talking about the average person’s PC. I know that I still have PCs running here with 128MB RAM, and my sister and cousin have PCs with only 64MB. My work has only 256MB in most PCs, and 128MB in some older ones.
If we can get the memory usage of GNOME down so that it can run speedily on a 128MB machine (like it could back in the 1.x days), then these machines can be switched to GNOME, and work fine for a couple more years, saving valuable cash. When it comes time to buy another PC, Linux will be the obvious choice.
Supporting older machines is the best path for getting Linux on the desktop.
I’ve run Linux on a computer with 128 Mb of ram for years and it was fine until about two years ago; Linux can still run nicely on a machine with 128 Mb of ram today if you use a lighter window manager like xfce or icewm. KDE and Gnome both use more resources than I would like but since their goals are to be very modern they will require more resources.
To say that Windows runs fine on a machine with 128 Mb of ram but Linux doesn’t isn’t a fair comparison since the real resource consumers are KDE and Gnome.
Supporting older machines is the best path for getting Linux on the desktop.
Getting Linux on the desktop where people make aqaintance with PC’s for the first time,schools,… is even better.A lot of people tend to stick at what they know.
Mozilla/Firefox is where a huge amount of your memory goes to. 256 meg is more reasonable. It’s time to upgrade unless you’re planning on running fluxbox and dillo.
How about the bounty for Pango optimizations? Gtk+ redraw still has issues.
>You should be using Opera 8 anyway. It’s a lightera and it uses less RAM.
Yes, but that doesn’t change anything. Free software community needs lighter, open source browser than Mozilla/Firefox.
Three tabs open in Moz 1.8b, 54 meg in use. A gig will probably be fine.
I am disappointed that whilst the Solaris kernel team does a lot of evangalzing of the new Solaris 10 features such as zones/smf/dtrace/zfs, this doesn’t seem to percolate down to other Sun teams such as the Sun Beijing (Mozilla) or Sun Ireland (JDS). IMHO, Solaris 10 has a lot of the observability tools Novell is building and the JDS team should be taking this opportunity to be showing off what Solaris 10 (dtrace/libumem/p* tools) can do to find out memory leaks etc.
Lighter Gnome helps Sun customers (more Sun Rays hosted on a single server) and if they (Solaris 10 tools) help to get some good fixes in, they can always brag.
End users/developers will think positively of Sun (great goodwill generation).
Good point. Someone needs to do it. I see a lot of Sun Bloggers on OSGalaxy.
Hopefully, optimization and profiling will become a key part of the GNOME development process. I do not think free software lacks good tools for such tasks.
“Is this serious? Or is this sarcastic? Anyway this is utterly out of reality, I live in a 3rd-world country and computers with 1GB DDR RAM memory are common. There are some with 512MB and still old computers with 256MB RAM, but 128MB, never seen it for a while. Oh, and I doubt European companies are still in the stone age either. C’mon, gimme a break…”
So, Mr. Smartypants, let us people living in countries poorer than yours benefit from all the memory optimizations. I live in Brazil, and there are LOTS of computers being selled with 128 Mb RAM. So, even you and the people in your country (where everybody has dual xeons with 2Gb RAM) can get some benefits, as mr. Maurer have pointed out.
It seems that this guy doesn’t have a clue at all… or he thinks that switzerland or Monaco are actually 3rd world countries
I got a pentium II I just installed FreeBSD on today. I was planning on installing Gnome as the desktop. From reading this post, it doesn’t seem like a good idea. What do you guys think, should I use KDE instead?
It will be great for gnome to be a bit lighter on the memory, but gnomes memory usuage is exagerated. For example I recorded 70mb memory used (on the whole computer) for an empty gnome session. Although this is everything compiled -Os
Looking at the bounties, so far the rewards seem a little cheap and out of line with the work that would probably be involved. Yes, I am a programmer so I know what I’m talking about. I also know that to do something properly it takes TIME. The only people who would bother implementing these bounties are developers who are already intimately familiar with the GNOME codebase. No developer I know is dumb enough to learn a whole new set of technologies to fix a bug for 100 bucks. Up the ante and then you might see some fresh fish dive into GNOME tech. This optimization work is long overdue and I do hope these bounties are redeemed soon. Though, unless the fix is immediately obvious I highly doubt one would be eager to put in the work for such a small amount of money. Here’s to hoping some developer still living with his parents knows something about the GNOME libs. Sorry to be cynical, but it’s true. If you truly want outsiders to capture these bounties more resources need to be dedicated. This includes more money but also links and book recommendations to relevant documentation. For example, for the bounties which speak about mmap, there should be links to documentation on how to use mmap efficiently. I know how to program in C, but I know nothing about mmap in particular. The problem descriptions need to be better documented and annotated. When the bounty itself gets more serious, then you might see real developers get serious about solving them. Right now the bounties look like easy bonuses for core GNOME developers only. Get real!
KDE uses less RAM, but both are going to be pretty slow with 160MB. If you really like Gnome better, I’d just stick with it.
>>Is this serious? Or is this sarcastic? Anyway this is utterly out of reality, I live in a 3rd-world country and computers with 1GB DDR RAM
I live in United States and I cannot buy 1GB RAM. LOL. By the way, I come from a third world country, D.R. I wonder what is your country of origin.
I agree. That’s chumpchange for those of us in the west – especially when you add in all the factors of undocumentation, and all the rest – but that might be some incentive for someone in a lesser developed country or kids still in school or for the glory of Gnome…
Money talks though and it won’t hurt for some of these corporations to spread it around a bit more if they want to be serious about the desktop. I think the bounty system is a good thing.
I’m in Canada, and I’ve been running my current system for the last five years with 512 MB PC100 RAM. It’s old an sytle RAM chip but it works just fine for all of those years. My notebook computer only has 256 MB of RAM, and if I were to buy a new notebook, most of them are come with 512 RAM. If you pay top dollar they come with 1GB of RAM.
The award money is too small, so they should only have one or two tasks and combine the money, make it worth while, although maybe to the guys in Russia or China, that is lots of money, their dollars go further in those countries, but one hundred dollars in Canada will hardly buy you a full tank of fuel for your SUV, and a cup of coffie and news paper.
Guess what: you’re absolutely right on this! (I’m also a professional software developer working on large software)
However, there’s this huge proclaimed OSS community that claims software should be free, and then they complain when things aren’t nearly as optimal as they should be, and then they say, “fix it!” expecting that it truly is free. Well, this whole things proves that no matter how “free” something is, everything has a cost. In this case, it will cost someone with more time than sense on their hands to optimize something that won’t get them anything more (considering the amount of work and resources involved) than some token recognition for their contribution to the world in a piece of software that will likely be replaced before their current car is too old to keep around. Of course, if they do too much of this sort of “free” software and don’t attend to life otherwise, they will find themselves stuck with that old car
That’s what the majority of people that claim things should be free don’t seem to comprehend: that nothing is free, and somebody has to be their software savior to make up for their lack of time/money/effort into getting that “free” software. Just because you can make bit copies of something in the matter of seconds doesn’t mean that the creation of the original didn’t take a lot of time/energy/resources/life from the developer. The copiers are just darn fortunate that technology allows perfect replicas of someone’s digital Mona Lisa for casual effort. Actually, comparing the time spent on any meaningfully complex software to the time required for a masterpiece like the Mona Lisa doesn’t put things in the correct perspective at all: the Mona Lisa was much easier to create the original, while harder to create an exact duplicate (paint and all, not using modern photographic or digital methods).
I find it uproariously hilarious that for the longest time, people were whining about how bloated Windows was, how it was such a resource hog, and now the closest equivalent they all gather around makes Windows look like a resource miser (which it isn’t)
[QUOTE]It seems that this guy doesn’t have a clue at all… or he thinks that switzerland or Monaco are actually 3rd world countries [QUOTE]
well, I am from switzerland and i have no money to buy new hardware, that’s why i have an old but usuable computer. i got only 192 mb of PC-100 RAM (a bit is even used for the integrated graphic-thing).
i’d like to try out gnome, so i think its a good think.
this comment is just a prejudice. if you live in switzerland you havent automaticly a lot of money… what a b.s.
s this serious? Or is this sarcastic? Anyway this is utterly out of reality, I live in a 3rd-world country and computers with 1GB DDR RAM memory are common. There are some with 512MB and still old computers with 256MB RAM, but 128MB, never seen it for a while. Oh, and I doubt European companies are still in the stone age either. C’mon, gimme a break…
That’s complete bull, 1GB RAM is common in 3rd world countries? What are you talking about? I come from one too, I worked there at an ISP a few months and their servers were on 256MB, let alone the desktops. Let me tell you how it is in “Third World Countries”, people over there care about the CPU more than anything else, they’ll get the best CPU they can afford and buy a measly amount of RAM and a crap motherboard. More about bragging rights than anything else. And not to talk of offices and businesses, they upgrade their machines very rarely, infact, people who can’t afford new computers buy ex-corporate machines that get retired in Western nations and then get sent by the container full to third world countries for cheap mass sale. And most of those come with 128MB RAM and are either Pentium IIs or IIIs.
It’s not about the damn 128 Megabytes, its about fixing memory leaks. And by fixing that all configuration will benefit.
Improving and checking code is always a good thing.
Why all the trolling about 128 Megabyte configurations? You dont understand what its about.
I am running Gnome 2.8 on Gentoo with an 1300MHz Duron and 256MB RAM, I don’t think it is slow at all, Windows XP is much slower….
Increasing the speed of the Gnome desktop and applications can only do good for the cause, and I’ve expected that this was going to happen.
I’ll try to help with this, but how do I know who’s doing what?
My oh my. So this “bounty” thing is to make gnome skinnier, or to make xfc4 a competition ? The raionale behind this thing is a bit doubtful: for skinny machines icewm (just one example) is just rockingly good (and you don’t need any more than 128 megs for a very nicely configured and fast icewm). It’s just I don’t think e.g. icewm (or the other small and fast ones) are that much financially sponsored but gnome and the like have their incomes from this and that eventually anyways.
Now don’t get me fully wrong, I dont’ see gnome “hunger” reduction as a time uselessly spent, it can do just good for gnome.
I wonder where KDE would be today if it had the same resources, support and hype Gnome receives. Competition is always good, but it seems to me there is far more money invested in making Gnome succeed. It’s a pity since KDE developers are quietly making a technologically superior environment.
Well, I’ve read through the posts and the 3rd world thing and the money amount comes up quite often. Let’s say this: I don’t live in a 3rd world country, still my monthly income (2 degrees and nearing phd) is around $500 and there are worse. Nuff said.
Instead of whining, why don’t you do something about it. There’s nothing stopping anyone from setting up a bounty system for KDE.
One of the bounties:
> Create a test suite for browsing in Nautilus
I always wondered why Nautilus was so buggy, and I guess that explains it. They’ve got to version 2.9 without having a test suite for a major component of the desktop! The test should be one of the first things you write, not the last.
Not whining, just pointing out something that catchs my attention. BTW, I donate money from time to time but obviously I don’t have the money Novell has.
KDE is even more bloated. I run FreeBSD 5.3 on a PIII 1000MHZ, 256 MB SDRAM and it just runs fine. I also have it installed on a PII 400MHZ, 128 MB SDRAM and it runs more slow but still the same as Windows XP would do on such a system.
KDE and GNOME are hughe desktop environments and both bloated , however I prefer to use GNOME over XFCE or Fluxbox [which I use on this laptop].
And although I am not a Linux or KDE fan, if someone looks @ Xandros Linux it will be clear KDE can run veryn well on a PII. 😉
IHMO most of the “lightweight desktops” are *very* ugly. No I don’t care for background and no I don’t care for fancy gadgets. I care for a clean window design and nice small taskbars and icons. 😉
Sure, I wish I had more and its a bit slow at times but it is usable.
My Dell Inspiron only has 128MB for a Fedora Core 3 setup and I would at 256MB it would sing.
It feels pokey but not so much that my wife still questions the need to upgrade.
In fact, the worst part is the slow disk on the laptop more than the lack of mem sometimes.
The only part that is unusable is OpenOffice with the launch taking an extreme amount of time.
But then again I like Abiword and Gnumeric better anyway. Thank god I don’t get many ppt presentations on a daily basis.
🙂
Why Fedora? Take a look at:
http://www.xandros.com/products/home/desktopdlx/dsk_dlx_systemreq.h…
It’s a pity since KDE developers are quietly making a technologically superior environment.
Hopefully one day they can make it usable and elegant too. That will be the real challenge for the elitist, clutter-loving KDE-developers.
ThatJoe User bragging about his third-world country where everyone has enough RAM to recompile the DNA of a hippo while playing Quake 3 appears to be in Brazil from his IP address, for what its worth
He’s also pretty clueless – most business desktops in the UK have 128mb, because its all you need to run Win2000, Office and an antivirus
When I first bought my Pentium 4 Dell desktop machine 3 years ago it came with 128 MB and Windows 2000. Shortly afterwards I updated to win XP. At first my computer was quite responsive-but after a couple of months where I was installing and deinstalling software the machine became slower and slower. At this point I dropped in 256 MB extra memory and the effect was amazing-the machine became usable again. Up to that point I had always been a multi-booter -usually having various versions of windows (win98 for games, win XP for the latest software) and several versions of Linux.
It was at that time that I discovered Gentoo and switched completely. Although I still have a windows XP partition I boot it perhaps 3 times a year now-primarily to help other with their windows problems. At the time I switched to Gentoo GNOME was preparing to go 2.x- eagerly awaiting it I installed each and every GTK+2 program as it was released-finally a nice looking desktop. But it was unbelievably slow-slow because my disk was thrashing like a mad man. In the fall of 2003 I decided to bite the bullet and dropped an addition 512 MB in the machine-bring me to 768 total(had to remove the original 128MB). Finally no more(ie. very little) disk-swapping.
Six months later I upgraded the processor from a 1.3 Ghz Pentium4 Williamette to a 2.4 Ghz Celeron(with a larger cache!) and I was stunned-sure it made a large difference in compilation time-Gentoo being compilation intensive- but I saw no effective change in the speed of GNOME programs-which has been my main desktop for the last couple of years. Thankfully with each iteration of the GNOME desktop, and the successive versins of OO and Mozilla/Firefox the desktop has gotten faster and faster- coupled with very nice performance improvements in the 2.6.x kernel series and my transition to reiser4 I can honestly say I am running a very stable and very fast desktop. Now with xcompmgr running my desktop is completely fluid with no visible redraws-app startup time is negligble and the computer remains responsive regardless of what I have running.
At work I have 3 desktop machines in the room from where I administer our LTSP based media room. These machines are all about 6-8 years old Pentium II/III era 400-600 Mhz. We had hired a new admin who was a Debian freak-so we decided to install a Debian-based disto on one of these machines for him- we chose Ubuntu(warty). I was really amazed – we installed it on a Pentium II running at 400 Mhz ancient IDE drives(Atapi33) and 512MB-and the GNOME desktop was lightening fast.
I was interested in finding out why Ubuntu on that machine was so fast- the other two machines were arguably faster- one had a 600 MHz Pentium III and the other had a 650 Mhz Duron. I had installed Gentoo on both of these machines (distcc on our LTSP server dual athlon-mp w/1Gb memory-man what a dream!). But both of these mchines were much slower than the old PII @ 400Mhz. The difference, aside from distro choice ?- Memory. the other two machines each had 128 MB of memory. (that and the wonderful LDFLAGS hacks that the Ubuntu guys used..)
So I have come to these conclusions :
Windows XP runs fine w/ 128 MB for the first couple of months. After repeated software installs/deinstalls the OS becomes progressively more unresponsive. An uprgrade to 256 MB make a massive difference on Windows XP- far more so than a procesor upgrade.
GNOME runs excellently on 400 MHz machines-provided that one has a large amount of memory >=256 Mb. Sure Windows(XP) will run on slower CPU’S but in my experience Windows is just not usable without at least a 700 Mhz processor-regardless of how much memory.
Honestly I am quite surprised by how much memory GNOME really needs to run good. It hasn’t been a problem for me but most of thes older generation of PC’s don’t have 256 MB or more of memory-and it is getting harder and harder to find older types of memory. In two years it will no longer be possible to upgrade the memory of a lot of these older machines. And there are problably somewhere close to 200 Million of these older machines still in use around the world.
I doubt it will be possible to get GNOME to run good with 128MB of memory- I say this because 256 is already only barely enough. I should qualify this- sure GNOME runs with less than 256- but it ‘feels’ slow because there is so much swapping going on- of course this is dependent upon application usage- but most GNOME users will have Evolution, Epiphany/Firefox/Mozilla or Galeon and OO running-over and above whatever applets are running on their panels. If I have 10 tabs open in one of the Gecko based browsers and have 3 documents open in Ximian-OO and 5 gnome-terminal tabs open while checking my emails on 3 diffent accounts that is one hell of lot of memory.
That said- I think it is a really good goal to shoot for 128 MB usability- this will probably result in wonderful 256 MB performance….
And even though I have far more memory on my main machine than whats being talked about here- I know that these improvements will make my desktop faster and smoother.
On a further note I should mention something:
in 1986 I was running OS-9 level II (Microware) on a 512 KB (!)TRS-80 Color Computer II running at a whopping 2.34 Mhz on Motorala 8/16 bit 6809E. This OS was Linux/Unix like- it had a shell similiar to bash and had 7 virtual terminals (with the same ctrl-alt-Fx keys like in Linux). I ran Starwriter ( a wordstar clone) in one virtual terminal, did Pascal programming in a second and kept track of finances using Starcalc in a third- the system was graphical (with windows and a mouse 640x192x256 colors). And I never had problems with disk swapping becuase I had no hard drive!.
Of course that old machine didn’t support one one thousandth of all the functionality that modern machines take for granted. CD-ROMS were not yet in use in PC’s- DVD’s were still on the drawing board in enginerring backrooms- digitaly stored music was but a dream as was digital video. And of course there was no WWW-no browsers and such. Microware coded the whole damn thing in assembler-it turns out that that was the same OS used in the original space shuttle.
Understand that the date we now how is much richer and uses so much more memory-what with all the graphics, sound and video in modern systems- but I cannot for the life of me understand how inefficient the programming styles haves become- GNOME is written in C with a C implementation of object orientation-it baffles me how a pure C implementation can be orders of magnitude less memory efficient than something like the MFC which encompasses *so* much more.
I don’t program anymore really, occasionly I code something up in pygtk and nowadays I do a lot of BASH scripting- at somepoint along the way the growing complexity involved therein just killed programming for me. I remeber being angry at about the memory usage required by Pascal and how slow the process was-editing, linking, compiling…I wrote 2k assembler programs which now would require somewhat close to 3 MB for similiar functionality.
When will the chronic wasting of RAM and CPU-Cycles end ?
i doubt never… Humans are lazy and tend to forget the past faster than a speeding bullet. So effectivness will not increase i fear.
Will people never learn, that optimized Code runs faster on the fastest machines, too?
I’m so sick of that “fast enough”-attitude. The IT kills itself slowly…
why should i use something bloaty and work getting it ubloated. when i can just take something that was well designed 15 years ago? Namely the free impelementation of the OpenStep specification. That’s GNUstep.
Back in the good ol’ days programmers enjoyed efficient code and valued optimization as a form of art. Today all the CS students are raised on the nipples of the bloatpig known as Java, and once they discover OSS their new peer-group tells them to use Python. These folks don’t know the first thing about efficient code. If you are used to program at such an abstract level you completely lose track of these things.
Of course increasing efficiency by 100% is not worth it if this also increases implementation time by 100%. However most the bloated and slow OSS code is just the result of incompetence. Using XML for config files is a good example. XML is a format for (transfering) business data, it’s bloated and expensive to parse and shouldn’t be used anywhere where preformance is critical and mem is limited. But of course it’s hyped, and corperations like to see it on a resume so the OSS code monkeys just couldn’t resist.
Something like GNOME should run comfortably in 64 MB RAM!
There is a difference between using the power of current PCs to support new sensible features and crappy bloated code.. hopefully the GNOME devs will realize this some day.
I think the issue in OSS is the 80/20 rule: OSS projects seem to be very good at getting the first 80% of a project done, but the remaining 20% (testing, profiling and optimisation etc.) is the bit that really makes a piece of software shine.
Unfortunately, that 20% is also the dullest part. Hence the large number of similar OSS projects all almost good enough; it’s much easier to recruit developers who want to do the interesting bits, rather than help out with the dull 20%.
It would be interesting to see what would happen if GNOME kept the feature freeze on for the next release cycle and spent six months purely testing, bug fixing, profiling and optimising.
there are a few Celeron 400’s left, staggering on with 160-192MB of memory, but the new ones have 512MB because that’s what i specc’ed them with.
256MB is not enough for XP if you expect to run an office-suite and a browser as well, in three months time when Windows has begun the process of strangling itself.
arguably 384MB would be sufficient for a work box, but these PC’s use onboard graphics so 512MB provides enough leg-room for them still to operate comfortably in a year or so’s time.
“It would be interesting to see what would happen if GNOME kept the feature freeze on for the next release cycle and spent six months purely testing, bug fixing, profiling and optimising.”
I’d be all for longer release cycles with feature freezes if it ment more stable, more secure, and more efficent code and it does. Just look at OpenBSD which took a similar approach to rid the base system of bugs and security holes so many years ago.
Guess what – it worked.
“I’d be all for longer release cycles with feature freezes if it ment more stable,…”
Exactly. I would consider stability and lack of bugs the most important feature.
My 15 month old desktop, a Sony Vaio P4 2.66 came with 256 MB RAM. Only a few days ago I added an extra 512 MB. Indeed anybody who says that there is plenty of RAM available is dreaming.
@Dennis: are you sure you had a good look at Xfce4.2? I still prefer KDE, but surely I don’t find Xfce4.2 “very ugly”, on the contrary. I could easily live with it. And BTW RHEL Gnome doesn’t know how to open a text file!
@The MESMERIC: you have gone way over the top, but I agree that Evolution 2.0 was a major disappointment.
Arguing over how much RAM is available is really missing the point. However much you have, using less of it to acheive the same aim is always better. The less RAM used by the desktop, the more is available for the rest of the system, whether as disk cache or for application data. People with gigabytes of RAM presumably bought it in order to run some memory-intensive app, not to run GNOME faster.
Removing redundant memory usage also improves processor cache performance, speeding up the desktop further.
Hello Robert (IP: —.sympatico.ca) Gnome and KDE are a little heavy in their basic requirments, try:
1)Fluxbox
2)IceWM
3)EDE
4)XFce
Those might be a little more inline with your systems specs.
FYI: http://www.plig.org/xwinman
Most of the window managers listed in the above site are in the ports collection. They have links to their home page with screenshots and some reviews. Its a nice link to have booked marked.
cvsup -L g 2 ports-supfile
portsdb -Uu
make install clean
Ok, not infront of my BSD box but you get the hint. 🙂
… I have to disagree with the money prize. First of all I think it is always to good to optimize the memory usage of any program. In the case of a gui manager, it is good because it not only will run better on older machines but it can also be good for newer machines because you’ll have more memory and speed for the programs that run on the gui, so efficiency is certainly welcome!
On the other hand offering cash prizes to boost development seems to me like something out of the open-source and free software world. I think (but take this as a very personal opinion) that providing free software and thus giving easier access to computers to everybody is more than a sufficient reward.
And how much $$ have you all donated to the Gnome project? The odds are:
1) Zip
2) Zero
3) Nothing
4) Zilch
Now you comment about the sum of money.
This can be also view as:
1)Giving people a chance to show case some of their skills and make some money at it.
2)Helping the project that has given you a DE for several years and asked nothing in return.
Am I programmer, no, however, I do help the OSS community by answering questions on various mailing lists. I do what I can to help. Not everthing is based on money.
There are those who use OSS and contribute back and then their are those who just are along for the ride. These would be know as freeloaders.
PS: I am not part of the Gnome mailing lists or associated with the Gnome project. These are my own views and opinions.
Why is paying contributors to Free-as-in-speech software a bad thing? You do realise that Linus, Havoc, Miguel, Ben Goodger etc are all paid to work on Free software? Where do you think all the money that IBM/Sun/Redhat/SuSE spend on Free software goes?
Free-as-in-speech != Free-as-in-beer
> Where do you think all the money that IBM/Sun/Redhat/SuSE
> spend on Free software goes?
Good question. Where does the money go ? I’ve been working for GNOME for a couple of years now and ’till now haven’t seen one buck of it. The money spent is most likely been floating in the boxes of those who already earn enough of it.
by Joe User
“Is this serious? Or is this sarcastic? Anyway this is utterly out of reality, I live in a 3rd-world country and computers with 1GB DDR RAM memory are common. There are some with 512MB and still old computers with 256MB RAM, but 128MB, never seen it for a while. Oh, and I doubt European companies are still in the stone age either. C’mon, gimme a break…”
USA ?
” I’ve been working for GNOME for a couple of years now and ’till now haven’t seen one buck of it.”
Have you tried applying for a job with the companies, rather than just waiting for them to send you money?
“Arguing over how much RAM is available is really missing the point. However much you have, using less of it to acheive the same aim is always better. The less RAM used by the desktop, the more is available for the rest of the system, whether as disk cache or for application data.”
I agree. What I was trying to say is people, generally speaking, don’t have RAM to waste.
…but, I have to say that running gnome on 128 megs is not that bad. I have a desktop wtih 512megs of ram, and a laptop with 128megs. Sure the laptop is slower then the desktop, but its not much. Both run the same version os Ubuntu, and the boot into gnome with in 3 or 4 seconds of eachother… Menus open just as quickly, even with several programs open across 3 or 4 desktops. OpenOffice takes a few seconds longer to load, and but firefox is more or less equal. Considersing the slower processor in the laptop, I would say that genome is pretty good at dealing with low memery sytems.
Something that bugs me:
a) These things should be a matter for the upcoming GNOME 3.
b) While bounties are nice, people seem to forget that GNOME is community driven and that you should have feedback with the maintainer of the modules first. Those are the ones who shape the certain tool/programm and those who decide what gets in and what not.
c) Announements like these are just confusing people and will lead in the assumption that companies or different organizations are leading the GNOME desktop.
Why delay until GNOME 3? There is at least a GNOME 2.12 planned, so that makes it a minimum of 13 months until GNOME 3 could be out. Why delay making these kind of changes, which have no effect on API compatibility until the next big API change?
Maintainers are important, but the existence of all these memory-saving opportunites clearly shows that there is a need for more work to be done.
Again, what is wrong with paying money for good development work on a project? GNOME is developed and helped a lot by people at Sun, Novell and Redhat. It’s still GPL. It’s a Free software project, not some sort of “volunteers only” effort.
A 128MB performance target is not realistic for GNOME, however maybe 256MB is possible. I’d rather that modern DE have a memory management system to program against rather than the most efficient code base, because this means that GNOME has a friendlier API and the safety of garbage collection. Although, I think what with GNOME that you can have it either way depending upon if you use the GObject system, is this not correct.
Making C code more efficient is possible without changing the fundamental design of the library. I think that books like “The Practice of Programming” are examples of efficient C code writing.
I think I might upgrade my system when that “cell” processor comes out. If you ever get the opportunity to upgrade to 512BM of RAM, it is a good investment, you will notice an increase in performance, especially if you only have 256MB RAM.
Looking at the bounties, so far the rewards seem a little cheap and out of line with the work that would probably be involved. Yes, I am a programmer so I know what I’m talking about. I also know that to do something properly it takes TIME.
That’s probably all they have available. This is an open source project. If you want a larger amount of money for being in the ‘Toolkit Performance Division’, go and apply at Microsoft.
I’ve never been totally convinced about bounties though, as they can detract from what really matters in a project.
This is a realy good idea, something like this should be done not only to GNOME but also to other window managers such as KDE. Even non linux based systems could need some work when it comes to memory leakage.
This does’nt only help people with little RAM but helps everyone.. Hope everyone are willing to help.
http://bitsofnews.com
While I dont find gtk+ as slow as some seem to say it is. I’d like to see some speed optimizations brought up. I’d like to know why gtk_widget_get_type is called so many damn times compared to other functions. gtk+ 2.4.14
LD_PROFILE=libgtk-x11-2.0.so.0 LD_PROFILE_OUTPUT=. gftp-gtk
[resize and random things]
sprof libgtk-x11-2.0.so.0
% cumulative self self total
time seconds seconds calls us/call us/call name
13.92 0.11 0.11 29263 3.76 gtk_widget_style_get_valist
11.39 0.20 0.09 296387 0.30 gtk_widget_get_type
8.86 0.27 0.07 25745 2.72 gtk_container_propagate_expose
7.59 0.33 0.06 16474 3.64 gtk_widget_size_allocate
5.06 0.37 0.04 77158 0.52 gtk_container_get_type
11.39% of cpu time on finding the widget type? 10 times as often as just about any other function? It’s just returning a static Gtype… If it’s just returns the same value 296387 times… why bother. almost 14% on getting a style? Or libglib. Just starting gftp-gtk and exiting as soon as it comes up.
time seconds seconds calls us/call us/call name
36.00 0.09 0.09 79459 1.13 g_hash_table_lookup
8.00 0.11 0.02 32345 0.62 g_free
8.00 0.13 0.02 4852 4.12 g_static_rw_lock_writer_lock
8.00 0.15 0.02 1044 19.16 g_str_hash
4.00 0.16 0.01 39979 0.25 g_malloc
very well could be gftp… but still 40K mallocs and 32K frees to start up? seems excessive.
i remember when i used norton desktop in 1993-4. it had more features than any light OSS DE, and i was running it on 386sx 33MHz with 4MB ram. now i’m using KDE on Celeron 2.4GHz (100 times faster) with 256MB ram(64 times larger), and it’s using much more RAM. what happend thru that years? i know that we have now more Bits Per Pixel, greater resolutions, longer file names, but for it now explain anything.
By Dennis (IP: —.upc-f.chello.nl) – Posted on 2005-03-06 10:15:01
Why Fedora? Take a look at:
http://www.xandros.com/products/home/desktopdlx/dsk_dlx_systemreq.h…..
Xandros I have heard is a fine distribution and I have heard a lot of good things about it.
Why Fedora? Well between Fedora and Ubuntu and Dropline Slack those are the only distros or distro/Desktop combos that are really gnome focused.
I have always flipped between RedHat and Suse since starting with linux.
When I firmly decided on gnome I tried Suse/XD2. But between the lack of attention to gnome and Suseconfig changing crap in the background everytime I touch the gui tools I went with RH/Fedora.
It has its issues but Fedorafaq at
http://www.fedorafaq.org
Really it helped with the plugins and multimedia.
Its only the lack of restricted modules rpm in Fedora (like madwifi and such) and editing of menus off by default and the lack of a decent bundled package manager that really bothers me.
I have downloaded SMART Package Manager and that really helped.
Now, I am trying Ubuntu. Better package manager and a restricted modules but not as slick at all. And Ubuntu needs a services configuration like gnome system tool’s runlevel admin and firestarter installed by default.
I’m in the USA and my 40 hour a week office programming PC is a 128MB RAM 400Mhz PIII running Gentoo with a 2.6.10 kernel and Gnome 2.8. It’s a bit faster when I run XFCE instead of Gnome but I prefer the bells and whistles available in Gnome to the bare essentials of XFCE plus gDesklets. I even use it for Mono development, I’d like to see MS XP and Visual Studio 2005 run on a machine with my specs… not going to happen.
Good thing this. I hope they optimize Gnome for 128MB ram machines– it’s by far not an impossible goal. My Sun Ultra 5 has 128mb ram, and it would be really cool to have Solaris/Gnome actually run, instead of walking.
Good thing
By Thom Holwerda (IP: —.cable.quicknet.nl) – Posted on 2005-03-06 16:17:32
Good thing this. I hope they optimize Gnome for 128MB ram machines– it’s by far not an impossible goal. My Sun Ultra 5 has 128mb ram, and it would be really cool to have Solaris/Gnome actually run, instead of walking.
I run Gnome on Solaris with 256MB.
I could not imagine running it on anything in terms of memory.
But with 256MB it actually feels snappier on the menu redraws and other events than say my Fedora Core 3 running with 128MB.
blastwave.org really worked for me.
My experience with this is notes here at the top of my Slashdot journal:
http://slashdot.org/~ACK!!/journal/
Gnome-terminal and nautilus take so long to start up on my 256mb machine, even after starting them a few seconds earlier
I wonder where KDE would be today if it had the same resources, support and hype Gnome receives.
Gnome gets the money simply because it needs it more than anyone else. KDE simply hasn’t needed it. They haven’t needed to invest several million dollars (yes, that’s the running total) of other peoples’ money in creating a free file manager or a free Personal Information Manager application. In other words, it’s viable.
That’s the problem Gnome has always had. You can’t throw money and developers around at the problem like it’s a proprietary product because the rules of the game are different for open source projects. If you want to do that then sell Gnome as a proprietary product.
Gnome gets the money simply because it needs it more than anyone else. KDE simply hasn’t needed it.
Your statement is not justified, IMHO.
As far as I know, KDE uses QT which is more than just a toolkit. This is a large part of development, KDE doesn’t need to make because QTs customers basically ‘pay’ for it.
GNOME and KDE simply have different financial structures – judging one ‘better’ than the other because one uses ‘direct’ bounties from companies while the other uses ‘indirect’ license fees is just a false conclusion.
The top result from a bare Gnome 2.9.92 desktop (Ubuntu Hoary Array 6):
Mem: 516500k total, 346192k used, 170308k free, 46812k buffers
Swap: 498004k total, 0k used, 498004k free, 179352k cached
Isn’t that a bit over the top??
This is an AMD Athlon XP 1600+ with 512 MB RAM and an Ati Radeon 9000 128MB RAM.
This should definitely improve. It’s not okay when a bare OSX desktop requires less memory without giving in on performance. Hasn’t BeOS shown us how to properly code an OS/UI?
That’s the problem Gnome has always had. You can’t throw money and developers around at the problem like it’s a proprietary product because the rules of the game are different for open source projects. If you want to do that then sell Gnome as a proprietary product.
Yeah David, we all know how you hate Gnome, Ximian, and Novell buying Suse, but you don’t get to dictate the rules of the game.
I just took a scan through all the bounties listed at the link and noticed a few things:
1) Many of the bounties are already fixed
2) The fixed bounties still haven’t been claimed or paid after more than a year in some cases.
3) The bounty hunters all start early on the bounties, but then no progress for months while people wonder if it’s being worked on.
4) Sometimes a patch is submitted, but it never gets applied by the coordinators.
5) It’s very collaborative. Many are just a long thread of incomplete fixes that go on for years.
6) The patches often get lost after the original author gives up to it not being included after a period of months/years.
7) Changes often impact other projects, whose maintainers may not want thier project patched.
Frankly, I’m amazed that in some cases it takes 2 years to fix and patch a simple bug. Meanwhile, the source tree is changed, rendering the original patch unusable. Don’t tell me how I should contribute to improve it. I was thinking about doing just that until I came to the realization of how horribly inefficient the whole process is. If I fix a bug, I don’t want to wait 2 years, if ever, to get paid thank you.
Reference of this quote: http://slashdot.org/comments.pl?sid=141549&cid=11859340
I would think that the large majority have computers with 256MB and above.
But fixing memory leaks are always a good thing.
I can’t believe all this debate about Gnome/KDE. Does anybody one here actually use their systems for processing…or do they spend all their time staring lovingly at the pretty icons.
I run a Linux/Pentium II/256mb/FVWM system and tend to use it for some pretty heavy-duty scientific analysis. It works fine, and doesn’t seem to mind the lack of the unnecessary “frontend” clutter.
Gnome is just helping along the Intel/Windows drive to make equipment redundant as soon as possible.
I object!
I’ve studied CS in C, not Java.
Last year I converted an old PII 96mb RAM box running Windows 98 to a Linux box and the first thing I noticed (besides the headaches getting X and sound to work) was that there was a huge dropoff in speed. Let’s face it, Win 98 is lots faster on old machines than any present day Linux GUI equivalent. Yes, fluxbox is very zippy, but it doesn’t have the same functionality as a real desktop environment. XFCE4 is the closest to it, but even then…
I’m glad there’s this initiative. It may be the whole X windowing system to blame, but why should Windows run faster on the same equipment. I love Gnome, and it’s come a long way in a short time to catch the Windows interface in slickness, but on the responsiveness front, there’s way too much ‘drag.’
Hi,
personally I believe it should be “easily” possible to squezze GNOME down to ~64MB without sacrificing functionality.
Just look under what kind of constraints embedded devolopers work, I think as a desktop-developer you should consider to carefully monitor what’s going on there.
After all, I don’t want to live without the bells and whistels a modern DE provides, but I don’t see why a machine should sacrifice more then 128MB for the DE.
Of all the desktop environments I’ve used, GNOME has been the most responsive. On my machine, I’ve use GNOME, KDE and Windows XP. XP is sluggish, will freeze when I do anything I/O related and has pathetic multitasking. It could never handle the load or abuse I constantly deal on GNOME/Linux. KDE has become very fast over the years, but on my machine it’s not as fast as GNOME and it consumes a lot more resources, RAM in particular. GNOME screams!
My hardware specification isn’t impressive. A 1.4GHz powered Athlon CPU with 256MB of RAM. I use what I think is the fastest Linux distro on earth, Gentoo, and I have GNOME compiled and optimized for size (-Os). I run a desktop optimized kernel maintained by Gentoo enthusiasts and I also use Reiser4, the fastest file system on earth. All I can say is that sometimes I feel the “GNOME/GTK is slow mantra” is bordering on trolling and unrealistic expectations.
Now, GNOME isn’t perfect. Yes, I believe optimization should be a key element in any software development cycle, not just GNOME’s. But only jobless geeks, like us, bother about RAM usage. Only jobless geeks, like us, spend all day resizing OpenOffice.org so that we can brag on osnews and slashdot about how slow GTK+ is. I’m not saying GNOME doesn’t need optimizations. I think it does. But I also think OS X needs optimizations. Heck, it’s not as responsive as GNOME is on similarly powered hardware, yet I don’t see geeks complaining.
It has occurred to me over time that the general public phenomenally places unrealistic barriers on and have ridiculous expectations of free software projects. Linux has to be better than Windows, BeOS, Mac OSX, Amiga, you name it. GNOME has to run faster than BeOS, look better than OS X and support as many applications as Windows XP. Mplayer and Xine have to play all codecs known to man, but it’s cool if Quicktime and Windows Media Player can’t play half of what the free software alternatives can handle without sweat. And the expectations go on and on and on. So do the complaints. It’s ridiculous.
Free software projects are community oriented. Rather than sit on your fat asses and whine all day, why don’t ya put your money where your mouth is, and learn how to contribute to solving the problem. And also, once in a fucking while, learn to say “Thank you devs, you are doing a bloody wonderful job!” Perhaps then, they’ll be motivated to fix GNOME/GTK’s ”slowness”.
Finally, my fellow geeks, this is the 21st century. The sad truth is that developers don’t want to waste their time fiddling with tedious low-level components and would rather have intelligent and automated units take care of tedious low-level programming. The resultant effect is that programmers can spend more time of design, creativity, usability and problem solving, rather than accounting and micro management. In brief, your applications are only going to get fatter and more bloated. Save money, and buy more RAM. It stinks, but it’s the truth and you’d never understand the issue if you aren’t a programmer.
The next generation of GNOME applications are going to be written via platforms like Mono, Java and Python. My advice again, buy more RAM. I dissed XP earlier. However, I need to say XP has some of the best animations and smoothness of any operating system I’ve used. GNOME could learn a lot from them.
Finally, my fellow geeks, this is the 21st century. The sad truth is that developers don’t want to waste their time fiddling with tedious low-level components and would rather have intelligent and automated units take care of tedious low-level programming. The resultant effect is that programmers can spend more time of design, creativity, usability and problem solving, rather than accounting and micro management. In brief, your applications are only going to get fatter and more bloated. Save money, and buy more RAM. It stinks, but it’s the truth and you’d never understand the issue if you aren’t a programmer.
I have never ever seen so much arrogance in one paragraph of text. If they cannot get Gnome to be useable on 128 MB RAM, then sorry my friend, the Gnome programmers simply lack skills. And the “but who cares RAM is cheap” argument is not a wildcard for programmers to deliver bloated and slow code.
BeOS– now that’s an OS programmed well. It runs fast even on 64 MB machines– something Gnome and KDE can only *dream* of ever achieving. Setting a goal to make Gnome useable on 128MB RAM by Gnome 3, is a very good goal that I’m sure these people will achieve.
Thom, I’m afraid the unnamed poster above has a good point. One of things programmers have to keep in mind is reusability. If you can develop a single solution to a problem, but make that solution generic enough to fit multiple situations, you come out ahead. In general, if you want to have an efficient solution, it is a specific solution, meaning that you can’t use it over again as easily as the generic solution. So, when it comes to software, where money really is the bottom line (and money here, I mean time investment, which would be money in a business situation) this reusability comes at the cost of performance and inefficiency of computer resources. The typical solution is to create a well designed solution FIRST and optimize LATER. That’s not to say you have to be wasteful in your initial design, but only that an extensible, maintainable design should definitely be at the top of the priority list.
I’ve never used BeOS to any great degree, and so I can’t comment on it. However, I’ve heard basically only good things about it, so I have to assume they’re founded. But you have to consider the differences between BeOS and Gnome. Gnome is available across many architectures, Gnome is being done by volunteers, not paid individuals (I know, some people are probably paid to work on Gnome… let’s ignore that ). Just a few things to keep in mind, that there are differences at the core between Gnome and BeOS, things that you can’t really fault Gnome for.
Thom Holwerda,
BeOS is old and dead. Compare GNOME to XP or even OS X. How feasible is it to run either of them on 128MB of RAM? When BeOS was alive, Linux had GUIs that could run on 16MB of RAM. It still does today, if that’s your cup of tea.
I believe GNOME needs optimizations, but I believe all the other major desktop environments do too, that is XP and OS X inclusive. As for application “bloat” as you guys like to call it. It’s gonna get worse. You have two options. Embrace it, or live in the your BeOS past.
Take a look at the next generation of killer GNOME applications, they are using plaftorm oriented tools like Mono and Python. These plaftorms by design make applications consume more Memory. It would be a cold day in hell before I go back to designing GUI apps for GNOME in C/C++. I’d rather use Python, Mono or Java anyday. And guess what, my apps are going to consume twice as much RAM as a C application.
Call me an arrogrant programmer. But who gives a damn. I write the app, you whine about it. That balances the equation, doesn’t it?
I have never ever seen so much arrogance in one paragraph of text. If they cannot get Gnome to be useable on 128 MB RAM, then sorry my friend, the Gnome programmers simply lack skills. And the “but who cares RAM is cheap” argument is not a wildcard for programmers to deliver bloated and slow code.
BeOS– now that’s an OS programmed well. It runs fast even on 64 MB machines– something Gnome and KDE can only *dream* of ever achieving. Setting a goal to make Gnome useable on 128MB RAM by Gnome 3, is a very good goal that I’m sure these people will achieve.
Ahem. Talk about the pot calling the kettle black. It is arrogant to assume that just because GNOME doesn’t fit in 256 MB of RAM the developers must lack skills. There are constraints. The original post mentioned them.
High level languages exist to free the programmer from low level details. This allows them to develop code faster and cut down the time it takes to complete an application. The result is you get an application in much less time and lowers the cost since time == money, but the application isn’t necessarily going to perform well.
There’s a well known saying among developers. Fast, Good, Cheap. Pick two.
I’m using a 2ghz amd athlon XP, with 256sdram, with Ubuntu Linux Hoary. I noticed only a few slowdowns with nautilus while using azureus, but I think thats because im writing to my home directory, and uploading also, so I guess when I open nautilus /home/ashley, my system crawls and is unusable… Whos fault that is Im not sure.
On another note, besides this, Gnome is fast, but much faster when using haory over warty (Ubuntu)
–Ashley
Viro,
Applications written in higher level languages can perform well if it is properly designed. It is just going to consume more resources, RAM in most cases. Hence, if you do not have enough RAM to run these applications, overall system performance begins to degrade.
No matter how much time you spend optimising your applications written in higher level languages, it will always consume more RAM. This is an intentional design compromise on the part of programmers, computer scientists and engineers.
It’s about time users started understanding the compromise as well as the solution to the problem. Bloat is not always a bad thing. At least, you get an application that is relatively more secure and in many cases better designed and robust.
But geeks like to get a hard on over the most ridiculous aspects of computing. Like getting Linux to run on a lightbulb just because they can. I am not encouraging sloppy programming practices, however, I’m equally not living in the past.
weel would not say that gnome is unuseable on a machine with 128mb ram.
here at school we have machines with 850mhz cpu and 128mb ram wich dualboots win2000 and gentoo w gnome.
i dont notice any diffrence peformance wise betwen gnome and w2k. it could be all te antivirus software and crap that bogs down w2k but in gnome i run firefox with 5 tabs xmms gvim rdesktop and the machine is still responsive ( well not as responsive as my dual mp2400 at home but that is not to be expected) the only time the machine get relly slow is when running large compile jobs
Your statement is not justified, IMHO.
It is, and below confirms it:
As far as I know, KDE uses QT which is more than just a toolkit. This is a large part of development, KDE doesn’t need to make because QTs customers basically ‘pay’ for it.
Yep – you’ve got to work with what is practical for the task. That has nothing to do with KDE itself though because Qt pays for itself independently, and the customers of it get something in return – a tool they can use.
GNOME and KDE simply have different financial structures – judging one ‘better’ than the other because one uses ‘direct’ bounties from companies while the other uses ‘indirect’ license fees is just a false conclusion.
No, I’m afraid you just don’t understand this as a lot of people don’t. Qt is independent of KDE, and is a commercially viable toolkit in its own right and pays for itself. None of the money goes directly into KDE but into the improvement of Qt, which KDE obviously indirectly benefits from. However, Qt pays for itself on its own as well as KDE contributing much needed publicity and testing resources – not hard cash, but very important. There’s an exchange going on there.
Again looking at the individual applications such as the file managers and PIM suites and the investment that has gone into them over the years, and the returns companies like Eazel and Ximian haven’t made, the cost effectiveness comparison is crystal clear.
Why is this relevant? Because nothing comes for free, as Microsoft is so very fond of telling us (but in a different way), and certainly (depending on the software) you can’t expect everything to be free for everything. So, Gnome and its libraries have some memory and speed issues? Considering what individual people and developers have got and what they can do with it, it’s actually pretty good considering. Optimisation is very, very hard, especially when you have a complex piece of software in front of you, and goes way beyond just measuring memory usage.
As an aside, can anyone tell me what ROI Novell gets out of all this? I just wish the community could deal with this by themselves really.
…but you don’t get to dictate the rules of the game.
I don’t dictate the rules of the game – I’m explaining to you what they are .
i don’t agree with you. DE is not an application that must be written for yesterday. it has to be responsive and stable.
Qt is independent of KDE, and is a commercially viable toolkit in its own right and pays for itself.
I didn’t say it wasn’t. Althought one might question if Qt would be commercially viabel without the KDE promotion effect. Think Kylix. Viable on its own?
Because nothing comes for free, as Microsoft is so very fond of telling us […]
I partially agree, but some things need to be commonly available for everybody otherwise you pay monopoly rents due to network effects. This is the part of the story, Microsoft doesn’t tell.
How the efforts for the common is financed is something completely different. So, yes, in a certain way, GNOME ‘needs’ the bounty money just like KDE ‘needs’ Qt to deliver software that the project would otherwise have to develop and maintain themselves.
Re: Re: Gnome gets the money, as usual
By clausi (IP: —.arcor-ip.net) – Posted on 2005-03-06 23:36:55
Qt is independent of KDE, and is a commercially viable toolkit in its own right and pays for itself.
……
How the efforts for the common is financed is something completely different. So, yes, in a certain way, GNOME ‘needs’ the bounty money just like KDE ‘needs’ Qt to deliver software that the project would otherwise have to develop and maintain themselves.
Plus, didn’t alot of KDE folks and personnel time go into KDE development for years just like RH does with Gnome?
I mean its not like corporations completely ignore KDE. They just have one less champion if and only if Novell starts ignoring them for Gnome.
I thought a lot of the KDE focused distro makers had chunked contributions out to the KDE folks that have helped?
Am I missing something?
I agree with you. The public needs to be educated about the limitations faced. Do you want software that runs really fast, has less features (though you might argue this is a good thing has the smallest possible foot print and costs a fortune since it took far longer to develop than software that was quick to develop, usable, but consumed more resources but didn’t perform as well.