Consider these memory requirements for Fedora Core 2, as specified by Red Hat: Minimum for graphical: 192MB and Recommended for graphical: 256MB Does that sound any alarm bells with you? 192MB minimum? I’ve been running Linux for five years (and am a huge supporter), and have plenty of experience with Windows, Mac OS X and others. And those numbers are shocking — severely so. No other general-purpose OS in existence has such high requirements. Linux is getting very fat.
I appreciate that there are other distros; however, this is symptomatic of what’s happening to Linux in general. The other mainstream desktop distros are equally demanding (even if not as much as Fedora, for example Arch Linux or Slackware run Gnome on 128 MB, but not very comfortably when you load 2-3 apps at the same time), desktops and apps are bloating beyond control, and it’s starting to put Linux in a troublesome situation. Allow me to elaborate.
A worrying tale
Recently, a friend of mine expressed an interest in running Linux on his machine. Sick and tired of endless spyware and viruses, he wanted a way out — so I gave him a copy of Mandrake 10.0 Official. A couple of days later, he got back to me with the sad news I was prepared for: it’s just too slow. His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower. Not only did it take longer to boot up, it crawled when running several major apps (Mozilla, OpenOffice.org and Evolution on top of KDE) and suffered more desktop glitches and bugs.
Sigh. What could I do? I knew from my own experience that XP with Office and IE is snappier and lighter on memory than GNOME/KDE with OOo and Moz/Firefox, so I couldn’t deny the problem. I couldn’t tell him to switch to Fluxbox, Dillo and AbiWord, as those apps wouldn’t provide him with what he needs. And I couldn’t tell him to grudgingly install Slackware, Debian or Gentoo; they may run a bit faster, but they’re not really suitable for newcomers.
Now, I’m not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I’m just being realistic — they should still run decently on hardware that’s a mere three years old, like my friend’s machine. If he has to buy more RAM, upgrade his CPU or even buy a whole new PC just to run desktop Linux adequately, how are we any better than Microsoft?
Gone are the days when we could advocate Linux as a fast and light OS that gives old machines a new boost. BeOS on an ancient box is still faster than Linux on the latest kit. And to me, this is very sad. We need REAL reasons to suggest Linux over Windows, and they’re slowly being eroded — bit by bit. Linux used to be massively more stable than Windows, but XP was a great improvement and meanwhile we have highly bug-ridden Mandrake and Fedora releases. XP also shortened boot time considerably, whereas with Linux it’s just getting longer and longer and longer…
Computers getting faster?
At this rate, Linux could soon face major challenges by the upcoming hobby/community OSes. There’s Syllable, OpenBeOS, SkyOS, ReactOS and MenuetOS — all of which are orders of magnitude lighter and faster than modern Linux distros, and make a fast machine actually feel FAST. Sure, they’re still in early stages of development, but they’re already putting emphasis on performance and elegant design. More speed means more productivity.
To some people running 3 GHz 1G RAM boxes, this argument may not seem like an issue at present; however, things will change. A 200 MHz box used to be more than adequate for a spiffy Linux desktop, and now it’s almost unusable (unless you’re willing to dump most apps and spend hours tweaking and hacking). In those times, us Linux users were drooling over the prospect of multi-GHz chips, expecting lightning-fast app startup and super-smooth running. But no, instead, we’re still waiting as the disk thrashes and windows stutter to redraw and boot times grow.
So when people talk about 10 GHz CPUs with so much hope and optimism, I cringe. We WON’T have the lightning-fast apps. We won’t have near-instant startup. We thought this would happen when chips hit 100 MHz, and 500 MHz, and 1 GHz, and 3 GHz, and Linux is just bloating itself out to fill it. You see, computers aren’t getting any faster. CPUs, hard drives and RAM may be improving, but the machines themselves are pretty much static. Why should a 1 GHz box with Fedora be so much slower than a 7 MHz Amiga? Sure, the PC does more – a lot more – but not over 1000 times more (taking into account RAM and HD power too). It doesn’t make you 1000 times more productive.
It’s a very sad state of affairs. Linux was supposed to be the liberating OS, disruptive technology that would change the playing field for computing. It was supposed to breathe new life into PCs and give third-world countries new opportunities. It was supposed to avoid the Microsoftian upgrade treadmill; instead, it’s rushing after Moore’s Law. Such a shame.
Denying ourselves a chance
But let’s think about some of the real-world implications of Linux’s bloat. Around the world in thousands of companies are millions upon millions of Win98 and WinNT4 systems. These boxes are being prepared for retirement as Microsoft ends the lifespan for the OSes, and this should be a wonderful opportunity for Linux. Imagine if Linux vendors and advocates could go into businesses and say: “Don’t throw out those Win98 and NT4 boxes, and don’t spend vast amounts of money on Win2k/XP. Put Linux on instead and save time and money!”.
But that opportunity has been destroyed. The average Win98 and NT4 box has 32 or 64M of RAM and CPUs in the range of 300 – 500 MHz — in other words, entirely unsuitable for modern desktop Linux distros. This gigantic market, so full of potential to spread Linux adoption and curb the Microsoft monopoly, has been eliminated by the massive bloat.
This should really get people thinking: a huge market we can’t enter.
The possibility of stressing Linux’s price benefits, stability and security, all gone. Instead, businesses are now forced to buy new boxes if they are even considering Linux, and if you’re splashing out that much you may as well stick with what you know OS-wise. Companies would LOVE to maintain their current hardware investment with a secure, supported OS, but that possibility has been ruined.
Impractical solutions
Now, at this point many of you will be saying “but there are alternatives”. And yes, you’re right to say that, and yes, there are. But two difficulties remain: firstly, why should we have to hack init scripts, change WMs to something minimal, and throw out our most featureful apps? Why should newcomers have to go through this trouble just to get an OS that gives them some real performance boost over Windows?
Sure, you can just about get by with IceWM, Dillo, AbiWord, Sylpheed et al. But let’s face it, they don’t rival Windows software in the same way as GNOME/KDE, Moz/Konq, OpenOffice.org and Evolution. It’s hard to get newcomers using Linux with those limited and basic tools; new Linux convertees need the powerful software that matches up to Windows. Linux novices will get the idea that serious apps which rival Windows software are far too bloated to use effectively.
Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users — the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don’t pay any attention to speed, and it’s giving a lot of people a very bad impression. Spend an hour or two browsing first-timer Linux forums on the Net; you’ll be dismayed by the number of posts asking why it takes so long to boot, why it’s slower to run, why it’s always swapping. Especially when they’ve been told that Linux is better than Windows.
So telling newcomers to ditch their powerful apps, move to spartan desktops, install tougher distros and hack startup scripts isn’t the cure. In fact, it proves just how bad the problem is getting.
Conclusion
So what can be done? We need to put a serious emphasis on elegant design, careful coding and making the most of RAM, not throwing in hurried features just because we can. Open source coders need to appreciate that not everyone has 3 GHz boxes with 1G RAM — and that the few who do want to get their money’s worth from their hardware investment. Typically, open source hackers, being interested in tech, have very powerful boxes; as a result, they never experience their apps running on moderate systems.
This has been particularly noticeable in GNOME development. On my box, extracting a long tar file under GNOME-Terminal is a disaster — and reaffirms the problem. When extracting, GNOME-Terminal uses around 70% of the CPU just to draw the text, leaving only 30% for the extraction itself. That’s pitifully poor. Metacity is hellishly slow over networked X, and, curiously, these two offending apps were both written by the same guy (Havoc Pennington). He may have talent in writing a lot of code quickly, but it’s not good code. We need programmers who appreciate performance, elegant design and low overheads.
We need to understand that there are millions and millions of PCs out there which could (and should) be running Linux, but can’t because of the obscene memory requirements. We need to admit that many home users are being turned away because it offers no peformance boost over XP and its apps, and in most cases it’s even worse.
We’re digging a big hole here — a hole from which there may be no easy escape. Linux needs as many tangible benefits over Windows as possible, and we’re losing them.
Losing performance, losing stability, losing things to advocate.
I look forward to reading your comments.
About the author
Bob Marr is a sysadmin and tech writer, and has used Linux for five years. Currently, his favorite distribution is Arch Linux.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
Funnily, I did try Fedora Core 2 on a 128 MB Linux-certified machine (that was before I upgraded it to 384 MBs recently). I knew that FC2 required 192 MB minimum for graphical, but I didn’t want to nuke my FC1 on my other machine that has 512 MB of RAM, so I decided to give it a quick TEST shot on that Duron machine with 128 MB. The result:
FC2 *was unusable*. And I mean, *unusable* with either KDE or Gnome. Things would load ages later or wouldn’t load at all. I could only *kinda* use FC2 at 128 MB when I switched to XFce.
On the same machine, with 128 MB memory, I also tried Xandros, Mandrake 10, Arch Linux and Linare Linux. From the bunch Mandrake was the one that was “a bit” heavy, but all in all, the machine remained usable (NOT confortable by any means, but usable if you wanted to do a quick job with it). But FC2 was really not usable at 128MB, and because that fact gave me a glimpse of what’s coming soon I actually decided to upgrade that machine (I didn’t have any incentive to upgrade that machine before, it is not my primary machine, I just use it for some tests).
Now that X is being developed again things might improve, the weakest part of linux has always been the GUI and X. KDE and Gnome have added a lot of polish over the years and now look really slick, but this comes at a performance cost, which improvements to X might fix.
Still, RAM isn’t exactly expensive anymore and running WinXP on less than 256mb RAM is pretty bad too.
It is not all that bad, it is just about choice. For example, I installed Libranet (which is quite user-friendly) on 128MB and 64MB 400MHz machines. With IceWM and Opera that works quite well, and it isn’t really more difficult to use than e.g. Win9x.
Yep, I agree that KDE and Gnome are bloated these days.
BeOS wasn’t cool for no good reason. BeOS can make 6 year old hardware feel fast.
What is the solution for Linux? Copy everything that BeOS does. Run the legacy kernel on top of the L4 micro kernel. Have all the desktop features use the micro kernel and multi-threading directly.
Just a thought. Oh, and before you say it, micro kernels will make a difference in this case. Why? Look at BeOS driver management. Drag and drop.
With ever progression the requirements for any os goes up. Gnome however, is known to be slow right now compared to the latest KDE, so Id expect that. I wonder if the author has Swap on as it works really well. My experience is the new 2.6 kernel isnt freindly with older hardware compared to the 2.4 series, so things like mandrake 10 id install 2.4, which is what I in fact did on a relatives pc and it ran pretty good. It was a pII with 400mhz. the Killer here is the RAM, as long as you have like 192 your good, hell even 128 will do. Its like my freind who had a celeron 500 mhz. it ran windows 2000 slower than the pII, my guess is exactly cause it only had 64 mb ram.!
Anywheres people installing oses generally aren’t joe six pack, its usually their geek freind or the office tech so installing vector linux or any low end thing should be fine. There is used ram out there now a days on ebay since the market atificially inflates SDRAM prices when they technically should be worth dirt by now. Actually if one has a pII id tell them to buy a new comp for 500-600 dollars. They can get an athlon XP cheap as well as a graphic card for a good price.
Running XP with less than 256MB of RAM (if you’re going to do more than play solitaire) is a disk-thrashing nightmare. 512MB is comfortable.
Microsoft recommends 128MB minimum and claims that it’ll run, albeit badly, with a mere 64. I’d rather use an abacas than try that. The point is, 256MB is really not so much considering XP was released in 2001 and FC2 was released in the middle of 2004, YEARS LATER. Why should an OS that has been evolving over the last three years (since XP was released) be expected to conform to the system requirements of an OS that’s been out for years? Following that logic, the XP system requirements (which dwarfed 98’s requirements, and came out roughly as far apart as XP and FC2) were just as horrible and alarming as this person seems to think Fedora’s are.
I’m forced to wonder if this guy was clutching his Pentium 200 to his chest and screaming, “It’s not fair! It’s just not FAIR!” when XP came out … I mean, that P200 Classic would run 98 just FINE, how DARE Microsoft make 300Mhz the recommended spec for XP!
Or maybe I just don’t get it.
Linux is not getting fat. Fedora, or any other distro with those requirements are. Keep the word Linux in context with the kernel and we are a lot less troubled. If you choose to run KDE/GNOME2 and then add GDM, and all the bells and whistles (gdesklets for example)… expect to use some ram up.
Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users — the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don’t pay any attention to speed, and it’s giving a lot of people a very bad impression.
I still don’t understand what the giant obsession with pleasing everyone is. I think there is a distinction in making a distro for ‘newcomers’, and just making a user-friendly distribution. Just because Mandrake is easy to install doesn’t mean it is made for newcomers. Windows isn’t the Microsoft OS for newcomers, it’s just damn easy to use period. I think the same flies for easy-to-run Linux.
>Running XP with less than 256MB of RAM is a disk-thrashing nightmare
I am running XP with 256 MB of RAM daily. That’s my PRIMARY machine, a dual Celeron 533 and 256 MB RAM running XP PRO.
I run IE, OE, Winamp, Notepad and Trillian at the same time with no problems at all. Things only get a bit strictier when I need to use an IDE or PaintShopPro, but overall, 98% of the time I only use the 5 apps mentioned above, and 256 MBs are more than enough to run those. At least for my needs, it runs great at 256 MB.
Something they has needed to be said for awhile. Look, I love GNOME, but on my eMac with Debian, it’s slower and takes more memory than Mac OS X while doing less. That is pretty lame and I definately hear you on GNOME Terminal, it is VERY slow. I see the GNU/Linux niche being in performing well on low-end systems, with benefit to third world countries and budget conscious companies, but at the current rate it just isn’t happening. Windows is *so* more responsive than GNU/Linux at the moment and I don’t care whether the blame lies with X or GTK or the language, all I care about is that it is, and so does every other consumer. Look at an old NeXT system, or look at the Contiki OS and how much they do with so little, it’s embarrassing. Yes, times have changed and the level of complexity has increased, but people shouldn’t need a 1.5GHz system with 512MB RAM to decently browse the web, e-mail, type up letters and listen to music.
I did try to run RedHat 8.0 on my Toshiba 64mb 433Mhz Celeron laptop, and it was totally unusable. I was n00b back then (well, I´m still a newbie, but atleast I know the basics), I thought after reading comments on Linux how good it was compared to Windows, that RH would run on my laptop. That experiment brought me back to reality, so to speak.
I didn´t switch back to Windows tho. I installed Debian with Xfce, and it runs like a charm. My 3Ghz 1G Ram P4 box is running Mandrake 10.0. For my friend, new to Linux, needed OS for his old AMD K6-2 box which had Win98 before until it got totally trashed, I offered old RedHat 6.2, which runs quite good. Atleast if our hardware can´t run the latest Linux, we can get old versions and still use our comps.
But, Linux apps could use better coding, I got apps running who get totally slow even on P4!
Fedora has some steep requirements, and suddenly the “Linux platform is getting fat?”
Guess I’m hallucinating the memory footprint of my Gentoo installation.
…which is almost my specs for my dual-boot system you start noticing how much slower Gnome 2.6 is than XP Pro.
When I first got this sytem back in January I was sitting in linux most of the time just for the mere fact that just to get everything up and running on gentoo, including wireless, and both desktop just the way you want it is a part-time week in itself. I thought Gnome looked good. The fonts were alright after I spent some time tweaking.
Well, for the past month and a half or so I’ve been in my XP Pro partition mostly working in Eclipse. Once I enabled ClearType things looked about a 1000% better in windows and eclipse just tends to run better in windows, plus with Firebird what the hell.
The other night I decided to play this very old BladeRunner DVD that windows media player just doesn’t handle for whatever reason, but I knew that a program I had for gentoo would handle.
Well, I hadn’t been in linux for quite a while and man I just didn’t like what I saw. The fonts just look like crap compared to me be using to cleartype and Gnome 2.6 (even on a P4-3.2 ghz, 1 gig of ram, and a ATI 9600 Pro card) just seemed sluggish compared to windows.
Yeah, yeah, I know I can run fluxbox or whatever, but why should I. With a firewall, a router, Firefox I’m not getting viruses. I know how to keep my system clean so why should I even mess with Linux.
and it costs less than $40.
So, your point is FC2 won run on your old Duron box. OK, that’s a point. It won’t, and I think the Fedora people intended it that way.
Then you claim “Linux is getting fat” …
Linux is a kernel. It runs on machines with 2MB of RAM quite well, depending on kernel version.
GNU/Linux is an OS. It runs on anything from an embedded system with 2MB of RAM to an IBM 390 with GBs of core.
This post is a troll…
I agree 256MB is realistic
Have a look on any major PC manufacturer’s website. Until recently (I haven’t looked in a while but maybe even still now) most laptops and pcs were coming with 256MB RAM as standard. Tho in my experience having a standard set of apps (Office suite, mail program and a browser) on that setup will begin to swap like hell on Windows after a couple of weeks of daily use.
Linux it seems will stay at the same level of performance without degrading over time at least but XP on 256 to begin with it fine.
@Mark
you can’t keep things on the kernel: normal users don’t know what a kernel is! They’re pointing at their screen and say “This is my Linux.” or “This is my Windows.”
They don’t care what makes it slow.
And normal users don’t like to hack to speed up any OS. And I agree that Linux need better coding and usability, some examples:
When I copy a text on OpenOffice via the context menu why can’t I insert in Mozilla Composer via the context menu?
Why don’t I have a universal installer service? I don’t like to bother about 24 libraries that are missing.
I use Mandrake 8.2 for my webserver @home and I tried to upgrade to Mandrake 9.2…but the KDE 3 was so buggy that I returned to Mandrake 8.2 with KDE 2…
This seems like a good place to ask — my company might be distributing our new show control app as part of a custom Linux install CD. We’d like to have a Linux distro that (a) can be installed by someone who doesn’t know a thing about Linux, other than “put the CD in the drawer, reboot, click Next until it’s done”, (b) auto-recognizes all reasonably recent (<5 years old hardware) and auto-configures it (including networking), and (c) Runs as snappy as possible — an ugly fast GUI would be preferable to a pretty, sluggish API. (Our customers previously ran our app under BeOS, and they put a premium on responsiveness) It would also be nice (but not strictly required) if it had the capability to run directly from the CD, and if it didn’t install a bunch of esoteric extra stuff that won’t be needed. Any recommendations regarding distros to try for this?
Dude, Gentoo Linux running KDE 3.2.2, OpenOffice.org and Mozilla are slow as fsck too on 128MB RAM. Do you really dare to deny that?
I know, because I use Gentoo on a 366MHz box with 160MB ‘o RAM. I would not DARE to run KDE on it. I use IceWM instead, but it’s still painstakinly to run FireFox and OpenOffice.org and aMSN together.
However, the blame isn’t just Linux. It’s the apps. KDE, OpenOffice.org, Firefox…just to name a few. They are terribly huge. But no one likes to optimize for free, so you won’t see that changing.
You can say a lot about Microsoft, but Office and MSIE start pretty damn fast and use less RAM than their OpenSource counterparts. Unless you like to compare lynx to MSIE and Abiword to MS Office, ofcourse.
Just for the record, I have just booted into KDE, and started only aMSN, Firefox, konsole and kdict. Memory footprint:
774680 TOTAL
263724 USED
17096 BUFFERED
124584 CACHED
That’s 260MB used already.
At that time i ran Mandrake 7.1 on a PIII 450MHz 64 MB. I switched to Slakware (8.0 or 8.1). It was another world. Try Slackware. It’s another way. I haven’t yet tried Gentoo or Debian. I’m still on Slackware. (Though i changed to a P4 1700MHz 256 MB).
I also tried BEOS R5.. I feel Impressed, It’s fast fast fast (but W98 is fast too on a P4). It’s great. But it’s dead, more or less, and Zeta it’s something strange. I think they don’t have BeOS code, so the hack and hack here and there without the possibility to really improve the kernel code etc.
Another problem with BeOS. It’s not multiuser. Sadly.
BTW. On the office front TextMaker and Planmaker are good light (commercial) alternatives for OpenOffice.
>Running XP with less than 256MB of RAM is a disk-thrashing nightmare
what are u talking about…
I had a Celeron 400 with 128 mb RAM
and Win XP PRO and Office XP was usable on that machine. Not fast true, but not unbearablly slow.
Now I upgrade the machine to Celeron 533 + 256mb.
It’s actually pretty fast (not fast enough to play game, but fast enough for IE+winamp+chat program).
Things will probably continue to get worse as far as desktop bloat goes when you consider that say your primarily a Gnome user, but like to use that one KDE app. Well by using that one KDE app you’re probably bringing in 3/4 of the KDE desktop libraries as well.
I guess that’s the price you pay for the “freedom to choose”.
…You can’t eat the cake and have it.
If ones expect the system to have all the bells and whistles, beautiful interface with lots of themes and decorations, everything plug and play, support to all type of multimedia formats etc, one should expect higher requirements.
I agree. However, XP and Windows 2003 Server does that better than Fedora/SuSE/Mandrake in terms of memory requirements and CPU needed. So, there will always be some comparison going on.
This article is right, apps for Linux are gettin slower and slower.. lets take my Notebook for example – Sony Vaio 450Mhz PIII with 320 MB of Ram. I had Mandrake installed before – was cool but sloooow – so I’ve switched to Gentoo – fast and liht very kewl.
I use it for Mail, web browsing, and working (I code web apps). And I can’t find a descent editor which does 3 things: code highlighting, tabs for multiple documents, and customizable shortcuts the way I want.
I’ve tried them all, to name few: gedit, screem, quanta, bluefish, eclipse, kate, anjuta… and you know what? Except for Kate which I use now they all were unusable slow when editing file with about 1000-2000 lines of code – after pressing enter I had to wait like a 10-15 seconds for the editor to become usable again – and when you code you press enter quite often.
To spice things up I can say that running EditPlus with Wine was faster then using apps I menshioned above and this is very pathethic…
Why applications nativly written for Linux run slower then program running in emulator?
Bad code? Bad design? I don’t know but this indeed is alarming…
I use Slack 9.1 in my celeron 434 MHz with 256MB RAM. It was installed with Slack 9.1 default Gnome 2.4 and X(still)Free 4.3. It used to use around 150MB in RAM right after the whole system loaded up with X and Gnome.
Surprisingly, after upgrading to Dropline Gnome 2.6 (with X.Org bundled), the RAM used is decreased to about 90MB. I noticed that either the previous XFree 4.3 and X.Org takes 11MB in RAM. Everything feels fast and faster than previous versions. Bloat ? Not happen to me…
Linux is a kernel. It runs on machines with 2MB of RAM quite well, depending on kernel version.
GNU/Linux is an OS. It runs on anything from an embedded system with 2MB of RAM to an IBM 390 with GBs of core.
This post is a troll…
Uh, no, Andrew/Mark. You’re barking up the wrong tree. If you exclude the Linux kernel from what’s “fat”, then you also have to exclude the Windows XP kernel. It’s pretty small, too.
The things that make an OS fat are the things that users interact with most commonly: Shells, apps, etc. Not the kernel. But, regardless of how you want to characterize “Linux”, it is judged by what’s included by default when you setup a distribution. People don’t install Linux and say, “Wow, I really like the speed of the kernel — but KDE really blows chunks perf-wise.” They blame the entire stack because (a) most don’t know what a kernel is, (b) even if they did, they don’t have visibility into the kernel to differentiate between bloat there and the apps.
Computers are getting pretty fat too!
>Linux is a kernel. It runs on machines with 2MB
>of RAM quite well, depending on kernel version.
The title of the article is about the “Linux platform”, meaning the desktop and surrounded apps, NOT just the kernel.
I wish Linux supporters stop using the same argument over and over when someone says something negative about the *platform* and they happen to use the word “Linux” simply because it is generic enough and convienient. We all know what the author meant, so there was no reason for the trivia.
Why applications nativly written for Linux run slower then program running in emulator?
Bad code? Bad design? I don’t know but this indeed is alarming…
There’s an old rule-of-thumb that, given a set of resources (CPU, GPU, memory, FPU, I/O devices, etc), applications will grow to consume all possible resources. This is so incredibly true. It is our nature (as human beings) to never be satisfied with what we have — and add more features. Over time, Linux is going to continue to bloat and leave old hardware behind. This isn’t a bad thing, in itself. There’s a price to be paid for progress. But none of us should have the unrealistic expectation of being able to load Linux upgrade-after-upgrade on the same hardware year-after-year and expect that the perf will be the same or better. Just doesn’t happen. Software developers get used to setting new hardware baselines, just as politicians get used to setting new tax baselines. It’s inherent.
I am running XP with 256 MB of RAM daily. That’s my PRIMARY machine, a dual Celeron 533 and 256 MB RAM running XP PRO.
I run IE, OE, Winamp, Notepad and Trillian at the same time with no problems at all. Things only get a bit strictier when I need to use an IDE or PaintShopPro, but overall, 98% of the time I only use the 5 apps mentioned above, and 256 MBs are more than enough to run those. At least for my needs, it runs great at 256 MB.
I don’t believe that for a moment. I run XP Pro on an Opteron with 512M. If I’m running more than one program, it can take as much as a minute just to flip windows between programs. From my experience, XP needs at least 1G of RAM to run comfortably with multiple programs.
I’m not talking monster programs either. I’m talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.
The article is just FUD, and so are some of the responses. Lets hear a little truth for a change instead of blind astroturfing.
I agree in general with this article, but it’s a bit overblown. Why not just say “no real desktop OS runs the latest apps without at least 256 mb RAM” and be done with it? Some perspectives:
– I recently watched the latest Knoppix fail to even start KDE on a new 2.4 gHz Dell Dimension with 128 mb RAM (since with Knoppix there’s no swap). That’s annoying.
– XP or OS X *will* run on 128 RAM … but they churn horribly because they’re dependent on swap/virtual memory in that case too. For our clients (nearly all Windows shops) we insist on 512 mb minimum for all new XP desktops. We’ve been doing this for at least the last year. The increased productivity is more than worth the measly $50 in RAM!
– I run the latest Suse 9.1 quite comfortably on a Pentium II 400 mHz with 384 mb RAM. The same machine ran Win2K quite snappily as well. But take away the RAM and it probably wouldn’t boot KDE either.
– And we run dev servers on the latest Mandrake and Trustix distributions … running LAMP + Samba + Postfix + a few others only requires about 90-100 megs of RAM, leaving plenty of room on a 128 mb machine for multiple httpd processes and a PHP bytecode shared memory cache like mmcache or php accelerator. These are Pentium-class machines and they respond nicely … let’s see Windows Server 2003 even try to boot on one of those!
Are Gnome and KDE bloated? Well, haven’t they always been? And consider that most of the big distros like Suse are basically running them both all the time, because most people want both KDE and GTK apps … and they’re separate from the window server, which is separate from the kernel and so on. All of which are cross-platform code. Compare that to XP or a hobby OS like BeOS or SkyOS which vertically integrates all those components and is written for a specific chip architecture (x86 only) and of course a Linux desktop is going to be more bloated.
But if I can run the latest desktop distro on a 6-year-old Pentium II with 384 mb RAM, who cares?
> I run XP Pro on an Opteron with 512M. If I’m running more
>than one program, it can take as much as a minute just to
>flip windows between programs. From my experience, XP needs
>at least 1G of RAM to run comfortably with multiple programs.
I don’t believe all what you say, not one bit. I use XP on a 256 MB machine as well as many others on an Athlon-XP 1.3 GHz and it runs great. Either your installation is hosed, or you blatantly lie.
Recently having to give up my big boxes, I was forced to recover my old K6 200mhz box with 60MB RAM (and 4MB integrated graphics yuck!) from the garden shed. Now I only have a w2k license so I installed that. Then, as I was beginning to install Mandrake the CD drive died. The Mandrake install CD must still be in there.. Maybe it was a hardware detection probe, maybe it just died through neglect.
So w2k with 60MB RAM and 200mhz – quite dog slow! I think I have to consider myself lucky that I didn’t install Mandrake before I lost my CD!
Give VIM a try, it’s likely to suit your needs.
If ones expect the system to have all the bells and whistles, beautiful interface with lots of themes and decorations, everything plug and play, support to all type of multimedia formats etc, one should expect higher requirements.
I’d have to disagree. The author mentioned Syllable, among others, so I’ll pick up on it now. Syllable can boot from power-on to login window in around 16 seconds, even on machines as slow as E.g. an AMD K6 233. It is usable in 64Mb (Which is a lot but we’re hoping to actually bring that number down in future) A typical Syllable system is running the appserver, the Media server, the Registrar and the Dock. With a setup like this, you can play media from WAVs to XVid MPEG-4 video.
Syllable has low overhead because we’ve tried to make it that way. We’re mindful of increasing memory usage or anything that might slow the computer down. We don’t introduce large dependency trees which require tens of additional libraries or applications to be loaded to support another application (which is quite possibly Linux’s biggest problem) I fail to see why modern Linux distributions can’t do the same things.
Let met guess, it is a LG CD-ROM drive? The Mandrake hardware probe in 9.2 kills LG drives due to some LG firmware bug. This can be fixed, just search with google, afaik LG released a firmware update that solves this…
There are a number of small fast distros that boot right off a cd with great hardware detection and install easily, some of them suprisingly small and light! Search on distrowatch.com or Google and you’ll find them. I should warn you though, you likly won’t find a perfect fit, and may have to roll your own based on an existing one.
I know I’m going to get martyred for this but the article is right, sure its not Linux itself but the include apps that are the problem. As much as I like Linux I will not deny that ever since I started using it I kept wondering where I was supposed to find all that extra speed everyone was talking about.
In my case I’ve found KDE 3 to start apps faster then Windows XP home but in KDE I use Konqueror and KMail and in Windows I use the Mozilla suite which is heavier (more features) then Konqueror and KMail and therefore an exception can be made for the extra few second it takes to load.
I can’t say anything for Microsoft office because its been a long time since I’ve used it but Corel WordPerfect Office in Windows has a lot more features then OpenOffice.org and starts up faster (albeit OO.o is also available on Windows, Linux and MacOS).
I know how miserable it is to try and install any Linux distribution (except for the antiquated Debian woody) on old hardware never mind run it because the minimum requirements have been increasing so fast, using an old distribution isn’t always an option because those don’t meet the software requirements for running new apps any more so the only two options now are either to use source based distributions or buy a new computer every two years.
IMO Linux will survive for a long time to come because of the $0 price tag on most distributions and the free developer tools and KDE of course but something does need to be done to resolve the minimum requirements issue or the next free OS with free developer tools that comes around is going to outperform Linux and get all its users.
I get the impression that a lot of the people who commented either didn’t bother to read the entire article or didn’t bother to read it at all, there is mention of source based Linux distributions being a possible solution but as the article said how is a newbie supposed to manage installing a distribution like Gentoo (yes newbies do end up having to do their own installs, they don’t all have a seasoned Linux veteran to turn to).
Its getting to the point now where it would be more worth people’s time to buy a used copy of Windows 95/98 off eBay and use that with free tools like Zone alarm, Grisoft AVG and Spybot Search & Destroy rather then use one of the latest Linux distributions even if a lot of them are free.
quote:
I’m not talking monster programs either. I’m talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.
endquote:
My xp box is just fine with the apps you mentioned except azureus. Azureus is my favourite bt app but bogged down on either platform due to java.
But this article hits what I’ve found right on the head.
My box is a 1.6 ghz but only 128 megs of ram and most things run like shit to put it bluntly. I’m running deb unstable and using XFCE4. XFCE4 is light but things seem to pile up quickly. I switched to xfce4 from icewm as I wanted to try and have a uniform desktop environment using mainly gtk2 apps.
Before I used to just go for speed exclusively, but ended up with a mishmash of apps with all these different toolkits and looking ugly as sin and interoperabilty issues with copy/paste etc. So I went gtk2 and things aren’t much better. Mozilla, and a few other apps open and it’s not responsive at all. And as one other pointed out, you can’t seem to get it all with one toolkit, so I use k3b instead of gtk2 offerings and that brings with it alot of kde’s bloat.
My gf says to me everytime she uses linux that ‘LINUX IS SLOOOOWWWW”. I respond with I’ll tweak it but can never seem to get good performance from this box. I compile my own kernel with just the bare minimum things I need for this hardware platform. I’m on the latest 2.4 series, I’ve tried 2.6 series several times and always run into swap issues with 2.6. I don’t think 2.6 handles minimal ran too well at all. I’ve exchanged several emails with andrew mortan on the swap issue but nothing resolved so far.
I would be happy if most developers went into a feature freeze for 6 months and just optimize the shit out of their apps. Maybe not the most exciting thing for a programmer and might make linux look a little dated on some fronts but I think it would be worth the effort. Besides the next windows has been delayed for awhile yet so there is a good window of opportunity.
Think about it, 3 big selling points for linux (ignoring open source of course) was speed,stability, security.
XP gives people the speed and now the stability that previous versions of windows didn’t have and microsoft is heavily working on security with the next version. And we can laugh off windows and security, but they don’t stop on something until they have it. They might be slow as hell getting there but they will approach a much higher level of security than they have today all the while still providing their EASE OF USE that is sorely lacking in some areas of linux. And don’t give me the crap it’s just what people are initially used to. Cause with millions of people out there the desktop is what counts and if they have to go to a xterm once, you’ve failed.
Look, I’m not a windows fanboy. Far from it. I actually would like to see them whither away but I gotta call it like I see it.
I don’t know what exactly is your definition of “comfortable.” I have a pIII 700mhz laptop 128mb ram, XP Pro runs with acceptable speed. This laptop also has a Slackware 9.1 installed (with a 2.6.6 kernel) and it’s quite a bit slower. It is still usable, just noticeably slower. Have a couple of Firefox windows open along with a konsole (or gnome terminal) and I’ll see lots of disk swapping.
KDE does deserve some credit because the upgrade to 3.2 makes things a lot faster although still eats up a lot of RAM. But Firefox is getting annoying. The Windows version is acceptable but it’s pretty slow on Linux.
May be you should try running XP and Linux on a low end machine first before saying people are spreading FUD just because they have a different experience.
(Btw, if your Opteron with 512mb RAM takes a minute just to flip between windows, may be you should check whether there’s something wrong with your hardware or your XP installation. Even my XP PRO on my laptop can do better than that.)
I use Mepis on a P3700, 384mb ram, 32mb Viper AGP card. Dual boot mepis – Win2000.256mb memory is the MIN. in todays world, if you dont want to add a little memory, stay with what you have. On this machine Mepis runs at 99% the speed of 2000 in loading programs, stability, etc. When I had a Permedia2 8mb AGP video card, 2000 ran great, Mepis ran pretty slow. ALL of the distros I have tried in the last 3 years have ran much better with a better video card. Built in sucks, PCI was much better, but the AGP slot speed everything up in both 2000 and Mepis. As for bloat, 2000+ WordPerfect + MediaPlayer9 + dbPoweramp + ZoneAlarm + AVG AntiVirus + AdAware = 3.74gig on my HDD. Mepis, which includes everything I need, 1.93gig. To me Win2000 is bloated, and a security mess to boot. I’ll take my bloated Mepis anyday. If people want to switch the will just have to learn, just like they did when the started using windows.
Actually I would Have to disagree with this article. I guess Bob Marr has never tried VectorLinux. it is VERY Lightwieght. It doesnt require 128+ Memory. People have used vector on machines with less than 128 and used gnome/kde on it just fine. It may seem wierd that its based on slackware and still is EASY to use. Check it out sometime. http://www.vectorlinux.com It even boots far fater than anydistro that I have tried, and I have tried more than my share.
Sorry, saying that XP Pro needs at least 1G is simply not true. I was forced to use it on an old notebook (Thinkpad TP600) with 128MB and 233PII processor. It was not super fast, but definitely usable – with such applications as MS Office, Outlook etc. On my current machine (PIII 900 notebook with 256 MB) XP performance is better (the difference is not astounding but noticeable) than Fedora Core 2 (for the same applications – for example Firefox) – while running KDE, Gnome is much slower. And XP Pro is way faster at booting – 3-4 times faster in fact.
I must agree with the article author – but I also think that not only the speed is becoming a problem – general quality of applications is getting worse (perhaps because the apps are getting more complex – gone are days of simple, text-mode only apps not depending on complex libraries). The OS kernel is probably still more stable than – say – XP kernel, but I would say that the entire GNU/Linux OS (as perceived by a user – including desktop environment, applications etc.) is much less stable (speaking about “standard” distributions such as Fedora, Mandrake, Suse etc.) than Windows XP. And this is *very* frightening…
Blah blah blah, lots of anecdotal evidence which doesn’t amount to anything.
Programs do more than they did X years ago. They require resources to do so. So if you want to run some program today at the same speed the same program ran X years ago, you need more resources (this obviously only holds for mature programs).
No, you can’t run the latest and greatest with all the fancy stuff on your ten year old Pentium 90MHz with 24 MB RAM. The latest and greatest will always require more resources than what some people have.
Try interpolating between the requirements for Windows XP and Longhorn and then plot those requirements listed for FC2 on the same graph. I don’t think it looks even unreasonable.
…to be slashdotted and OSNEWed (200 post by tomorrow morning), and Google NEwed
This is exactly the problem with the Linux desktop, developers mean well, but I don’t feel they have considered how many CPU cycles there programs take. Maybe it is also a syncronization problem in their program too.
Bob Marr wrote: “Why should a 1 GHz box with Fedora be so much slower than a 7 MHz Amiga? Sure, the PC does more – a lot more – but not over 1000 times more (taking into account RAM and HD power too). It doesn’t make you 1000 times more productive.”
Couldn’t agree more. Plus, you could buy three books for the Amiga platform and knew everything about it. Free software movement needs to organize, this is getting nowhere.
Any tips people have for optimizing linux (and I mean the platform and all that entails not the kernel).
I don’t think saying go the gentoo way because I was excited about that but I’ve read many comparions where a gentoo distro was any faster than many other popular distros.
So any tweaks anyone knows of post em.
I don’t believe all what you say, not one bit. I use XP on a 256 MB machine as well as many others on an Athlon-XP 1.3 GHz and it runs great. Either your installation is hosed, or you blatantly lie.
Typical response of the astroturfer. “You must be doing something wrong.” It’s not Windows, it the user. Your other choice is just downright insulting. If you can’t defend the product, attack the consumer. The system is as described, it is properly installed, and I don’t blatantly lie.
Anticipating another attack, the drive is a 7200RPM 120G ATA133 drive with 8M buffer. It’s been recently defragmented. I didn’t say it ALWAYS takes a minute to flip windows, but that it CAN take that long. I’ve found that rebooting the computer every other day clears that up pretty well. The longer the computer is run without rebooting, the worse the thrashing gets until it takes more than a minute to just pull up menus. XP has bad memory fragmentation issues that get worse as the system is used, particularly if you use multiple programs.
It is an unarguable fact that hardware continues to improve at a torrid pace, and that minimum shipped hardware on systems, for example RAM, continue to increase. To argue that this should create an acceptance for inefficiency in software, that people should just expect the requirements for running software to increase drastically, is completely illogical. Good programmers like Steve Gibson, Robert Szeleney, the people at .theprodukkt, as well as others, prove that good programming practices result not just in excellent functionality and pleasing appearance, but do so without prohibitive performance hits.
It is also true that todays computer users expect more from their computing experience nowadays, than when the “P200” was the standard. More features and abilities added to a program, done well, and given the abilities of todays computers to number crunch should justifiably increase the footprint of software, but just barely compared to what we’re seeing. ESPECIALLY should this be true of an OS, which is to be the middle-man between a user and the hardware. It is pure marketing hype and an attempt to keep technology sales up to suggest otherwise.
If everyone applied the same standard to software as they did to hardware there would be far more accountability for poorly written software, and security failures. If a machine, or a part on a machine breaks down people take it back on warranty, they not only complain, but expect something to be done about it. If software fails, or causes serious problems, unless it affects the hardware, there is much complaining but less action demanded, because we are indoctrinated to expect problems or bloat and inefficiency.
This article addresses a gradual trend that IS a problem, and not just for Linux, but for software in general.
My main machine is a 1.1GHz Athlon with 512 MB RAM running IceWM on Slackware 9.1. I like it this way.
I also have a 1.2GHz Celeron with 256 MB RAM running Windows XP.
I find most of the ‘I have (tiny box) and it runs GREAT’ and ‘I have (monster box) and it SUCKS’ to be a little hard to believe. I have a mediocre box and XP runs in a mediocre way.
Before the drive died, I had a second install of Slack on the Celeron and it easily outperformed XP.
But almost everybody seems to be missing the point: the author specifically states that Joe User probably *isn’t* going to want Slack and Ice. He wants a GUI distro and Gnome and/or KDE. In other words, MS makes one system. Ipso facto, it’s their best system (allowing for differences in ‘home’ and ‘pro’ and ‘server edition’ and blah blah that Joe User doesn’t care about). So Joe User also wants the quote-unquote best Linux system, which he takes to mean the latest and greatest most user-friendly distro with the IDEs.
No kidding storage and core is cheap. To many citizens of industrialized nations. But if you’re a dude in a third world country who can’t afford to *feed himself*, upgrading hardware is *not* cheap.
The author’s point was that Linux is blowing an opportunity to put first class systems on second class boxes in third world countries (or on poor Americans’ boxes or whatever).
If the reaction is defensive and making excuses, Linux is truly screwed. If ‘bloat’ isn’t a problem, why are so many Linux users so dismissive of Mozilla and hyped about Firefox? (I use Mozilla, thank you – have to pick your battles and Mozilla is just too cool to mess around with any LightningPanda.) We know bloat is a problem but when somebody else points it out and for far better reasons than ‘My FPS in CS sucks’ he gets insulted for it? Weird.
Even if there were no other reason than pride in clean efficient code, that should be enough too want to keep things as slim as possible.
Hi THere!
I have a Celery 900MHz Desknote with 256MB RAM 10 GB 5400 IDE HD, and a dual PIII 450/384MB 36GB SCSI LVD Matrox G400.
Galeon, Evolution, Gnome-terminal (GNOME 2.6 Debian Sid) load on login on seperate virtual desktops, and both machines are quite perky. All the apps are GNOME, and use Gnome shared libraries, thus reducing RAM use. Open Office takes a while to load, but once there is pretty fast. Totem plays back DVDs flawlessly on the dual PIII with out even a skip, and is so easy to use! Couldn’t do that under Windows on that hardware!
Nautilus in 2.6 is FAST, and I like the new browsing modes. Much like Mac OS 9 which I have played with and like.
Just my 2c – Mandrake is definitely slower due to a whole lot of plug and play smarts it seems.
RAM, as pointed out, is cheap. I run Fedora Core 2 on the slowest computer in the house, a K6-2 350 BUT it does have 448 MB of RAM. I noticed quite a peformance boost between Core 1 and 2, and that’s due to good programming. IE GNOME 2 is now very fast, KDE 3 remains as fast as ever, both are only getting better. Comparitively I used to run Windows 2000 on this machine and it was sluggish. I finally get the speed I used to from Windows 98 in Linux. All you need is more RAM, you can keep your old computer.
Well with all this disscussion about rewriting gnome in java or c# to make it a more devl-friendly platform, its definatly going to get a performance hit.
linux is getting bloated, just like everybody else.
i remember clearly the days of red hat 4.2, the very first linux I tried. it ran comfortably at 32MB, and was serving like 70 simultaneous FTP users (MP3 downloaders!) over an E1 (2.048Mbps) line. i remember upgrading to 64MB and things were still fast even if there were 200 simultaneous FTP users on board. and it was a cacheless Pentium 133 MHz. to think that Slackware users then were telling me that RH4.2 was “fat.”
i just noticed that given around 128MB of RAM or less, Windows (Win2K; haven’t tried XP with this little RAM) performs better with its GUI and Office than Linux. Of course, once the RAM reaches 512MB or so, Linux performs better than Windows even with slower CPU machines.
Nothing new. In 1995 Niklaus Wirth has written this article that explains a lot:
http://cr.yp.to/bib/1995/wirth.pdf
br
Marko
everyone here is moaning about not being able to run gnome or kde on really old memory constrained systems….
installing linux on an amd 233 with 64mb ram, but kde runs too slowly ?
emm, did the pc not have win9x installed before ?
why do you not want to use icewm ? it is basically a linux version of the win9x interface. and it will run faster than the win9x interface.Oh, and OO will run on it too.
No-one with even a bit of sense would try to install win2000 or xp on that machine, so why would anyone try a DE ?
I totaly agree with the writer. The huge potential of spreading linux on the win98/winNT machines is there we need to grab it.
There is no excuse for creating fancy apps which consumes a lot of RAM.
Developers pls have efficiency as a goal as important as functionality for your apps
Up until about a year ago, I had a friend set up on a PII 300Mhz with 192MB of RAM. Unfortunately the harddrive died, but when the machine was still alive it ran SUSE 8.2 and subsequently Fedora Core 1 very well. This was with KDE in SUSE 8.2 and GNOME in Fedora Core 1. The machine ran very well. It was left on for a whole semester basically. My friend wrote papers, chatted, browsed the web – even did some basic GIMPing – very comfortably. I don’t think that 300Mhz with 192MB of RAM is outrageous for a distribution made in 2003. On my computer right now (which is a very nice computer, AMD 2600+, 512MB of RAM, 7200RPM harddrive), I have FC2 with GNOME 2.6. I have eight virtual desktops filled to the gills with applications – including the GIMP, OpenOffice, Inkscape (several windows with 1.5MB SVGs in them), Scribus, Epiphany, Gaim, gedit, Evolution, Muine, shiny Crystal icons, Straw, and about a dozen Nautilus windows. I can flip through the virtual desktops as quickly as I want and not feel a bit of slowdown. When in XP, however, clicking the start menu typically results in a 3-4 second wait, subsequently hovering over “All Programs” causes another long wait, and opening more than 4-5 programs brings the system to its knees and an inevitable crash. Linux makes me far more productive.
I think memory footprint and CPU requirements in a lot of FOSS are a problem at the moment. But I think it is starting to get better. Optimisation is hard, takes time and you shouldn’t do it during the main part of development – it’s something you do afterwards.
For example – KDE now has a kde-optimize mailing list dedicated to speeding up KDE. Large amounts of work is going into profiling and optimising the environment. That’s why 3.2 is so much faster than 3.1. With the next release of the next Qt, things will get better still.
Decius raises a few interesting points.
First of all, I agree completely with accountability and poor quality of software. I never understood why a software company can’t be held responsible if their product causes damage.
Yes, programs were written to be more efficient previously. There is a reason for this: computer time was more expensive than human time. This meant that it made sense to spend lots of manhours making something run faster or use less memory.
But that is no longer the case. Computers are dirt cheap compared to human resources. Today it makes good business sense to increase the productivity of the programmer by letting him write in higher level (and slower) languages at the expense of requiring more computer time.
I for one am not going to sit around and handoptimize assembly code to make it run faster. A clever programmer can always do this, but it takes a lot of time, it will likely introduce some new bugs and it hurts portability.
Programs can be written much faster today than they could only a few years back. They may also require more resources and that is exactly the tradeoff we’re seeing.
really? your 512MB Opteron cannot run Firefox on WXP Pro fast enough?
either you’ve got the wrong/unoptimized drivers, or maybe your system is loaded down with spyware and viruses
i tried using a Pentium III 500 (the one with 512K L2 cache that runs at half the CPU speed) with 256MB with WXP Pro and it was fast and very very usable.
There is no discussion about rewriting GNOME in C# or Java. There is discussion about allowing core components to be written in C# or Java. Big difference.
M$ is evil and does bad business practice, but the suerly know how to make polished user experience. I can’t help it but XP feels somehow faster than Linux + KDE. It’s more responsive, and it’s easier on XP to turn of the goddam eyecandy.
And I noticed that Linux’s software developers tend not to give a shit about backward compatibility, a thing that always seemed to be a primary focus for M$ developers.
There are known places to cut out bloat. The way stock icons are handled is inefficient, causing apps using libgnomeui to load the icons into memory several times. Metacity has some “low-hanging fruit” type optimizations that Havoc Pennington (who is a very talented coder – that was a lame troll thrown in by the author; the reason that GNOME terminal is slow on some machines is pango, the text renderer, which does receive very good acceleration from X at the moment) has recently published for those interested in optimizing the WM. The biggest GUI speedups are going to come from the new X technologies – most of which are already incorporated into X.org CVS. For true legacy machines, however, the author paints too bleak a picture. There are usable options that use far less RAM – the best being XFCE. OpenOffice, in my opinion, is the biggest bloat problem at the moment. Abiword and Gnumeric are great alternatives for most tasks, but there is a need for a lightweight powerpoint presentation. Perhaps OO.org 2.0 will help to solve this problem.
XP, in my experience, only appears to boot faster. I have a very reasonable boot time with FC2. And more importantly, when it looks like the boot process is done, it actually is. Once I see those panels in GNOME, I know it’s ready to use. In XP, sure, I see the desktop pretty damned quickly, but then the system tray (many times invisibly) is loading who knows what for another minute or so before I can actually use my machine. While this way of doing things may seem better at first, it often times confuses the user and compels them to open an app too early, making the boot process even longer as the system struggles with all demands placed upon it.
Well, I’m running two boxes at home at the moment;
Main: XP, Athlon 2.8, 1GB DDR 400, 200GB HDD (ATA133 2x 7200rpm disks), Geforce 4 4800.
Secondary: Mandrake 10, Duron 1.2, 256MB ram, Geforce 3, 1 x 80GB 7200rpm disk.
Mandrake 10 is dramatically faster than the 9.2 I was running on there before, and in general, it is faster than WinXP for normal application. Internet Explorer is an exception, because it is pre-loaded, of course, but pre-loaded Konqy is not too far off. One thing I do find, is that Mandrake is faster to respond to activity than XP is in many situations, because XP is always doing stuff in the background, even while idle. This can lead to several second pauses between clicks and click registration. Sometimes the thing just gets plain busy and takes a while to redraw or sort itself out after a large memory-grabbing application like Photoshop closes.
Honestly, it is faster to me, but these things do vary between systems. My Athlon 2100+ laptop feels a lot slower under XP for some reason (still 512MB DDR and dedicated graphics), which I assume is the hard drive spindle speed coming into play.
Oh, it is isn’t fair to suggest that KDE is actually becoming slower or more bloated with each release; it is actually becoming faster and more lightweight in memory terms, thanks to a lot of optimisation work being done, ultimately thanks to the joys of valgrind KDE 3.2 will substantially out-perform 2.2 on the same system, which is a good thing
Both winxp and linux+gnome/mozilla is really bloated for low end machines.
With linux you can choose to use lighter windows manager. If you go deeper you can turn off unncessary daemons and trim down your kernel as well. In extreme case you can just f@ck GUI at all.
With winxp you are bloated all the time and there is not much you can do about it except turning off some services.
I use fedora2 on P166+128MB ram, and i am pretty happy about it. Winxp on such a box is pretty much useless, even too slow to play solitaire.
1. A couple of years ago we heard all this about GNU Linux being secure…
Now we all know that ain’t true, that’s what BSD’s are for (and in particular Open BSD)
2. GNU Linux is userfriendly
This one is actually heard every now and then, however we all know that it simply isn’t AS userfriendly as XP nor BeOS or Skyos etc…
3. GNU Linux is so efficient.
Well it’s getting fat, speed is devoted to BeOS and alikes, slim desktop systems..
4. Linux is Free
LOL, yah right!
I hate to bring this back up again, but it has to be done. I can’t remember where I read this, perhaps in Tanenbaum’s Operating Systems book, or perhaps online, but what he preaches seems to be right on the money. I think alot of things in linux suffer from this problem. “Perfection is reached not when there is no longer anything to add, but when there is no longer anything to take away”. I think kde 3.2 has taken a step in the right direction, and has admitted that bloatedness had been a problem in the past. I think alot of the problems with linux today have occured because of the addition of new features, and I question how useful they truely are. I remember using linux 5 years ago on the same machine I have today, and I remember being able to do everything I could do today, while my machine didn’t lag behind trying to keep up. As more and more code gets written, I believe, more and more code has to be audited, optimized and reviewed. The easiest way to speed things up isn’t faster hardware, it’s code review. The level of complexity linux is reaching is on par with windows systems, which is extremely hard to deal with.
Well I don’t believe you. You said,
I’m not talking monster programs either. I’m talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.
I get the exact opposite behaviour on my AMD k6-2 450 Mhz with 384MB RAM.
Try the following registry tweak, although I should say that it didn’t thrash before the tweak. The first tweak keeps more stuff in RAM, so there is less paging to the pagiing file.
[HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlSession ManagerMemory Management]
“DisablePagingExecutive”=dword:00000001
“LargeSystemCache”=dword:00000001
I’ve noticed that XP has a lot higher swapiness than linux. It starts using its page file immediately on my machine (512MB RAM) whereas it really is a struggle to get Linux with kernel 2.6 to just actually start using its swap partition.
it probably partly explains why XP is faster on low memory machines.
I dont know if many of you are aware of the recent trend Redhat has taken in compiling their distribution, the reason the memory requirements are up but on the reviews you hear that FC2 is really snappy (which I agree with) is because the main and biggest software packages such as OO and Moz are compiled with prelinking support, this means that the libraries are loaded when the machine starts up, and so the requirements are quite steep especially seeing as many uneeded services are also fired up on a default install. I might recommend doing some homework before wining about RAM requirements, because to end users, you just see the requirements go up and dont understand what is happening in the backend. At least thats what I understand the situtation to be, Anyone feel like correcting me?
As far as interface design goes, I think GNOME has your minimalist attitude pegged . Seriously, though, we’re always looking for optimizations. I can think of several proposals for major optimizations floating around the GTK community right now. The community is definitely looking to speed up performance. One of your other statements though, that you could do everything you can now five years ago, is ridiculous. There is now near 100% MS Office compatibility – for all major applications. There is real desktop integration. There are high-quality raster and vector graphics. There are advanced music playing applications like muine and rhythmbox. There are photo management apps like F-Spot and Gthumb. The GNOME project was started only seven years ago and wasn’t usable for a year or two after that. Things have progressed a lot. In fact, I’ve found the rate of progress stunning.
The author is right. The problem is that, for most developers, software performance is not a focus of their efforts; so long as that is the case the problems described in the article are only going to continue.
But they don’t have to. When developers put their efforts into optimizing for performance real results are possible: for example, Apple’s efforts have made Panther (10.3) noticeably faster than Jaguar (10.2) on the same hardware. This article talks about how Apple did it:
http://www.kernelthread.com/mac/apme/optimizations/
I’m afraid the open source model may have more trouble getting people focused on software performance than, say, Apple or another corporate developer. I’m not trying to knock open source here — I’m all for it — but it isn’t clear to me that anyone (or any group) has the incentives to take responsibility for optimizing overall performance in many open source development projects. I think it’s probably fair to say that about bigger projects like KDE or GNOME. Anyone disagree?
The biggest cause of bloat in Linux seems to stem simply from sloppy programming.
* Memory leaks. Because Linux (the kernel) catches these, you can get away with this to a limited extent with desktop applications which are opened and closed frequently.
* Scripting Languages. I’m getting sick of these. Starting a single gdesklets application takes up 36 MB of ram (this is taking X server memory into account) to display a simple weather status application. Rather than being ‘glue’ to hold various applications together, these languages are being used for entire applications. Do it properly, or don’t do it at all.
You could’ve told him to switch to AbiWord, Firefox and XFCE4 if he wanted speed.
… Or Gentoo.
I disagree. Firstly, the core developers aren’t all volunteers anymore, so if that was the premise of your argument (that volunteers don’t want to hack on something as boring as optimization) you’re off. In fact, many developers are paid and full time. I know of several instances where performance has been the primary focus of application development. The 2.6 kernel saw preemption added, the CFQ scheduler, etc. These are all improvements to speed desktop performance. I remember recently seeing that Miguel de Icaza was pushing Larry Ewing to find out why the icon view in F-Spot was slower than in some other app. And there are the other recent optimization discussions I’ve referred to. Secondly, volunteers in many cases do care more about optimization than employees. I lot of open source hackers take great pride in writing efficient code. That will never change.
What the author is referring to are the ones that are bringing Linux to the world.
The world doesn’t care if some hippie down the street can run OpenOffice with just 32MB RAM (exaggerating here). The world knows Linux as what the enterprises are portraying them, say IBM, Redhat, Novell etc.
<p>PastyHermit: You mean preloading, not prelinking. Prelinking is the process of pre-resolving all the symbols in shared libraries, which can improve performance for programs using lots of shared libraries.</p>
<p>Preloading seems a pretty poor way to improve subjective loading time for me. Instead of waiting longer the first time a program loads, the user is forced to wait longer on boot, even if he never wants to use the app.</p>
At Edward:
Oh, so if people can’t code applications so they conform to your high standards, they can’t make applications at all?
I am sure lots of people enjoy the gdesklets regardless of them being written in Python or not. I don’t use them myself as I don’t look at my desktop background much. Please don’t assume that just because you don’t like something, nobody will.
It’s not like you’re losing something because someone codes something you don’t want. If people code free software, you can only gain from it.
I don’t think so. Whilst fixing problems my 2.6.6 kernel, I had the opportunity to boot back and forth between 2.4 and 2.6 and kernel 2.6 is really faster ! The next thing I will do is use kolivas patches to autoregulate swapiness and use the staircase scheduler.
kde 3.2 also brought speed improvements as did oo.org 1.0 and 1.1. Whilst taking part to cooker testing, I immediately noticed than mandrake 10 was faster than 9.2.
It is true that the priority of the last few years for linux apps was to do a features catch-up but performances have definitely improved and now than kde and gnome are quite feature rich, I am sure these guys will continue optimising.
And lastly, it’s always been my experience that linux remains usable, whatever the load (kernel compile, cd burning and so on) whereas Windows (XP included)priviledges the heavier apps, making the system hard to use whilst ripping CDs for example. I’d rather have applications taking 6 sec more to start but then being usable with heavy background activity rather than the other way round.
All these full-time developers at Novell, RedHat etc. have top of the line machines and don’t notice the slow down.
It’s not like you’re losing something because someone codes something you don’t want. If people code free software, you can only gain from it.
True. However, I’m very worried at the way perl/python is becoming a general programming language for applications that are meant to run all the time. I’ve always felt that scripting languages are for binding two seperate programs together, or simple programming of uncommon tasks (say, gTweakui), rather than writing a text editor.
—
Interesting note, I just fiddled with the themes in my GNOME install. Switching from a Pixmap theme to an XFCE theme made huge differences in the ‘usability’ speed of the system*, and dropped ram usage from 289 MB by around 100 MB for a fairly small app load (epiphany, system monitor, nautilus and beep-media-player). I wonder how much of Fedoras speed issues are caused by an over-abundance of eye-candy.
—
(* For reference, System is K7 XP 2000 w/h 768 MB DDR on GNOME 2.6/Debian)
Brad Griffith made a good point. Reason that GNOME terminal is so slow at drawing chracters is Pango, and Pango in turn uses RENDER extension. Read http://www.osnews.com/story.php?news_id=5453 . Interesting quote:
A big bottleneck right now in GTK+ performance is the poor performance of the RENDER extension drawing anti-aliased text. Even without hardware acceleration, it could be tens of times faster than it is now. I’m hopeful that the X server work currently ongoing on freedesktop.org will result in that being fixed.
So they are aware of it. And it is not Pango’s fault. Pango may look bloated if you don’t care about internationalization, but it does much better job in placing East Asian AA texts than (say) Qt. And that is very important to me.
And the miracle-working Apple optimizers mentioned just a little bit ago are using Lisas, right?
What the employees use doesn’t effect a corporation’s need or lack thereof for optimizations. And, to emphasize that volunteers are most certainly interested in optimization, I know of many independent GNOME developers who run very modest machines. Ross Burton (developer of Sound Juicer for GNOME) needed a hard drive donated to him so that he would have enough space to do a build of CVS GNOME, for instance.
Unlike Windows XP, you can run something like fluxbox and gtk+1.x apps and still have a snappy desktop on a P166 with 80 meg of ram.
Really, the only missing component for the minimal hardware desktop is for dillo to be just a little bit more complete.
As others have pointed out, it’s not linux that is the problem, but the desktops and their associated monster libraries, along with X which is quite speedy but still takes a sizeable chunk of ram.
I could see some of these newer hobbyist OSs that have a stricter development path taking a chunk out of the low-end linux market, which would be pretty ironic.
I totally agree with this article. Even I have a story which surrounds speed of applications on linux(linux desktop).
My story is like this:
I have a celeron 433MHz , 32 mb RAM at my house. I run Mandrake-Linux 9.2 and Windows 98 on it. I am happy using linux on it, but not as happy using Windows 98 when it comes to speed.
I was using blackbox as the window manager before. But was quite happy when I saw Xfce and started using it. But still all the gnome/kde applications are very slow.
Mozilla and open-office are unusable. Whereas I use IE and MS Office very happily on Windows 98. On Xfce I have only 2 desktops. It becomes very slow if I go beyond 2.
Its very painful using the gnome terminal. I still stick on to rxvt. Even tried using 2.6 kernel, but found that it performed very poor when compared to 2.4.22.
At my work place I have a 1.4 GHz, 256 MB machine running Debian(Sarge). All applications work slightly better here. But still have problems with application speed. When I login my machine in the morning, It takes about 2 minutes for all the windows to redraw. If I keep running Mozilla continuously for a week or a two, the machine slows down. It takes about half a minute to see the splash screen of open office after starting it from the command prompt.
Conclusion:
I hope the application developers give imortance for the performance of the applications, rather then keep on adding features. I agree that linux means the kernel, but not the desktop environment. But a normal computer does not know anything about the kernel and desktop environment. He/She would like to use it, to get their work done quickly. I hope importance will be given to speed and efficieny of the applications in the near future.
I think the author is trying to say what linux distro’s need to stive for this way they could take over some windows installs in the business arena. I think he just fails to realize that it just isnt possible. He says how linux used to be so fast and everything, the reason for this was because it didnt have nearly the features it does now. Before your only options were these very slim apps that he even mentions but now there are better things out there. Its called progress so get over it. If he want blazing fast speeds like those of the old linux days then just stick to the command line and no gui at all. Everything you need can be worked from a console from IM to web browsing to document editing.
Also there is no reason why a user cant upgrade his hardware anymore. Just think about, A user could buy Windows XP Professional for around $250 and have it work decently on his older pc or he could buy 1gig of ram for his computer for about $100 (search pricewatch.com) and then use Fedora core 2 at a cost of $0. You will have $150 more in your pocket plus 1 gig more of ram in your pc and a far better OS on your hard drive. Also if you were going to use XP you would have to buy a hardware firewall plus antivirus software just to have a usable system. And believe me you do need a hardware firewall. A software based solution is as week as the OS it runs on case closed.
I’m curious to how smart the linux linker is. As in, how smart is it when it bringing in code from libraries. I remember an article sometime back about how certain linkers were very agressive in just taking out the exact code that was needed from a library for an application.
No he mean’t prelinking.
Beginning with fedora core 1, redhat introduced for their distro prelinking. This is the reason you may sometimes see CPU usage go high while your machine is idle.
Although it said C++ apps like KDE actually benefit from this whereas GNOME does not.
I am just going by stuff read here at OSNEWS. I am not an expert in the matter.
eg. http://freshmeat.net/projects/objprelink/
Don’t be silly. Havoc writes great codes. It is just that he is a little ahead of time. 🙂 Pango and Metacity use either under-optimized or not-very-well-supported-on-network X extensions. And X will improve.
To that! Remember the C64 and the astounding things developers could achieve with so few resources by todays standards, the limitations of the platform forced developers to be careful about what they wrote. Ignoring memory and cpu limitations usually ment your creation would simply crawl.
On the contrast, todays bigger faster machines impose no such limitations, developers are free to write any old slop and get away with it, and it really shows! If it runs to slow, you have to get a faster machine.
As an ex BeOS user myself I am all to aware of what a well designed OS should feel like.
IMO the biggest problem with Linux is Linux itself. Its monolithic design is comming back to haunt us and as time goes on its only going to get worse. You can only speed hack it so much, not to mention the usability issues that seriously impact it widespead acceptance. (Can you say drivers?)
When it comes to the perception of speed a better X would certanly make a world of difference, but that will do nothing to address the deeper underlying problem. Lack of developer concern for tight light code.
Your 433 Celery isn’t a speed demon, but isn’t exactly chopped liver either. But that 32 meg of ram is killing you. Since you mentioned that you have a job I would advise for you to spend $10 and pop in another 64 meg stick at the very minimum.
IMO the biggest problem with Linux is Linux itself. Its monolithic design is comming back to haunt us and as time goes on its only going to get worse. You can only speed hack it so much, not to mention the usability issues that seriously impact it widespead acceptance. (Can you say drivers?)
You are clearly talking out your ass. The structure of the kernel has sod all to do with the way the applications are structured.
(If having a monolithic kernel was slower, then I wouldn’t get better fps in Enemy Territory in Linux than in Windows would I?)
I’m running FreeBSD 4.10 loaded with GNOME 2.6.1 (with some 2.6.2 components) and I could never understand why the terminal was so crappy. This is on a K6 550Mhz, SIS540 Graphics chip, 256MB Ram and a 60gig 7200rpm hard disk, and yet, the responsiveness of the gnome terminal is though I have 1000 users logged onto my machine leeching resources left, right and centre.
Please, someone fix it up, I don’t see it in KDE on the same machine so why does it exist in the GNOME terminal? Please, my kingdom for responsive GNOME terminal!
I have a machine running on 64MB RAM and a 233mhz CPU as well as a 2GB HD running Windows XP.. below the requirements….it runs fine… just as slow as windows 98 or 95… a gnome desktop redhat 8 couldn’t keep up with it. Very incredibly slow and kept freezing.
Funny, I haven’t heard many complaints about the speed of the Linux kernel. In fact, it seems the server market is drooling over it. And the Apache web server – not really known for bloat and loose heavy code. The platform is developing really well. A faster X will make a huge difference. Other improvements have helped desktop performance a lot – CFQ scheduler, kernel preemption. Linux adoption is increasing exponentionally. And the platform is improving faster than any other as far as I can tell.
(If having a monolithic kernel was slower, then I wouldn’t get better fps in Enemy Territory in Linux than in Windows would I?)
99% of the dual-boot rigs out there get better fps in windows than in linux. Linux has closed the gap, but windows invariably runs games better.
Just curious – what graphics driver are you using?
KDE 3.2 is nice and snappy on my 266 MHz with 160 MB RAM.
Of cource, it’s running Gentoo…