The center of gravity is shifting away from the traditional, massive operating systems of the past, as even the major OSes are slimming their footprint to make code bases easier to manage and secure, and to increase the variety of devices on which they can run, InfoWorld reports. Microsoft, for one, is cutting down the number of services that run at boot to ensure Windows 7 will run across a spectrum of hardware. Linux distros such as Ubuntu are stripping out functionality, including MySQL, CUPS, and LDAP, to cut footprints in half. And Apple appears headed for a slimmed-down OS X that will enable future iPhones or tablet devices to run the same OS as the Mac. Though these developments don’t necessarily mean that the browser will supplant the OS, they do show that OS vendors realize they must adapt as virtualization, cloud computing, netbooks, and power concerns drive business users toward smaller, less costly, more efficient operating environments.
I’m probably missing something, but if Ubuntu strips out CUPS what printing system will they use? I’d guess the average desktop user still wishes to be able to print every once in a while, right? Or do you simply mean the service will be disabled by default?
I don’t know how Ubuntu solves it, but I have often had a look at all of the processes running in the background of a modern OS and wondered if they are really necessary.
A modern CPU can’t take much time to start most services on demand. Why shouldn’t the OS intercept calls to the printer service and only start it when required?
Edited 2009-02-09 20:13 UTC
I think the plan is to use Upstart to start those services on demand only when they are needed and stop them afterwards.
Makes sense if you ask me.
“Linux distros such as Ubuntu are stripping out functionality, including MySQL, CUPS, and LDAP, to cut footprints in half.”
I don’t know if they’re talking about memory footprint, but I doubt that removing such software would reduce the footprint in half — and I’m not sure that would save much RAM either.
I’m no big fan of software bloat and I’ll concede that on netbooks and other constrained devices every byte counts, but with RAM, hard disks, CPUs, etc. getting cheaper and cheaper, I don’t see any real benefit for the typical desktop user.
Of course trimmed down systems have their benefits too (think about virtual appliances for instance) but it seems to me that this yet another case of “one size DOES NOT fit all”! ๐
There is a huge room for improvement without need to increase the specifications. Just because ram and processors are cheap, doesn’t mean that a system can be badly designed, wasting a lots of ram to provide features of dubious utility.
The bottlenecks of desktop computers now is the dismal performance of secondary storage systems and the lack of new ways to interact with it (rectangular monitor without touchscreen, keyboard and mouse). Without solving that, there’s no room for large improvements visible for the normal end users, except some bangs on GUI and usability (3d effects, animations…).
What all this ‘trimming’ is all about is modularized & optional components for everything non-critical, and then following the “lazy programming” model in which all processing is delayed until the moment it is absolutely required.
This means that if you start your computer and don’t go on the network, networking isn’t started. If you don’t need to print, you can either not install the print capabilities, or you can install them and not have any overhead other than the on-disk overhead.
The end result is kinda like BeOS. And that is one AWESOME system, performance wise (well, perceived anyway). BeOS, however, loads all servers up at boot, but does so asynchronously, only waiting here & there in the boot script to assure proper function.
The trick with BeOS is that you can remove so much from the system that it can boot in just a couple seconds on a 333MHz PII & a decent hard drive. This little trick has many benefits and is being more appropriately utilized in more OSes ( OS design generally appears to allow this anyway – even in Windows – but it may not be properly executed, as in BeOS ).
–The loon
RAM is cheap, but not that cheap. Hard disks are slow, thus any RAM you’re not wasting can be used to cache files for disk. Also, CPU cache is still very expensive, and cache misses are slow, the more of your system and applications you can fit in that cache the better.
I have 1GB of RAM, usually firefox is using 300MB and my other applications are using another 200MB, the rest of it is just used to cache file from disk.
It is not just hard drive or ram memory space.
I have seen Windows machines that are *NOT* connected to the network waiting for the time-out before they would continue their operation when you login or started certain program. If you disabled networking they would start almost instantly.
In other words the computer is slowed down by a service you never requested, and should have never been active in the first place.
I am sure this is not Windows direct fault as other programs would start up right away, but why is the service even running if the computer have no use for it?
I think that’s an Active Directory thing.
Windows doesn’t really seem to be getting smaller, considering they plan to eliminate Starter and Home Basic from the western market.
My favourite bit about this is that a mere three “Page 2” posts back is the article “Fedora 11 Alpha Comes With Huge Feature Set”. So which is it? Tiny, “shrinking” Linux distros or huge future ones? It’s make your mind up time… ๐
Windows XP pro install is around 8-9G HDD space.
Windows 7 (from the reviews I read) is around 10Gig HDD space.
Ubuntu 7.10 (what I currently use) is around 1.2Gig HDD space.
Mac OS-X: I have no idea it’s install size, but I’m pretty sure it’s bigger than Ubuntu.
NOTE:
Also note that Ubuntu comes with an office suite (OpenOffice) whereas Windows doesn’t. Add MS Office to the Windows list and I think it grows by another 1Gig.
Either way, I think it’s great that they cut down on the default running services. I would rather enable them as I need them.
Since when? Windows XP Pro is about 2GB default install. I believe the minimum requirement for an XP Pro installation is 1.5GB hard drive space available.
The minimum requirement for a default Ubuntu installation is 4GB disk space for full install and swap.
You can use the alternative Ubuntu install cd and trim this size down, but you can also use NT Lite to trim the default Windows XP Pro size down considerably.
If you’re going to bash Windows, please do it accurately. Making up statistics does not help anyone.
Sources:
http://www.microsoft.com/windowsxp/sysreqs/pro.mspx
https://help.ubuntu.com/community/Installation/SystemRequirements/Gu…
https://help.ubuntu.com/community/Installation/SystemRequirements
Windows XP’s disk usage depends on how much memory you have, because of the pagefile. The pagefile starts off as 1.5 times the physical memory in the machine, capped at 2GB for 32-bit versions of the OS. You also have the hibernation file, which is a bit larger than the amount of physical memory in the machine.
Installing the 32-bit version on a machine with 2GB of RAM would get you a 2GB hibernation file, and a 2GB swap file. That bumps the disk space requirement up to 6GB.
Installing the 64-bit version on a machine with 4GB of RAM would get you a 4GB hibernation file, and a 6GB swap file, for a total footprint of around 12GB.
Granted, Ubuntu has a swap partition as well, but it doesn’t show up anywhere except the partitioner. And it’s not as large as Windows’s default page file.
Hibernation is not enabled by default in XP.
That doesn’t really matter. It takes disk space.
The different isn’t that large and you can always reduce the size of the windows swap size easily. Can also be done with Ubuntu but it isn’t as easy.
Maybe not by Microsoft, but I have seen machines from the manufactor where it is. Ofcourse, these machines came with tons of non-Microsoft bloatware that the user did not know how to remove either.
Before 80+GB hardrives became standard in laptops those two files alone counted for a big percentage of the hard drive space being used.
They have a low overhead and if software companies supported them and hardware companies wrote drivers for them, they could beat Windows, Mac OSX, or even Linux in a lower memory faster loading operating system.
Windows has always had bloat, only worsened by third-party apps and crapware on new PCs. But the Linux side is not above reproach; I was looking at a utility to switch into a chroot to run 32-bit apps on my Arch 64-bit system, and it pulled in almost 120MB of libraries on the download, for a 300+ MB installed footprint. For one utility!
That said, I can do a “ps ax” on my Linux install, and pretty much know what everything is (and it fits in ~1.5 25-line terminal screens pre-GUI, and maybe 3 or 4 while in Xfce). I use Autoruns on Windows to see what all is starting up, and I’m lucky to know what 50% of it is — and that program has to have a tabbed interface for different startup “categories”. There are a couple hundred entries at least on a clean XP install.
My fav in the minimalist department though is OpenBSD — that has to be the simplest and quickest to install, most no-nonsense, no-bloat multitasking OS I’ve ever used. Not flashy, but it gets the job done.