Along with the release of the Windows 7 release candidate came new system requirements for Microsoft’s next operating system. This updated set of requirements has been declared final, making them the official system requirements for Windows 7 final. Seeing Microsoft’s rather… Dubious past dealings with minimum system requirements, let’s take a look at Windows 7’s.
The updated requirements have changed little since the Windows 7 beta, and they are almost identical to those of Windows Vista – but when Windows 7 is released, we will be three years down the road. Without further ado:
- 1 GHz processor (32- or 64-bit)
- 1 GB of RAM (32-bit); 2 GB of RAM (64-bit)
- 16 GB of available disk space (32-bit); 20 GB of available disk space (64-bit)
- DirectX 9 graphics device with WDDM 1.0 or higher driver
Sadly, I don’t own a machine close to these minimum specifications to give you guys a hint as to what that would be like. My AMD Phenom X4 quad-core (4×2.2Ghz), with on-board Radeon HD3200 graphics and 4GB of Geil RAM obviously runs Windows 7 RC (32bit) blazingly fast. My much older, but still relatively powerful Pentium 4 2.8Ghz HT machine with a GeForce 6200 and 2GB of RAM also has no problems whatsoever with the Windows 7RC.
The only machine where it could’ve gotten interesting is my Aspire One netbook, which has the well-known Atom 1.6Ghz processor with an on-board Intel-something video chip and 1.5GB of RAM. Sorry to disappoint some of you, but this netbook has no problems running Windows 7 RC either.
Then again, none of these three machines had any troubles running Windows Vista anyway (I never ran Vista on the quad-core though) so to me, there’s really no surprise there. I have already accepted that my apartment is a magical Windows-problem-free zone inhabited by fairies who keep my machines tidy and clean at night while taking sips out of my Martini bottles.
Anyway, looking back upon the increases in system requirements for Windows, here’s a nice little table I made of all the official minimum specifications as listed by Microsoft.
Minimum system requirements for Windows home-oriented releases as listed by Microsoft.
As you can see, this is the first time that Microsoft has actually managed to keep their minimum requirements flat. However, knowing that Windows 7 actually runs a lot better on the same hardware than Windows Vista, you could technically say that the minimum requirements have been decreased, fairly unique in the software world (Apple managed to do the same with Mac OS X a few times).
If I ever have any time to waste I’ll try to install it on my old IBM server with dual P3-S 1.32Ghz to see how it runs. The trouble will be drivers I think!
The XP minimal requirements are just as ridiculous as that of Vista… unless all you do is play minesweeper.
To be fair to Microsoft – not something I say very often – XP originally ran pretty well with that specification. The problems only started when you added anti-virus software; and remember that when XP was launched many home PCs weren’t connected to the internet so antivirus wasn’t a necessity.
Of course, after several years of service packs, XP is now barely usable with 512MB of RAM. But a fully patched copy of XP is hugely different from the 2003 version.
As far as I can find, MS never updated the minimum / recommended specs when they released service packs
which was a huge disservice to their XP users.
Aye. I have a fully stripped down XP SP3 install on my netbook that boots up using 137 MB of RAM. There is no way you could run XP-SP3 out of the box in 128 MB of RAM, it’s just not possible. 512 MB at least when you add apps.
Not out of the box anyway, but I’ve done an XP sp3 installation with all but the necessary components stripped out and its memory footprint at boot-up was 49mb. So technically, it’s still possible… but how many people are going to bother going through that? I only did it to see how small I could get it, but I’m a geek so I find that stuff fun.
I’d like to know how you managed to get it to < 100MB. I haven’t removed any services yet, but I nLite’d the disc, and then emptied every process I could out of the startup. I disabled WinUpdate and every process I could. (It boots with only 17 processes).
but thats the important thing when slimming down windows
my minimum xp-sp3 configuration used 40mb out of the 192mb available.
you can even go below the 40mb-mark, but at this point you realy loos functionality (like copy&paste and other essential things)
in my opinion xp-sp3 is no problem with 128mb, even without special treatment
but it will probably need some time after booting to swap unused stuff
To me, the worst minimum listed was the 4MB for Windows 95. It technically ran, but it might take 30 seconds to click a button (even after a fresh install)! The real minimum was 8MB.
I’m happy to see that XUbuntu runs with less than 256Mb…
http://www.youtube.com/watch?v=z39n5Tleo0A
I’m not entirely sure whether to laugh or cry by this grossly uninformed statement. Appearently, some (young?) people actually think that before the internet, there were no virusses. Am I that old, I wonder?
JAL
No, they’re quite realistic. My wife ran XP Pro SP2 on her P3 450 MHz laptop with 256 MB of RAM for several years. She never complained, even when running Mozilla Firefox, WordPerfect, and Trillian at the same time. We even used it for some basic picture editing tasks.
Yes, XP runs best with at least 512 MB of RAM, with 1 GB being the sweet spot. But it is definitely usable with less.
Errm, XP’s Minimum requirements were a 233MHz processor with 64MB RAM. Your wife’s PC was above even the recommended requirements of a 333Mhz processor with 128MB RAM (coincidentally also the specs of the oldest PC I’ve ever run XP on…).
When I was using XP it required 87MB to boot, with completely usable and very fast system.
The rest is up to you obviously, if you want to play Doom, or work with heavy graphic then you need much more than basic system (better video, more RAM faster processor, but this has nothing to do with microsoft)
Doom? You mean the game that runs quite happily on a 80486 with 8MB RAM…?
…are set a little high… kind of like how in (at least) the 50’s, 60’s and 70’s (pre-unleaded fuel) car manufacturers played DOWN horsepower ratings.
Anyway, I have found Windows 7 (I am now running 7100, started with the first beta release) to be pretty snappy in all respects on my somewhat beefy (mid-range) PC. With my Q6600, GeForce 8600GT and 4 gigs of ram I can pretty much do anything I want and in a much more friendly-on-the-eyes way than XP… altho’ OC4J frequently dies because I have been too lazy to up the permgen…
I think (as I’ve said before) this version of Windows will be here to stay for awhile. Decent performance, good usability… This will be the XP SP2 of 2010 (but obviously prettier).
Edited 2009-05-01 14:51 UTC
I know: Off Topic, but…
Anybody remember the game Forsaken? The box said it needed a 16MB video card, 64MB of ram, and a Pentium 266MHz as a minimum.
That game was blazing fast on my Pentium 75Mhz, 32MB of ram, and 8MB ATI Rage Pro graphics. I’m talking >24fps at 1024×768.
Of course, Quake2 said it needed a Pentium 90Mhz, and it was unplayable on my 75Mhz chip. Oh well.
As for XP requirements, I know Microsoft demanded 128MB of ram for XP, but were XP requirements actually higher than Win2k? I ran 2000 on my P200 MMX with 64mb of ram and was blown away by the performance compared to Win98. That was around the time that DivX came out (the codec, not the lame DVD hack/half rental scheme of the same name), and the movies that would skip on anything less than a P266 MMX in ’98 ran fine on my P200 MMX under Win2k.
Oh, the good ‘ol days.
UO, anybody?
Edited 2009-05-01 15:19 UTC
Oh yes! Forsaken…I’ve lost so many nights of sleep playing that awesome game. Those were indeed the good times.
Vista eats 40GB??!! It eats about 20GB in my case…
The article said it requires a 40GB HDD but 15GB disk space (so presumably Vista refuses to install on a partition smaller than 40GB even though it doesn’t require that much space for OS files).
15GB for OS files still seems excessive to me though.
One of my co-workers has been able to trim Vista down to
8 GB without sacrificing stability but it took several months of work.
I agree. I don’t understand what could take up so much space, even with System Restore and other crap turned off.
Excessive is an understatement. It’s ludicrous!! That’s just for the OS, now you still need to install your actual applications: Anti-virus, office suite, general tools, development software, games etc…
Ubuntu Linux installs the OS with loads of applications which include OpenOffice and it’s still only 4GB of hard drive space.
Where are the days of OS’s like Win95 only taking up 50MB hard drive space.
And 20GB is OK? What I can’t understand is how an OS with as little out of the box functionality compared to some other OSes can take up 10 times more space out of the box than others do.
It’s Microsoft after all. They are the masters at this kind of things.
+1
It’s totally crazy!
In the end it’s what Microsoft say their OS runs like with their default apps and set-up. Yes those minimum specifications would be passable but when you install your third party everyday apps, it’s going to struggle.
It also depends on your interpretation of ‘it runs fine’ and people would find your “it works fine”, not very usable.
Always the same propaganda that the next OS will be the best of the best.
We have also seen the same thing with Vista, XP, Me… and 95
You make it sounds like that’s a bad thing? Every version of Ubuntu is billed as the next best thing. Every version of Mac OS X is billed as the greatest thing since sliced bread.
Every vendor does that with their products.
Well, with the Linux distros it is usually like this:
The versions of Redhat/Debian/Suse/*buntu from last year have been very good, one of them was probably ahead of the others.
And EACH of this year’s versions trump ALL of the previous year’s versions by far.
I’d like to hear that sort of thing for Windows, but every time they release a new system, some things are regressive, not progressive (like security policies).
The Linux distros have had decent separation of admin and users since day one, and are now erecting more and more internal walls (like with SELinux) to tighten down the system even more.
In Windows they are doing a freaky dance around the problem that for compatibility’s sake they cannot outright forbid ordinary users to mess around in operating system’s folders.
these are tehe minimum for any eddition above starter or home basic.
I have home basic running on a P3 with Intel extreme graphics 2 (cringe) 256 rambus ram, and a 20 gig HDD.
Here’s the kicker, it runs fast! not as fast as 98SE ran on it, but fairly close (slighlt amount of tweaking required to make it run smoothly to the point that it is comfortably useable).
anything that needs Areo interface and such, well, the posted specs are about right.
I tried to install Windows 7 (beta build 7000)on a machine with only 512 MB RAM. It installs and runs without complaining.
just like Vista
I think the table of windows version requirements should branch and contain NT4/W2k requirements prior to XP since those products were essentially being provided to “professionals” at the same time as 95/98/ME before XP came around.
I stopped using Win9x almost entirely when W2k appeared, only dual-booting to 98SE occasionally for those crappy games/etc. that required it.
Edit: I see the little teeny note about “home-oriented” versions… but I still think it is more interesting to branch the 9x and the NT-based stuff into separate comparisons.
Edited 2009-05-01 18:32 UTC
I don’t think the mimimum requirements are that bad, it would just be nice to see that 512MB is sufficient. Other modern OS’s run very nice with that amount, I think it should also be possible for Microsoft to do so. Maybe somebody can test?
In that case it only takes much more disk space, maybe for a next windows version?
Anyway I’ll stay with ubuntu, but my next might be an Apple.
Edit: already tested…
Edited 2009-05-01 19:41 UTC
Ubuntu on 512MB? Or worse yet, Mac OS X on 512MB?
You must have pretty low standards. You can get a functional Linux desktop out of 512MB on a Linux machine with some careful software choices and no Firefox/OO.o, but Mac OS X? Good luck. Even on 1GB of RAM OS X always feels sluggish. It doesn’t become as snappy as W7 or Ubuntu until you hit 1.5-2GB.
So, to say “other modern OS’s run very nice”on 512MB is simply not true. It requires lots of work (Linux) or it’s impossible (Mac OS X).
Arch linux with KDE
memory used after boot: 90MB (it can go down to ~78MB)
The default install of Debian 4.0 and GNOME 2 needed less than 70 megs without extra apps running. I myself have used Debian with Compiz on 500 MHz Athlon K6 and GeForce 2 MX and it run fairly well though not with maximum FPS, of course.
Solution: don’t install the graphics drivers. My installation of XP SP3 with 17 processes ate about 85 megs too, until I installed nVidia display drivers. It added 20 megs rightaway.
You’ve just described my eeepc 701. And it does just dandy on 512MB and 530MHz-900MHz cpu. It came with 256MB and locked at (I think) 630MHz with Xandros, and I don’t recall that being too terribly bad.
Besides, the OP was talking about it being nice if the minimum requirement were 512MB.
Edited 2009-05-01 20:33 UTC
No work required – I’ve been using Ubuntu (and Mint) for years, and it runs perfectly with my 512MB of RAM. Sure, double that would be nice, but it’s running a hell of a lot better for me than XP did on exactly this machine; and XP’s minimum was what, 128?
You can get a functional Linux desktop out of 512MB on a Linux machine with some careful software choices and no Firefox/OO.o
Don’t lie. I have myself two computers running complete GNOME desktops, one has 256MB RAM and the other has 512MB, and hell, I have Apache, FireFox and web-development utilities running all the time on the latter one. There was absolutely no reason to carefully select software.. I just installed the freaking default GNOME desktop on Mandriva.
So, to say “other modern OS’s run very nice”on 512MB is simply not true. It requires lots of work (Linux)
What you are saying simply is not true. I have several computers proving you incorrect.
Well, so far, whenever I ran any modern Linux distribution on 512MB it hasn’t been an optimal experience. Especially Firefox and OpenOffice are notoire memory suckers, and launching a few tabs in FF or a few docs in OOo would bring it all to a screeching halt. Bring in something like Flash…
I personally wouldn’t recommend a default Ubuntu or similar distribution installation if they have a <1GB system.
However, as always, mileage may vary .
I was going to bitch and complain about this, as my full blown Ubuntu running several apps right now and Compiz only uses around 400MB. Then I realized, I don’t use Firefox. Or OpenOffice.org
Oops.
I think part of the problem also is whether you have a dedicated or shared graphics. I used to run Ubuntu on a laptop with 512 with compiz on, and yes, there were times it slowed down quite a bit, but then on a very similarly spec’d desktop with the only difference being a dedicated card, and have never had any speed issues.
I’ve already covered the single user case in a previous post, which is admittedly more relevant to this discussion.
But I’ll also go ahead and mention that I find that 200MB per user on a Gnome XDMCP/NX server running x86_64 gets me pretty good results. 60 users => 12GB. Running 32 bit, 150MB/user is more then adequate. The workload is the typical Epiphany/Evolution/OpenOffice/Evince/etc affair one would expect in an office. Plus about a hundred sessions of an ncurses/Cobol based accounting package. And some other odd lot stuff.
NX sessions do use significantly more memory than regular XDMCP sessions, and typically about 70% of my sessions are NX. Running straight XDMCP on a LAN the requirements would be significantly less.
Edited 2009-05-01 22:27 UTC
That’s about the same memory scaling ratio for WS08 on Terminal Server. I’m not sure how much better it’s gotten on WS08 R2 (Win7 Server).
Well, if we’re going to make this a direct comparison of Linux vs Terminal Server, I (we) should be more detailed and rigorous about the numbers and where they come from.
CPUs: 2 Irwindale 3.2GHz Pentium Xeons.
Firstly, what is the server doing:
1. Gnome Desktop server, with typical office workload running Fedora 8 and Gnome for about 60 simultaneous desktops. 16 XDMCP, 44 NX.
2. About 100 simultaneous instances of a business accounting and service tracking package.
3. Samba file server for legacy Windows workstations.
4. NFS server for Cobol C/ISAM <=> SQL gateway
5. Print server
6. Time server
7. Font server for the XDMCP sessions.
Currently, running 64 bit Fedora, it runs with 12GB of memory, which includes about a 50% safety factor. Until approximately 3 months ago, we were running well with 8GB, but there were occasions in which I felt performance could be better. We could have comfortably stayed at 8GB, but I didn’t care to.
So for the 8GB scenario, it worked out to ~136MB/user. (Always keeping in mind that it is not *just* an XDMCP/NX server.)
Now, before our last OS upgrade from Centos 4 to Fedora 8, we were running 32 bit. Slightly fewer users. Say 55. And we then had 4GB. It was about the same situation as 3 months ago. Performance was OK, but I didn’t feel we had a lot of safety margin.
This scenario worked out to ~74MB/user. (Always keeping in mind that it is not *just* an XDMCP/NX server.)
The move from 32 bit CentOS 4 to 64 bit Fedora 8 increased the memory requirements. It still ran acceptably on 4GB, but it was pretty clear I needed to upgrade the memory. (Which we did, to 8GB)
Thus far, everything I have said is backed by actual experience. At this point I’m going to guess a little bit. If I were running only XDMCP sessions, without the NX, how much memory would I have needed for the above scenarios? (And BTW, I’ll put a link at the bottom to a very useful tool for helping to get a mental grasp on these sorts of things.[1]) Keeping in mind that these are just guesses based upon my estimate of the NX savings, I would expect maybe:
3GB for the 55 user 32 bit CentOS 4 scanario. ~56MB/user
6GB for the 60 user 64 bit Fedora 8 scenario. ~102MB/user
Interestingly, processor has never been an issue on this box. Which is why I always smile a little when I see posts where people are drooling over multicore for their sinlge user desktop. Load average on this machine averages 2.5 over the course of the business day, which equates to about 1.25 for a single processor machine.
And for FWIW, local network bandwidth is a total nonfactor. Except for the file serving (obviously), I could run this on 10baseT and I doubt anyone would notice a difference.
Why did I bother to post all this? Because it annoyed me slightly that someone would waltz in and say, essentially, “Windows can do that! And maybe even better with the next release!” without presenting even a shred of experiential evidence in support of the claim.
=========
[1] smem is nice for getting an idea how much memory is *really* being used by a process, taking into account all the tricky issues surrounding shared memory and swap:
http://www.selenic.com/smem/
Edited 2009-05-02 16:26 UTC
I didn’t mean to impugn your machine or your OS, but a lot of work has also been done on Windows to get hundreds of users on a single machine. There is in fact a team which tests this scenario and works to get multiplicative per-user costs down. They ensure that the OS works properly with hundreds of sessions of IE, Office, and a few other applications at once. There’s no waltzing at all–those guys work pretty hard and are pretty demanding.
I can’t say about Ubuntu, haven’t touched it for years now. But I am a heavy FireFox-user, I always have a minimum of 4 tabs open, usually way more when I’m doing web-development. And I do it all on the 512MB box. I haven’t bothered tweaking the thing anything more than disabled Beagle because I don’t use it, everything else is on default settings.
Maybe you should try some other distro.
Kubuntu 8.10, with KDE 4.2.2 installed, runs just fine on an Asus eeePC 701 (900 MHz Celeron, 512 MB RAM, no swap, 4 GB disk). Including OpenOffice.org and Firefox 3. This is my media jukebox (Amarok 2) and school work computer.
I’ve also run Xubuntu 8.04 with XFce 4.x on a P3 450 MHz laptop with 256 MB RAM. Didn’t have OpenOffice installed as it was our media centre (video-out to the TV), but it did run Firefox 2.x just fine, along with Kaffeine for watching video, streamed over a wifi connection using smb4k.
You must have extremely high standards.
And that’s a bald-faced lie, to say the least.
Nothing more nothing less. We had to wait for Vista for a long time… We’ve been wondering about new features like Aero, WinFS (remember that?) and suddenly… Vista was a huge disappointment. I think, it was disappointment even for Microsoft. So, they have to do something. It would be silly to say – “Come on, this Vista was not a good idea. We promised a lot, we’ve given not too much, you had to wait for so long and received nothing special, now, let’s repair Vista with a kind of SP, which will replace a huge part of it, almost the same way like before, during Windows XP and SP2 time”. Oh, no. It’s better to deliver something new again. Something that will help to cover this Vista thing.
Anyway. I’m using Slackware on my notebook at home. Just works. I’m happy with it for so many years.
…why Windows 7 doesn’t need more system resources than Vista is because “moore’s law” came to a technical end recently. Microsoft makes it on purpose to make a bloated OS. That’s why hardware manufacturers bundle it by default. It forces people to buy new hardware.
It has more to do with the fact that Vista’s reputation was so bad. Windows 7 is their chance to keep their marketshare, they don’t want to blow it up.
I run the beta on a P!!!800E with 512MB and a 40gig hard drive; the only downside is the integrated graphics. It would be nice to have a dx10 video card in it, but I can’t imagine 1gig of RAM being necessary.
It hardly even jumps past 600MB of dynamic swap file usage. And that is running firefox, OOo, and all sorts of heavy handed trialwares like the Office 2007 trial. On top of that the machine cracks OGR-NG packets in the background full-time. It runs absolutely no worse the wear than it did as an XP, SuSE, or Ubuntu machine.
Maybe the 1GB RAM requirement was in response to Vista having its system specifications set too low and as a result leaving a lot of people a bitter taste afterwards.
Personally I think that Microsoft and Apple are realising that they can’t keep pushing and pushing hoping that the CPU and component companies can’t spontaneously create more performance from no where. Both Microsoft and Apple need to realise that performance is a two way street – it isn’t all hardware and it isn’t all software.
So it might mean it runs well on netbooks, but it also means it’s more expansive and more power-hungry hardware than the ARM-based Linux-running devices which will be popping up soon.
I pretty much stuck with XP and passed on Vista. One thing that struck me is how much more disk space Vista and 7 need over XP. A typical XP install for me takes up about 2-3gb (before installing any programs).
What the hell did they add in Vista and 7 that needs so much more space?
Main culprit is the side by side which is suppose to fix the dll hell (work quite the same in linux to run different libraries version) + shadow copies + system restore.
After testing the beta I would say better than vista ( not that hard ), but still not convincing enough to move away from XP
aren’t all versions of windows 7 meant to be installable on netbooks ? requirements do not sound very netbook friendly (mainly storage)
I thought Windows 7 was supposed to use fewer resources than Vista?
Vista Home Basic required an 800MHz CPU, Seven requires 1GHz – same as Home Premium.
Vista Home Premium required 1GB of RAM, Seven 64-bit require a whopping 2 gigabytes of RAM. That’s 2 gigs MINIMUM!
Comparison: OpenSUSE is pretty memory-hungry but it “only” requires 512MB of RAM, and I bet you’d have Compiz with that too.
How on earth is Windows 7 more resource-friendly and more appropriate for netbooks if it has bigger system requirements than Vista?!
…because Vista’s requirements really shouldn’t have been as low as they were. You could “run” Vista on those minimum requirements, but your definition of “running software” would have to be a pretty loose one. I haven’t really compared Vista and Windows 7 extensively, but Vista on my desktop with a 2.2 GHz quad-core, 2 GB RAM, 512 MB graphics card can be sluggish at random times, and especially in startup. I wouldn’t dream of running it on my netbook. I installed Windows 7 on my netbook, and it ran beautifully– almost as well as on the same desktop with Windows 7.
Anyway, I think Windows 7 requirements are nearer the truth than Windows Vista’s were, and I think that they’re less than Vista’s minimal requirements should have been.
Edit: I don’t know about the 64-bit versions being less resource-hungry than Vista’s was (should have been)… never used 64-bit Vista.
Edited 2009-05-02 20:34 UTC
I have a (laptop) Intel Core 2 Duo T6400 (2Ghz) with a graphical Nvidia 9600m GT (512MB) and 4 Gb of RAM.
In Vista, minimal requirements are 1 gb ram for 32bits and 64 bits version. In that case, I think it’s better Windows Vista 64 bits for me.
But in Windows 7, minimal requirements are 1gb for 32 bits and 2 gb for 64 bits. I have “only” 4gb of ram. ¿Which version it’s better for me?
32 bits only can manage 3 gb ram.
64 bits can manage 4 gb but needs 2gb, so I think it will be a heavy version than 32 bits.
Help please…
And I’m so sorry for my poor english.