“Does performance suck on Vista when compared to XP? That’s what I was set out to find out. I was worried at first, since the performance in Beta 2 was quite bad. While there is indeed a performance decrease, it’s quite minimal as you’ll find out.”
“Does performance suck on Vista when compared to XP? That’s what I was set out to find out. I was worried at first, since the performance in Beta 2 was quite bad. While there is indeed a performance decrease, it’s quite minimal as you’ll find out.”
So it’s basically more of the same + some incompatabilities. I wish they would just extend XP support for another 8 years, and continue to make it stable etc. Then add DX10.
-Bounty
Well, not exactly. The reviewer noted that with a 1GB of RAM, Vista would be much slower. Vista seems to be much more of a memory hog, but not relative to when XP was released. Once that bottleneck is overcome, then yes performance seems roughly similar.
I think when XP came out, the typical PC had 128MB of RAM and XP uses that much ram on boot, and doesn’t really open up until 256 or 512MB IMHO. Vista seems similar in that the typical new PC today has about 1GB of ram but doesn’t really open up until 2GB, and 4GB seems more than ample (today).
I agree with your subject. Big deal. Given a hefty amount of memory (2GB is still pretty hefty by today’s standards), Vista runs the same single-threaded code slightly slower than XP. How much slower was anyone expecting Vista to be, especially on a dual-core system?
Vista is a memory hog. Given enough memory, it should do just as poorly as previous versions of Windows. It has more system services going on in the background, but if you’ve got two processors, they shouldn’t be impacting your active single-threaded process very much.
Sadly, while Vista pretty much requires a dual-core system to let the applications get some processor time, it doesn’t really scale much further than that. They still haven’t designed their process management code with SMP in mind. Still a single system runqueue, no CPU affinity, no cache warmth considerations, no priority boost for threads holding locks, and load balancing means dispatch to the next available CPU. For Microsoft, designing for SMP begins and ends with serialization. They can’t even get the basics right, yet they think giving Media Player 80% of each timeslice is a brilliant innovation.
That’s my rant. Hey, if all you care about is Media Player and DirectX, then Vista should satisfy your addictions quite nicely.
>> Sadly, while Vista pretty much requires a dual-core system to let the applications get some processor time, it doesn’t really scale much further than that. They still haven’t designed their process management code with SMP in mind. Still a single system runqueue, no CPU affinity, no cache warmth considerations, no priority boost for threads holding locks, and load balancing means dispatch to the next available CPU. For Microsoft, designing for SMP begins and ends with serialization. They can’t even get the basics right, yet they think giving Media Player 80% of each timeslice is a brilliant innovation.
Butters, aren’t you being a Bit harsh?
It’s YOU that’s the problem. You sound like you’ve got like a Comp Sci degree. YOU are the one on the wrong platform, if this kind of thing bothers you.
Butters, aren’t you being a Bit harsh?
It’s YOU that’s the problem. You sound like you’ve got like a Comp Sci degree. YOU are the one on the wrong platform, if this kind of thing bothers you.
Direct link for this comment-
And there is a statement to explain why this sort of lazy ass design an implementation is allowed to pass, the unsuspecting public does no know or care to know about this type of stuff. If this lot is indeed true, i aint tried Vista so i cant confirm, then MS havn’t really done much to advance the OS other than adding some paint .. really sad…. And even sadder is that average Joe does not even care.
I found the network performance (1G bandwidth, SMB file copy) under Vista is about 40%~50% lower that that of XP. I am not sure it’s a NIC driver problem or the new TCP/IP Stack problem. Maybe Both?
I called up Microsoft, and they said this is a feature designed to lower media piracy by about 40-50%. They also mentioned that TCP/IP is an acronym for Trusted Consumer Platform for Intellectual Property.
Hey, yesterday I played the Microsoft apologist, so today I get to have fun.
Trusted Consumer Platform for Intellectual Property.
Heehee … sorry I can’t mod you up. There ought to be a button “+1 for being fun and disregard +5 limit”
a performance comparison between Vista and some of the top rated Linux distros.
Yeah, a comparison of Suse 10.2 would be good.
I do not have time to do it myself, and although I am a Linux devotee… I would not favour Suse, as I have said before, I cannot stand all the mono it has in the menus.
+1 for Vista
-1 for Suse
All those delta %’s seem negligible but what do you expect with mostly I/O or CPU limited benchmarks?
These are all benchmarks I’d put my system through if I were testing hardware against hardware, not a new version of an OS against it’s predecessor.
For example, besides scheduling what possible measurable effect would you *expect* the OS have on a Nero Recode encode? Besides a bit of memory management and task scheduling the task is mostly limited by the number crunching capabilities of the CPU.
For a lot of people, UI responsiveness and effectiveness (how quickly you can do things and how simple they are to do) are much much better benchmarks for how performant the system is. These things are almost immeasurable so failing that why not some low level benchmarks of memory management/allocation, task scheduling, threading etcetera?
The only vaguely interesting figure here seems to be the filesystem performance regression. But that’d require more testing.
Somewhat agree with your point, but I do see where the reviewer is coming from. One thing that I would add to those tests would be maybe Vista with and without Aero effects.
“I found the network performance (1G bandwidth, SMB file copy) under Vista is about 40%~50% lower that that of XP. I am not sure it’s a NIC driver problem or the new TCP/IP Stack problem. Maybe Both?”
That’s something I am working on… I wasn’t sure if it was just me or not though, since my router and network cards seem sketchy half the time. I’m glad I am not the only one with the problem though. Although I found Vista to Vista transfers were less than ideal, transferring from my Gentoo rig to the Vista machine proved even -worse-, with speeds closer to 100KB/s. I did experiment with various configurations, including the latest version of SAMBA but it doesn’t seem to help.
“These are all benchmarks I’d put my system through if I were testing hardware against hardware, not a new version of an OS against it’s predecessor.”
The goal was to show people immediate differences between common applications used on both versions of Windows. The usage of the OS itself is hard to benchmark, but I didn’t have any real complaints. The entire OS is rather fluid. The biggest “slowdown” I’ve seen has been when I received a security prompt and agreed to allow it to change the configuration. For whatever reason, *that* seems to be slow (ie after giving permission to access another users account folder). You do make some good points though…
Thanks for the comments guys.
I replied on the author’s original forum. I’d like to see a Pentium 4 era computer with 512Mb-1GB RAM. That qualifies as a good ‘average’ computer these days. A C2D will make any OS look good. I bet you the numbers slip further apart. The faster the PC, the less any variation will be.
I agree.
The test system used should have been a system with 512 MB to 1 GB of RAM. Isn’t that what most computer users have in their systems?
Also, I would have preferred it was tested with an Athlon XP 2400 or P4 2.4 Ghz processor.
With less RAM & a slower CPU, you’d notice the difference somewhat more.
Microsoft got it right with Windows XP. I’ll stick with XP for a few years & afterwards, when time to move on I may go over to Linux OR Zeta/Haiku.
One thing is for sure, I’m not impressed by Vista right now (doesn’t have anything I really need) & it won’t be for another 3 to 5 years before I consider changing OSes.
I’m using the Ultimate RTM on an acer tmc302xmig tablet, 1.6 centrino, 1gb ddr 400, ata100 60gb 5200rpm, intel 855gme onboard graphics w/64 shared ram.
It’s an old tablet but vista runs pretty nice, the graphics performance is better than with xp, same for network( broadcom netxtreme Gbit/intel 2200bg ).
Overall it’s faster than xp and with much more functionality. what bothers me is Outlook 2k7 and supposedly small IM apps like Windows Live Messenger, outlook takes a lot longer to open an email(word preview pff) and WLM has some performance issues too, minimizing it to tray and restoring it can take a few seconds sometimes, it’s just an (40mb) IM app..
Gee those numbers are pretty similar. Almost like they’re running the same code. But that couldn’t be since they rewrote the while thing from ground up, right?
In the OSS world people would be up in arms over slow downs in a major release of a program, let alone the whole OS.
Where always trying to use less ram, run faster, and still add more functionality to the application.
I just don’t get the Windows world…
95% of the world doesn’t get you either.
“In the OSS world people would be up in arms over slow downs in a major release of a program, let alone the whole OS.
Where always trying to use less ram, run faster, and still add more functionality to the application.”
Well that’s goody for you.
But it’s not unusual for initial releases of new Windows versions to be slower than their predecessors.
NT 3.1 was slower than Win3.x on same hardware.
Win95 was slower than Win3.x on same hardware.
Same for Macs.
OSX 10.0 was slower (much slower) than OS9.
As for your “we’re always trying to use less ram, run faster, …”, Linux distros of today are bloated and use much more RAM than did distros of the 90’s. I remember Linux advocates back then crowing about how they could run Linux on a 286. Try that with any distro today.
OSes and software in general use more and more hardware power as time goes on, and that’s as it should be really. Hardware power increases, so why not use it? Or are you still hand-coding in assembler?
You say, “I just don’t get the Windows world”, well I don’t get those that complain about RAM usage. They go and buy 2GB machines than complain if the RAM is actually used. They’d rather have all that RAM sit idle.
Edited 2007-01-25 21:55
Well, Gnome 2.16 runs faster on my hardware than Gnome 2.12 – Firefox 1.5.0.9 runs faster than Firefox 1.0.x (especially on Linux) – Gnome 2.16 uses no more ram than Gnome 2 (and has all the fancyness one could ask for – if they want it).
RAM is not left unused with a real operating system, but there is a difference between using it as cache (as with Gnome) and simply swallowing it (as Vista does).
The system requirements for Vista is enough to stamp it as inferior, despite several architectural goodies (I applaud the use of .eml for mails).
Little remarks…
RAM is not left unused with a real operating system, but there is a difference between using it as cache (as with Gnome) and simply swallowing it (as Vista does).
Gnome is not an OS. Vista uses all the free ram as cahce aswell.
The system requirements for Vista is enough to stamp it as inferior, despite several architectural goodies (I applaud the use of .eml for mails).
OS architecture != email
Little remarks…
And not particular thought through.
Gnome is not an OS. Vista uses all the free ram as cahce aswell.
Gnome+Gnu tools+Linux do equal an OS. Vista uses all free ram as cache, there is just much less free ram, since Vista uses 12 times more RAM – just in order to run. And that with the same level of functionality and fancyness as in *BSD/Linux+Gnome.
OS architecture != email
Welcome to 2007, makc. – or 1995 for that matter.
OS architecture does contain email-application(s) as well.
OS architecture is no longer just a question about kernel design or filesystems. A mail client and it’s behaviour is tightly integrated (functionality wise, and maybe also code wise) into an OS today. That is, if we’re talking mainstream OS’es. Of course it differs if we’re talking about OS’es for special purposes, like a control system for a nuclear plant. But I thought you could figure that one out on your own.
OS X 10.0 was slower than OS 9
but speed has increased significantly every Mac OS X release after that. (10.1, 10.2, 10.3, 10.4)
Don’t forget OS X is indeed a 100% different operating system than OS 9, so a new Mac era started at 10.0
OSX 10.0 was slower (much slower) than OS9.
Yes, it was. But then 10.1 was faster (much faster) again. We all know about how bad 10.0 was.
As for your “we’re always trying to use less ram, run faster, …”, Linux distros of today are bloated and use much more RAM than did distros of the 90’s. I remember Linux advocates back then crowing about how they could run Linux on a 286. Try that with any distro today.
I don’t think anyone ever said about running Linux on a 286 — Linus originally wrote the kernel on a 386, indeed describing it as using “every conceivable feature of the 386 [he] could find”. The thing is, I’m willing to bet you can still run the latest Linux kernel on a 386, provided you don’t select any compiler options meant for newer processors.
OSes and software in general use more and more hardware power as time goes on, and that’s as it should be really. Hardware power increases, so why not use it? Or are you still hand-coding in assembler?
This is the argument often put forth by proponents of Java and .NET/Mono. The thing is, as hardware power increases, I want my computer to get faster. If an operation took 30 seconds on my old PC, I want it to take 15 seconds on my new one with double the horsepower.
There was an interesting article a while back showing that computer boot time has remained pretty much constant since the days of Windows 95. That’s not how it should be.
Of course memory requirements are going to go up over time — it’s only natural that a new feature will use some extra memory. But that doesn’t mean that programmers should be able to just shrug and say “well they’re going to have 1GB anyway, what does it matter how much I use?”. Programmers should always strive to use the fewest resources that are necessary to achieve what they want to do.
You say, “I just don’t get the Windows world”, well I don’t get those that complain about RAM usage. They go and buy 2GB machines than complain if the RAM is actually used. They’d rather have all that RAM sit idle.
No, I’d complain if my RAM went *unused*. That’s why I’m pleased to report that of the 1GB in my machine, currently just 16MB is free. The applications I have open are using about 300MB (of which over 100MB is Firefox, but that’s another story…). The rest? Well that’s being used by the the Linux kernel as an enormous cache, thus making my system faster and more responsive whenever I want to load anything.
The new Qt4, which will be used in KDE4, reputedly uses about 30% less memory than Qt3. Gnome also went for a bit of a blitz on memory usage recently, and Ubuntu Edgy now uses less than 90MB on first login on my system (on x86, more on AMD64 obviously). I just can’t understand why Vista needs so much.
“Yes, it [OSX 10.0] was [slower than OS9]. But then 10.1 was faster (much faster) again. We all know about how bad 10.0 was. “
I know that OSX increased its speed with releases 10.1, then 10.2, then 10.3, (but many experience slower perf with 10.4 due to Spotlight overhead). But as I said, it’s not unusual for the *initial* release of a new OS to be slower than that of its predcesor, and that’s because it’s doing more stuff. The performance penalty lessens after that with incremental updates and more powerful hardware.
“This is the argument often put forth by proponents of Java and .NET/Mono. The thing is, as hardware power increases, I want my computer to get faster. If an operation took 30 seconds on my old PC, I want it to take 15 seconds on my new one with double the horsepower.
There was an interesting article a while back showing that computer boot time has remained pretty much constant since the days of Windows 95. That’s not how it should be. “
Fine, you and many geeks have that Spartan mindset. But most people don’t care about whether software is “faster”, they want it to do more stuff (or do the same stuff in a prettier presentation and better user experience) in the same amount of time. I still preferred OSX 10.0 over OS9 despite the slower perf. Same witn Win95 visa-vi Win3.x. I mean, we could all be running 1988 versions of DOS and Unix on today’s hardware and have instantaneous boot times, but would we want to?
Edited 2007-01-26 01:18
more stuff like virtual desktops? windows still can’t do that because it would then be really slow!
xp can since always. that there’s no decent gui for it, i admit. anyway, get a clue
No, WinXP cannot do that. You can however install an extension (Power Pack) from Microsoft or from a third party. But the MS extension doesn’t work very well, and slows down XP (or Win2K3) a lot. Third party solutions tend to work better (if you choose the right ones).
The same goes for Win2K.
I tested at least 10 different virtual desktop software on XP and all of them failed miserably.
Microsoft own VD (power-pack) is slow, bloated and crashes my desktop (on two different machines) at least once a day.
No thanks.
– Gilboa
I’ve been using virtual desktops on Windows since 1999 or so. You just had to ditch Explorer (it’s not as nice as e or KDE, but oh well). There are other alternatives, now, of course, keeping Explorer. It has nothing to do with speed.
OS X doesn’t have them as it comes, either. I’d be willing to bet you that most people get confused easily with them.
MollyC wrote:
“well I don’t get those that complain about RAM usage. They go and buy 2GB machines than complain if the RAM is actually used. They’d rather have all that RAM sit idle. “
likely those who complain are those who like me, use that ram for applications. that said, you are probably representative of most computer users today. if you use your computer primarily to surf the web, email, chat, listen to music, play games (unless you are a hardcore gamer) then you probably have alot of ram and cpu power to spare. in that case why not spend it on eye-candy such as that in vista? I totally agree.
however, if you use most of your computers resources on a regular basis, (I do alot of 3d modeling/rendering, encoding/compression) then you want the operating system to use as little as possible of your system resources so that you have as much as possible available for your applications. since I usually run a number of resource hungry programs simultaneously I’ve never really come across the ‘problem’ of having too much unused ram. the less resources the operating system use while still performing it’s duties, the more is available for what I actually want the computer to do.
due to this, Vista just isn’t an attractive upgrade path for me. again though, most people don’t use even half of their computers resources and for them, Vista is basically a better looking windows.
for me, a win2003 workstation version would be alot more attractive as an upgrade, assuming it was priced accordingly.
likely those who complain are those who like me, use that ram for applications.
however, if you use most of your computers resources on a regular basis, (I do alot of 3d modeling/rendering, encoding/compression) then you want the operating system to use as little as possible of your system resources so that you have as much as possible available for your applications. since I usually run a number of resource hungry programs simultaneously I’ve never really come across the ‘problem’ of having too much unused ram. the less resources the operating system use while still performing it’s duties, the more is available for what I actually want the computer to do.
Well said – you’re a frakkin’ mind reader
for me, a win2003 workstation version would be alot more attractive as an upgrade, assuming it was priced accordingly.
That would be a nice solution, but one needs a MSDN AA deal in order to run Win2K3 Server (or an obscenely large amount of money floating around with no obvious use). But if you can get that, Win2K3 (with tweaks) works as a great desktop OS
I remember Linux advocates back then crowing about how they could run Linux on a 286. Try that with any distro today.
No you don’t. You could never run Linux on a 286. Linux has always been a 32-bit protected-mode OS. While the 286 had protected mode, it was 16-bit only, thus a 386 is the minimum requirement for Linux. (And also why you still see Linux packages/applications marked i386).
Minix will run on any PC from XT-class up (at least Minix v1 and v2, I don’t know about v3). ELKS is based on a partial port (“subset”) of Linux to XT and 286 class CPU’s but it is not “Linux”.
As always, people don’t let the facts get in the way of a good (?) argument.
First, Linux never officially supported 80286. Linux -does- run on 80836’s. (Last
time I tried (~2 years ago), Slackware worked just fine on a 8MB/i386DX machine. Same goes for DSL.
Second, the main problem with Vista is not the 2GB base requirement. As you said, people can can buy ~1.5GB of RAM and get on with it. The problem is what Vista (or should I say Microsoft) chose to do with the extra 1.5GB of RAM.
Vista (and I’m playing with the RTM @work) is just more-of-the-same. (+DRM +very annoying UAC.)
Yes, you get cool effects, but at least 50% of these effects can be generated by Windowblinds/ObjectDesktop.
Yes, Vista works much better on NUMA enabled 64bit, but suffers from lack of drivers/software and still scales badly on 4+ cores.
Why on earth would I upgrade my aging (?) Sempron64/2800/1GB/XP machine to Vista, if the only thing I get in return is UAC and DRM?
– Gilboa
In the OSS world people would be up in arms over slow downs in a major release of a program, let alone the whole OS.
Where always trying to use less ram, run faster, and still add more functionality to the application.
I just don’t get the Windows world…
Well, that depends on what you’re running, doesn’t it? For example, the first time I got KDE up and running was KDE version 1 on a Pentium 1/233 w/64MB of RAM.
What kind of hardware would you need to run KDE 3.5 with all the visual bells & whistles turned on, with a 3D desktop? I’m not saying it would be heavier than Vista, but I reject the notion that OSS desktop enviroments get faster and use less RAM with every release. I
n the case of KDE, maybe they’ve sped it up some since v3.0 (or so I’m told), but those are only incremental releases, not major milestones. You would really have to comapre KDE v3 vs v4 (when it is released) for a fair comparison.
Well, that depends on what you’re running, doesn’t it? For example, the first time I got KDE up and running was KDE version 1 on a Pentium 1/233 w/64MB of RAM.
What kind of hardware would you need to run KDE 3.5 with all the visual bells & whistles turned on, with a 3D desktop?
PCLinuxOS will do all that and more on a PII with 256mb.
I suggest you actually try an up to date Linux instead of try and guess what we are using :p
Edited 2007-01-26 00:01
vista runs ok on a core 2 duo with 2GB of RAM…. what would you expect from that?
I mean, come on, why don’t they try vista with something like a p4 2.0 or AthlonXP with 512MB?
In these computers XP runs fine with most tasks, but what about vista? would it be usable?
Athlonxp +2200 with 512mb runs Vista smoothly.
But, just the OS. you need more ram if you want apps to run too.
And usable ? it does not matter if you had a crap machine or a mainframe, only you can decide what you think is usable.
My Vista test notebook is an old AMD XP 2500+ 512MB RAM (of which 32 are assigned to VRAM, so it’s 480MB RAM) and 40GB HD. I cannot run AERO, of course, but system runs smoothly and faster than XP Pro.
Also notice that OS developers build systems mostly for future, not for past. You wouldn’t expect to run Vista on a old 386-486, right? So don’t exepect Vista to be at maximum on old PCs.
2GB is not that uncommon now and it will be very common in an year (partly because of Vista too but memory hungryness is a common behaviour).
The fact that Vista CAN run (in a degraded form, but with full system security and protection) on 4 y/o PC is fine already and Microsoft already did more than I expected. Of course, people who cannot run Vista can decide to stick with their present OS or buy a new PC. Easy.
My Vista test notebook is an old AMD XP 2500+ 512MB RAM (of which 32 are assigned to VRAM, so it’s 480MB RAM) and 40GB HD. I cannot run AERO, of course, but system runs smoothly and faster than XP Pro.
I’m a bit amazed you call that configuration “old”.
I figure that laptop is about three to two years.. old?
Computers.. sad they get outdated so quickly. Maybe it’s what saves the economy?
I have a Kenwood hifi system here which is 15 years old. I think I paid about €900 for all of it then. I highly doubt that there is anything on the market right now that really beats its quality. Actually, I see a lot of crap too, in the same price league. For some funny reason the audio world didn’t evolve to the extent that I feel I have to get a new one anytime soon. Nothing has ever needed any repairing either.
Computers though… league of its own.
I’m a bit amazed you call that configuration “old”.
I figure that laptop is about three to two years.. old?
It’s at least 4 y/o 😉 And yes, that’s ages in computer industry.
Computers.. sad they get outdated so quickly. Maybe it’s what saves the economy? I have a Kenwood hifi system here which is 15 years old.[…]
Computers though… league of its own.
I agree. Fact is that computers,unlike other electronics, is changing fast and in my opinion usual PC lifetime is about 1,5 years. Don’t get my wrong: I can appreciate things that just works!
Talking about Hi-Fi systems, I bought mine in 1987-88, paied like 1,500$ and it’s still working and waking me up every morning, playing my CDs and stuff. Will I change it soon? Of coure not! At least, until it’s working. As you said, computers are different beasts 😉
>>I’m a bit amazed you call that configuration “old”.
I figure that laptop is about three to two years.. old?
>It’s at least 4 y/o 😉 And yes, that’s ages in computer industry.
I guess you’re right… I have an old Athlon XP 2600+ 1.9 Ghz in a desktop, which I found out is only a tiny bit slower than a P4 2.5 Ghz. Rushing to the store now..
No, I found out that adding 512 MB and a bit faster HDD was more than enough to increase performance. That’s the easy thing of white boxes. Bad thing is they look hideous.
(…)
>Fact is that computers,unlike other electronics, is changing fast and in my opinion usual PC lifetime is about 1,5 years. Don’t get my wrong: I can appreciate things that just works!
Me too. Although I do think that the speed development curve may be bending down a bit. Sure, GBs become TBs, more cores and still more Ghz’s, and 1GB memory is sort of a standard. But things do change.
First of all, in a few decades energy (read: oil) will be incredibly more expensive than even now (Chinese buying lots of cars, Peak Oil, you know the “drill” ~ was that pun intended?).
Even many North-Americans, once famous for their great car sizes and beastly engines, are now consciously driving efficient Toyotas instead of Chevys. The production itself of semiconductors, chips, etc. gets more expensive when energy gets more expensive.
So I figure the effort will be more than ever in getting the same performance at tiny amounts of Watt, and walking the less-nm-production alley. Laptop HDDs may be more common, or maybe even flash memory for reading the OS and HDD only for storage and /tmp files. Cooling may be done passively increasingly, and things like the “Centrino platform” will be used on desktops, everywhere.
Even video cards might “improve”. All this will become a necessity if people discover their computer use will cost them hundreds of €/$/.. a year just to run them.
>Talking about Hi-Fi systems, I bought mine in 1987-88, paied like 1,500$ and it’s still working and waking me up every morning, playing my CDs and stuff. Will I change it soon? Of coure not! At least, until it’s working. As you said, computers are different beasts 😉
And if you calculate it over the years, it cost you almost nothing too.
The big difference is also quality. It’s amazing to see the bad quality of the housing and even assembly, and the traditional total lack of creativity, when it comes to such incredible high-tech that computers (their inside) are actually made of. Just look at the beauty of even low end motherboards.
With the exception of some nice barebones and some of the stuff Apple makes (but even then, lots of plastic).
Sure, why care about making computers look nice and assembling them carefully and with good materials such as aluminum and titanium, so these machines last for ages, if you want people to ditch their pc/laptop after a few years because of all the faster hardware?
People spend, what, €500 on a new laptop (some people call that an “el cheapo” here in Holland ), not realising the total crap they buy. The keyboard sucks very bad, the exterior will look like total crap within months, and if there’s a very hot cpu inside, which the plastic can’t cool, the fan will whoosh your brains out.
That’s the real freedom that our societies offer: the freedom to consume crap.
On the other hand, the hi-fi audio stuff we may both be looking at will still look great after a decade.
My lifetime for PC’s are 5 years. I refuse to update constantly just because of sloppy “commercial quality” coders.
The problem is that companies are “forced” to maximise profit constantly, leading to fast development of application solutions rather than to development of fast application solutions. (See the small, yet huge difference?)
Hi-Fi systems are different, since people tend to know better, somehow. Probably the lack of blinking GUI things – that said, a real Hi-Fi system is a DYI-project
EDIT: Fixed typo in third paragraph
Edited 2007-01-26 11:36
5 years? I *just* ‘upgraded’ from a pentium 3 600mhz coppermine slot1 tower to a 2.8pentium 4 And I only did that because I had enough spare parts from junk machines people gave me to built it without buying any hardware
I’d still be happily using my pentium 3 if I hadn’t gotten that one last dead computer…
Now my grandmother is using that tower and loves it.
I might reply to myself that “commercial quality” code can also be found in FLOSS. OpenOffice, eclipse, monodevelop and Azureus are some of them.
Vista is just like other versions of Windows. It usually isn’t stable or production ready until service pack 2. I wouldn’t use it to do work until that comes out.
The major sources of added latency that I found using the RTM verions were in the MS bundled utilities.
3rd Party software was not really noticably slower, but for example: browsing using explorer take ages, all the on-the-fly thumbnail/metatag/file detail parsing really slowed things down. Also, any file copy or zip extracting process was ridiculously slow, (around 250 Bytes/Second!!). 7-zip on Vista however, is just as fast as on XP.
System: Athlon XP, ~2GHz, 1Gb DDR400 ram, Vista RTM (Currently XP Pro) HDD throughput scored 4 Experience Index, ATI 9600XT(AGP) Gfx.
Why the hell do they use synthetic benchmarks to measure OS speed? Do they really expect these numbers to be different? They actually benchmark their test rigs, not the OSes. OS performance is in IO speed, boot speed, multitasking performance, relative memory usage (relative, not absolute… there is no more a way to measure that) and UI “snappiness”. Vista does very good in these fields, according to my experience spanning multiple modern machines, Vista performs better in these fields (of course not synthetic speeds) than XP.
I partly agree. Anyway, there were rumors about Vista slowing down your software performance by a significant delta. That expecially scared gamers, who were listening to people predicting a quite large impact of new Vista features on games.
Synthetic benchmarks are partial and will not reflect real-world behaviour. However, this test shows that there’s no impact of Vista on software.
(Notice that when people say “It performs the same, so why bother?” they forget Vista is performing the same while implementing much more than XP under the hood. This actually is a performance improvement)
Really, shouldn’t you be testing Vista against 64 Win XP?
Isn’t vista a 64 bit os?
Try Vista vs. XP on a 2.2GHz P4 with 512MB RAM and a 20GB HD with an 32MB ATI AGP Rage Pro 128.
XP wins by miles.
Maybe a better lead for the coming vista media blitz should be, “Vista, only a little slower than XP!”. Or “Vista, looks fast even when it’s standing still or locked up, as the case may be!”
On my old desktop PC (Athlon 2500+, 1GB ram, 80gb HD, AGP nvidia 6900gs card for gaming) it is quite fast. Especially booting and hibernation are fast. Hibernation works really well. Certainly much better than under XP.
On my notebook (2GHz Core Duo, 1gb) it is very fast even though it has only an intel graphics card. A very nice touch is that the glass effects are being disabled when you disconnect the power to save battery life. Battery life is almost identical to XP, maybe slightly better.
The free memory is always quite low since there are all kinds of prefetching and caching mechanisms in the background. But that is the same on linux. I have not tried running it with 512mb, but with 1gb it works just fine even when running large applications such as multiple instances of visual studio.
I am running Slackware 11.0 eleven here on a Pentium III 450 MHz with 512 MB ram and it is very fast even with the “nv” open source driver. The only thing that’s faster is probably LFS/Gentoo/Arch or some other source-based distribution.
On this same computer I had installed Solaris Express Build 54 and while it wasn’t slow it was still not as fast. And before that it ran SUSE 10.1 and that was a lot slower.
On my laptop I have multibooting OpenBSD 4.0, SUSE 10.2 and Slackware 11.0 and OpenBSD is pretty fast but still Slackware is faster. I could make OpenBSD faster by recompiling everything for the processor though.
This is what I am talking about when I mention speed, but it looks like Vista doesn’t come close by a long shot.
What it comes down to is Linux/BSD/Solaris is becoming faster every release (on the same old hardware) and Windows does the opposite.
I am running Slackware 11.0 eleven here on a Pentium III 450 MHz with 512 MB ram and it is very fast even with the “nv” open source driver. The only thing that’s faster is probably LFS/Gentoo/Arch or some other source-based distribution.
Theoretically, you might turn Gentoo into a slightly faster system than Slackware (doubting that a little though), but in practice Gentoo isn’t much fun on older hardware because of compiling times.
I didn’t know Arch was source-based (which you seem to imply), I thought that was binary but still fast because of all the optimalisation for i686 systems?
i used to be able to play dvd’s(barely) with win2k on 500mhz with 256, now my 1.6ghz with 786 struggles with them on xp if i don’t turn of and optimize every little thing. I can’t wait to see vista struggle with blu ray. I wish my linux install could play most dvd’s
I wish my linux install could play most dvd’s
FYI, there has never been a single DVD I could not play well with Linux – on any distribution I tried. I use MPlayer, Totem or Xine (not the crippled ones from Fedora but the “originals” with all the plugins and codecs, that are in additional repositories such as Livna). Perfectly legal, not entirely free software I suppose, but it always works.
i tried setting up the codecs for dvd’s a few times. almost all work but i have a few that didn’t. However you understanding of legalities involved is flawed those are legal grey area’s it is not technically legal but no court will convict you so long as it’s fair use.
I just reread you post you didn’t hand install the codecs you got them with the player i’ll have to try that.
Edited 2007-01-27 10:23
i tried setting up the codecs for dvd’s a few times. almost all work but i have a few that didn’t.
I don’t have that many DVDs that I could safely say “all DVDs work” on my system, but I didn’t have problems.
you understanding of legalities involved is flawed those are legal grey area’s it is not technically legal but no court will convict you so long as it’s fair use.
I guess you’re right. I should have used other words than “legal”. Where I live, smoking marihuana is “illegal” but you can say Hi to police in the streets while smoking a huge joint.
I just reread you post you didn’t hand install the codecs you got them with the player i’ll have to try that.
I do think I installed the well-known DVD codec separately (it’s been a while though), but even then it seems that some players are better at playing certain disks than others. I had a Truman Show DVD that Mplayer handled well and Totem/Xine did not.
It’s all a bit fuzzy with all these players and codecs, but what it comes down to with many popular distributions is to uninstall the crippled version that is included, and to install the “real thing” straight from the authors.
I only mentioned Arch Linux as an example of something that might be a little bit faster, but there is a little bit of truth to the source-based nature of it, quoting the home page:
“Arch Linux uses the Pacman package manager, which couples a simple binary package format with an easy-to-use build system, allowing the users to easily manage and customize their packages, whether they be official Arch packages or the user’s own homegrown ones.”
Eventually everything is source-based. It depends how easy it is to recompile everything. Slackware also makes it very easy by its lack of hard dependencies, e.g. see Slackintosh, Armedslack, Slack390, Splack and Alphaslack.
And I don’t think Gentoo is fun at all because of the many possibilities to break it. That’s why I recompiled Slackware for Athlon to have a very stable base with package management and still the performance of LFS/Gentoo, like BSD does it.
Gentoo might be fun with quad-core 2-3 GHz cpus with background compilation. That’s probably when I’ll try it again if only to try to get a working system.
Edited 2007-01-27 22:11