Just weeks after releasing Windows Vista Beta 1, Microsoft has shifted our paradigms again, unveiling a preview of beta 2 at the TechEd 2005 developer conference. Also, “hardware vendors are going to love the news that Windows Vista is going to need very beefy hardware to run well. At Microsoft’s TechEd conference, Dan Warne finally managed to squeeze blood from a stone – or rather, answers about Longhorn’s hardware requirements from Microsoft.”
It may be a while before Vista’s new DRM joy becomes common enough for the media producers to safely ignore older OS’s.
Yay for backwards compatability!
1. Users don’t change they computers because of a new OS, not in 2005. Most corporate users will keep Windows 2000 till hell freezes, it’s just “enough”. Same for Office. Microsoft revenue?
2. Hurt DVD decrypters and help Hollywood, well they slould do better films instead, but hey! Microsoft makes an OS that fucks its users so they can not make copies of DVDs they rent? Allright, what’s next, a Windows Vista that can’t be pirated? I like Linux and FreeBSD so that sounds as great new to me. Way to go.
Next year will be very interesting with Mactel and Vista maybe we see unexpected things on the os area.
As I’ve said before, it isn’t MS that is the source of the DRM problem. That is the domain of the content producers/marketers. You want DRM to go away, stop consuming the crippled crap that gets shoveled at you.
As I’ve said before, it isn’t MS that is the source of the DRM problem. That is the domain of the content producers/marketers. You want DRM to go away, stop consuming the crippled crap that gets shoveled at you.
While I generally agree, Microsoft is a partner of the ‘content producers/marketers’ in that MS benifits from implementing DRM as it allows them to differentiate Windows Vista from previous versions of Windows and non-Microsoft operating systems. Apple sees this, and is also jumping on board.
If both dropped DRM plans, things would be different.
The thing is, people have a great quality for poor quality, especially if high quality is much more expensive. Wide screen flat panel TVs and surround sound have been on the market for years, but the majority of people still buy 36 inch (or smaller) TVs and $100-$200 dollar stereos with ordinary stereo sound.
Look at it this way. If “disabled mode” is good enough, people will live with it and DRM hardware won’t get any more traction than IBM’s MCI “standard” and they’ll lose out to cheaper hardware. If “disabled mode” isn’t good enough, people will bawk the same way they did with region coding. Content producers with poor “disabled mode” content will lose out to ones with good “disabled mode” content, and DRM will fail like region coding failed.
MS dropping integrated DRM would make little impact in the long run. The media execs would still have the ****ed up notion that treating their paying customers like theives is a good method to raise sales.
If nobody does it, maybe Microsoft will buy a clue and stop releasing overly bloated OSs that do nothing but bother you while you’re trying to work.
It’s 6am here, and I can’t get Windows to stop BEEPING at me. I don’t even have sound drivers installed! I have the volume control turned all the way down. And its still BEEPING at me!
If nobody does it, maybe Microsoft will buy a clue and stop releasing overly bloated OSs that do nothing but bother you while you’re trying to work.
You must still be dreaming.
It’s 6am here, and I can’t get Windows to stop BEEPING at me. I don’t even have sound drivers installed! I have the volume control turned all the way down. And its still BEEPING at me!
Better just go back to bed for a while .
You can cut the pc speaker wires inside your computer if you want to stop the beeping.
My sound card came with instructions on how to cut the wires to the speaker and solder them to the sound card for volume control (roughly 4 years ago). I could just imagine some unknowning consumer buying the card and thinking a computer upgrade required a soldering iron…
Lol, turn it off in your BIOS newbie.
Or just disable it in the Device Manager. I believe you must show hidden devices to make it visible.
>Lol, turn it off in your BIOS newbie.
Yeah right, it’s the speakers fault!
Dumbass.
Someone suggested cutting the speaker wires…I say either simply unplug it, or if your a DIY person, put it on a switch.
I just cant help but smile all the way…2gb ram, 265mb video cards, dual cores, sata2…omg. all that to check my email !
Just what I was saying to a friend, I’m not going to buy a supercomputer just for a desktop environment to run on it. I use FreeBSD 6 with KDE now, and that’s bloated enough for my taste. I don’t need it to suck up more of my hardware because the hardware is meant for the occasional gameplay, not a flashier icon for my trashcan. The difference between MS and BSD was very clear when I had both Windows 2000 and FreeBSD 5 running in dualboot, BSD was quicker in use.
And all so you can get, mostly, the same neato animations you can get on a tenth the hardware with e17 or maybe (if it turns out good) KDE 4….
I’m actually a little bit annoyed to see the direction GTK people have gone with Cairo. If they can make it quick so that you can barely tell it’s doing more work that’s fine, but I don’t want to need a gfx card to have good 2d graphics. I’m happy to buy a cheap gfx card for all my desktops, but my laptop is sort of froze where it is!
Yeah, it’s ridiculous … Just build yourself a render farm to play simple games. Like the bozo said RAM’s cheap anyways, it’s no issue.
What an exciting feature list. Unless you use Linux, in which case you already have most of that stuff. Nice colours, but…
“The hardware vendors all know about it but aren’t yet making monitors with it built in, so now it’s up to you [the users] to say, “where’s my HDCP?”
LOL. Yes!!! Give us DRM. Please!!!
I don’t know who this feature is for. Is it for the content providers? Do they want a system where they can offer HD content that only 1% of the target audience can view? Hooray, the world of HD media is upon us! Quick, everyone go out and buy monitors that don’t exist, 2GB of something called DDR3, a $300+ video card and a brand new hard drive, so that when Windows Vista comes out, you can pay an exorbitant amount of money to download a movie that you can’t do anything with but play it on your computer monitor!!
No one has enough money for this. This is fascist, elitist, and anticompetitive. I posted a week or two ago about how DRM isn’t evil, it’s abusive DRM policies that are evil. Well, this is an abusive DRM policy, and it is pure evil.
There is nothing technically preventing someone with a 1920×1080 widescreen LCD from downloading an HD video in H.264 and playing it in full HD with mplayer, but it is not to be. Although, I did have a good time reading the code here:
http://mplayerhq.hu/cgi-bin/cvsweb.cgi/~checkout~ffmpeg/libavcodec/…
It’s amazing that in spite of the effort the free software community puts into developing open implementations of open standards, commercial interests want to somehow render them useless. We can control how people use proprietary software, but not open source software, they say. When the commercial interests lose control of their target market’s usage patterns, they get angry. Why doesn’t MS listen to us when we say that we get angry when we lose control of the media we buy? The media consumers buy way more MS products than do the media producers. Why is MS listening to them?
DRM isn’t evil but abusing it is? Who has used it without abusing it? TMK, every single music store reserves the right to change your DRM rights…
If DRM is ok but abuse is evil, and corrupted people abuse things. And DRM is a form of power. And if power corrupts, then how can you have DRM without evil forming behind it to abuse it?
What company could resist the temptation to break the knees out from under a new technology? They’ve been trying to do it for 100 years or more! FM radio, CD writables, cassette recording, VHS recording, etc etc etc…
I don’t see how DRM could be used without treating your customers like thieves…
Maybe it could be used for free time-expiring samples?
Does anyone here know why exactly the new windows will need so much memory ? Is it because of .net or is it the drm or is it something else ?
Aero or Avalon or whatever the new graphics system is called accounts for increased demands for video memory. Instead of painting directly to the screen buffer when asked to do so, applications will now paint their windows into off-screen buffers, with the graphics system combining them on screen. At 1280×1024 in true color a buffer for only one full-screen window eats up 4MB.
But even so 256MB graphics memory seem a bit over the top. And God knows what those 2GB of main memory are for.
But even so 256MB graphics memory seem a bit over the top. And God knows what those 2GB of main memory are for.
It’s short hand; they don’t know what will be optimal.
* 256MB should do the trick for flashy video; 128MB is common, 128 * 2 = 256.
* 2GB main memory; If you need 512MB now, and Vista will have heavier requirements (2x), and 64bit is 2x32bit, you end up with (2 * 512) * 2 = 2GB.
I think they are right in the long run, though I would hope that they are doing some serious optimizations. The Linux kernel folks love scraping out another 4K here and there.
No, it’s pretty much because while MS keeps piling features on the OS without addressing low-level problems, particularly (in this case) with regards to managing heap memory. I had an assignment a couple years ago where I needed to implement a my own malloc package that runs at least 150% as fast as linux’s implementation for allocating memory. The challenge is to efficiently find not only a chunk of unallocated memory that is at least as big as the requested allocation size, but the chunk that leaves the heap as defragmented as possible. If you just allocate the first available chunk, the heap quickly becomes full of free memory chunks that are too small for the requested allocations. It seems as though a process is taking up a whole slew of memory, but it’s really that a smaller amount of data is spread sparsely over a whole slew of memory.
The application layer is also full of culprits, for example the desktop search feature which indexes your data files. It’s either storing the cache in main memory, or it’s paging it out onto the disk. Windows doesn’t really have a swap filesystem, so paging is inefficient on Windows. It would probably prefer to keep the large cache in memory if at all possible.
One thing is for sure, it’s not .NET, because I don’t think there’s really any .NET applications shipping with Vista. I could be wrong, but at least there is no focus on “.NET” in Vista. I don’t think this was the case since back when Longhorn was supposed to feature subscription-based application services running on remove servers.
good lord… is windows memory allocation really that far behind? I can’t imagine they’re still using the same ol’ algorith they were back in the early multiprocessing days. they’ve got some pretty intelligent designers, i have trouble beleiving they’re using the ‘first available chunk’ approach.
i know one thing that i can’t stand is how windows paging works. it always takes over the system. obviously i know very little about what they’re doing under the hood, but it seems like page faults are a killer on an otherwise fast machine in the windows world.
it’s definately not because .NET
you can run .NET 2.0 on 1.0 GHz computers and have no problems..
i would say it’s all because of windows “services” and the new GUI it has.
i would say it’s all because of windows “services” and the new GUI it has.
More services and features means more complexity.
More complexity means a nearly exponential increase in the number of possible interactions.
All of that needs to be secured.
To secure any system or to run it in a stable manner, it is necessary to reduce complexity before checking for security holes…otherwise you drive yourself nuts.
XP is exceedingly difficult to secure. It can be done, but it is not at all easy to do. Vista looks like it will require quite an even higher level of commitment to concerned admins to handle it properly.
Expect those concerned and capable admins to be even fewer in number.
Expect that there will be substantial security issues that will take about 2 years post Vista release to resolve if — and only if — Microsoft takes security seriously this time around.
Microsoft is in risk of being crushed by the features they promote; they are not big enough or well organized enough to handle all the secondary issues. No single company or group is.
So basically…what they are saying…to watch hi-def content youll need to use THEIR operating system, with equipment from THEIR partners. Sounds very unfair to me. 🙂
“If you move from 32 to 64 bit, you basically need to at least double your memory. 2 gigs in 64 bit is the equivalent of a gig of RAM on a 32bit machine. That’s because you’re dealing with chunks that are twice the size… if you try to make do with what you’ve got you’ll see less performance. But RAM is now so cheap, it’s hardly an issue.”
Sounds like a “strategist” at Microsoft is simply someone who’s got no idea how thinks actually work.
While it’s true that pointers and registers double in size, most data stays the same size and code only grows by about 10% with x86-64.
Are physical pages twice as big in x86-64? This would mean that there are half as many pages in x86-64 as in x86 for the same amount of physical memory. Therefore, this guy’s statement *could* be true, but only in the absolute worst case where each process requests one page of memory and allocates half of it, on average.
64 bit memory paging requires a totally different strategy to handle.
a bare minimum 32bit table with 4kb per page is gonna run you about 4megs of memory.
a 64bit table using the same strategy is gonna be over 30 million gigabytes.
i can’t remember the math offhand, but i remember it being covered in Andrew Tanenbaum’s Modern Operating Systems.
your not simply doubling the address space by going from 32 bits to 64 bits, you’re increasing it by orders of magnitude.
2^32 = 4000000000000
2^64 = 10000000000000000000000000
doh… i’m retarded. I had my calculator in hex when i did the 2^32 and 2^64… thought those numbers were coming out a bit too even.
need more coffee…
I’m not up on the specifics but amd64 has at least two more levels of page tables than ia32.
Sounds like a “strategist” at Microsoft is simply someone who’s got no idea how thinks actually work.
He doesn’t have to. If it is simple math for the uninformed (see my post a couple up with 2 * 512 * 2), it is less confusing. The point is that eventually people will have that kind of hardware. It is not at all unreasonable that by the time Vista settles down in 3 years, 2GB RAM and 256MB video will be common.
While it’s true that pointers and registers double in size, most data stays the same size and code only grows by about 10% with x86-64.
Thanks for the details. I did a similar calculation back in the transition from 16-32bits. At that time, it was not as efficient, though we ended up changing some habits and corrected the problems. Now, compilers handle most of the dirty details.
While it’s true that pointers and registers double in size, most data stays the same size and code only grows by about 10% with x86-64.
Is there any results on this?
In favor of more memory:
– All fixed address operations will require 32 more bits.
– 8 byte memory alignments eat som memory.
In favor of less memory:
– More registers will reduce the need for register spilling.
– 64 bit calculations require less instructions.
So: does anyone have any data?
In reagrds to whether 64bit code is about 10% larger than 32bit code… well, lets look at a major package for both architectures. This is for Fedora Core 4 Core RPMs.
gimp-2.2.7-4.i386.rpm
RPM size = 9885K
content size = 25M
gimp-2.2 binary = 3244464
gimp-2.2.7-4.x86_64.rpm
RPM size = 9998K
content size = 25.9M
gimp-2.2 binary = 3295472
difference in RPMs = 1.1%
difference in contents = 3.6%
difference in executables = 1.6%
I would say that 10% is being generous. Most programs will probably see very little difference. A glance at many different RPMs for 386 and x86_64 showed no major differences in size. I’m not saying 64bit will NEVER be considerably bigger, just that MOST programs won’t be.
So that RAM prices will come down. Otherwise no one will be able to afford the upgrade!
The idea is that we don’t _NEED_ the upgrade.
Hey, we’re all saying that Apple was a memory eater with MacOS X who need 512mo min, and now MS ask us to have 2gb of memory??? And a 256mo graphic card??? What the hell do they think!
My next computer will be a Linux or MacOS X one(more probably MacOS X, can’t help here ) – probably with the 256mo gc and the 2go of ram but for the apps :p
Ok, I’ll tell you what to do if you want to record this HD content.
1: get yourself a dual processor PPC PowerMac now, before the Intel DRM chips arrive.
2: get a few EyeTV 500’s (before the broadcast flag gets enacted) which will processes/record the ClearQAM/ASTC signals/content
3: search online for the *cough* mimicking *cough* device in *cough* germany *cough* cough COUGH
4: don’t let the cable guy into your house
5: watch HDCP on your 2560 x 1600 30″ Apple Display or simply record the screen with a HD cam if all else fails.
A lot of the stuff mentioned in the article Apple has already deployed, the big question is if Apple has monitor DRM already in the new metal monitors. I assuming they do and are just waiting for the DRM Intel chips to complete the chain.
Maybe this is why their monitors are more expensive, than just the name and pretty looks/quality.
I have unencrypted HD on my high res. display and I’m going to tell you right now your going to want this.
Take a look at this guys screenshot, if you can’t see it, you got a dumpy monitor
http://homepage.mac.com/hogfish/.Pictures/HDscreenshot22.jpg
Take a look at this guys screenshot, if you can’t see it, you got a dumpy monitor
Wow that looks… terrible. Is the image blown up?
I think it is, after all the 30″ Apple Display has a 2560 x 1600 resolution and HDTV max is like 1200 x 1000 (1080i)
The EyeTV can show multiple sizes including actual size.
But how is it that MacOS X.4 can accelerate most of it’s display (using Quartz Extreme) on a puny GeforceFX with 64MB of memory, when Vista needs 256MB of memory?
Also I wonder about his statement about 64bit Windows needing twice the memory as it “uses chunks that are twice the size” [paraphrased]. Last time I checks the x86 architecture utilised variable-length instructions, and most apps (short of video-processing, specialist-math and enterprise DBs) were happy to work with 32bit. 2GB of DDR3 memory seems pretty high-end for an OS whose feature-set is not significantly better than MacOS X.4 or the upcoming Linux systems with X11R7 and Beagle (don’t forget that Reiser4 supports a lot of what WinFS does as well).
Updated hardware is obviously going to be necessary as computing (and particularly games) get more advanced in the near future. But this seems pretty ridiculous for what Vista offers.
Quartz 2D Extreme is still disabled by default in Tiger 10.4.2, even if Ars Technica anticipated that 10.4.1 sould have already made use of it.
You have to consider that the backing stores of the windows are stored directly in the the VRAM, so 256 MB are not that much.
Let’s hope that Quartz 2D Extreme will be enabled by default in Mac OS X 10.4.3: more than 400 bugs are said to have been fixed.
Sorry, you’re absolutely right. Couldn’t screen images be compressed within the backing store, though? NVidia have had lossless compression at ratios of up to 4:1 since the 5900FX. As it is, an open Firefox window (984×713), pointed at OSNews.com, takes up 5.2MB. Assume that could be halved by texture compression, that allows you to have twelve windows in 32MB of VRAM.
I’ve got about 20 open now, and most of them are far less than 984×713.
It seems a little excessive. That said, I use Windows XP using the “Classic” theme, so I’m hardly MS’s target audience. For those Windows users who want Exposé now, there’s always TopDesk (http://www.otakusoftware.com/topdesk). With a bit of tweaking it works pretty well.
According to Ars Technica’s technical article on Q2DE
(http://arstechnica.com/reviews/os/macosx-10.4.ars/14)
you should consider not only the backing stores of the windows: the rasterized bitmaps of the font glyphs and of the many GUI interface elements (which Mac OS X doesn’t render in real time vectorially) are to be stored in the VRAM, too.
I know, that’s why I capped my example at 32MB, I was still assuming the 64MB card I’d mentioned in my initial post.
Another thing, which I’d forgotten but which your link reminded me of, is the fact that Quartz Extreme uses a virtual memory system between VRAM and main memory for when VRAM fills up.
It looks like Microsoft has taken a more simplistic approach here. To a certain extent, I’d imagine that’s because they’re hobbled by backwards compatibility (neither Linux nor MacOS X still try to support software written in the mid-nineties), but I still think they could have done a better job. Frankly, from a technical standpoint, they appear to be falling behind MacOX X and – to a lesser extent – Linux/GNU/XOrg with this latest release.
This is hardly a unique situation for Microsoft, in the past they’ve been able to rely on their monopoly to carry them through. However this time around, they hype around things like Linux and Firefox is alerting people to the fact that they have a choice in these matters, and Apple in particular has finally got a line-up that caters for budget users. It’ll be interesting how the next five years shape up.
“But how is it that MacOS X.4 can accelerate most of it’s display (using Quartz Extreme) on a puny GeforceFX with 64MB of memory, when Vista needs 256MB of memory?”
Probably because you are used to crappy performance, 256MB would help quartz extreme aswell.
Probably because you are used to crappy performance, 256MB would help quartz extreme aswell.
256MB video would help just about anything.
aliquis you obvioulsu don’t know anything about Quartz Extreme so STFU.
The reason you need more video memory is to avoid the bottleneck caused by swapping video memory out to main system RAM. If you have 10 openGL apps each running simultaneously in ther own window then all their textures, shaders, buffers and other bits and bobs all need to occupy video ram. Vista basically makes all your windows behave as though they are OpenGl/direct3d apps.
Only looks alot for now till its actually released.Same way XP looked back then.Build it and they will come.People are still going to upgrade or have Vista on their spanking new machines.
The GPU needs a very high speed bi-directional bus to communicate with main memory. That has not been the case in the past, and what it means is that AGP will not be optimal.
Does this mean they’re adding PCI Express to the requirements list?
Does this mean they’re adding PCI Express to the requirements list?
Likely as a recommendation…as in some eye candy will not work or look poky without it. In 3 years, this will not be a big issue.
ummm…for a release date of the end of 2006 ie. over a year away. I’m quite sure that this hardware will be fairly common. Especial when buying a new pc which will come with Vista pre-installed. Hardly anyone buys a boxed version of windows, most users just get it with their new computers.
Currently you can get a new pc with close to these specs for under AUS$2000 this will definitly come down by the end of 2006. I reckon microsoft is right on mark with this one.
You actually sound quite stupid.
It’s incredible being reading this article.
I’ve got the impression that WE ALL have to greet Microsoft for making this new OS, and also greet Microsoft for telling us to upgrade our recent 2005 machines that are just plain and crap a new AMD64 4000 with crap 1,5GB DDR and crap 250 S-ATA, and a crap 150€ video card.
Also, I must greet Microsoft for making this posible.
This is true, I have this impression, is something like say “What would be all we doing without Microsoft and it’s new products?”.
The truth is that I don’t have this new 2005 machine, just plain AthlonXP with less than 1GB of memory, and I don’t see why should people in general (not everybody, just the average user) need a better machine nowadays, if WindowsXP is not even able to use all the power of my machine…
Well, if in year 2007 you’ll need that machine just to read e-mail and listen to music in front of your PC while you read your e-mail and do your work in Word, WHAT THE HELL WILL NEED A SERVER MACHINE!!?? Have to see that!!
Quit whining. Your machine is plenty beefy to run Vista. It won’t be as snappy as XP, but for checking your e-mail you’ll think it’s snappier.
The next beta will probably run very well on your machine!
You really should have held out for an NCQ drive though .
Hi,
just a few notes about that “double your ram”-issue:
1. Pages are still 4KB on x86-64 or PPC64 (unless you want to use large pages, which the OSes I know don’t use by default).
2. Sizes of C datatypes: Most of these datatypes still have the same size on 64bit architectures, especially the ones most frequently used (integers, chars). The size of pointers and longs are now 64bit.
I don’t know about statistics, but generally, your application shouldn’t use a lot more RAM on 64bit than it uses on 32bit (someone said something about 10% larger binaries; should be about the same then for a running program).
However, it depends on the application – those who really need 64bit boxes (which are pretty few people for now in the desktop area) actually make use of larger datatypes, so size requirements may vary. For a desktop OS like Windows, I can’t see a reason that justifies 2GB and 64bit boxes.
I happen to work on AIX, which has some interesting characteristics relative to Windows/Linux/Mac. For example, AIX 5.3 supports variable page sizes on a per process basis. Some applications perform better with small pages and some will large pages. Also, the AIX kernel is pageable. Parts of the kernel can be swapped out when they’re not in use, and kernel pages can be distributed amongst multiple LPARs. Very neat stuff.
on a very nerdy level the AIX stuff sounds pretty damn slick.
No kidding. I mean, when you say “yea but what if you’re a big image editor” well then sure you may need more memory. But just for Windows? Maybe IE with 50 web pages plus WMP with 8 movies playing; then maybe you’ll need 2GB of RAM! But how many people have the screen real estate, and mental stability, to do all that at once?!
I wonder if the insane hardware specs will be in pace with the release of Vista.All that power just for some butt ugly transluency?
So we have waited 5 years to hear you have to buy a race-horse to do some heavy tractor-pulling?
So we have waited 5 years to see some UNIX loging features from the sixties?
Hmm i think i will get very busy migrating peoples boxen to (Gentoo) Linux.
Why is MS listening to them?
They need eatchother,one for feeding the others product.Both are needed to milk the cow.Now you the end-user only has to pay and say booh uhmm thank you.
Everyone roasted Mac OS X when it came out for how slow it was – and 10.0 was slow. Partly, Apple hadn’t had the time to optimize it’s code, but really a 3rd generation display model like the one in Mac OS X or Windows Vista takes a ton of power. Today, Mac OS X runs very well on the hardware that Apple’s currently shipping which is quite a feat considering that Microsoft is recomending twice the graphics RAM of Apple’s high-end systems and 8 times the graphics RAM of Apple’s low-end.
I’m guessing that Microsoft, being more consumerist than Apple, will give people a way of disabling the new display engine and going back to the current GDI to make their system snappier. Apple did no such thing and just waited through the bad times – which might have been the best thing in the long run because they don’t have to support two display engines now. But Microsoft has a less loyal consumer base and a much larger one.
Eh, I’m just rambling.
http://en.wikipedia.org/wiki/Quartz_Extreme
http://en.wikipedia.org/wiki/Windows_Presentation_Foundation
Ok, go read both of those. And please read carefully and understand:
Quartz Extreme is a texture compositor.
Avalon is a vector graphics system..
Avalon is significantly slower because it’s significantly more useful…
They’re comparable only at an end user level. And then it’s a question of design ideas. Do you: Limit what applications can do for widget sizes or do you just let them do anything and make sure it still looks perfect on a low res LCD?
I *think* the next version of OS X is supposed to have much more similar functionality in Quartz. But IIRC they aren’t going to do vector graphics, they’re doing a large number of pre-rendered graphics.
That’s why Mac’s is so much faster. It’s not doing nearly as much.
The article is complete nonsense.
Either the microsoft strategist did not know what he was talking about or the writer did not understand.
The article: “Super-phat video cards mandatory”
Vista Beta 1 runs nicely on a prehistoric three year old card on my pc.
Memory: Vista Beta1 in my experience needs about the memory of XP + Memory for the desktop search engine.
These hardware specs in the article are not requirements nor are they necessary.
I take these “requirements” just as hints what a new pc should look like.
Don’t forget that Vista Beta 1 comes without Aero. We’ll see it only in Beta 2, so at the moment you just can’t know how slow (or fast) Aero will be
Aero will be ready when Cairo will be ready because MS is copying a lot from it… really
> Don’t forget that Vista Beta 1 comes without Aero. > We’ll see it only in Beta 2, so at the moment you
> just can’t know how slow (or fast) Aero will be
But then we are talking about what machine you need to run aero fast – and here the “hefty requirement” from the article is: you need a 128MB graphics card.
From my experience with WGF/Avalon: you cannot use it when the graphics hardware does not help you – just too much cpu usage.
That’s the reason why the graphics in Vista has that XP compatible interface for older hardware or people who prefer the XP/W2K look and feel.
That’s the reason why the graphics in Vista has that XP compatible interface for older hardware or people who prefer the XP/W2K look and feel.
The fact is – Why would anyone buy Vista when they can only get Windows XP in return? They may as well stick to Win2k or XP then.
Aero isn’t the only feature:
Search capabilities + corrected user environment with respect to running as a proper limited user.
You know all that sudo stuff in Ubuntu and the admin password stuff in OS X? Microsoft’s doing it in Vista….
Also, proper shut down algorithm’s that do NOT allow program overrides. Supposedly, better hibernate support in other ways (not sure what that all means).
It’s not so much amazing for 5 years work, but it’s worth $200. Not worth $200 to me.. but it’s worth $200 to most people .
The fact is – Why would anyone buy Vista when they can only get Windows XP in return?
On existing computers, true. There will be few upgraders beyond the initial surge of Windows advocates and people who think they must have it.
That said, for new systems, the situation is simple;
‘Why would anyone buy XP when they can get Windows 2000?’
While there are plenty of fans of Windows 2000 that can’t stand XP, none of them can buy a new computer with Windows 2000 on it.
The same will eventually happen with XP when Vista is out there. I give it 6 months; old stock gets purged, new stock is dual-loaded (pick XP or Vista on boot), and 6 months out XP is no longer an option unless you buy a seperate full copy of XP from the available stock.
So I’ll buy a super computer, give my freedom away and pay them money, and what will I get in return?
NOTHING
I wonder what will windows advocats say next
Ok, maybe those reqs are not exactly “required” to run windows vista. They’re sort of a “suggested config” to run it pretty well. It’s nonsense nevertheless. I do have some really heavy configs in my own pcs, but judging from what I’ve read, in order to keep performance to my actual level, I would have to almost double every performance.
SO I’ll have to go from a A64 4000 to a 4000+ double corish thingo. Memory should go from my actual 2Gigs to 4 and maybe more. And graphic cards? I do 3d for work, I need the card to focus on my work, not on else. So I would rather revert to a “classic UI” even with my 256MB card or buy a 1GB (!!!!!!!) card in order to have everything nice and shiny. But is it worth? I doubt it. And I’m even more shocked about the drm thing. I really don’t know where the majors are going with this. If they really want the market to collapse, they’re straight on the road. No way I’m gonna change monitors/players in order to view something I should be entitled in the first place. For waht again? for the horrible movies currently out these days? movies not even worth the pirating? Insane again.
I will wait till those super PC’s are at least 5 years on the market and have the prize of an bottom Wallmart box
Keep using XP or switch to Linux or BSD. I’ve been using slackware for years now because it’s MY computer, not Microsoft’s.
Its getting harder to find unbloated hardware too, I don’t do games so I figure the graphics memory to be 2k by 2k by 3 or 12 Mbytes max for my workstation. For 1600*1200 4Mbytes is enough at 16 bits. Any more than that I don’t care about or won’t pay for.
PC architecture and Windows (and Linux too) architecture are all wrong, bloating faster than the hardware can keep up.
With the current technology available today we could have built a full spec workstation in a matchbox that would be far more usefull than the junk we have now.
Anyone still remember that QNX was able to barely build a functional OS on a floppy.
I think alot of the memory crunch is now due to thourough incompetance, they obviously don’t teach CS the way they used 20yrs ago when substantial usable programs could be <64K.
transputer guy
Its getting harder to find unbloated hardware too, I don’t do games so I figure the graphics memory to be 2k by 2k by 3 or 12 Mbytes max for my workstation. For 1600*1200 4Mbytes is enough at 16 bits. Any more than that I don’t care about or won’t pay for.
Just use older PCI cards with 1-4 Megs on it then. You can have those for free these days, and as old as they are, they just don’t die, and are well supported by OS’es because they’ve been around for so long. Or go for the cheapest/most humble integrated graphics you can find.
PC architecture and Windows (and Linux too) architecture are all wrong, bloating faster than the hardware can keep up.
I agree with you. As the saying goes: “software expands to fill up the available hardware capacity”. So true. I think it really depends on one’s definition of an OS. To me, that’s something like:
bare metal << low-level drivers << kernel << more or less hardware independent kernel extensions (virtual filesystem, low-level input/output API’s) << libraries << middleware (high-level API’s, GUI etc.) << user-visible applications << user
To me an OS consists of the “low-level drivers << kernel << kernel extensions” part. That should be small, and become even smaller as it’s improved. Support for additional hardware/functions should follow a plug-in approach. To MSFT, I suppose an OS means ‘everything between user and bare metal’.
See for instance how OS’es are installed on harddrive: in many cases, the user goes through configuration like language, date/time, hardware options, etc., then can sometimes edit a default set of installed components. Then a whole bunch of stuff is copied, a lot of auto-hardware detection is done, a lot of auto-configuring takes place, and one or more restarts follow to complete the process.
Modular approach: find harddrive partitions, have user pick one, copy something minimal that supports very basic safe defaults (just VGA/VESA, keyboard, USB mouse for example) onto it, configure bootloader if needed, and restart. Installing that way can be done in an instant, and if something goes wrong, there’s few possible causes to investigate. Then, after succesful restart, the user can add/configure additional components and hardware support (audio, 3D video, game controllers, printers, power management, etc.). One by one if necessary, so that in case of trouble it’s immediatly clear what component introduced a problem.
But hey, I guess this is ultimately a matter of ‘simple but limited, works everywhere, guaranteed’ vs. ‘lots of things automagically supported out of the box, and a decent chance that it works on common/your hardware’. A matter of taste and convenience. Just pick what you like most.
I dunno, in my computer science classes all my programs (even ‘hello world’) came out to be 64kb (exactly, for some reason), on ‘release’ mode since we weren’t taught how to use the debugging tools (just add printf statements to find your error). Of course, that was with Microsoft Visual C++ V6. Using the Bloodshed Dev-C++ program (which uses Mingw), I think they ended up more around 20kb.
And then there’s Visopsys and MenuetOS, both of which I found on this website, that can STILL fit on a floppy.
Maybe it’s the accelerated pace of hardware being produced that means people no longer make extreme optimizations like they used to do on Amigas and Atari STs. I mean, look at MenuetOS. It’s coded in assembler, which isn’t (apparently) used much any more. The hardware improves to compensate, so while Vista probably could be run at full features on 2003-era hardware, I doubt it will.
Yea, some crazy stuff. Like they’ve decided that managed languages have advantages that are worth the small speed cost. They decided that higher level languages that add in functionality are worth the cost in disk space. Crazy I know. We only have like 10 orders of magnitude more power and we’re wasting almost a whole order of magnitude to make development more productive!
It’s obvious to me that you’re an expert in the software development field. Software truly is bloating at an unheard of rate; it’s so bad in fact that PC turnaround times are as long as they’ve ever been! It’s so bad that people who bought a computer to run Office 2000 on are only able to run the latest Office on it!
It’s so bad that people are creating very high level managed languages which allow developers to focus on the logic more and their toolset less! And it’s showing. Uptimes are up, changes in formats are down, and people are getting more done on their PC’s!
Truly truly we should go back to the days when real men used assembly because they liked rewriting the same program for every system!
Dual core 64-bit CPUs,2 GB DDR-3 RAM,SATA-2 drives,256 MB graphics card(s) along with no certainty that present flat screen monitord will support High Definition content is totally plain ridicule. I dont even think that Linux would require so much even after 10 years from present. Microsoft is planning to skyrocket into the future is it? Too sad many people on this planet of ours wont be able to afford such a PC in the next 5 years atleast.
Well, of corse you do realizes that with this amout of ram Windows will seem much faster, because it puts everthing into ram. Prefecting, I mean what more does Windows need to claim speed since thats not it’s natual speed by a long shot. Just let Linux pave the way for doing things proper, it may take more time but you can be sure it’s done proper.
Being A Linux user I just cannot see Microsoft getting this wrong, it would pave the way for Linux big time if they did.
It must be a joke: “Vista would work best on a video card with more than 256MB RAM, 2GB of DDR3 memory and a S-ATA 2 hard drive.”
Someone should remind Microsoft that the role of an Operating System is to optimise the use of the available resources, not to consume them all.
You realize that optimizing resources is using them right? You use them and then get out of the way when something important needs to happen….
This is why you’ll see many Linux distributions with about 8 weekly cron jobs, including prelink. All of them take a whole bunch of time, but they all help speed something up majorly.
Did they really mean you should have your main RAM as DDR3? Is that even going to be available within the next two years?
predict whats going to happen in 5 to 10 years time and get people to upgrade ASAP…
everyone complained when win95 came out cause their harddrives where too small and processors where too slow…
its their way to not make you consentrate on the software is not using the resorces correctly.. it takes to much time for them to write good solid software so let the hardware take flack…
watching films dvd is good enough for me…
This one could really turn round and bite MS in the butt. Recent events, not least sharply rising energy costs, suggest that we may be in for a bout of small is beautiful again – and that includes modest power requirements and running costs. Yet here we are with a corporation betting the farm on some 1990s fantasy that not merely bigger but hugely, vastly bigger is better. Can’t see it personally. If they want to sell this pig in a poke – as distinct from trying to blackmail users into an upgrade through the drm elements – they’ll have to try a lot harder than they’re managing to so far.
Yeah, that’s exactly like someone said about Apple’s future – that they see the future as gadget-centered, not PC-centered, which will only act like a hub. Microsoft is really behaving childish and thoughtless here. Not to mention the monitors thing – they’re just on crack I guess. Besides, if there’s pure unecrypted video stream somewhere, anywhere, it CAN be decrypted.
Take a powerbook now..(with a 64mb ati radeon 9600). Your windows get shadows, alpha transparency, smooth zoom (exposé), deformations (minimize), you see your whole desktop mapped on a cube when changing users, lovely wave effects when adding items to your dashboard. Or just play with Quartz Composer to see what you can do directly inside applications.
What exactly is Aero going to do that will require a 256mb graphics card? Will every window be mapped on a 1000 faces torus with 20 sphere calculating raytracing results on your desktop? Note sure this is going to make notepad better.
That said I acknowlege using the GPU inside apps for _processing_ will be a big thing, just look at apple’s coreimage and what you can do with it with a radeon 9600 today. Now requiring 256mb and saying “AGP is not gonna make it” … whatever shitty effects you can feature it is a bit overkill.
Not to mention that knowing the graphic skills of the designer at microsoft you can only expect to learn what it gives when you mix enormous power with enormous bad taste.
Instead of playing with Exposée or Quartz Composer, try to work professionally with Motion 2: 64 MB of VRAM are simply not enough. The top models of PowerBooks nowadays carry 128 MB VRAM already and the Intel-based ones will come very probably with 256 MB VRAM at a minumum.
It’s going to have vector widgets. Something Mac, IIRC, swore it’d never do. Mac still uses a bitmap display. Quartz, TMK, is a really kickass compositor.
This is why it works on a 32MB card that’s not very fast.
You know that Cairo thing in GTK2.8? Microsoft’s doing that in the graphics card. Picture this. Round buttons that look the same at 500,000 pixels^2 and 200pixels^2…
Also, they may want to do neat things to do with monitoring everything on your screen and things not on your screen. I believe Quartz only does that sort of thing when expose is activated?
IIRC the next version of OS X is supposed to have widgets done in Quartz using pre-rendered images. Which is a more efficient way to do it! It just makes a bit of a restriction on your widget sizes.
I was trying out Vista Beta 1 on my Pentium 4 360 with 2 GB and it was slow as hell and used up most of the memory without even running any large applications!
For users who have any machines below that performance or memory capacity Vista is so painfully slow that it is totally unusable. Oh, and forget using Vista on a notebook.
Pentium 4 630 I meant.
You know why there is so much bloat? DRM on the hardware level starts at the dual core. Both Intel and AMD have both implemented it at the dual core.
Microsoft should read Aesop’s fable, “The Dog and His Bone”.
It’s about a dog carrying a bone in his mouth. As he was crossing a footbridge over a stream and happened to glance into the water, he saw his own reflection. Thinking it was another dog with a bigger bone, the greedy dog growled and said to himself, “I’ll get that bone, too.” When he opened his mouth to take the bone, his own bone fell into the water, never to be seen again.
Hollywood’s “true trusted DRM/HDCP” is bone that looks real but isn’t. The only way it can work is if you turn a general purpose PC into a special purpose PC and you ignore all backwards compatibility and everyone agrees to follow the new “standard” and everyone is eager to upgrade (as they were around Y2K) and it is seemless.
Back in the early 90s, IBM was king. Whatever hardware standard IBM decided, was the standard. Period. They tried to introduce the MCI “standard”. The industry bawked and created EISA. Neither standard was successful, but since IBM wasn’t able to impose it’s standard, it became clear to consumers and hardware manufacturers that IBM was no longer king.
Microsoft is playing a dangerous game. Office 2000 and Windows 2000 and IE6 have convinced many people that there’s little innovation in Microsoft these days. The jury is still out on Microsoft’s “Bet the company” .NET, but it doesn’t seem to be the panacea that was hyped. The whole “Trusted computing at any cost” strategy may be final thing that convinces the average joe and the average manager that Microsoft is a company that’s lived beyond its usefulness.
True, TPM isn’t exactly appealing to anyone buying the software. They’re thinking “but I don’t want to pay to be restricted.” And the response is “well if you don’t you won’t be able to do this and that.” And then they say “but I can now.” And the reply is “times change.”
Then they think to themselves “man this sounds like I’m dealing with the mob or something…”
Once upon a time it was “maybe i should try linux out.” it sounds like when Vista is out for a few years and the kill XP.. its going to be… time to switch… thanks ms, they might be linuxs best friend.
Vista brings my Pentium 4 to its knees. 1 GB is not enough and the smallest application makes it start swapping out memory on harddrive. This is ridiculous.
I’ll stick to Win2K for the moment.. and ReactOS in a few years from now…
Anyway… GNU/Linux is here, and Haiku is coming pretty well along … bye MS :p
dylansmrjones
kristian AT herkild DOT dk
And I thought Bill Gates once said, who needs more than 4 Gigs of RAM except maybe photoshop user?
How much more evidence do you need? Microsloth can’t release an OS that does half of what the current Mac OS does without four times the hardware requirements. Plus it will be buggy as hell, DRM’ed to death, and loaded with secuirty issues due to Microsloth’s special undocumented hooks.
Until an Intel based Mac is released we have no idea if it will use DRM. If it does it will likely only be so that you can’t run Mac OS on a PC. Except for the Apple Music store Apple has always shunned DRM and I think they will continue to do so. The only reason it’s in the AAC encoded songs sold on the Apple Music store is that it was a requirement from the record companies. It’s also easy to strip out.
I remember a decade ago, people used to joke “How do you turn your Pentium into a 386? Install Windows 95.” This time it will be, “How do you turn an AthlonXP into a Pentium2? Install Vista!”
Ya know, I don’t think that’s fair. Here’s why:
1.) Win95 was utterly unecessary in its day. You installed it so that you could have this “pretty” shell that was supposedly easy to use. You still used most of the same programs, because nobody switches the programs they use immediately. And slowly, over the next 2-4 years you started using the multi-tasking capabilities in the OS, but by the time you really needed them Win98 was out and so were fast Pentium II’s.
2.) Win95 was actually pretty revolutionary in that it was a horrible set of speed hacks to get something that wasn’t reasonable to happen on crap hardware. The thing was snappy, but horribly unreliable. I mean, put Win98 on a computer, then put Win95 on. You’d be amazed at the speed difference you’ll see in the Windows shell.
Vista on the other hand isn’t so revolutionary. It’s really very evolutionary in that things don’t so much change for the end user. Most of the cool stuff is additions or under the hood so to speak. Search is neat, but it doesn’t mean you can’t do everything the same way you used to. The new graphics will add usability, but the old usability is still going to be the same.
As for the hardware. I don’t think they’re saying you’ll need a killer box (not for next year). They’re saying you’ll need a modern setup. And they’re defining “modern setup” as:
A dual core/dual chip. (dual core being cheaper of course)
A graphics card with memory.
RAM – A bit of it, cause it’s cheap people.
SATA 2 – Cause you might as well.
There’s no reason that you’d need NCQ for Vista. And if there is, it’s stupid and they shouldn’t be using it as an excuse. Never yet have I heard of slower drive accesses meaning something was unusable.
More RAM? Well, it’s typical to get 512 these days. So what’s say 1-2GB next year? Pretty evolutionary.
Dual core? That’s probably going to be almost all you can buy soon. Even Intel has admitted, implicitly, that netburst isn’t viable today.
Graphics? Well it’s about time! Bitmap screens are like an attempt to apply a character map to colored pixels: Not the best way to do it by far. Even graphics storage is, to some extent, heading toward using vectors. Obviously not for art and photo’s, but for simple graphics. Well, widgets are pretty stinkin simple graphics!
I loathe Microsoft. But I really like a lot of the stuff I’m seeing in Vista. They seem to have left behind their days of doing things wrong to get a bit more features onto underpowered crap hardware. Now they’re saying: Spend the extra $200 and we’ll give you something useful!
I’m more concerned about two things:
1.) Driver stability from less than wonderful vendors. We’ve all had a bad graphics driver make a game play bad, or kick us out in the middle of a game. Then we used a recent driver and through that stupid driver cd away. Well, now your whole desktop will be dependent on it. But I think Vista has a fully able bitmap screen to fall back to..?
2.) Abuse of search. Search is great, until you change that document and it no longer contains the word “work”. You’ve still gotta organize your stuff. But marketing people will try and say you don’t to sell more copies..
But again. This isn’t that beefy of hardware. Especially considering it’s for PEEK performance. The only part of that which you don’t need for PEEK XP performance is the graphics card.
Hardware requirement isn’t really the most important factor. What’s important is the performance. It’s going to be very interesting when you can do a benchmark comparison of Adobe Photoshop as well as bunch of other prominent applications running on Mac OS X for x86 and Windows Vista with identical hardware. That’s when we will know which emperor has no clothes on.
People here are saying why not stick with 2k/xp. I’ll tell you why.
Support for those products will end and new software will not run. Like Doom 3 for example will not run on 98. It forces you to upgrade or miss out on many things.
Sure you’ll be fine for about a year, but then?
The fact is we don’t buy Vista, we won’t buy Vista, but we will upgrade our system eventually in 5 years from now, which is going to be 2010 – 2011 where the hardware prices have decreased by 3/4 or maybe even 1/2 of the original price.
I used Windows 98 back in 2000/01
I used Windows 2000 back in 2002/2004
I used XP in 2005 (just quite recently, around march)
And I’m going to use this XP until 2007 (most likely)
Of course MS knows this trend quite well, that’s why they worth 40 billion USD.
By the way, the cost of the cheapest RAM out there is $50 CAD (yep, Canadian, not USD) for 512 MB (Samsung brand), the cost of the cheapest HDD currently in my area is $70 CAD for 80 GB.
ASUS V9520-X/TD GEFORCE FX 5200 128MB AGP8X VGA DVI-D TV-OUT VIDEO CARD <— this one cost $55 CAD, it might not be super ultra fast video card, but go figure what the prices will be 5 years from now.
Ever switched the CPU of your newly bought non dual core laptop?
I’m glad i assemble my own boxen.A dual core CPU will not be so “expensive” as they are now.
But the question is why would i run Vista?My Gentoo Linux system runs just as fine on a simple Athlon64 3000+ and it’s hard to use all of the 1024 MB ddram.
The question is,why should i upgrade something that runs smoothly and has everything i need and isn’t broken?
I quite liked this bit of the article:
“Among the breakthrough new features shown to the 2,000 developers paying $2,000 each to attend TechEd: Solitaire with new background images, a scrolling Alt+Tab bar and Microsoft’s version of Mac OS X’s Expose function, which allows all the open Windows to be viewed at once. (Microsoft has done a 3D thing that shows the windows stacked side by side, rather than spreading them out in miniature across the desktop as Apple has done.)”
I had been wondering what Microsoft were planning to do about expose since it’s such a handy feature, but this doesn’t seem all that great to me. Firstly, what makes this 3D? As far as I can see from the picture, the windows are just laid out side by side! And, more importantly, this seems to be not so much Microsoft’s expose, but more and extionsion of the Command-Tab thing already present in both Windows & Mac OS X.
At present with OS X, you press the key combo and you get a nice line of application icons. You can let go to select, or click on one. Vista seemss to be taking this further by displaying scaled down windows in their place. This wouldn’t work under OS X since it uses a nice clear application/document/window system, rather than Windows’ jumble of windows and apps.
However, this doesn’t seem anywhere near as good as expose for one reason:
The windows are tiny! By placing them side by side, they have to be pretty small to all fit on your screen. expose takes full advantage of the whole screen and only scales stuff down as necessary.
Oh well, it’ll be good to have one more thing that I can point out I’ve had a better version of for years when someone tells me how great this ‘new’ feature is.
No where does the MS speaker say those are REQUIREMENTS. But don’t let that stop you, MS haters.
“He told APC today that Vista would work best on a video card with more than 256MB RAM, 2GB of DDR3 memory and a S-ATA 2 hard drive.”
Not only that, the guy clearly doesn’t know exactly what he’s talking about, illustrated by the 32-bit to 64-bit doubling mem usage comment.
This speculation is absolutely retarded. Just wait for either an official MS press release, or for Vista to come out.
This is more about Vista scaling to whatever hardware you can give it — not a base requirements list.
Vista doesn’t require any of this for good performance. The requirements largely mirror XP’s recommended specs except for the GPU if you want to run LDDM. For that, a 64MB GPU will likely be the minimum (IIRC, I’ve seen some running Beta 1 LDDM w/ 32MB GPUs on laptops). 128MB GPUs will likely be what most people run Vista on. These will likely be the baseline GPUs the IHVs have in a year anyway. What the MS rep was getting at is that GPU vendors already have plans for 512MB GPUs (not sure whether NVIDIA has released it yet). While Vista will run well on less than that, your performance will also scale with more capable hardware.
The same applies to the CPU, RAM, SATA-2, and PCI-E. RAM requirements for Vista are basically the same as XP. Neither SATA-2 nor PCI-E are requirements. Vista runs just fine on IDE and AGP. It will, however, scale (just as XP) and use SATA-2/NCQ and PCI-E’s greater bandwidth and efficiency to give you better performance. Dual-core is also not a requirement. Such CPUs will be in wide availability when Vista ships, but they are not required to run Vista optimally.
The ability to scale to better hardware doesn’t mean it will suck on lesser hardware. It’s like adding another CPU. Your computer may perform well with one, but if you add another, it will take advantage of it as well whether or not you actually need the performance.
This is just describing the hardware that Windows Vista will come preinstalled on. It’s not about you, the consumer, choosing to upgrade your machine, it’s about all those people who are looking to buy a new machine justifying why they need a supercomputer on their desktop.
People have been justifying a “supercomputer” on their desktop for years. Just look over on “/.” at what happens when someone questions the utility of the latest geek toy.
Maybe when this is all over we can thank MS for giving us a desktop “supercomputer” running Linux and our kick-ass 3D interface.
Because we’re certainly not going to get it waiting on everyone elses hardware to stop running Linux.
I buy the faster computer to get better load up times, better compile times, snappier interfaces, and so that I can actually do things like video and image editing.
I prefer my OS isn’t the reason for the faster computer…
And Windows PC’s have notoriously been built on vastly inferior hardware… So what are you blithering about anyway?
Right now I’ve just switched to windows XP on my “media centre” PC and I’m still running Windows 2000 on my server. Windows XP is running on a huge number of computers at home and in business, a lot of people will still be running it years from now. I doubt that software availability and support will force me to upgrade for another few years, maybe not even until 2010. By then PCs that’ll run Vista well will be dirt cheap and there’ll probably be hacks to get around most of the DRM nonsense. They’ll also probably have fixed the major security issues and bugs that are almost inevitable in a new product.
Then there’s the possibility that Linux will develop greatly in the next few years. At the moment it isn’t ready for my desktop and I’d much rather use Windows 2k/XP, but by 2010 who knows? If Vista drives some more normal users from Windows to Linux then maybe it’ll become and easier and friendlier OS with more commercial software support. Then there’s always Apple and the possibility that it’s OS will eventually run on standard PCs…
Vista may suck, but overall I don’t think there’s too much to worry about when it comes to OSes for the PC.
Many people are getting something wrong. The recommended video card ISN’T 256M. If you RTFA, it says “a video card with more than 256MB RAM.” Note the MORE THAN. That means 512M, not 256M.
Second, a few people are rightly pointing out that these are not quoted as BASE specs. However, they ARE forgetting that anytime MS says something runs “best with…” they are effectively telling you a base spec. Their “best” generally means “barely adequate” much of the time.
Not really… You’re thinking of game distributors. I’ve run Windows XP on close to the base spec systems and it runs usably (barely).
I think the 2GB of RAM one of the surprising part. I’m not sure what the OS is doing to hog RAM. The OS itself should hog as little RAM as possible…
Well if your silly enough to buy new software with a machine that is more than a couple of years old then you deserve to have a problem.
Stop complaining, save your money and buy a new PC, you can get a new pc for around $500 nowadays
You dont realise the fact that many people on this planet do not have 500 USD to spare even in the next 10 years.
“Stop complaining, save your money and buy a new PC, you can get a new pc for around $500 nowadays”
And that PC will be absolutely useless for running Vista.
Look at those specs again, 256mb graphics card, dualcpu/dualcore cpu (so we’re talking AthlonMP or intel equivilent, or better), 2GB of ram, you show me a system with all that in it for £500 let alone $500.
Any pc now that isn’t bleeding edge now will run this system dog slow. And whilst those prices will indeed go down, if you run a company, can you justify the running costs of a dualcore AMD64 system, with all the trimmings, just so that your new system can run the programs your old system used (or equivilents thereof) only as fast, or slower then the previous system, whilst using maybe as much as 4 times the power?
Vista is an OS who’s design is 5 years out of touch. If you hadn’t noticed, cpus aren’t getting that much faster any longer, they’re getting more features to compensate for that such as dual cores, but even that will only get you so far, no, if you want to see real performance boosts on hardware now, its going to have to come through software, at least for the time being. And you’re not going to get those boosts if your base OS has just absorbed your spangly new alienware as though it was the $500 pc you mentioned earlier.
M$ could really have innovated, started off with a whole new design from the ground up, and copied OSX with its sandbox for older software. I just see more of the same, but “prettier”. I don’t think consumers are going to be all that pleased at having their topend systems (from todays specs) running like a P3 450 128 ram using XP. Especially once they start seeing their electricity bills go up because of their dualcore/cpu SLI rigs power requirements.
No. The point is that by the time Vista is released, a system capable of running it smoothly will be that cheap.
What I find funny is how willing people are to abuse hardware companies, when a good number of the times, any problems that do occur, occur because of flaws in the basic architecture of the operating system.
Yes, there are dodgy hardware vendors out there who try to siphon off poor quality hardware, but the question I constantly bring forward, is how on earth FreeBSD, Linux, Solaris and all the other operating systems available on the x86, provide pretty good stability, exceeding that of Windows, on the same hardware – the very hardware that Windows fanboys claim that are the causes of Windows problems.
I ran a Dell XPS PIII 550Mhz, for around 3 years; Windows 2000, constant crashes from Explorer.exe, constant strange bugginess I couldn’t put my finger on, I even did a full memory scan, which took some time, and everything was ok, did a check of the controllers, checked the connectors.
Once I found that my Matrox G550 was supported by XFree86 on FreeBSD, I instantly moved over, and compiled from the ports, KDE – a fresh breeze of stability once returned, I should have never left.
As for the solution for Microsoft – Solaris, the most scalable, stable operating system has now been opensourced, why not embrace that and build a GUI along with tools ontop of that? port all the drivers they own accross to it.
We can’t run every cheap junky usb device on the shelves. We usually don’t use the generally badly written commercial drivers (there’s a lot more bad devices out there than good ones)..
You probably had a bad driver for your Matrox card.
How on earth does a ‘bad driver’ cause explorer.exe to crash?
The issue is with Windows NOT the driver; the driver itself is rock solid, just like the hardware – what the causes of the grief is the bugginess of Windows; the crap design, the unwillingness by Microsoft to accept that they got it wrong, and that they need to go back to the drawing board – and shock horror, means they have to do a MacOS X and base their next operating system on *NIX (and thus, go against 20 years of anti-UNIX rhetoric), then so be it.
Corrupting memory, for one thing.
How on earth does a ‘bad driver’ cause explorer.exe to crash?
Well not to be a dick but if you have no idea how a bad driver can cause explorer.exe to crash on a windows system you really have no authority to tell anyone that its the *bugginess* of windows that caused the problem.
Besides there were a few things you missed in your troubleshooting that may have honed in on the problem.
No point in me telling you what you missed or how a bad driver might crash explorer because as you stated, you already moved to another OS. Problem solved.
Cut the defenciveness; this weird behaviour occured on day one, and continued right till Sp4 installed – WHQL certified drivers, the lot.
If there was an issue with the driver, it would have atleast been fixed within the three years; all it tells me is that Microsoft would rather spend time souping up the GUI by adding unnecessary and confusing changes in the name of so-called “progress”.
The best example of this change is in control panel in Windows XP, and the change to catagories – sorry, I worked at a help desk, and believe me, the first question I would get asked is, “how do I turn it back to my old Windows? I don’t know where anything is?”
You sounded sane till this part…
M$ could really have innovated
Please drop the M$ and similar childness. It just makes you look like an idiot.
The SATA thing…I kinda see, the video thing…I can kinda see but 2 gig of DDR3 JUST to run the OS…….OH MY GRAAWWWD (<–family guy ref )
but as I think of it (and I think someone already pointed out) OSX looks a hell of a lot nicer and it doesn’t need such OUTRAGEOUS specs!
Though I keep Windows around for gaming and some other stuff that isn’t in linux yet, I think its about time for me for me to give MS the middle finger, run a pure linux system and then see what apple is offering once their x86 boxen comes out.
………ddr3…..phhfft
Hmm….with this, I think Windows vs. Linux is a non match when it comes to TCO. I think I spent a total of $30 on my Kubuntu machine which runs smoothely with all the KDE Kandy turned on.
1.1 GHz Athlon, 256Mb of RAM, 8.4GB WD HD, CD/DVD Reader/CD Burner.
The $20 was for the pair of 128Mb PC133 DIMMS off eBay (shipping is included in my cost) and a Cooling fan. I got the rest from my bins of old stuff, and the motherboard/chip from someone upgrading.
I have said it before, and I’ll say it again…I WILL NEVER USE VISTA.
I have said it before, and I’ll say it again…I WILL NEVER USE VISTA.
Why bother posting then ?
I highly doubt the specs put forth will be whats needed but seriously, if its an issue then run another OS people!
MS can’t win with a good chunk of people. If they came out tomorrow and said they were scraping the entire OS and starting over people would “see they have a POS so they need to rewrite it”
If they work new software layers on what is there then people complain that they did not start over.
If the specs were low, people would claim there was nothing new and switch the debate to “it still costs X amount of money! I want it all free!”
They can’t win for trying so if ubuntu works for some of you (its probably the best Linux Distro imho) then run it and be happy.
The short story is stop complaining. If the state of operating systems is so bad that you are that pissed off about it then get off your @ss and write a new OS.
http://download.microsoft.com/download/c/3/9/c39e98c3-03b7-4fa1-959…
Insane hardware requirements, advanced DVD playback “features” that cripple the end-user’s choices, barely any new features….
Combine this with Apple moving to x86, Linux, things like SkyOS and other choices; this all adds up to some interesting thoughts, in regards to Microsoft being able to dictate standards to the industry that perpetuate their “natural” monopoly.
That’s not it. Microsoft has (for all intents and purposes) lost the search engine war, and have their interests scattered elsewhere (X-Box, etc). Microsoft has lost their focus, and their market value is simply running on 20 years worth of precedence.
I’m not going to necessarily predict anything, but I think the next 5-10 years are going to be very interesting, regarding shifts in control and power in the technology industry.
“NCQ means drive tasks can be reordered in the most efficient path for the heads to move”
I don’t understand why it can’t be done in disk driver?
I don’t understand why it can’t be done in disk driver?
It could…requiring that the CPU be tied up for the reorder. If the drive will do it anyway, you get that chunk of processing power — and no impact on other apps — for ‘free’.
NCQ has been added to most current SATA drives these days. Look for it.
“I don’t understand why it can’t be done in disk driver?”
In the old days it could have been done because the disk geometry was easy enough to figure out.
With modern drives it’s difficult (Even if possible) to find the real physical geometry and the drive screws everything up by silently mapping replacement sectors from reserved tracks when it detects a bad sector (All drives come with bad sectors). That combination destroys any chance the driver writer might have to order operations more efficiently based on geometry.
Well, well, that’s what i call inovation MS! Until now you probaply invested in a update of your OS, when you bought a new PC, now you have to invest in a new highend PC, only to update your OS.
Congratiolations MS, very customer friendly, no realy
Yes you can rip dvds with Microsoft Windows Vista. It’s built right into Windows Media Player xx? I saw it myself, it keeps menus intact and everything. It has drm built into it but that’s cool because all you have to do is sign into .NET passport to get access. Personally I don’t care if I have to agree to the EULA which states they rate dvds by how many people are watching them.
Sorry I am really stoned. In a perfect world that would be the case. It took a glass of wine to sober me up.
I’ll be damned if I upgrade to this OS…
Those memory requirements are just outrageous… And the memory doubling for x86_64 is just bulls**t (as it has been said before).
Now why do you need a SATA-2 hard drive? If you have that much memory, you’d think Windows would cache a lot of things in it… when will we get /dev/shm in Windows (easily accessible)?
256MB graphics card?! Um… my 2-year-old radeon 8500 LE wth 64MB ram runs all the games I play (ut2004 once every 2 months) just fine…
Microsoft is really going to be at loss with this release of Vista if they keep those hardware requirements…
“Ok, go read both of those. And please read carefully and understand:
Quartz Extreme is a texture compositor.
Avalon is a vector graphics system..
Avalon is significantly slower because it’s significantly more useful…
They’re comparable only at an end user level. And then it’s a question of design ideas. Do you: Limit what applications can do for widget sizes or do you just let them do anything and make sure it still looks perfect on a low res LCD?
I *think* the next version of OS X is supposed to have much more similar functionality in Quartz. But IIRC they aren’t going to do vector graphics, they’re doing a large number of pre-rendered graphics.
That’s why Mac’s is so much faster. It’s not doing nearly as much.”
I would advice to go read proporly the Apple doc, or the artechnica about Quartz.
You just confuse everything. Quartz extrem is the technology used for GPU rendering of the quartz composition. Quartz compositor which composed the final scene to be displayed on screen, run in the GPU. What does it mean is that all this king of things like shadows and transparency, or visual effetcts are GPU rendered. Each winow is in fact a 3 dimensional texture.
But you forget one thing. Quartz is not only used for compositon of the user interface, it is first and also used for all types of 2D graphics. Quartz is based in PDF 1.5, so it means that quartz does vector garphics, Quartz provides real time vector graphics, advanced drawing, antiliasing, advanced color management, etc….
Every developer can use the Quartz API to draw any very high quality 2D graphics for their application. Some osx apps use this very well:
http://www.delicious-monster.com/
http://www.steelskies.com/coverflow/HomePage.html
Avalon provides exactly the same thing as Quartz, it really does not invent anything new. Avalon provides a more easy way to draw advanced graphics for apps that take advantage of it. Look at the demo that someone gave in this forum, everything that they showed can be done with quartz.
I don’t know why you say that Avalon is more useful than quartz, that’s not true, you dont know what quartz is!!!!
Moreover, Quartz 2D extrem allow Quartz to perform every 2D graphics in the GPU. Quartz 2D will then run in the GPU, and the drawing results will be stored in the VRAM instead of the Ram as it used to be. In Quartz 2D Extrem, the cpu will only take care of the drawing commands.
It is still desactivated in Tiger, hopefully it will be available in 10.4.3.
Please dont pretend that you know better than me, i am developer and i know quartz very well. As i could see from Avalan, there is noting so much different.
To you b1tch1ing crybabies …
Do you think MS even cares if you don’t by Vista?
I don’t … so I’m sure they don’t.
Have fun computing in 1999 with KDE 4, whilst the rest of us that matter are exploring new frontiers with Vista.