Asus will begin shipping a dedicated physics processing board based on Ageia’s PhysX PPU in May, the company said today. The card contains 256MB of memory dedicated to environment calculations designed to make virtual worlds feel more real to game players. Ageia announced PhysX last week. It claims that 60 developers – including UbiSoft, Cryptic Studios, NCSoft, Epic Games and Sega – are working on 100 games with support for the company’s physics calculation API.
The question is, will games developers support and gamers buy Ageia’s single-purpose boards, or will they prefer to devote resources and cash to graphics cards they’re going to have to buy anyway?
Eh,the dedicated physics processing board is “just” a graphics card with extra chip?So if it is a better solution ( it works too) and there’s enough games developer support than i don’t see why the mentioned resources and cash will not be devoted.Interesting from a technologically point of view.
To be honest I hope this is a short-term product, a dedicated physics engine would belong on a Graphics card. Considering integration is the general trend of computer technology, I’m sure this will be short term.
own PPU’s to their video cards this board and company is gonna go belly up if not be bought up.
That thing looks expensive, and power-hungry (it’s got a fan for cooling?!). Really curious as to how this thing is going to work out.
Personally, I’d rather see their tech integrated on a video card, which might be the natural evolution (sort of like how 3D used to be add-in cards and then migrated to be an integral part of the video card).
What’s this thing going to cost?
– chrish
The 128MB GDDR3 PPU that Alienware and Dell sell adds $275 and $249 to the cost of a machine, respectively. You might expect something in that price range depending on the production volume.
A physics coprocessor will no-doubt allow for some fantastic effects in games. The trouble is, the full power isn’t going to be used until (or unless) everyone has got one.
For example, the manufacturers claim that this chip allows fully modifiable landscapes, destroyable walls, that sort of thing. I have no doubt that this is true. The trouble is, things like that fundementally affect the nature of the game, and are not something that can just be tacked on as an extra for people who happen to have this high-performance card. So game designers just won’t include these effects until everyone in the target audience has such a chip, which will be about three years after ATI and Nvidia start including them on their cards.
So while these chips might cause a revolution in gaming, it’s not going to happen tomorrow. All that you will get from buying one of these tomorrow are ragdoll effects that look more realistic, and an FPS figure that’s a fraction higher as the CPU’s got a slightly lighter load.
Now if Sony were to include one in the PS3 on the other hand…
“The 128MB GDDR3 PPU that Alienware and Dell sell adds $275 and $249 to the cost of a machine, respectively. You might expect something in that price range depending on the production volume.”
Gamers are not gonna pick this up then if it costs that much for a card with half the ram. As a serious gamer I have a hard enough time keeping up with my CPU, Ram, and Video Card upgrades. No way am I picking a addon card that costs that much. I’d rather wait untill it’s intergrated into my video card instead. By then the price will be nearly invisible to me.
After increases in production volume and foregoing the markup of Dell and Alienware, you can expect the Asus card to not cost considerably more for twice the local memory. Whether that will be realized or not isn’t really obvious yet. As for transparency, maybe if you spend twice as much for two GPUs and are willing to sacrifice one for offloading this work rather than rendering, which seems the goal of NVIDIA and ATI. Of course the more general-purposes GPUs become, the more design tradeoffs that will need to be made that will reduce efficacy or increase complexity. “Gamers” in terms of the people with more money than sense and the need for “the best” in a computer will spend a measly $300 on a PPU if it provides greater performance in the few titles they’re especially interested in (Epic has been pretty open about its interest in supporting Ageia’s PPU). Compared to spending $800+ for SLI, it’s not really a big expenditure. If a dedicated general-purpose vector coprocessor catches on it might be in the form of ClearSpeed or Ageia’s products. Or not. We’ll see.
I say that tongue and cheek. This card is going to be great for the automotive industry, aerospace industry, anyone who builds any type of vehicle or something structural that needs physics to be simulated…which is a lot of stuff. $250 more per workstation for accelerated physics is nothing for big design centers.
This is a great idea (in theory), but I doubt the players of many video games are too concerned about the scientifically-accurate aspect of blowing stuff up. I work for ‘the Man’ at a computational physics lab and we’re using huge clusters to do physics, so this would actually be better targeted at specialized tasks like ours where we’re using brute-force & big-iron to do what this card may be able to do much cheaper. I’ve read their specs (which are painfully vague), but as far as video games – I don’t see this catching on.
Good places for this kind of card would be:
* Architectural models – one could measure stresses, wind interference, and vibrations
* Fluid physics & navigation – maybe actually include one or two of these in a boat or plane for optimum maneuvering through complex flows and eddies
* Automotive – ahhhh, a car that could not only drive itself, but also make those decisions you wish you didn’t have to (is an oncoming car more dangerous than say, a tree? – or building?)
* Satellites – that could avoid solar radiation damage by altering orbit and not obliterate eachother
Bad places for this kind of card:
* PC games – video cards already have a place in PCs, it’s better to integrate this technology directly into a card versus making an additional peripheral that the CPU&OS have to treat so special
* The general market – even advanced programmers don’t have much of a use for this, it’s just too special for most people’s use
But, that’s just my opinion….
…either way, I hope they succeed in making some progress and profit (somehow).