“We are all aware of AGEIA and its desire to bring a PhysX processor for physics processing in order to make games look and feel more realistic. We all want the characters to move and act realistically, and that’s exactly what AGEIA’s PhysX will accomplish once it debuts. However, in order for the technology to work, it requires an expansion card for the motherboard. ASUS will debut the card in February 2006 when more games become available to take advantage of AGEIA’s innovation.”
Ff the drivers are open source, the hardware will sell.
90% of the population has no clue what open source is!
The only people likely to care about whether or not the drivers are open source are a few alternative OS users and associated developers. If something being open source were a key motivator for selling to people at large, the multitude of Linux, BSD and other alternative OSS operating systems and their distributions would have a much higher penetration rate.
But the fact of the matter is that price or political/religious issues are not the biggest driving forces for what determines which products will be successful: it’s perceived value. That perceived value is greatly enhanced by the existence of something that solves one or more problems for a customer, that is enabled by that product. What they need to do is convince customers that they’ve got a good enough (it doesn’t have to be the best, just good enough: open source promises nothing in this regard to most people) solution to a problem they either knew or didn’t know they had, and convince them that there’s greater value in using their solution (for a small price, of course: it takes money to pay for reality enhancements) than in continuing with their current “problem.” If someone doesn’t do something that is significantly enhanced by a new or existing solution, it likely won’t be worth their bother to worry about it. It means absolutely nothing to most people how good of a server OS something is, if it doesn’t work for their other needs. So, too, it doesn’t matter how good this hardware is at solving such math problems if all they do is view stuff on the web, write emails and other things, etc. which have no use for such a thing.
If the hardware/software combination doesn’t solve a “problem” (real or perceived) at a price the user(s) are willing to pay for the “solution” because they see the value in it, the technical (does it work?) or religious (is it “free as in beer” or proprietary and restrictive according to OSS adherents?) means absolutely nothing in practicality.
Jonathan Thompson
So the zillions of quality and modern games that run on *BSD and Linux can show improved physics?
Seriously, this is a physics accelerator for games.
– chrish
I hear you make the tactical, pragmatic argument.
The perceived value argument continues to favor F/OSS in the strategic picture, as the growth of GNU/Linux, Apache, and Firefox, to drop a few names, shows.
Christopher Smith
For ~=90% of the potential market (ie. Windows/OS X users) the strategic picture only matters if they long-term intend on switching away from being amongst the Monopoly Masses; until that inertial bump is sped over, the small percentage of systems that are F/OSS have zero value proposition to them, as they don’t see it, and don’t think about it. It matters nothing that something works on many other different systems than the one they’re using: what’s in it for them, now? It’s the classic chicken/egg paradox.
Something being F/OSS only matters to those that bother/wish to work with alternatives. As the installed base and marketshare demonstrate, that’s a small percentage of the market (alternatives that are F/OSS) and outside of the server room, are likely to remain that way for a very long time.
The F/OSS alternative OS’s have more gained perceived value that will make them more saleable to more users with the addition of hardware/software that’s F/OSS than (in this case) a hardware manufacturer stands to gain in terms of sales from the users of those systems, for the amount of resources expended towards supporting those systems: this is the bane of hardware/software support for many devices/applications, because having multiple targets to support costs a lot more than a single target, and where the multiple other targets have a potential and current real market that are too small to expend the resources for the anticipated revenue, the target that is the biggest/most-likely-profitable one will be addressed, and others are more likely to be ignored or at least have very little emphasis placed on them.
At this time, most alternatives are more prevalent (by percentage of machines and numbers) in the server closet than on typical users’ desktops, though please don’t take that as me saying that the only alternative OS’s are primarily used for servers, as such alternatives as Zeta/Haiku/BeOS, Syllable, SkyOS, RISCoS, etc. are not. How many people use server machines running server OS’s for things outside of their main use most of the time? The computational ecosystem needs to exist for something to be successful; in this case, that’s not going to be the typical server machine running an alternative OS (alternative being defined in this case as an OS with a small percentage of machines running it compared to the total market) unless you somehow create a compelling reason. Perhaps I’m wrong here, but I sincerely doubt there will be many Physics Servers any time soon, as the whole point of having a PPU is to have it local with minimal latency. Until servers are serving out pre-computed game data, or they create applications for using the PPU for things like nuclear weapons simulation or weather prediction (a statistically uncommon end-user application) expending the resources on the alternatives has insufficient real value to the manufacturer, because there isn’t enough perceived value to enough users to recoup investment.
It really is too bad that no single OS exists that performs optimally at every sort of task; that would make the whole thing of supporting alternatives a moot point. As much as I’d like maximum freedom to do everything and anything I want in any OS I choose to use, it doesn’t exist now, and likely never will, until the unrealistic-to-expect event of that optimal-at-everything OS appears and is accepted. Until that happens (yeah, right!) I’ll continue to use my mix of “Standard” and “Alternative” OS’s and platforms of my choosing (yes, I do use more than one OS, from more than one vendor!).
Jonathan Thompson
the small percentage of systems that are F/OSS have zero value proposition to them,
Disagree
Something being F/OSS only matters to those that bother/wish to work with alternatives. As the installed base and marketshare demonstrate, that’s a small percentage of the market
Disagree. Your point has some merit in the switching cost discussion.
because having multiple targets to support costs a lot more than a single target
There is cost involved in the compile time, and verification that fixing one thing didn’t break two. However, once the pocket-picking evolution of maintaining a software baseline is obviated by F/OSS licensing, life is improved and cost is avoided.
The computational ecosystem needs to exist for something to be successful
Vendor lock-in at the software level is the requirement to ensure actual attempts at competition remain unsuccessful. Bogus software patents are the corresponding legal mechanism. To paraphrase Mel Brooks: “Shall we continue to build vast binary palaces for the rich, or build decent applications for the Poor? Fsck the Poor!”
It really is too bad that no single OS exists that performs optimally at every sort of task
Seriously, have you tried Linux From Scratch, to get the feel of things, and then Gentoo, for managing the actual system? Great for the individual, and not to take anything away from Novell and Redhat; they make compelling offerings for the enterprise.
But, hey: as long as we keep an eye out for the corporate predators, there is plenty of room for all.
I’m sure the gamers will love it, but this kind of extra hardware goes unused most of the time in most kinds of computers. Its functionality (like hardware OpenGL) which could have been reusable source, had the hardware not filled that slot, gets locked into a product. Which may be a great. There’s nothing immoral about it, but it’s not good for open source.
I think a scalable, homogenous multiprocessor architecture with as few alien processors as possible, would be better for Open Source. No processing power is hidden behind an opaque hardware interface, but available all the time, and the collective source stays algorithmic in nature rather than device-specific.
Instead of CPU, GPU, Physics processor, …, I’d like CPU1, CPU2, CPU3, …, probably NUMA-ish memory, and only the most trivial of IO devices (think framebuffer).
Replace hardware with software and kiss those NDAs and missing hardware specifications goodbye.
problem is, you will never get the same amount of performance out of a general cpu that you can get out of a specialized processor. try emulating a good gpu on your second core on a dual core cpu. you will not get anywhere near the same amount of textured triangles per second as even an aged geforce2.
specialized hardware makes sense for some areas even more if it is a problem which can be parallelized much and i guess physics is such a case.
I choose hardware that works with the OS:s I want to run, and I don’t expect device drivers to be made that lets me tap the 3D power of todays graphics cards in BeOS/Haiku and other niche OS, so I’d choose a set of extra general-purpose CPUs over more powerful yet completely unsupported single-purpose chips, any day.
But its all hypothetical. Would be cool though, a little supercomputer where the network card adds another same-architecture processor, the graphics card adds another, and so on, each device adding CPU power instead of taking away.
The chips wouldn’t even have to be the most powerful/expensive ones, as long as the system could be extended.
Edited 2005-11-25 01:10
“Replace hardware with software and kiss those NDAs and missing hardware specifications goodbye.”
I am on your side pal.
what they really need is some eager;y anticipated game based on the card or that can take advantage of the card
for example if oblivion used the card to provide cloth physics in an expansion pack.
the next unreal tournament is said to support it.
so do you think people are willing to shell $300 for a physics solution?
while i might be willing to shell an extra $50, $300 is pretty stiff
For that reason I don´t think this is an interesting product. For example consider a game in whic you jump 20 m. In a realistic phisics engine you would have to have a big initial impulse but I think that in that case, It wouuld be funnier (and easier) a less realistic behaviour like a more smooth jump. A plataform arcade like Mario Bross with realistic physics would suck.
Even when possible, far most games you see nowadays on the PS2/Xbox/GC choose to not use realistic physics. Why? It doesnt really help or change the premise of the game, or makes it more fun/less fun. Final Fantasy, Zelda, RPGs dont take much from it. Many games like Mario, Psychonauts, Katamari Damaci, have very precise and limited physics so the control of the character is comfortable Racing games often have modified physics, because a realistic simulation would send your car rolling against buildings each time you fail a curve. Flight simulators rely on specialized models, balancing of each airplane, which are harder to simulate and tune (find the actual parameters/mass/air friction/rigid body/etc) than to implement. I think Half Life 2 is also a proof on how physics doesnt really help much, since despite using the havok engine, most of the game didnt use realistic physics. So, I think the card is very limited and doesnt help gaming overall progress much.
The card can be used to accelerate any kind of physics, or even math that is just similar to physics calculations. It gives developers the ability to make physics in games detailed, but does not require them to make it realistic. Racing games, etc. will certainly benefit from this. As for games in which smooth control depends on simplicity (Mario), actual large-scale physics of the player’s character (like the jumps involved in Mario’s platforming) can be kept simple, and the card can be used to do realistic real-time physics that don’t affect gameplay but just make the game look better (for example, when Mario pounds a crate, each splinter of wood can be individually modeled in realtime for a realistic explosion effect not possible with an animation). Games don’t choose not to do this today, they absolutely cannot, any more than they can do realtime raytracing.
This product certainly deserves to succeed, but I’m worried about its future, because when people read about it, they seem to immediately think of Half-Life 2’s gravity gun, and say, “That was a nice gimmick, but I certainly don’t need it in (insert favorite game here”. They don’t realize that this thing is worth it simply for what it can add to the presentation of a game, and if it enables new types of gameplay, that’s a bonus.
once upon a time, there was a weitek math coproessor for CAD/CAM medical imaging, 3D type work etcetera.
while $300 for a physics processor, comparable to a xbox360, strikes me as overpriced for mere gaming, could the scientific, graphics, photoshop, 3D modeling, weather modeling, workstation/super computing, simulation, molecular biology, protein folding, engineering community benefit from physics acceleration? in effect using it like a second super-math copressor?
i understand its architecture is SSE like vector processing units connected by high internal bandwidth and external bandwidth. perhaps the vector units could be used for other purposes?
would it be possible for amd to create a virtual physics processor by having a third core, on a 65nm, of a dozen SSE2 units all connected by high internal bandwidth?
$300 for accelerated graphics, $300 for accelerated physics… $300 for accelerated AI maybe?
And how many games would you find that really make use of all that?
A LOT of FPG’s would. I see this breaking into consoles first, then computers.
I promise to agree with you if you change your sentence to “A lot of FPG’s could“. And I wasn’t wondering if they could use it, but actually how many would really do it.
Most physics I can think of can be described by manipulating 4-Vectors.
So I really don’t see what this thing has to offer that general purpose vector processors like the cell vector units or the SSE2/SSE3 extensions can not do.
But note that they also offer a software-only solution for cell and ppc. So the company is not dependent on the sale of the hardware accelerator for survival.
Floating processor units . Already have them.
Why would we need this ?