Not directly operating system news, but nevertheless, interesting news for all us geeks. So, Carmack says that GeForce4-MX is not a good buy for Doom3: “Nvidia has really made a mess of the naming conventions here. I always thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had significant architectural improvements over GF2. I expected GF4 to be the speed bumped GF3, but calling the NV17 GF4-MX really sucks. GF4-MX will still run Doom properly, but it will be using the NV10 codepath with only two texture units and no vertex shaders. A GF3 or 8500 will be much better performers. The GF4-MX may still be the card of choice for many people depending on pricing, especially considering that many games won’t use four textures and vertex programs, but damn, I wish they had named it something else.“As Tom’s Hardware also noted, GeForce4-MX is nothing but a bumped up GeForce2-MX, and it bears almost no resemblance, feature-wise & architecture-wise, of the real GeForce4 series of cards (which are more expensive). It should have been called something like GeForce2-MX-Turbo or equivelant, but definetely not a GeForce4-MX.
This naming convention by nVidia will probably leave many costumers unhappy thinking that they bought something truly fast to play the latest games, while in reality, this won’t be the case. Also, future games or demos that will be built to rely on the GeForce4-specific hardware, most probably won’t be able to run on the GeForce4-MX because it does not carry the features found on a real GeForce4.
the Radeon8500 will work fine with Doom3 beware of a possible flaw ATi have to address…
quess what the flaw is…
nVidia is suckering consumers by not naming the GF4-MX what it really is, a GF2. An incredibly cheap and shoddy shot.
Come on nvidia, get with the program here!!!
Why couldn’t you have made the GF3 the “GF4 MX”??
Whats really annoying is that TNT2’s GeForce 2’s, and 3’s are all still sitting around in the distribution channels as well. What really sucked for me was that the GF3 Ti500’s all but disapeared about a month before the 4 was officially announced. So, I was stuck with what customers saw as non top of the line stuff. Annoying. And now they pull this stunt. Keep the damn product line simple! GeForce 4 Ti, GeForce 4, GeForce 4 MX. None of this numbers game crap and it only ends up being as good as a GF2.
In there defense though if the MX series is only as good as a GF2 it still wouldn’t be logical to call it the GF2. That would confuse consumers. I agree though, the lowest of low GF4 MX series should have been at least as good as a GF3 Ti200. Therefore you can phase out the old junk and sell this new stuff and have a slightly higher margin.
Oh well.
Just buy a Ti
>In there defense though if the MX series is only as good as a GF2 it still wouldn’t be logical to call it the GF2.
The problem here is not just the speed. GeForce4-MX (and do not forget that GeForce4-MX has 3-submodels, 1 very slow, one medium and one pretty fast – if that was not already confusing!!) is quite fast when compared to a GeForce2 or to a slow GeForce3, at least the fast submodel (check Tom’s Hardware for actual benchmarks on the sub-models). The real problem IMO, is that the GeForce4-MX, architecturely, is not a GeForce4. It is like calling the Porche 944 as a Porche 911-Value. Well, it is not a 911. It does not share the same technology, and even if you tweak it and you make it a bit faster than the original 944, it still won’t make your 944 -technologically- a 911. On cars that may not matter much, but when you got games or demos that RELY on specific architectures and then you try a demo code that says: “requires GeForce4” (there are demos that require very specifif hardware to do some cool stuff) and you run it on your GeForce4-MX and it does not actually run, JoeUser will think: “WTF? But I do have a GeForce4“. News break, my friend Joe: You don’t.
mac users…
Ohhh poor mac users…
Poor Mac users? Are you assuming that Mac users only have access to the MX?
http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/st…
Seeing as how the GeForce4 MX is what’s shipping with the new G4 PowerMac towers, I wonder if all of Carmack’s complaints are valid for the Mac platform as well ? I seem to remember that one of his chief problems with this card is that it’s reduced feature set makes it “not a DirectX 8 part” (or something like that). Since the Mac uses OpenGL, not DirectX, are the issues of the same severity ? Other than the fact that it’s simply a cheaper, slower part vs. the real GeForce4 cards that is…
Carmack only uses OpenGL, but card manufacturers aim their feature set to DX. The GF4mx is technically the same familly as GF2 and hence aimed as the same version of DX, which is an earlier version then DX8 that the GF4 targets. There wll be a lot of confusion, I just wonder what NVidia’s marketing morons were smoking…
AFAIK, using OpenGL doesn’t help, because DirectX 8 covert a lot more features than OpenGL anyway.
But still, it’s just an “AFAIK” 🙂