A new player dares to enter the graphics card market that ATi and Nvidia have dominated for so long. XGI (eXtreme Graphics Innovation), based in Taiwan, comes at the market leaders with a line of cards for a whole lot less money. Tom’s Hardware looks at XGI’s product range, and offer results of a beta model from XGI´s top model Volari Duo V8 Ultra. The site also has a benchmark article on the latest Nvidia cards Vs the latest Radeons, but it is interesting to see some new blood in the market that have left S3, SiS, Matrox, Trident and Intel i8xx as secondary players or in ‘survival mode’.
I got an e-mail about the new XGI cards over a month ago, this is old news
Anyways, i’m waiting for these bad boys to be released, I am thinking if the DUo technology works as planned it will be a VERY nice card to have!
XGI is the remains of SiS and Trident, so they’ve got the facilities and experience to create something competitive. But we’ll have to see if they’ve learned anything from the past mistakes of those companies.
If they try to make something cheap without enough quality to justify buying it, there won’t be room in the market for it. Why bother with XGI if you can get fast, reliable cards from ATI and Nvidia for the same price? I don’t think XGI can beat the value of something like an Nvidia 5600 and Radeon 9600, they need to go after the fastest cards and sell XGI equivalents for less, or offer something worthwhile, like uh… hostesses that personally deliver the card to you and install it or something. >:^)=)
I’m still waiting for the be-all and end-all of GFX cards from BitBoys
I know the drivers for this card still isn’t really mature, but from the looks of the benchmarks, it seems to perform really well on *common* benchmark programs.
Take UT2003’s Antalus flyby. That’s a benchmark demo built into the game. The XGI performs very well there. But over on the next page, when the use the custom Inferno demo, the XGI is last, even slower than the nVidia FX 5200. Same goes for the Magma time demo. There shouldn’t be any reason why this card performs well in the flyby, but not in the time demos.
With 3D Mark 2003, it performs pretty well while sacrificing image quality.
From the looks of things the current drivers seem to be optimized for benchmarks. The only place where the XGI looks competitive is on commonly used benchmark programs. I think I’ll way till the card is released before saying for certain if they’re cheating.
I don’t believe they are really cheating, because of the nvidia/3Dmarks episode. the Volari line probably have its own prefered settings for performance in the Pixel Shader 2.0 arena, and the benchmarking applications don’t have it yet. Of course, this doesn’t explain much the lack of good performances on the UT2003 custom benchmarks. Let’s just put that on the driver quality for now.
Instead of trying to make the best video card out there, they could release all of the specifications and watch as good-quality drivers pop up for all of the alternative platforms.
<p>I could see such a company finding it’s way into every Linux user’s box in a few years, and having it’s own niche market, and not have to worry about benchmarks as much.
<p>Sigh…keep dreaming.
Just check out this detailed graph for yourself:
http://www20.tomshardware.com/graphic/20031107/images/xgi-comp1.jpg
There is no point in me even looking into this if it does not support Linux
LET SOLUTION = WINDOWS
hehe sorry, I was reading that comical site lol it’s very funny.
anyway, it’s good to see that we’ve got another choice. I wish them good luck when competing against nVidia and ATI.
This is of course good news, more competition leads to better quality overall.
Let’s hope they plan to sell in Europe as well
>and Intel i8xx as secondary players or in ‘survival mode’
Thats quite a funny comment given Intel are the biggest in the graphics market.
How so? They sell lots and lots of motherboards with built in graphics.
“How so? They sell lots and lots of motherboards with built in graphics.”
The fact that VolksWagon sells the most cars (an example), it doesn’t automatically mean they are the ‘biggest’ in the car-seat market.
“Instead of trying to make the best video card out there, they could release all of the specifications and watch as good-quality drivers pop up for all of the alternative platforms.”
I love it!!! I can finally get a new video card that has drivers available for Windows 3.1!
Not that I do much 3D gaming on Windows 3.1, but still…
dont u think if duo start getting very promising results that nvidia and ati will follow, what need to do s geta single chip beating there top single chip or atleast pull out the same power then then they might survive.
i know there drivers are in beta but im sure with two companies as one they will pull out some good drivers, we will see this as more and more hardware sites get sample boards
also on the linux front i cant see why they would support it nvidia and ati are both supporting it and are racing to get the best drivers out for the platform, it would be a silly move to ignore it totaly dont u think ?
Apparantly Linux drivers are planned for Q1 2004:
http://www.linuxhardware.org/article.pl?sid=03/10/21/1524252
-fooks
Maybe *now* is the time we each send a polite email to them asking about Linux / FreeBSD driver support. They are more likely to listen right now than a year later when they are more established and stop caring.
Type out a quick email to:
[email protected]
And you never know what will happen!
I suspect that the highend market will shrink as better quality integrated graphics become available. In a few years integrated graphics as good or better than todays best cards will be available on very cheap motherboards. It will be hard to sell a Radeon or ATI for $300 when a $49 motherboard has an inbuilt 256MB GPU.
Depends on what you use your computer for. If you use it only for undemanding apps like word processing, web browsing, etc. the current crop of integrated graphics chipsets will work fine. That’s why many of the Centrino notebooks that are aimed at businesses come with the Intel Graphics Extreme chip, which sucks majorly.
If you’re playing games, you will need a non-integrated GPU. Aside from upgradability, an integrated GPU will be using system memory as its graphics memory, which is much much slower than having dedicated video memory.
So yeah, integrated graphics are cool if you run graphically undemanding apps. For everyone else, a dedicated GPU is a must.
Wait, we dont want linux drivers we want OSS linux drivers. A company having OSS linux drivers will be a first and will give them an amazing advantage. All linux distros have it (even the all free ones) HEck, their drivers could even be included in the kernel! It will also reduce ALOT of its maintenece cost. Since a card like this will get very popular in the OSS community, and that will mean that there will be alot of people working on it. And maybe, a group of enthusiats will convert it to *BSD.
So how aout sending them a nice email requesting open source drivers, and explaining nicly that it would only benefit them?
That is the problem if you are going to be going after the Geforce MX sort of market. If NVidia takes out a license for Pentium 4’s, a lot of computer manufacturers are going to just jump on board and ditch Intel chipsets because then they can say, play the latest games with this computre, when it has nothing more than built in graphics. NVidia could destroy the low end market for other makers if they wished. These guys should target the high end because they will be very vulnerable if they go after the low end.
Apparantly Linux drivers are planned for Q1 2004…
That is too bad. I was hoping that they would provide non-Linux specific (open source XFree86) drivers for all the OS’s to run them. Isn’t that what DRI (http://dri.sourceforge.net/) provides?
I will have to stick with Nvidia on my FreeBSD system.
Well the business graphics market is bigger than the gaming market. If it’s lesser for more? Then they will have a problem. But less for less is still viable. Of course more for less is nice, but let’s wait and see.
BTW Image quality is more important for a business desktop, than graphics speed.
If you read the Toms Hardware review you will see they specifically say Intel is the largest in the graphics market.
Normally you would say the company producing the most units is the largest. Take American Airlines for example. They have more flights than anyone else and are listed on S&P reports as and everywhere you look as the largest, despite making losses.
We don’t want either linux or BSD drivers; we need AmigaOS/MorphOS driver 🙂 !!!
I also look at his. Must you have direct X to really be able to utilize some of the more advanced features on the Volari?
If so, a open directx spec is required. http://www.realtech-vr.com was working on such a thing.
But, who wants high directx performance and crappy opengl?
And I can get better for less with 3d labs wildcat..
While Intel may not be a high end graphic card maker, they still sold more graphics processors than ATI or NVidia last year. Calling them secondary is far from accurate.
I suspect XGI will end up with some low end parts too, another Parahelia.
I don’t just mean driver support, XGI could make it a point to win over all of the geeks out there by releasing specifications. Virtually every Linux user who buys a video card would buy XGI. That’s a BIG market.
Since they’re coming out supporting Linux so early in their product cycle you’d hope that they actually might work with the alternate OS world in giving us good support and you’d hope perhaps even go the open source route with their drivers.
If these cards have 2d quality that rivals the radeon series I might consider buying one of these cards. I MUST have a card first and foremost is easy on my eyes for coding and then second have something fast enough for casual gaming. I think my radeon 8500 is starting to show its age.
NVidia has IMHO ALWAYS failed the 2d quality test.
They say that their cards support DirectX on hardware.
Yes, just like everyone else.
Is that merely hardware that’s DirectX compliant, or is the chip “winmodemed”?
The former.
As directX itself isn’t an open specification, could it be just a waste of chipspace? … No, seriously – if it’s DX crippled hardware, you have to look at how it’s crippled, or enchanced.
There is no such thing as “DX crippled hardware.” DirectX is an interface that sits on top of the software drivers – just like OpenGL.
Hi!
As i know there’s a little chance for that we can see open specifications for any high end cards. It costs a lot to develop these cards and the chip makers won’t give away all their ‘secrets’ just to satisfy the need of the (yet) small linux/bsd/etc community. I think if they all provide stable and reliable drivers to our platform too asap, that’s a more resonable goal. Actually i don’t have any problems w/ my nvidia. Sure it would be nice to have it supported by the kernel itself, but while it does its job it’s ok for me this way too.
bye, hirisov
What platform are you using? Because the Linux drivers do have a (relatively small) kernel component.
Why on earth mod someone down who states that this isn’t relevant for their use unless it has linux drivers? This is an <u>Operating System</u> news, he’s hardly off-topic. The tone of his post may have been quite terse, but he has an extremely valid point.
I’d like to upgrade my geforce, but ati drivers don’t have the compatiblity I need and geforce cards don’t offer value for money- so I can well understand his statement.
“As i know there’s a little chance for that we can see open specifications for any high end cards. It costs a lot to develop these cards and the chip makers won’t give away all their ‘secrets’ just to satisfy the need of the (yet) small linux/bsd/etc community. ”
And what “secrets” would that be? (never mind the circular nature of it all). Kind of like the government saying we can’t tell you because it’s a secret. Second are these “secrets” more along the lines of “Obvious to people in the industry” secret? Or more “no one else can figure it out” secret? Or just “call it a secret so they’ll stop pestering us” secret?
It’s so hard to tell, because it’s a secret.
ThanatosNL wrote:
Instead of trying to make the best video card out there, they could release all of the specifications and watch as good-quality drivers pop up for all of the alternative platforms.
I could see such a company finding it’s way into every Linux user’s box in a few years, and having it’s own niche market, and not have to worry about benchmarks as much.
I think you’re exactly correct. At some point, some video chipset/card manufacturer will realize that they can instantly summon hoards of geek customers by doing just what you describe. And geek customers are the ones helping their friends/family get their computer systems set up. Large companies have trouble seeing the long view when they’re hyperventilating over next quarter’s returns.
if XGI is pushing for PCI-Express cards for 2004 like Nvidea and ATI. Kind of a waste though.
3dfx anyone? Sli?
Just because they’re using dual chips you shouldn’t draw comparisons with 3dfx, ati have used dual chips before and look at them. Also, if you’d bothered to do any reading past the headline you’d see that they aren’t using sli. There is more than one way to utilise multiple graphics processors.