A shadowy organization called Larrabee Development Group has set a most ambitious goal: unseating AMD/ATI and Nvidia as the largest producers of high-end graphics chips. And it might just succeed. It may seem unfathomable for an unknown such as Larrabee to knock off two of the most powerful processor companies. That is until you realize that Larrabee is little more than a weak disguise for Intel. The chip giant has, in fact, ramped up its graphics efforts in recent weeks to create a product code-named Larrabee theoretically capable of besting AMD and Nvidia’s best kit.
let the graphics wars begin anew!
I thought the next generation of graphics was going to be ray traced.
With ray tracing wouldn’t you get all of the stuff that is ‘hacked’ into current processors for free?
Ray tracing is computationally very expensive, even if you do most of it in hardware.
I have to agree with tomcat here, it’s not worth it to even try to do ray tracing in real-time. There are many tricks you can do (such as bump- and normal-mapping) to achieve better graphics, but ray-tracing isn’t one of them.
Ray tracing is parallel processing friendly. Imagine a card manufacturer embedding two or more cores dedicated to tracing rays into gfx cards,
Or while we are at it, let’s just imagine a beowulf cluster of these….
Um No. Imagine Imagine Imagine… it’s like Hollywood magic!
Ray tracing is expensive, the hardware required to do it real time at a performance rate that would be acceptable to a gamer is even more so. Adding more cores, increases complexity and the cost to manufacture the item. The goal of a business is not to produce a product that is the best on the planet but no one will buy becuase it costs too much. (SEE SGI Workstations for more information).
They want to make the best product they can for as cheap as they can so that the average consumer will *buy* it. It’s all about volume these days. Margins are very tight in the hardware business so I doubt you’ll see anyone making the “Real-time Raytracing” leap for a while yet. We still have a few more(dozen maybe?) generations of hardware to go through before we approach hardware that’s affordable and capable of that.
SaarCor Project is developing realtime raytracing hardware and do it prety well and fast. In the web you can see scenes from quake raytraced at 15FPS in one of the first test cards (the card the made in 2004). http://www.saarcor.de
It would be really nice if they came out with something that’s at least competetive against nvidia/ati chips. More competition = lower prices
How long does it take to go from hiring engineers to shipping a product I can go out and buy? If they offer open source drivers like they already do for other products, it can’t happen fast enough for my liking..
Desiring to knock-off NVidia and ATI isn’t the same thing as doing it. Intel has thrown lots of dollars at this problem in the past and, so far, it hasn’t been successful, either. As they say, I’ll believe it when I see it.
Desiring to knock-off NVidia and ATI isn’t the same thing as doing it. Intel has thrown lots of dollars at this problem in the past and, so far, it hasn’t been successful, either. As they say, I’ll believe it when I see it.
Intel has never had a competitor like this before, that offers both high end CPUs and high end graphics cards. In fact Intel never wanted to make high end graphics cards before. They were content offering integrated solutions. Lately we’ve seen Intel get more serious about their graphics chipsets and it seems this project is an extension of that. Intel has a lot of money and a lot of engineers. If they really want to make a high end graphics card they will.
Intel has a lot of money and a lot of engineers. If they really want to make a high end graphics card they will.
Sure, but ATI and NVidia aren’t exactly just going to sit still while it happens. All of them are throwing heavy-duty R&D dollars at the problem of building faster cards. Plus, ATI & NVidia have been working with Microsoft for years on graphics card technology. To some extent, they have been helping to drive the technology forward, so it will take a concerted effort on Intel’s part to match that kind of focus.
Sure, but ATI and NVidia aren’t exactly just going to sit still while it happens. All of them are throwing heavy-duty R&D dollars at the problem of building faster cards. Plus, ATI & NVidia have been working with Microsoft for years on graphics card technology. To some extent, they have been helping to drive the technology forward, so it will take a concerted effort on Intel’s part to match that kind of focus.
Intel has been working with Microsoft for years and they have a lot more money, especially for R&D. I’m not saying Intel is going to take on ATI and Nvidia over night but they have the resources to do it.
“In fact Intel never wanted to make high end graphics cards before…”
umm… remember this garbage?
http://en.wikipedia.org/wiki/Intel740
Definitely not the first time they are trying to squeeze into this market. Sounds like they might be a little more intent on making a decent showing this time around though.
I am all for it.
If Intel is as open source friendly with their nextgen graphics technology, as they are with their current technology, I can’t wait to see Nvidia/ATI put in their place.
I have an GeForce GTX8800. It won’t work with my Dell 30″ FP because Nvidia’s Unix drivers don’t support dual-link TMDS on the G80 GPU yet.
If this was open source, I would have had a $600 graphics card sitting in its box for over a month.
I like Nvidia better than ATI, but Nvidia is merely the lesser of two evils.
Dude, why did you buy a card like that without research? I mean why would you want to put yourself in a position where your $600 graphics card doesn’t support your platform of choice?
It doesn’t take much to check the Nvidia site to see if they have drivers for *nix….
I would understand if your where bitching about performance being worse under Linux with SLI, or something of that sort, but this was your own fault.
Dear Genius,
Please review the following info from their driver:
Beta Driver – Linux Display Driver – x86
Version: 1.0-9742
Operating System: Linux x86
Release Date: November 8, 2006
BETA Driver
Release Highlights
* Adds support for GeForce 8800 GTX and GeForce 8800 GTS GPUs.
Yet with dual link TMDS, all you get is a black screen (one of the Nvidia devs later revealed they have this in their bug tracker as a known bug).
This card in combination with a dual-link TMDS monitor is pretty rare and they never mentioned in their driver documentation that this was an issue.
The nvidia devs told me this will be fixed in the next release. It’s just too bad that it’s closed source, so I have to wait until it’s financial prudent for Nvidia to fix their damn product.
Edited 2006-12-09 01:22
I’d like to add that if Intel provides open source drivers for THEIR “good” video hardware, it could possibly drive Nvidia and ATI to do the same.
I’d like to add that if Intel provides open source drivers for THEIR “good” video hardware, it could possibly drive Nvidia and ATI to do the same.
Indeed. I do believe that this is Intel’s plan – after all, it has nothing to lose (and everything to gain) by open-sourcing its drivers in the current context.
Note, however, that there are persistent rumors that AMD will open-source the ATI drivers as well, which would leave only Nvidia with closed-source drivers.
I hope that’s that case as there is a freeBSD install that is languishing because of ATI x1600 that is installed.
nvidia and ati (ati more so) have both punished linux and foss, now that intel is open source, its seems its payback time, PS sooner or later ati and nvidia will have to open their drivers.
wow tech is really hotting up, Microssoft is running scared, Intel is pushing base, Oracle is invading, wat a year
i for one welcome this
GPUs used to be very different from CPUs, but recent GPUs are starting to look like multicore processors with a little graphics-specific logic added on. Intel’s decades of experience in processor design can now be applied to graphics.
Actually GPUs *are* multicore processors. Let’s say they are “RISC” somehow
And I’d add: http://developer.nvidia.com/object/cuda.html
C extension framework for general programming on the GPU. Looks quite more comfortable than writing the shaders and reinventing addressing schemes for gpgpu
intel wants to off customers the complete solution. personaly i think they should have partnerd with nvidia but its cheeper for intel to do it this way. ATI, Nvidia, S3, and intel let the games begin….
In order to do this, you need a fan-boy base. To generate a fan-boy base, you need over-clockable hardware and more FPS than anybody else.
That’s it.