“AMD’s plan to integrate a CPU and GPU on a single chip, codenamed Fusion, has been a hot topic since the CPU manufacturer first announced it planned to purchase ATI. AMD hasn’t had much to say about Fusion lately, beyond confirming that the project exists and is still in development, but new rumors have surfaced over what form Fusion might take when it surfaces in 2009.”
Hey, it sounds like they try to resurrect the MediaGX
I’m glad they’re opening documentation and thinking outside the box.. but, they’re sacrificing “choice”.
If this the future of AMD/ATI graphics controllers, this will mean it won’t be possible to use combinations.. an ATI graphics card with an Intel processor for instance.
Hopefully this will be a second market, embedded perhaps… because, frankly.. what if someone wanted to use an AMD card on a non-x86 system? Nothing explicitly disallows that on systems with a PCI/AGP bus.
Also raises an interesting question as to where Apple fits into the equation. Will we eventually see the multicore monster Intel is developing going to be used in the future workstations? If Nvidia become the only via option – given the current problems with the Nvidia graphics cards, how is Apple going to cope with that fall out and possibly future fall outs?
Although this does open up possibilities, it also limits the possibility for future competition in the graphics market – especially when it comes to future developments for OEM vendors who want to chop and change to meet customers demands.
Right now I am using an ATI card on an nvidia chipset with on board video. You can so the same with an Intel chipset with onboard video.
I’m not sure why moving the onboard GPU to the CPU would prevent this sort of thing, I don’t see it limiting choice at all, unless they artificially prevent it.
Edited 2008-08-06 08:17 UTC
How is this going to limit choice or change anything? This combo will be aimed at low cost desktops and laptops. Much like their old MediaGX chips.
I was replying to BSDFan and saying the same thing. Unless they artificially limit the external GPU options, things will work the same way.
oh, yeah, i responded to the wrong comment !
“Fusion” chips aren’t going to be graphics chips; they’re going to be processors. A motherboard with a fusion chip is going to look like a motherboard with a IGP, except the IGP will be integrated with the CPU rather then on the board.
The idea is that having a GPU on die will shorten the distance between the CPU and the integrated GPU, and once a discreet graphics card is installed the on die GPU will act as a co-processor to the CPU or as an additional graphics resource, built in Crossfire.
Fusion is basically AMD’s next gen laptop chip. The seeds of this have already been sown with AMDs nifty new technology that lets laptops run graphics off of a discreet graphics card or the IGP, switching between the two seamlessly.
oh, wow. I never thought about that scenario. That’d actually be pretty sweet. Even when a graphics chip is too old to properly run a super intensive game/3d app/whatever, it’ll still trump the general purpose cpu in a lot of tasks…
I guess that as long as Intel is going to announce its new awesome x86-capable GPU on August 12th, AMD is just trying to sneak through into the headlines. Check this one out: http://www.intel.com/pressroom/archive/releases/20080804fact.htm?ii… .