The future of integrated graphics processors lies somewhere on the dies of future processors, that much is a certainty. However, this creates a big problem for NVIDIA, whose chipset business will be out, of well, business. Beating everybody to the punch, the company announced yesterday that it is ceasing all development on future chipsets, citing unfair business practices from Intel.
It is often rumoured that NVIDIA has a skunkworks project going on in which they are developing their own x86 processor, combined with an on-die graphics processor. Intel and AMD are working on similar chips, whose arrival would mean the beginning of the end of NVIDIA’s chipset business.
For now we don’t know if this project will ever reach the market, but NVIDIA decided that it wouldn’t wait for the inevitable and just quit with the chipset business before they make any major investments in research and development. They are ceasing all development on chipsets; there will be no Nehalem or Core i5/i7 chipsets coming out of NVIDIA. Questions about chipsets for AMD were carefully avoided, so it is likely they’ll back out of that market too. NVIDIA will continue to make the Ion and 9400M chipsets, but future designs are all off the map now.
The big problem is Intel’s DMI bus. Intel claims that NVIDIA does not have a license to this new bus, effectively making it impossible for NVIDIA to develop chipsets for Nehalem and newer Intel processors. In the pre-Nehalem era, such a license wasn’t necessary, but with the newer processor architectures, NVIDIA does need a license. Without the DMI bus, NVIDIA’s PCH replacement wouldn’t be able to talk to the processor. NVIDIA claims it does have a license, and the matter is currently in court.
“Because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs,” NVIDIA writes, “So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.”
As Jon Stokes over at Ars points out, while it is of course convenient to place the blame on Intel (and it probably does belong there at least partially) a bigger issue is probably the future of IGPs, as mentioned above. IGPs are moving to the processor die, which means that even if NVIDIA does gain a license, its chipsets would have a lifespan of only a few years, because of the arrival of CPU/GPU combo chips. Add to that the uncertainty of NVIDIA gaining a license at all, and it only makes sense to back out now, before any investments are made into research and development.
Apple is in an interesting position here. The Cupertino company has more or less standardised on using Intel processors with NVIDIA chipsets (the 9400M), so it will be interesting to see what Apple will use for future Macs. I’m not familiar with Intel’s roadmaps, but I find it hard to believe that Intel’s IGP offerings will be as competitive performance-wise as NVIDIA’s offerings would be, and on top of that, does Intel’s roadmap IGPs say anything about OpenCL support?
I do hope that these tensions between Intel and NVIDIA subside, because I would really welcome NVIDIA entering the x86 market with their own processors with integrated graphics chips. The consumer processor market is dominated by Intel and AMD, and more competition is always welcome in this space. VIA is having troubles getting any serious foothold, so maybe NVIDIA’s brand recognition can do the trick here.
For Apple it’s more convenient to move to AMD CPUs now, to continue using nVidia or ATI (AMD) GPUs, which are way more powerful than Intel’s.
This nVidia move actually helps ATI more than it damages Intel.
I dunno, I doubt they would. Apple likes standardization, and switching to AMD would entail at least some variation in the availability of those instruction set extensions. Also, if I’m not mistaken, current generation AMD chips run hotter than their Intel counterparts, something that could present a problem for Apple’s ever-shrinking form factors.
I have long suspected Apple of moving to ARM for their desktop and laptop machines (so it is ARM all the way from iPod to desktop). Apple stated compute power per watt as a major reason for changing from PowerPC to x86, so moving once again and this time to ARM would fit this reasoning. At the time of the change from PowerPC, ARM did not have any processors with sufficient compute power to replace PowerPC, but now they do. And Apple is investing a lot of money on in-house hardware design, which most people believe to be focused on ARM SoCs.
In any case, if Apple does go the ARM way, it would not matter to them in NVidia won’t make Intel-compatible chipsets anymore.
As for NVidia making an x86 compatible, they could just make a good JIT for translating x86 to ARM. Digital did this for running x86 on their Alpha, and Transmeta used a similar approach with their Crusoe processor (except that the translation was done on-chip). So it is eminently possible. It might be a bit slower than running native ARM code, but if all the heavy duty processing (graphics etc.) is done natively, the overall slowdown wouldn’t be great.
nVidia doesn’t have a license to make x86 CPUs does it? If so, I doubt Intel would be willing to sell it to them, and I don’t know if AMD can or would want to.
They could always buy VIA, or sue Intel for antitrust stuff, I guess.
VIA is a viable option with Nano. The same holds for the new dual core ARM. Finally there is plenty of opportunity with godson processor (it aims to be dual core).
Via chips aren’t suffering due to lack of names recognition, they are suffering due to poor performance. My parents bought a computer with a cyrix (precursor to via’s) processor, it was cheaper and almost as powerful as the cheapest intel. I thought via would be able to boost performance when they bought cyrix and be able to develop some really competitive offerings due to their experience building some of the best chipsets, but that proved to be wrong. Sure via has a niche in the low power/embeded x86 market, but its not good enough for consumer consumption.
And thus you have proven why your whole post is wrong – the last time you used a VIA cpu was when it was called Cyrix. Right now VIA suffers because the chipsets used that support the VIA are crap – buggy, unreliable whether running Windows or Linux. The same sort of bugginess that one saw when the only chipsets available for AMD years ago were from VIA. What VIA need is for the low power cores to be optimised for performance, multi-core and hyperthreading – couple it with a good chipset form Nvidia and there will be a viable alternative.
Nvidia also needs to focus on getting their GPU’s to do more work when it comes to audio and video compression, and improving gaming performance. Nvidia + VIA could work but I am not a fan of Nvidia simply because they’ve demonstrated that they have no interest in fixing the manufacturing flaws which has caused massive recalls of 8400 GPUs and faulty chipsets.
There is a reason why I’ll be avoiding all Nvidia products in the future -and if that means I can’t upgrade to new Apple products then it is the choice I’m willing to make. Apple need to realise that Nvidia product quality is crap and they should stick to using Intel CPU and chipset and graphics supplied by AMD/ATI. More people are taking my position having seen the fall out of Nvidia’s poor quality control and happy that we bought our MacBook’s when the Intel X3100 was used.
Well not madness but the new core i7 is pushing more and more into SOC territory. Putting everything on one die is faster and ends up being cheaper for whole systems. NVidia doesn’t have a place in this new world.
Its getting about time x86 and wintel had more competition. Right now intel rules (at the high end) but its just a matter of time before they start making p4/rambust type mistakes again.
I think that is sad and unhealthy and should be investigated, but it is the truth.
Intel dominates .. AMD is struggling to stay alive.
Maybe Apple might help now. An Apple with powerful AMD/ATI interior would be nice .. I don’t think Steve drinks the Larrabe(or whatever it is called) kool-aid.
I don’t know if it’s still true, but amd processors are usually hotter than intel and considering that one of the selling point of a mac is the low fan noise they wouldn’t fit very well
So true, so true.
Low fan noise is a mac selling point? Must be one of those little things, I never pay attention to. Dells are pretty quiet as well. Come to think of it, I haven’t run across any loud computers ( non server grade equipment) in a while. My amd quadcore beast is whisper quiet with just the stock fan.
Maybe the thermal properties aren’t appropriate for the more compact imac form factors, but noise shouldn’t be a problem in the mac pros or even mini macs.
The AthlonXP ran quite hot, especially at higher clockspeeds. But AMD’s 64-bit CPUs seem to run relatively cool – it probably also helps that the heatsinks & heatsink fans are larger (a larger fan can move the same volume of air while spinning at a lower speed, making it quieter).
It also seems to be lacking in the below 35W segment last I checked, it might have got better, but one of the problems with AMD has often been how power-hungry they are, sure the CPU is not everything in power consumption, but a 10W jump will be felt hard and let’s face it, the most sold macs are probably the books.
What do you think Apple will do to Intel? Really…
Apple’s computer business is not a big enough deal for Intel. Considering they might actually take the GPU market and convert it to SOC market. All for themselves.
Then, everyone will be standing and watching the “Unbrella” corporation of microprocessors.
Hell, hey aren’t even taking their chances with Wintel duopoly.(I reference Moblin)
nvidia shouldn’t need a license either for using the bus architecture nor for building x86-cpus. patenting this kind of protocols and isas shouldn’t be possible in the first place. if it really is, intel should be obliged to license them because of their dominant market position.
ssssssshhhhhh.
hardware patents are good. software patents are bad.
//sorry, couldn’t resist. One of my pet peeves
Intel restricting the x86 instruction set… might as well allow Microsoft to patent C# and .NET so no one can create a compatible compiler.
It certainly comes across as majorly uncompetitive. I mean, when you have to pay a company to be able to become their competitor, there’s definitely a problem.
I don’t know enough to know whether hardware patents are always bad, but this seems like a clear-cut case where it’s causing major problems with competition. All the x86 patents and their ilk are doing is helping Intel become a monopoly. They aren’t an outright monopoly, but they’re close enough to cause problems. However, much as we’d like the patent situation to change, I doubt that we will anytime soon.
You don’t need good graphics to run iTunes.
I think current Intel graphics should be good enough to run the 3 games that work in OSX since all 3 are about 5 years old.
Apple using NVidia was wishful thinking on Apple’s part trying to get developers to use their platform. Fail.
Now with the introduction of OpenCL in Snow Leopard, the power of the GPU will be more important. Not yet. But in a year or two there should be some apps that really take advantage of the GPU via OpenCL. Intel has nothing to offer in this area right now, only promises for the future which Intel is well known not to achieve.
It would be foolish of Apple not to look into offering AMD solutions, now. Even if it’s just for the iMacs and the mini. ‘Cos they like to have options 😉
Using “Apple” and “options” in one sentence is an oxymoron.
I was referring to the sentence “We like to have options” that occurs a few times in Apple’s history, that’s all 😉
Do you really think anyone doing heavy movie editing would agree with you? 3d modelling? Doesn’t OpenCL and CUDA count as a reason to have a beefy GPU even though you do not play games?
I for one is happy i can continue play BF2142 in bootcamp. That wouldn’t have been possible using intel graphics.
want to do something really usefull for a change? anyone else think that Intel might be abusing a monopoly here just a little? I used to love the nvidia chipsets, good times indeed. sad to see them going. Looking forward to ION2 for Intels low power chips and VIA’s nano, 1 more month now…
If Intel is integrating almost all the functions the north bridge used to provide… then what exactly is the point in nvidia “chipsets”? The southbridge is just for IO and there isn’t a lot you can do to improve that at the moment. So, what is the EU going to do? Force Intel to keep two lines going forever? One with everything integrated, and one with an external support chip, just so nvidia can continue making chips?
Making a product better does not count as abuse of dominant position. Otherwise we would have not had even combine harvesters, because the farm workers would have risen against them.
Of course. Don’t you know? When your business is failing it’s never because you make crappy products or make bad management decisions. It’s always someone else’s fault. The legislation, the competition being better..err…unfair, the moons gravitational pull, not enough virgins to sacrifice etc etc.
Important point: it’s never your fault.
Nvidia should at least consider buying Via. That way they could still keep chipsets like ION around competing in the Netbook/Nettop/embedded market.
After a few yers then they’ll be able to get into high end CPU’s again.
Edited 2009-10-09 16:53 UTC
VIA has crypto acceleration. Can you imagine OpenSolaris on a Nano+9400M ?
Wasn’t there some kind of an agreement between Intel and VIA that if someone buys VIA the licenses are not transferred to the buyer?
Well maybe it’s not formulated exactly like that, but I remember reading something like that here on OSNews.
nvidia has a proven history of leaving open source
operating systems out in the cold.
it refuses to release documentation for its chips
and chipsets and it doesn’t look like it’s gonna change
anytime soon.
if one wants open drivers, keep away from nvidia.
What the hell does this have to do with the article?
i merely reacted to the authors hope that nvidia
will (re)enter the chips/chipsets market once
again with integrated technology.
every piece of hardware needs a driver.
how will you write it if you don’t have documentation?
all i said was that nvidia never was, and so far
isn’t open source friendly.
lot of fat good it will do us to have competition
if you can’t write drivers for it.
Well, nVidia might not be good at opening up their source code or letting third parties write drivers, but they DO make good drivers themselves. I never had problems with nVidia drivers when I had a dedicated nVidia chip; we’re having problems at work with the ATI graphics cards that we don’t apparently have the free drivers for, and the proprietary ATI drivers (better than they were in terms of installation, etc) are causing everything to briefly and randomly lock up. THAT took a while to trace back to the ATI drivers…
nVidia (seems) to have 64-bit drivers, while AMD apparently wants you to install 32-bit drivers with 32-bit libraries in place to use them. And they’re (AMD) releasing new drivers about once a month, that mostly repeatedly try to fix dual-monitors and problems with their own control console.
When it comes to Windows AMD/ATI drivers are great. No more am I beholden to constant leaked beta drivers and hard OS locks. AMD release drivers once a month and I have yet to have the system lock on me requiring the obligatory one fingered salute.
The problem is that Nvidia provides a good driver even though it is closed. Install through Mandrive install wizard, install from Debian repository, install from Nvidia binary download; they all work clean and provide stable performance. Further still, Nvidia’s own developers also contribute to the community driver project.
As an end user, I’d happily move to the community driver if it shows the same or better stability and function coverage.
By contrast, the ATI binary was horrid when last I tried it (fps in the 20 frames kind of horrid). The community driver didn’t support all the functions of the graphics card but I could get a rational FPS rate out of it. I don’t know how the current drivers are doing now that AMD has provided information though.
The poing is that in the case of my ATI a few years back, it was the community driver providing stability and performance where now with the Nvidia, the community driver has not caught up to the closed binary.
I wouldn’t say Nvidia is leaving anyone out in the cold as long as that binary installer remains available. They choose to go it alone with driver development though and that’s fine provided they can provide equal or better support than a community driver would. This also assuming that the community inherits required information if Nvidia should decide to stop development of older hardware drivers or *nix drivers entirely.
A better company to look at would be Broadcom and other companies who have truly left *nix platforms out in the cold with absolutely no chance of support without reverse-engineering the interface specs.
i am sure the binary nvidia driver is wonderful.
however i don’t use linux and/or any of the distros
you mentioned. nvidia does not provide drivers for
a lot of systems, an important issue esp. on osnews,
the site that is not concerned only with the penguin.
yes, broadcom is just as bad, and there are more
companies like that. i try to stay away from them
as much as possible.
i don’t care how much fps the nvidia driver has,
if i want to play games, i’ll use a different OS.
but i want my hard drive be usable through the
crappy nvidia ahci chipset long after nvidia’s gone
out of business or transcends to a higher dimension
or whatever happens to them.
blobs are bad for your health.
Mind telling us what open source OS you use that you expect excellent drivers for?
i expect an excellent driver for every and all OS’s :]
i use openbsd.
It was hardware that’s kept me from going to Open or FreeBSD. A fully open driver for hardware would be the ideal I think also. In absence of an open driver, one has to go with the driver that works. Unfortunately, being selective about hardware purchases with less mainstream platforms is still a requirement. Get your list of wanted hardware then compare it with your lists of supported hardware. This process sucks but remains as long as the hardware vendors continue to treat non-Windows users like lepers.
Yes.
But increasingly the “four freedoms” are moot. NDAs here and NDAs there.
On the subject, I’d like to know how many non-Intel employed engineers work with Intel X drivers?
Edited 2009-10-11 12:43 UTC
Our only hope of gaining the 4 freedoms with respect to graphics is the Open Graphics Project.
I would happily fork out EUR 200,- for a not really fast graphics card, as long as “free” is a part of the package.
frames per second is not just about gaming. Granted, the gaming size queens use the metric most but I mean with a basic X display. ATI closed driver couldn’t push X without tweaking where the community driver could kick out enough frames per second to display a smooth desktop. (It must have improved since the AMD buyout)
hahahahahhahahaha
the nv driver is a pile of crap. Further, it is completely obfuscated so as to prevent other people from picking up the code and improving on it. The nouveau devs are a brave lot to even try to understand it.
Edited 2009-10-10 08:25 UTC
That’s kind of my point, the best performing driver happens to be the closed blob from the hardware vendor in this case. The ideal would be open specs and a FOSS driver that could be ported to any platform. At this time though, there isn’t a better pick for performance and driver support than Nvidia is there? AMD is not there yet and Intel isn’t even in the same game.
Yes the nvidia driver in linux is behaving a lot better than the ati one.
I switched motherboards from an Asrock A780GM-LE (ati hd3200) to a comparable (same price) Asrock K10N78FullHD (nvidia 8200)and trying out a new kernel (ubuntu beta etc) doesn’t break the graphics anymore.
With ati you had to wait till they released something new. In the meantime you are stuck with VESA.
…don’t seem to be splitting ways anytime soon. Considering Apple’s push for the Light Peak interconnect, they probably made some other agreements to tie their lines together in other areas as well. I’m sure Apple knew about NVidia’s issues for a while – and was probably asked (by NVidia) to push Intel into allowing the bus license. I can see Apple and Intel having a long discussion over what to do about NVidia (and eachother) until the decision was made to go all-Intel. That would match with the way Apple does hardware: as few vendors as possible. Get NVidia out of the way and you have a platform with centralized and integrated hardware (the ‘Apple’ way). In exchange for the ‘whole enchilada’ was Light Peak – a distinctly ‘Apple’ piece of .. something, that Intel probably didn’t think was such a great idea with USB3 coming so soon.
…at least that’s just how it looks.
Intel doesn’t develop new hardware for nothing, Apple owes them big. And as much as I like AMD, I just don’t see Apple going there.
“I do hope that these tensions between Intel and NVIDIA subside, because I would really welcome NVIDIA entering the x86 market with their own processors with integrated graphics chips.”
Did you mean “subsist” instead of “subside”? Otherwise, it looks like a contradictory statement. It is because of these tensions that Nvidia wants to build its own x86 processors, instead of working closely with Intel on new chipsets, integrated graphics or whatever.
BTW, in my admittedly controversial opinion, the problem is not that Intel is abusing an alleged “monopoly position”; Intel is simply using the outright, legal monopoly it has in a bus design for the purpose of protecting its future integrated graphics from a potential competitor. That’s what patents do; they are the problem.
Eh, no. The tensions between Intel and NVIDIA should subside, so that the latter can start making x86 processors – which they will have to do anyway because the chipset market is going down the drain thanks to CPU/GPU combos.
Edited 2009-10-09 19:42 UTC
Ah, sorry. I assumed they could just buy Via or make their own x86-compatible design, even if they have no agreement with Intel. Well, they can always turn to ARM chips for Linux systems :p
In my opinion, the ARM processor offers the only hope for breaking the Intel monopoly. I was going to buy a new computer this year, but I’ve put off any purchase in the hope that the ARM-based machines will be coming out in 2010.
Well, that was my very reasoning when I bought an Archimedes in 1987. Sadly, I’m now replying from a PC compatible machine.
Alot are of people wondering about what Apple is going to do. Maybe since they’ve acquired PA semi they have other plans.
Otherwise the new technologies in Snow Leopard will have been developed for nothing if Intel plans to make their own SOC with a graphics accelerator. It’s been stated and I think we can agree that Apple knew about 1) Intel’s plans for SOC, 2) Nvidia’s troubles, and 3) AMD’s lesser offerings in the grapics dept.
In any event I believe Apple have a plan in place and I’m not really worried about what is going to happen to them if Nvidia and or AMD stop making graphics chips.
I suppose that would also help with the Psystar situation, if they started making their OWN graphics chips. Still, that seems like an awful lot of time and effort when many companies center around making such products.
Long time ago nVidia created many good chipsets, nFroce2 family, nForce4 family, but starting from nForce5 quality/features/technical advancement stalled, they worked, but not as great as in the old days.
In the mean time Intel created very nice family of chipsets with 965P/965G/945P/945G along with newer versions, Q35/Q45/G35/G45/G33/G31/G43/G41/… same AMD here, 780G was quite a small revolution, such great integrated graphics with 40 shaders and such low power consumption, sam for its successors, like 790gx and 785g.
That was also nVidia problem, they chipsets was very more power hungry and because of that they also created more heat.
From their last chipsets chipset with integrated GeForce 9400M was nice, ION also brings something interesting.
Looking at their current REBRANDING policy where 8800GS becomes 9600GSO, which becomes GT130 is sick, same for many other graphics cards from them.
IMHO its VERY GOOD that they stop creating them, maybe some very custom sollutions (like ION) would be nice, but looking on their recent Core 2 chipsets, noching changed in newer versions (like with rebranding on gfx cards).
Also I really did not liked what problems they created for support of SLI, AMD gave away all specification manufanturers needed, so Intel creating chipsets, add CrossFire support without anyproblems. For nVidia’s SLI, you had to but special N200 ships to add support of SLI, and you had buy TWO of those, to have QUAD SLI support … instead of just implementing that info main chipset core logic.
The future of integrated graphics processors lies somewhere on the dies of future processors, that much is a certainty. However, this creates a big problem for NVIDIA, whose chipset business will be out, of well, business. Beating everybody to the punch, the company announced yesterday that it is ceasing all development on future chipsets, citing unfair business practices from Intel.
And good ridance to bad rubbish.
At least one won’t have to worry about what kind of
defective crap is in a nvidia-based laptop…
And good ridance to bad rubbish.
At least one won’t have to worry about what kind of
defective crap is in a nvidia-based laptop…
+1000 I would love to stick my hp tx1210 up an hp or nvidia engineers ass. The laptop I really want to buy right now is the asus g51 unfortunately it has the nvidia chipset(at least it is replaceable) and evenone says it runs hot as hell.
Pissing off large groups of customer bases doesn’t help a company.
Locking out driver docs, generating lots of heat/pulling lots of power, rebranding the same crap several times, attempting to lock out competition with physics processing.
Why should anyone feel sorry for them?
I don’t feel particularly sorry for nVidia, but I do feel sorry for the chip markets. Because the biggest villain is still in town.
Let’s face it, Intel’s hardware graphics sucks, big time! AMD made the smart move to buy ATi, and Intel should make the smart move to buy nVidia. AMD is supposedly working on CPU/GPU combined processors, and Intel will be left with to competing product unless it either joins forces with nVidia or just buys them. It’s the smart move…
Speculating here, but I bet nvidia over valued themselves.
Intel has larrabee coming. That is what is supposed to compete with ati/amd.
Considering larabee is just another x86 based architecture I suspsect it may be quite open source friendly. It could very well be a serious game changer (if it doesn’t suck).
All I know is that the Intel graphics accelerator in my ASUS 1005HA leaves a lot to be desired! http://bit.ly/44CHFm
If NVIDIA can make a better solution, I will buy that netbook instantly!
I ran some benchmarks and give more detail on the ASUS 1005HA graphics abilities this review: http://bit.ly/44CHFm
What the heck is it with you and advertising that Asus in every single comment of yours?