“ST Microelectronics Inc. has decided to put its graphics operations up for sale, placing the future of the Kyro graphics accelerator in limbo. The announcement comes after the troubled graphics accelerator, designed by PowerVR Technologies, itself a division of the U.K.’s Imagination Technoogies PLC, missed several major milestones that company executives outlined in an exclusive interview last June.” Read the rest of the report at ExtremeTech. Our Take: With 3Dfx long gone, Matrox & 3DLabs already out of the 3D gaming market and SiS, VIA, Intel and Trident not been able to produce fast 3D chipsets, the ball is now only between the duopoly of ATi and (mainly) nVidia. I wish good luck to ATi, as I just can’t handle yet another monopoly in the tech world. I want choice, I need diversity.
It is a huge shame since Kyro I and II were very good gfx chips. And all info that was available for Kyro III hinted at it being a very damn good unit.
I hope someone pics up this technology and continues with it.
Judging from the things said by certain developers dealing with graphic card solutions nVidia`s management team greatly favours monopolizing the OS industry (with Windows). Also therefor I personally don`t touch any of their solutions and prefer to use with ATI or Matrox solutions depending on the task. I must add though that the small linux division working for nVidia appear to be helpful people with regard to development support.
The only difference is between an nvidia monopoly is that they drove their competitors out of business with better products, not ubiquitous installations like microsoft. ATI holds that distinction. There is still lots of competition especially in the non-performance graphics markets, like those 3000 pc’s at xyz company that need no real acceleration. Please point out the graphics cards nvidia markets to businesses. The MX line is still overpowered and over priced for those markets. There is much competition you just dont see it because you’re a saavy user who often see’s things from the performance perspective. Kryo’s had good technology but were simply underpowered comparatively in the market they attempted to break into. That being said 3dfx was the only company that had open drivers for linux, nvidia and ati are both anti-open with reguard to drivers. 3dfx had the same thing happen to them, miss a beat and you lose the game. It’s fast paced, it’s competition.
Maybe SiS will will buy STs GFX ops. SiS gfx cards have always been OK. If you dont play 3D games, and experiement with lots of OSs, the SiS6326 based cards work with damn near any OS. They’ve needed some improvements on acceleration though. Which they’ve seen on their newer chips. SiS is the ONLY company producing a gfx chip with hardware T&L besideds the duopoly. SiS could take the best technologies from both the Kyro and their own line, to produce a fair competitor to nVidia and ATI.
actually I’m hoping that sony will release something a bit more powerful than the Ps2 with the same design, we got GNU/linux on the ps2 if it works a Sony WorkStation won’t be so bad.
Anyway is a Shame we cant see (and buy) KyroIII based chips
Anonymous asked who puts low end graphics in PC’s targeted for business?
Intel does, they ship a range of chipsets with integrated graphics, SiS do as well, and even VIA have some chipsets with integrated Trident graphics for the low end.
The P4 changes that game however, as untill recently there were no chipsets with integrated graphics. But Intel will be shipping the 845G soon.
The latest Kyro chip was so good that nVidia actually feared its competition, especially since the more intelligent approach of the design makes the cards cheaper as less expensive memory can be used without making the card slower.
This feature even led more well know firms like Hercules switch to the Kyro and now they pull out of business??
To be honest, nVidia chipsets aren’t really better than anything else, they’re just faster in terms of 3D performance (even that is just done by faster clocking and more expensive memory) but they really suck in the 2D department, which should be just as important for someone who isn’t a full-time gamer! The only problem is that the normal person doesn’t even notice how bad the cards are, because almost every off the shelf PC contains nVidia.
The nVidia chips don’t yield much innovation since the first TNT, which is why they were so frightened by the Kyro, which kicks ass not by simply faster clocking but a more intelligent approach to rendering.
I really hope that ATI makes some ground in this business, otherwise I might have to bite into the sour apple of the slower 3D performance of Matrox cards in the furture, as I simply cannot stand the poor 2D quality of cards with nVidia chipsets…
>>The latest Kyro chip was so good that nVidia actually feared its competition,<<
Noticed, maybe . . . but feared?
>>especially since the more intelligent approach of the design makes the cards cheaper as less expensive memory can be used without making the card slower. <<
Yeah, the Kyro cards have a couple of neat tricks which I’m sure ST or PowerVR have patttented, but those same “neat tricks” cuased quite a few issues with drivers (as would most new technologies).
>>This feature even led more well know firms like Hercules switch to the Kyro and now they pull out of business?? <<
Herculese/Guilmot did not SWITCH to Kyro, they simply offered a Kyro based card in ADDITION to their nVidia ones. Herculese was trying to use its name recognition to “bless” the Kyro for others to use. With no competitor nVidia would hike its prices sky high (ATI didn’t license its chips to third parties)and Herculese/Guilmot was looking to stop MS like tactics from nVidia. They didn’t “jump” all over the Kyro because it was sooo amazing, they simply wanted a choice in more than one supplier of gfx chips.
This is not as big an issue as it may appear. STM produced the silicon for the Kyro. But Imagination Technologies (IT) produced the design and the drivers. As such, design work can continue and the drivers go on.
Also… IT has been doing this for some time. Their old partner was NEC and later STM took their place. So this has all happened before. It just didn’t get as much press I believe and before… IT went on and came out with the Kyro. History will possibly repeat itself again in this case.
As far as the Kyros being underpowered or their “neat tricks” causing problems… All I have to say is have you used one and compared it with others? I’d say not.
Because as far as the “power” goes… It was originally paired against the MX and the Geforce 2 GTS, and I can say it EASILY beats the snot out of the MX in all games I personally compared them in. I can say this… One of my friends has an MX and I have a Geforce 2 Go (32 Meg) which is comparable. As for the others in it’s bracket of capability… I don’t have any other Geforces in it’s range right now to compare it with, so I honestly cannot say and will not speculate based on the other video cards I own.
And the driver problems… Out of all my video cards… (Matrox, 3dfx, nVidia, SiS, and S3) Thus far… In comparison it has had incredibly few problems.
As for Hercules… I believe they use strictly Kyros for their low-mid-end video cards now and ATI for their high-end. I haven’t heard of any new Geforces coming from them. In fact… I’ve heard that they’ve dropped nVidia entirely. Of course… What I’ve heard could be wrong… But then… Where are the announcements for the new Hercules Geforces?
BTW… I have nothing against nVidia or anyone else… I use lot’s of different video cards. There just seems to be alot of people making this present problem alot bigger than it is currently and may turn out to be. And also alot of people making a mountain of the fact that Kyros are in the low-end and there have been a few minor glitches that some people have run into. Which… I might add… Lot’s of video cards have small glitches.
> Noticed, maybe . . . but feared?
When the Kyro-II came out it wiped the floor with a GeForce 2 Ultra in the Unreal Tournament benchmark, while the average GeForce2 card was about $200 more expensive at that time.
This led to a document by nVidia stating that the Kyro is a bad card and why this is the case. This document was only briefly available on the internet, porbably it wasn’t even meant to be seen in the public.
I don’t recall the whole document, but I know that most “arguments” against the Kyro were utter rubbish. The best bit was that nVidia claimed the Kyro had never been developed without the help of Guillemont, who – according to nVidia – had no experience in bulding graphics cards, which is quite a surprise since Guillemont built several graphics cards with nVidia and 3Dfx chipsets.
Maybe only a rumour, but really sounds like nVidia, they are said to have told graphics card firms that they might not get all the nVidia chips they ordered if they intend to use Kyro-II chips.
Did nVidia fear the Kyro? I’m quite sure of that!
I’m not saying nVidia GeForce smokes the Kyro III. I don’t even like nVidia. I’m just saying the Kyro based cards aren’t a GeForce killer waiting to happen. A good card, YES. COMPETITIVE with nVidia, YES. Absolutely beats nVidea hands down, NO.
I read the document that nVidia sales staff put out about how bad the Kyro sucks. That doesn’t show fear, that shows a sales staff thats paying attention to their market. Documents like that are VERY common, most don’t get leaked on the net.
And as I stated the few problems Kyro had with drivers aren’t uncommon when introducing a new technology into a video card.
Again hopefully SiS (or Herculese/Guillmot!) buys up the Kyro Technology.
BTW… Some people seem to think that the Kyro has “new tech” for a video card. If my memory serves me well, the company that designs the chip has been using similar technology in PC video cards for a “long time”.
Originally… I believe they had their own API… Like 3dfx had glide. The problem was they were not popular enough to get very many games to develop for their API. I don’t remember what their first card was that could handle other APIs… Anyway… The problem if my memory serves me well, was in handling other APIs other than their own with their version of tilebased rendering.
As for the Kyro and Kyro 2… Those were simply their latest and best technology chip and driver-wise and by far their most popular.
Of course… As I said… My memory may not be serving me well here. Because I don’t often think about their history. But I believe this is all accurate.
I figured a small amount of history might interest everyone here. I certainly thought it was interesting, because I couldn’t remember hearing of the the company that designed the chips for the Kyro before.
Oh… And as I said before… They’ve been through this little event (losing the chip manufacturer) before. So… This is nothing new. And no one will be BUYING the Kyro technology. Because one company designs the chips and another produces them. It’s the one that is producing them that no longer wishes to do it. So another company will be LICENSING the Kyro technology.
Also… The chip manufacturer is still interested in working with the technology. They just don’t want to be doing chips for PCs anymore.
I didn’t run into this myself. But… Occasionally the Kyro does lose to the Geforce 2 MX according to some reviews on the Internet and I certainly can believe it. However, it also occasionly beats the Geforce 2 Ultra according to some reviews (or was it previews? Anyway…)
I think the biggest problem performance-wise… Is either their lack of T&L or the fact that they still use SDR memory. However… The company that designs the chips has had T&L technology for some time. As well as some form of dual chip technology like the old Voodoos. Neither of which has yet to be applied to the PC market.
If these and DDR memory were brought into the picture for the Kyro… I dunno. But already the Kyro does very well in comparison to nVidia.
Is the Kyro technology a Geforce killer? Not in it’s current state… Because it still loses sometimes.
But is it one waiting to happen? I don’t know… But the company that does the chip designs apparently did NOT use their full capbability at the time. Possibly because the chip manufacturer didn’t want the extra expense or possibly because of problems interfacing with the standard APIs? I don’t know…
We might find the answer to that IF and WHEN the companies involved decide to release a “high-end Kyro” card.
But right now… The Kyro is only focused on the “low-end” and does it clean-up for a “low-end” card? From ym experience, so far it does for the most part. Though it doesn’t fair very well against the “high-end” cards.
The question though… Is it a Geforce MX killer? Because that’s who it’s targeted against and it does depend on the game and OS. (because for example the Kyro doesn’t have Linux drivers yet) And naturally… If a person is buying… Is it right for you?
What the Kyro chip does that no other gfx chip does is soemthing like this.
When a foreground polygon hides part of another polygon in the backround, a gfx chip is actually drawing/shading/lighting/mapping the parts of polygons you cannot even ‘see’.
This is a lot of ‘wasted’ power. Kyro gfx does not draw/map/shade/light those areas, and thus has a lot more power left to do other things.
From what I read the reason this had never been done before Kyro, is becuase gfx companies claimed it was to difficult to implement.
Appparently STB/Power VR didn’t think so, because they made it happen with the Kyro.
I also read an interview with a Kyro designer? who said that DDR memory on a Kyro chip wouldn’t make a damn bit of difference on the Kyros type of architecture.
I knew about TBR… Which is what tom6789 describes. But I thought Kyro’s predecessors used TBR as well. It just didn’t work as well with Direct3D and OpenGL? Of course… I could be wrong. Honestly… I didn’t bother to pay much attention to the company until the Kyro, so I know little about the previous products besides what I’ve been told. In any event… TBR is what sets PVR apart from ther others. Yes. (However… It is worth noting that 3dfx had a software based HSR before they went out of business, though it did not work well and nVidia should now have this technology)
I think saying DDR wouldn’t help literally would be like saying uping the clock rate wouldn’t help worth a darn or giving it T&L wouldn’t help worth a darn. Or giving it any number of other things wouldn’t be worth a darn.
What it boils down to… Is that it didn’t need it nor was it designed for it, as such adding it would be pointless. If it was designed for it from the beginning… Would it help? I think so. But then I don’t work with their technology. But it makes sense to me anyway.
Well… I decided to spend some time trying to find a feature list for one of the pre-Kyro chips. I didn’t find one though. Oh well… I’m not going to waste my time and worry about it. The Kyro was definitely the first popular card with TBR and possibly the first ever. So we might as well consider it the first even if I did hear otherwise. Without a feature list for the prior ones it certainly makes it difficult to support the Kyro as not being the first. So I concede on that.
Anyway… My choice of words was not the best for a prior point. When I was talking about the Kyro and DDR, etc… I meant the line of chips from IT/PVR instead of the literal Kyro chip currently out. It was a poor choice of words I suppose. So I apologize.
I was simply attempting to say before that as far as I could tell the companies that produced the Kyro were not even trying to take the speed crown from nVidia and therefore, who could say if they can or can’t take it if they had used everything that they had at their disposal?
Naturally it would have had to have been from beginning to end with the chip design. Tacking on DDR or whatever later without first planning for it would most likely be more of a hassle than it would be worth.
So no the current Kyro isn’t going to kill the Geforce 4, nor is a tweaked Kyro or a modified Kyro. But a successor to the Kyro MIGHT. Of course it might lose too. And of course if no one licenses their PC video chip technology naturally there won’t be any successor… In PCs anyway ;> They do arcade machines and consoles too… (I don’t think anyone is presently licensing their technology for consoles though.)