Intel demoed the world’s first 32-nanometer processor today, showing it off in several test desktop and laptop configurations. There aren’t any hard-set specifications or benchmarks just yet, but here’s the scoop on the upcoming processors, according to Intel: The 45nm desktop and laptop processors (the Clarksfield and Lynnfield) with four cores will transitionally be replaced by dual core alternative 32nm processors (the Clarkdale and Arrandale) that also have an integrated graphics processor all with the same form factor as the 45nm chips. Two exciting side-notes: The first, Intel will be investing over $8 billion into the 32nm era (alright, so not immensely exciting, but definitely interesting, especially in this economy where money isn’t shouldn’t be thrown around without a mighty good cause). The second bit, according to one of Intel’s charts, apparently there will be a 32nm high-end desktop processor (the Gulftown) that will have six cores. The good news? Parts of the platform will be going into production in 2009 for sure. The bad news? They said “parts.” Be warned: that Core i7 you have your eye on will be a thing of the past come the newer and higher-end quad-core 32nm beauties.
A better title would be:
‘First Ever Mainstream Processor with Integrated GPU’
This is a much more important milestone than switching to a smaller process. With AMD’s Fusion ( http://fusion.amd.com ), Intel’s plans to put a GPU inside the Atom Pineview and Clarkdale/Arrandale and the recent rummors that NVidia is designing a x86 processor – this seemes to be the direction future CPUs are going.
Maybe in the near future the separate GPU will go the way of the math coprocessor.
While there are ARM processors with integrated GPU, also AMD’s geode has integrated graphics – this is the first time such processor will hit the mainstream.
Edited 2009-02-11 11:33 UTC
Damn it, give us a chance..
Possibly I am being obtuse here, but why should this be “bad news” about which someone would need to be “warned”? Now, it is true that my main machine is ancient, by computer standards, and that I will eventually need to upgrade. I had briefly considered a Wolfdale but then decided to wait for Nehalem. Now, if this new cpu family is going to be a watershed in the course of cpu history, I would like to know why it should labeled something about which I need to be warned. An explanation would be greatly appreciated.
Oh I don’t know.. maybe for those who are seriously thinking of buying an i7 (like me).. it’s already practically obsolete.
That’s not really an enlightening reply. Are i7’s going to be “obsolete” in the sense that there are apps and OS’es that it will not run but which will run on the 32nm cpu’s? Are the 32nm cpu’s going to be run at several times the speed of i7’s and yet cost but a fraction of the price? Really, the only thing that I can foresee, is that the retail cost of i7’s might well decrease at some point reasonably soon after the 32nm’s come market. But as to what “obsolete” means in this context, is not clear at all.
I wasn’t trying to enlighten you.
If you took that line of thought then my AMD Athlon64 3200+ wouldn’t be considered obsolete, but at least to me it is.
The reason I called it “bad news” was the very reason flangue mentioned. I’ll expound a bit more. There are those of us who have been psyched about the new Core i7 processors. Having the newest, most powerful hardware is a plus for many people– it’s a sort of a pride thing, I guess. So when you’ve bought the most powerful thing on the market only to find out there’s going to be something better soon, it’s not so joyous. Also: if you’ve got your eye on the most powerful thing on the market but find out that something even better will be coming out in only a year or two, you’re stuck between the choice of: Do I get what I want now, or do I wait a year or two and get the better piece of hardware? Or do I wait a year or two for the price of the current hardware to go down? So yes– it’s bad news in that sense, though I’m still rooting for new hardware to come out even though it makes my Phenom 9500, 2 GB DDR2, and ATI HD 2600XT worth less and less and look more and more like a compy a little kid would get from his dad after it’s been used for ten years.
sadly as soon as you walk out of the store with the computer it is “obsolete” and the newer models are on their way. Intel has an agresive tick tock stratigy that is every 2 years or so, something they did, then an improvement on it; then something totaly new, and then an improvement on it. I for one am waiting since the i7 boards dont have SATA3 and other emerging standards that will be here soon.
The eternal connumdrum of the computer geek . No matter what you by, when you buy it, it doesn’t stay top of the line for long–a few months, at the very most.
Personally, I’m a bit more evaluating, I don’t want the latest and greatest for its own sake on every machine. On, say, an audio editing or dvd encoding machine then yes, it is very good to have the latest, as those tasks can tax almost any single PC to its limit. Do I want a Core i7 for just a standard desktop though? Where it’ll mostly be used for browsing, wordprocessing, and maybe playing a movies–your standard desktop, in other words? Well yes, I want it , but it’s hardly necessary by any stretch, especially if you don’t insist on running a ridiculously huge os like Vista. Heck, even a dual core is fine for that resource hog of an os.
Seriously, if you get a Core I7 now, it may not be top of the line soon, but it’ll still be a good number of years until you need to upgrade it. If you insist on waiting for the absolute latest technology, I’m afraid you’d never have a computer at all.
I can’t help but wonder, exactly how much faster or more powerful do we really need to go as far as the CPU? For most tasks it’s other devices in the computer that are the bottleneck, the hard disk being a big one, for example. No matter how fast your processor is, it’s going to be limited by how fast you can get data in and out of it, and the same applies for any bit of hardware. Sata 3.0, for instance, isn’t going to matter one bit if your hard disk can’t keep up with the data transfer rate, it will max out at whatever the hd, or array of hds, can manage.
If you think like that then you will be forever waiting to buy. There is always new and better technology on the horizon. As long as the hardware is good enough to run your workload who really cares anyway? If you just want the most up-to-date technology available then be prepared to build a new computer every 6-12 months.
Oh, I definitely agree with you. That’s why I settled for the Phenom about a year ago, even though I knew the Core 2 Quads were going to get cheaper within the year and new versions of the Phenom were coming as well as the Phenom II. It’s just… there’s always a sinking feeling when you find the processor you paid $290 for is now selling at $190. Even still, there are few who I know who have a system as powerful as mine, so I still feel pretty good about what I’ve bought in the geeky pride sense.
Also, agreeing with darknexus, I find that my netbook performs all of my daily tasks beautifully. Email, internet, desktop publishing, and database management is usually what I normally do (as I’m sure what most other people normally do). I have my desktop for video editing, games (ha– if I had time for those), and other heavier tasks, but I find I’m using it less and less because my new netbook does my daily tasks almost as efficiently, and I can actually escape my dank study and see the sun again and explore the world with it.
Edited 2009-02-13 01:45 UTC
I wouldn’t see the need to wait for the 32 nm part. I’ve had a Core i7 under my desk for a little while and this thing is pretty amazing compared to pretty much everything else.
Firstly, ….
Secondly, …