The third member of the Itanium chip family is the company’s best shot to date at taking on Sun and IBM in the market for high-end server chips. With Madison, Intel is hoping the third time’s a charm. Update: Red Hat announces support for the latest Intel Itanium 2 processor.
New Itanium a Breakthrough for Intel?
About The Author
Eugenia Loli
Ex-programmer, ex-editor in chief at OSNews.com, now a visual artist/filmmaker.
Follow me on Twitter @EugeniaLoli
36 Comments
That is US$, times it by 1.5 and you’ll get what us suckers overseas pay. It is still overpriced. Keep coming down, one day the unwashed coder will be able to afford one.
Why is Intel not listnening to its users? Every other chip manufacturer can see that desktop users want 64-bit computing for the same reason we wanted 32-bit computing back in the day. They keep claiming that nobody needs that much power and I can’t help but remember the famous 640KB-will-always-be-enough speech we got from Bill.
Intel is going to find itself trailing IBM and AMD pretty soon if they keep this up.
It isn’t the extra power, I want something different to the BIOS and IRQ crap one has to put up with today. They’ve talked about EFI replacing the BIOS, when is that going to happen? why is it taking Intel so long in getting into their own motherboards?
I want a better OVER ALL solution. Not faster, but a Better solution not strung on 640K base memory limitation, IRQ’s, conflicts between different configuration techniques and various other crap that make life harder for the PC user.
I have to confess that for many reasons, specFP/INT seems less and less interesting. I remember from the days when switched from PA-Risc 1.1 to 2.0 how much having code compiled for the latest architecture ment. It is obviously the same for Intel 386/P4/Itanium. So, what is this rambling really about. Well, problem is that most of my applications don’t come with source code (Matlab for instance) and without reliable performance information, I can’t spend money just to find out how much time I can save. Maybe I’m just cheap but I want to know what I get for my money. If the vendors actually could complement the spec benchmarks with standard application benchmarks and put those results in a database, that could help a lot (assuming the improvement actually was substantial)
/jarek
moocow:
” I can’t help but remember the famous 640KB-will-always-be-enough speech we got from Bill. ”
1. Bill Gates never said that. Anyone who says he did, provide PROOF, when did he say it, where did he say it, to whom, etc. BE SPECIFIC. And provide a link. Tired of people getting their “facts” from people named “dark overlord” or something similar from places like slashdot.
2. 640KB vs 4GB is quite a different situation. Just about everyone was running into the 640KB limit. But probably not even 1/10th of a percent of people will have 4GB in their systems in the next 4 years, let alone need more. It’s not econimical when most people just don’t need it.
I think it would be cool to have a 64-bit intel desktop cpu, but I completely understand their reasons for wanting to hold off a bit. Cool != money. When intel NEEDS A 64bit desktop cpu to keep marketshare, then you’ll see one.
> But probably not even 1/10th of a percent of people will
> have 4GB in their systems in the next 4 years, let alone
> need more.
You underestimate how many memory hungry desktop users exist. That is beside the point. 64bit removes the need for software emulation for most large floating point operations. Software developers, CAD specialists, and other areas where high end visualisation is required will all dramatically benefit from the need to perform calculations at a lesser cost to the software.
Well in excess of 3 million Java developers exist:
http://se.sun.com/nyheter/releaser/2003/030312.html
Java programmers make up only a fraction of all software engineers.
I don’t have any specific figures relating to these other industries but I can assure you; LOTS of professionals with these needs exist. Other manufacturers creating 64bit chips for desktop machines (as I mention below) would not stay in business if there wasn’t sufficient demand.
> When intel NEEDS A 64bit desktop cpu to keep marketshare,
> then you’ll see one.
That time is now. All major *nix vendors have had 64bit processors almost since the beginning of time. Now apple has them. I fail to see why you believe that enough demand for these chips doesn’t exist.
The 64bit h/ware market is a multibillion dollar industry. Admittably a large portion of this is sold as server hardware but a significant portion of it, made up with products like the Sun Blade, Apple G5 and SGI Octane are 64 bit desktop machines.
BTW: You should be young if you didn’t know this well known gold sentence from your friend. Time to grow up. Google for the next string: “640 kb enough Bill Gates” and be happy with your links. :}
Well in excess of 3 million Java developers exist:
http://se.sun.com/nyheter/releaser/2003/030312.html
Java programmers make up only a fraction of all software engineers.
It’s completely true… the memory footprint of the JVM is so large, I need more than 4GB just to keep my system running!
(I’m half joking. It’s sufficiently bad that my medium-sized server app really needs 1GB inside so as not to be crippling other stuff)
Maybe CNN is lieing also:
http://www.cnn.com/2000/TECH/computing/02/11/free.software.idg/
Or some historical line:
http://www.thocp.net/timeline/1981.htm
Just for that case If you would be too lazy. :]
Intel is in a very sweet position. They have the desktop pretty much locked up, and as far as the high end-is concerned, they have no way to go but up. Lots of room for growth. Contrast this with, say, SUN’s position in the chip market. Eventually, Intel is going to get it right, no doubt about that. And that can only be good for corporate customers, who will see soon enjoy huge drops in price.
64 bit desktops? I think those will be pretty standard in two-three years. Apple has shown there’s no need to wait, and if opteron works out, Linux users will be following suit pretty quickily. The intel-Amd Microsoft-Linux race will ensure that ordinary users aren’t left out.
http://ddj.com/news/stories/2000/06/do200006nw008.htm
For a guy who doesn’t do his research you run your mouth pretty reckless. You’re either too young to remember this or have been living under a rock since the 70s. I usually have to put more effort into embarrassing people online and I was disappointed by you.
Now, kids, can anybody tell Cletus here the year Bill Gates made this quote (by following the above link that Cletus is too lazy to find himself)? 10 points for anybody who gets it in under 15 minutes of this post.
I find it very hard to believe that Bill Gates ever said that “640KB-will-always-be-enough”. I also think that desktop users in general aren’t demanding 64 bit now, and that it won’t be required in five years either.
I don’t see the type of application that would require huge amounts of memory that would also be required by desktop users. Most desktop users just use the computer for browsing and reading/writing simple documents, and for that you don’t really need more than a P2/300 with 64MB. If you want the latest versions of everything you will need more but desktop users hardly need that.
“You underestimate how many memory hungry desktop users exist. That is beside the point. 64bit removes the need for software emulation for most large floating point operations. Software developers, CAD specialists, and other areas where high end visualisation is required will all dramatically benefit from the need to perform calculations at a lesser cost to the software.”
Does 64bit remove the need for software emulation for large floating point operations?? You know that the 486 could perform 64bit floating point operations don’t you? The 64bit we are talking about now is related to how much memory the processor can adress, not how large numbers it can process. If I understand it correctly the P4 (sse2) can handle 128bit (2*64) numbers.
“On 25 October 1988, Bill Gates, Chairman of the Board, Microsoft Corporation addressed members of Melb PC. This is a close to verbatim transcript of that speech and of the question and answer session that followed.
“Well, in fact, it was adequate for about 7 years of work. Both the 640 kB and that 384 kB we find today are extremely overcrowded with the needs of PC configurations. In fact that is probably the toughest problem we face in PCs right now.”
Well that’s one thing I found. So if he said that 640k would always be enough, it was obviously before 1988.
The link you provided just mention that Gates said that 640kB ought to be enough for everyone in 1981, which is quite different from saying that 640kB will be enough forever.
“I find it very hard to believe that Bill Gates ever said that “640KB-will-always-be-enough”.”
You should. It’s such a stupid statement like trusted computing.
I think that 4GB ought to be enough for every desktop user today. It doesn’t mean that I think that 4GB will enough forever.
You are a lawer, aren’t you?
So, do you think when Gates said this sentence he tought only for that one moment?
I don’t think Intel will lower prices of Itanium2 CPUs for *servers*. They’ve spent billions and something like 10 years (?) developing Itanium, it’ll be a long time before they start making back the money they invested in it.
The low-end version of Itanium aimed at workstations and possible desktops (for the little guy) is Deerfield. So yes, they already have plans for 64 bit computing in the low end markey, Apple and AMD just happen to beat them to it.
I think the 640kB-will-be-enough has to do with partitioning the 1MB into 384kB/640kB. I think he believed that 640kB would be enough for Dos 1.0 and that it’s more a matter of 640kB vs 736kB (or something), than stating that there is no reason for any program to use more than 640kB at all, because it wasn’t enough for all programs when he said it.
Those who can’t see the difference between now and forever should see a psychiatrist, or write a letter to Bryan Adams.
so the itanium is “a potential breakthrough” and the G5 is a manipulation of benchmarks.
So where are all the questions regarding the itaniums performance? That is a kind light they’ve shed on a commercial flop (itanium).
You prove that everything is explanable afterwards. Bill were stupid and said this lame sentence. Why cannot you accept it? He is a human (or looks something like that;), he can make mistakes too. Nobody is perfect.
Everybody knows what he thinked when these words left his mouth, you cannot change the history with your explanations.
According to Bill Gates himself, it wasn’t said:
http://inquirerinside.com/?article=8742
My understanding is that the 640k limit was due to the segmentation of the 8086 processor used in the original PC. Here’s some links that argue the same thing:
http://www.zip.com.au/~guyd/smash_ms/dl/msdos.htm
http://physinfo.ulb.ac.be/cit_courseware/msdos/msdos04.htm
Mike Magee states that the Itanium 2 workstation from HP, with the 1.5GHz processor and 6 MB cache, will cost $4,900.
HE IS WRONG!
This $4,900 price is for the 900MHz w/ 1.5MB cache, previous generation HP workstations, in a minimal configuration. Probably a sales price due to the launch of the Madison processor, and HP being forced to get rid of its excess inventory.
I also noticed that it ships with a Radeon 7000 video card, a totally obsolete chip by present standards (I should know, since I have one).
HP´s website does not provide an exact date for the availability of the new Itanium 2 workstations, nor their prices.
So, then it looks like (from what you are saying) that it would be cheaper to get an Apple with dual G5…compared to that price, Apple sounds damn reasonable…
I doubt Apple will ever be a serious server player. It just isn’t their focus. Apple’s servers are mainly good for small businesses that use Apple on the desktop. This is a solid niche for Apple, but it is unlikely to grow very large.
The PPC970 (aka G5), on the other hand, could become a serious low-mid range player if/when IBM starts selling Linux servers based off of this chip. Note that Apple will benefit from this in that they can share hardware designs (chipsets etc.) with IBM, reducing development costs for their desktops and servers.
http://www.everything2.com/index.pl?node=640K%20ought%20to~…
that ought to clear things up.
http://www.urbanlegends.com/celebrities/bill.gates/gates_memory.htm…
Bill Gates addresses this issue himself in Bloomberg News, 1996.
“You must be young that you don’t remember that.”
HAHAHAHA!!! You must be a fool to be making up things and believing that you ‘remember’ them.
“I’ve said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough […] But even 32 bits of address space won’t prove adequate as time goes on […] Meanwhile, I keep bumping into that silly quotation attributed to me that says 640K of memory is enough.”
– Bill Gates on the “640k out to be enough for anyone” quote, quoted from the inquirer.
Now find me a witness and provide the time and place that he made the quote if you still believe he did. I don’t want to hear “1981” I want the day, the person(s) he said it to and where he said it, ok smartguy? If you’re really smart, you’ll STFU while you’re ahead, clueless.
Heh, this is the same man who labeled the 286 as braindead (he was right on that one). And claimed that OS/2 was the future…
so take anything he says with a grain of salt.
As soon as a G5 XServe (XServe5?) aka. A 64 bit apple server is released, Intel might just start to feel the pressure. Support for enormous files of hundreds of gigabytes, useful particularly in database situations, and the ability to use gobs of RAM (8GB natively IIRC), will really cause Intel YET_ANOTHER need for concearn.
Of coarse, this article is obviously talking about 8+ CPU systems or huge clustering nodes where macs don’t yet have a competitor.
To summarise. If Intel wants to keep their big slice of the server pie, they’ll need to be far more compettitive on price than what the Itanium, I2 and presumably Madison have offered.
The sad fact is that you can have an abysmal 64 bit CPU (the Itanium 1 anyway..) and still win the hearts of your customers if you can give it away cheaply.
Checking the prices, I think I’ll not buy Itanium2 to my desktop.
I completely agree. The performance for Itanium isn’t too bad, however, what they should do is allow more smaller players enter the market, replace the Xeon with Itanium and get more IHV on board to product Itanium compliant motherboards etc.
As for desktops, wait another 5 years, however, I see no reason why the Itanium can’t be used in a workstation as a replacement the PA-RISC or MIPS line up.
“The firm has introduced a $4,900 box which includes Madison Itanium 2s each at 1.5GHz but with the 6MB of level three cache…”
“HP is claiming SPECint of 1318 and SPECfp of 2104 for these babies.”