AMD is showing off at Comdex a prototype 64-bit Windows OS running on its upcoming 64-bit Opteron processors. The demo units are running Information Server (IIS), 64-bit Terminal Services and 64-bit Microsoft Internet Explorer. Also they’re running 32-bit Office XP over the ProtoWin-64 – the demo shows interoperability between the 32-bit and 64-bit apps, AMD says.
In the world of free software rewriting software for 64bit is not difficult at all; its even easier when one converts between x86 32bit and x86 64bit. So basically AMD has only created a chip which allows compatibility with obselete closed source software. But at least the compatibility will increase it’s market share and make it cheap enough for me to buy one. In any case Free SoftwareLinux will show its the power of open source on the AMD 64bit.
Not really. I saw Opteron running SuSE and Mandrake, and while they were 64bit, AMD had DB2 and Opera loaded to show to people stuff, and these were 32-bit apps, because they were closed source. So, the way it done by AMD, a *smooth transition* from 32bit to 64bit, it is the best way for the customer, as even in the Linux world, legacy 32bit apps are still around.. used by many… Not just in the Windows world. Legacy apps are not “obsolete”, a lot of people need them. Yes, they will run slower or something, or might slow other thigns down (as in the Itanium case), but the bottomline is that compatibility is needed by many.
I’m glad that 64-bit isn’t becoming the debacle 32-bit was, where hardware support existed since the 386, but Microsoft didn’t bring a proper 32-bit Windows to the masses until XP. As a Linux user, I don’t have much stake in this directly, I get to use whatever chip is the fastest at the moment without much more than a recompile. But hopefully, mainstream support like this will make 64-bit hardware more commonly available and thus cheaper. And before we get off on some tangent about not needing 64-bit, have you checked the price of RAM lately? 4GB is about $1200 in 1GB DIMMs. By the time Opteron comes out, 1GB DIMMs will probably come into production volumes comparable to 512MB DIMMs today, and that 4GB costs about $1000. A bit pricey, but not nearly enough to stop some would-be digital video maven who needs that kind of horsepower. Hell, with CPU prices today, you can probably put together a $3000 system (the going rate for a good computer 5 years ago) with 4GB of RAM *today*
x86-64 really seems like the way the commodity market will move. Intel will never succeed in making the Itanium a commodity product with commodity pricing (it isn’t attractive even as a server product at this point)
As for support of 32-bit x86 applications, Sun still provides all sorts of 32-bit compatibility in Solaris. This is pretty much a staple of transitioning to a 64-bit architecture.
AMD processor running a 64 bits OS for running 32 and 16 bits applications made for 32 and 16 bits processors that are extensions of the 8 bits Intel processor (8080), wich is an extension of the first 4 bit Intel processor (4040)…
Why pay fort a 64 bits processor for running expensive, closed and proprietary software ? I prefer to use a full 64 bits Linux or *BSD system on it.
The 4bit processor you meant was the 4004.
I read the article on http://www.theregister.co.uk, and it seems that they had quite a bit of 64 bit apps running as well.
They mentioned IIS, Windows terminal services (this is rather important for the server market) and Internet Explorer. You can read for yourself http://www.theregister.co.uk/content/3/28143.html
Now, why you think that demoing the smooth functioning of 32 bit apps (some as complex as Office XP) in a 64 bit operating environment is a joke, I don’t know exactly. That’s a huge success, and from what I understood, they apparently have demonstrated the distributed object model (or whatever is the incarnation de jour in Windowsworld of the object model) working among 32 and 64 bit apps.
But what do I know, I’m just a lowly Unix guy.
which begot the 8008 and then the 8080, which begot the 8088 (and the 8086, but that somehow was forgotten and evicted). Then came the 80286 (and the 80186, which also got evicted, but this one for good reason). And so forth, up to today. Along this glorious saga we have the corpses of the i860 and i960, feeble attempts of Intel towards the land of RISC. Not that bad CPUs, but without any software, so they died without descendants.
And so will die the Itanium, since it’s expensive and without any application written for it, while the 32 bit apps run so slowly that a Pentium II 433 MHz doughterboard would be a better idea.
Or so I think. But what do I know…
>>>>>
Along this glorious saga we have the corpses of the i860 and i960, feeble attempts of Intel towards the land of RISC. Not that bad CPUs, but without any software, so they died without descendants.
<<<<<
From what I recall, those were actually scrapped due to serious design flaws.
It’s alive and well as an embedded processor. The ix60 architecture was based on the Intel APX 432 processor, which was supposed to be the sucessor to x86 nearly a decade ago. It was an advanced, object oriented processor designed to fully support Ada, which was up-and-coming at the time. Of course, all those near-sighted people who want backwards compatibility more than anything else killed it, just like they killed Itanium in the consumer space…
In fact, you can find the i960 in all Compaq SmartArray RAID card in their servers. The i960 can be found in many laser/color laser printers.
The i960 is not dead, it’s alive and kicking. It may not be the heart of a computer, but it’s the heart of many other devices.
I remember seeing a bunch of those i960s back in the day when I was swapping Reality Engines and various other I/O boards in and out of SGI servers. Everthing from the old school Crimsons to the Infinite Reality 2 Onyx 2s. But the original Onyx and Challenge systems with their motherboard sized texture or rendering engines, spread across a handful of i960s and banks of RAM, now that was fun hardware to dive into. At least someone was buying them.
Doesn’t it amaze you that the modern PC that people still feel obligated to upgrade to has nearly the power of these old million-dollar servers, yet no one is willing to learn how to use it. What’s gonna happen when the desktop PC can address 16TB of RAID storage and RAM? Nothing. They’ll be sold for $300 and people will use them to read email and browse CNN and complain because it takes 3 seconds to open their 64-bit apps instead of 2. Yay capitalism.
I was in high school when the 4004 was out, so I remember most of this quite well. 4004 was used as a substitute for a multi chip set calculator that a Japanese client asked Intel to design. Intel thought it through & figured a general purpose chip would suit the customer & themselves better, it was designed by Ted Hoff et al. The following 8008 followed and became the 1st widely used “Micro Computer” often mentioned on Tommorows World, ie an embedded cpu.
The 8080 (& 6800) were both influenced by the pdp11 of the time, but both were cut down so much you could barely see that connection. This is really the starting point for usefull home computers.
The 8086 was a completely new design for which a small amount of effort included backward compatibility ONLY to 8080, no further.
The i432 was a disaster designed but a couple of Phds with no handle on reality of costs, performance & something Intel was very glad to forget.
i960 was very succesfull, & pops up all over the place.
Theres room in the world for more than one cpu esp outside PCs. Luckily Intel also now believes in many architectures, the right one for the roght job. Adopted home also of StrongArm & XScale & no doubt other cpus.
The rest is history.
>>>>The demo units are running Information Server (IIS), 64-bit
I wonder if they went the extra mile and ported code-red to 64 bits? ๐
And the OS is made by a 2 bit company that can’t stand 1 bit of competition?
The 8086 was a completely new design for which a small amount of effort included backward compatibility ONLY to 8080, no further. That’s true, but it wasn’t very succesful, expecially compared to the 8088. I only saw the 8086 in some Olivetti PCs. The 8086 had a 16-bit databus, while the 8088, if I recall correctly, had an 8 bit databus. I don’t remember if there was any other significative difference.
Memories ๐
The 8086 met limited success because of its wide databus – as weird as that sounds. At the time, most device controllers and sub-components were of the 8-bit variety, and anything in a 16-bit version was usually too expensive (or didn’t exist) to use.
Intel cut the external data bus from 16 to 8 bits, and marked it as the 8088. This ‘economy’ version is what made the sales numbers.
Nope, the real power was the 68K processor used in the Amiga’s. That was cool.
A fast and reliable OS that was easy to use and had a tonne of games and applications. Rather sad that Amiga died the way it did.
i thought amiga had more going for it, both os=wise and hardware wirse than did apple.
so why did amiga perish but mac prevail?
you know what, it’s too bad apple didn’t buy amiga, and use amiga’s technology, and amiga’s OS, to build its next generation of apples.
All with funny names
And was one of the first real world computers to use multi-processing and distributed processing at a comodity price tag.
That is why there are so many fans of the platform even today, several years after the last Amiga shiped out of the factories… [and is comming back again hehehe]
Cheers…
I remember those. They were really weird, as if someone nicked them from a Star Trek re-run ๐
Remember when there was a hard crash?
“Please click the left mouse button for guru meditation” (Amiga 500)
All style, even when displaying a crash ๐
The chip names were Alice, Denice and others I can’t remember (damn).
They were given names so that the designers could talk about them over the phone without fear of being spied on!
i.e. “How’s Alice going today?”
“Not bad, is Denise OK too?”
Or at least that’s what *I* heard.
8086 & 8088 were too all intents identical. The only differences to the user was that 8086 was put into a package with the full no of pins & was wire bonded to enable the full width. So the cost of the 2 chips (dice) was identical BUT the 8088 used a much cheaper package. In those days, every pin added atleast 10c to package cost. Also of course the external 8 bit bus allowed a simpler cheaper board, but it was only IIRC about 30% slower than a similar 8086 system because of this. The smaller package was often a factor against the 68K since it came in the huge 64? pin long ceramic DIP whereas 808x came in 40ish plastic or ceramic packages, a very large difference then.
There is a huge rumor (yet again) that Apple and AMD are going to make some sort of announcement today. As far as I know, it’s just rumor.
The union of apple and AMD would be interesting. But i still can’t see apple supporting x86 even at 64 bit, too much work for the developers. Using AMD to fab powerpcs though is very interesting. AMD is getting their butt whipped in x86 land and Motorola has no interest in the desktop game. Apple would benefit from having both AMD and IBM as partners and getting rid of motorola once and for all. AMD would benefit from having more volumes to absorb the ever increasing cost of upgrading chip making equiment. Mot though might not like the fact that AMD could then target the embedded market with lower power and more powerful chips than mot.
in the end, the CPU guys might want to join forces with DSP makers since the latter have the real volumes.