You may not realise it, but today one of the most important pieces of technology celebrates its 40th birthday. In November 15, 1971, a company called Intel released its Intel 4004 processor – the first single-chip microprocessor, and one of the most important milestones in computer history.
Congratulations are in order i believe. 4bit madness at the time, turned out pretty well for most people involved i guess.
Well until Intel was allowed to bribe and rig the entire ecosystem and cut Via (who could have easily made the first netbook if Intel hadn’t paid off the OEMs) and AMD (who is only surviving thanks to their purchase of ATI and selling their chips for a couple of pence over cost) out of the market using illegal means and getting away with it.
I wonder if there is some sort of rule that once a corporation gets to a certain size they HAVE to turn evil. Its like one day they wake up and starting practicing their evil laugh and putting their pinky against the edge of their lip. look at MSFT, started out as this little software house, gained year after year by being cheaper than the other guy, then one day gates waked up and goes ‘Crush them, that is what we need to do CRUSH THEM ALL Mua ha ha ha ha!”
Just think how different things could have been if Intel hadn’t rigged the game. Via would have probably come out with netbooks in 04 and used the money they made from those to build even lowered power chips, whereas AMD would have had roughly half the market thanks to athlon64 being so much better than the Netburst arch and would have used that money to come up with even better designs, while Intel would have not been able to pawn Netburst off for so many years and we wouldn’t still have those power hogs by the millions. Instead they would have went back to the drawing board and probably had Core design out 3 years earlier than they did.
While i’m glad Intel came up with the CPU it just shows that IMHO power ALWAYS corrupts. When I was a kid we had more than a half a dozen CPUs to choose from as well as 4 different x86 designs. Now we are down to just 3, Intel, AMD and ARM, and if it wasn’t for cell phones ARM would be toast as well.
When were you a kid btw? When i was younger we had opportunity to pick between the following in x86: Cyrix, IDT, Intel, AMD, RiSE NexGen and the NEC Clones.
Today the only manufacturers that are allowed to make x86:cpu’s are VIA (whom bouht IDT and Cyrix), SiS (Who bought RiSE from the people at VIA) AMD (sharing agreement with intel as well as bought the designs from nexgen as well as the company) And ofcourse intel themselves. Ofcourse if someone were able to do a complete reverse engineering of any of those they would be allowed to sell x86 designs learned from that expierience too. (in omst countries atleast, local laws may apply)
Edit: Nexgen added.
Edited 2011-11-16 11:57 UTC
Those were the days… I hate what’s happened to the industry. Sure, things are more standard, but there’s less actual choice than there used to be.
Nah. Many of those “choices” were more or less horrible, and still universally “too expensive” & of much worse value than the nice things of the last decade (or so), things which brought less limits of what you can do, what you can accomplish.
You have more actual (and typically better, with great value) choices instead of “necessities” of old. It’s just that how some x86 designs were much further ahead made most others disappear.
But there’s so much more than that. Probably any of the dozens random microcontrollers will give much better value than those from the old days (you could hardly do that http://en.wikipedia.org/wiki/Uzebox 2 decades years ago for example, not anywhere near so good & inexpensively; didn’t stop some, of course: http://en.wikipedia.org/wiki/Galaksija ), out of which x86 was born BTW.
Sounds like close to your childhood as it was late 70s to mid 80s (my first album was Kiss Alive II on 8 track just to date myself). My first “PC” was an Altair that “fell off the back of a truck” my uncle was driving, followed by the VIC. Atari, GEM, Compaq, I think I played with just about every CPU and OS back then, man it was so much nicer when we had choice!
Of the ones you name there really are only two now, Intel and AMD. try buying a Via CPU in anything but ultra niche like carputers is nearly impossible, and SiS doesn’t make chips. I had hopes that Nvidia would buy Via and get in the game (imagine how sweet a netbook with a combo Via/Nvidia APU would be) but that seems to be a no go.
But frankly other than bribing the officials I can’t understand why Intel wasn’t busted. I mean you had a major CEO admit that Intel bribes were like Cocaine and Dell admitted that during the price wars there were quarters where their ONLY “profits” were in the form of Intel kickbacks. Then finally you have the fact that in every benchmark there was the Athlon64 stomped netburst until Intel came along and rigged the compiler, what more proof do they need? What they did made MSFT under Gates look like choirboys!
Not really, we didn’t have it, in comparison ( http://www.osnews.com/permalink?497962 ) & no, it wasn’t nicer.
We have tons more choice now. If, for whatever reason, you don’t exploit it, a least don’t throw such myths around.
It’s almost insulting to many millions who were simply excluded back then from the field of computing; which became immensely more approachable over the years, open for many more people (who typically don’t have the opportunity to, apparently, steal their first expensive computer…)
There are ~2 billion PC users alone, while your “great times” most likely offered the experience (extremely limited, nvm much more expensive) most likely to at least an order of magnitude less.
That’s more nuanced, I think, with many technologies patented …well, one could ignore it, I suppose – but in the process closing too many markets, being too risky for buyers of chips (manufacturers, really, not end-users*).
OTOH, i486 is over 20 years old. And in 2013, you can do P5-level implementation without any legal issues, I guess. Two years later – i686, essentially bringing compatibility with most of present software, if it doesn’t require MMX (2016) or SSE (2019…). x64 would be probably at least as big of a problem before 2023 – I imagine that AMD doesn’t want additional competition, much more than Intel doesn’t want it (Intel just wants some competition, to avoid antitrust; for AMD, being that “sanctioned” by Intel competition is the lifeline)
*hm, I wrote it originally as end-suers ;p (and almost submitted like that, auto-correct giving me a false sense of correctness)
That’s one pretty angry sentiment and mostly wrong.
You seem to have a fix on VIA as if they are angels too, it’s not as if Taiwan doesn’t have it own huge empires that dominate the world in motherboards and PC production through China. These companies are often setup up in a way that would alarm most westerners, typically one family member each has a company to run, VIA is the wife, and the husband is another giant. Nepotism is the word there. At Intel you will find engineers and managers of every race and nationality (I don’t work there). In any Taiwanese empire you will find mostly, well try guessing.
Yes Intel did get into legal trouble for aggressive marketing and that’s why AMD still has some part of the pie but it has to fight it out in the market for that slice. Netburst was crap, they got over it, every company makes mistakes and learns from it or dies.
Besides x86 and ARM, there are many other processors still out there mostly in the embedded area. In my own humble opinion, the future looks much better for ARM in tablets, phones and the embedded space than it does for Intel/AMD in PCs, its volume vs margin. With the gradual switch to slimmer PC/Tablets, I see only huge growth for the ARM and timid growth for x86. The ARM industry has at least a dozen vendors in it and looks quite healthy.
The term CPU predates Intel by a long time, even in those days IBM, Bouroughs, ICL and many others were building CPUs built with many discrete standard logic chips and custom chips. The 4004 was only half way to a whole CPU with very limited performance, and stuffed into a 14pin DRAM package because that was available. Pins were very expensive in those days, but the more you wanted to do good CPU design, the more pins you needed in the package, that had to wait awhile for the 40pin DIP and the switch to NMOS.
Congratulations are really due to the entire industry.
Wow you are so wrong I don’t even know where to begin on your wrongness. Intel wasn’t “aggressive” they BRIBED THE OEMS which several OEMs admitted under oath. One likened Intel kickbacks to “cocaine” and there were several quarters during the price wars where the ONLY profits Dell had were Intel kickbacks.
Also you might want to look up “Intel rigs compilers” to see how by simply changing the CPUID on ANY Intel chip from “Genuine Intel” to “Authentic AMD” suddenly the CPU will slow down by nearly 40% on ANY program compiled with the Intel compiler. Wow a single line suddenly causes the entire code to fall apart, how is that possible? Simple Intel is RIGGING their compiler so that ALL code compiled with it, whether the coder wants it or not, looks for the CPUID and if its Intel it gets full SSE up to SSE 3, if not it gets X87 which was depreciated back in 1995!
So this is NOT aggressive marketing, this is complete market rigging. They paid the OEMs massive kickbacks and made it clear those kickbacks were connected with how many AMD chips they carried (which was why you could find Duron/Sempron but not Athlon, as Intel gave them a higher “quota” of Semprons allowed) and when they saw that even the Duron stomped netburst because they had made the pipe too long instead of fixing their problem they rigged the compiler so any CPU that didn’t have the right flag got a boat anchor tied to it. Frankly what they did made what MSFT pulled look like a joke and MSFT got put under sanctions for 12 years.
Well… http://en.wikipedia.org/wiki/The_Corporation_(film)
And, really, it’s nothing new; probably largely a reflection of what we, humans, are.
I suppose you don’t think (saying “IMHO power ALWAYS corrupts”) AMD or Via wouldn’t act in a roughly similar fashion, given similar position & circumstances, some opportunity in which they’d saw a chance for them.
And come on, don’t glorify them. Via had many years for that… yes, OEMs most likely were an important factor – but Intel didn’t even necessarily have to intervene (OEMs by themselves preferred to sell machines to “premium” people)
OTOH what Via did, in some of their initiatives (announced, at least; I’m not sure if they were ever really realized), can be largely described as dumping obsolete, uncompetitive, barely useful tech on impoverished people (products of bad value often hurt them the most)
Or consider the “evil” brought by flawed Via chipsets from a decade+ ago. Equally inexpensive but technically competent alternatives were available: Ali Alladin in Super Socket 7 era, SiS chipsets in K7 era – but somehow Via managed to push their own flawed chips to generally dominant position (in the segment), didn’t have any qualms about it.
All their issues, instabilities …think how much additional (unnecessary) stress it brought. How much human creativity and energy it wasted.
Well, and many of them weren’t really that good of a deal; largely died rightfully. OTOH, there are still many embedded architectures (that’s where x86 was born) to choose from.
Edited 2011-11-22 22:51 UTC
Before the 4004, CPUs required more than one chip, but the term was used going back to the 1950s.
Make that the first single-chip commercial CPU.
It’s the first microprocessor, the 4004 is technically part of a chipset that made the actual CPU.
Although the 4004 chip was cheap enough, by the time it was packaged up into a useful system, the cost was back in the many thousands. The tiny bus meant lots of external glue logic was needed. The 8008 was a little better but the switch to NMOS in the 8080 is what allowed lots of rapid improvements to start coming in plus competitors. The 8080 finally had a big enough 40 pin package that allowed the address and data buses to be direct driven, most of the glue logic was gone.
I used to love watching Tomorrows World and Raymond Baxter mentioning “it has a built in Micro Computer..” every time he opened up the gubins box to show these new fangled things.
If only the Japanese had retained exclusive rights to this chip. The world would have been a different place.
I read it would take 360.000 4004 CPUs to get the same performance of a regular desktop we use today.
We’ve come a long way.
Actually, it’s probably several order of magnitude higher than that. I’d roughly say 10^9 to 10^12 4004 units to reach the performance of current CPUs.
The things is that current CPUs do in hardware things that have to be emulated (i.e. slowly) on the 4004:
– The 4004 is 4 bits per instruction, x86-64 can work directly on 64 bits registers. Hence operations on normal 32 or 64 bits integers take several instructions on 4004 while only one on x86-64.
– The number of cycles needed to execute complex integer operation such as division and multiplication has been dramatically reduced on modern CPUs.
– The 4004 has no hardware support for floating point math. Compare that with the Core i7 Sandy Bridge that can execute an operation on 8 floats in one instruction via AVX.
If you throw into the mix things like GPU (CUDA/OpenCL), it’s just getting ridiculous.
Yet, it would only take of few years for the 4004 to do more computation than what civilization has ever done before the advent of electronic computer in the 1940’s.
That is certainly one way of looking at it to maximize the difference up to gazillions.
I prefer to compare the productivity of PCs built with useful processors even if one has orders more devices and clock speed than the other.
An early 68000 has around 46000 devices running near 8MHz while the 4004 has 2300 devices at 0.8MHz. The overall difference is 200 more power for the first nice CPU you could actually use to work on graphical documents with windows and mouse. The 486 would be similar.
A modern x86 may have 10,000 as many devices and 400 times the clock speed, which means a PC today has maybe 4 million times the theoretical performance of the 68K Mac.
Some how the quad core I am using today does not feel like it has 4M times more performance or responsiveness than my first Mac. Maybe it feels 100 time faster but it still has too many odd latencies. It does have 20 times the screen estate and full color so it needs at least a 1000 times the power for software based graphics but then again the graphics is done by nVidia/ATI so what the heck is the CPU doing, its mostly idle as was the 68K.
For me the usefulness of a processor seems to follow more the log of the devices x clock speed.
So 4Mx200 is close enough to the 1 Billion suggested. Of course these arguments don’t really make much sense, like comparing an amoeba to higher life forms.
I am luck enough, I used one of those chips. Soon later, moved to the Z80, 8080, 6809, 8086, and then microcontrollers such 8051 and AVRs.
Time goes by very fast, technology advanced fast as well. I barely was able to follow the speed of it, and actually totally lost the whole overview of what is happening.
Happy Birthday Mr. Forty O’Four.
I can not even guess what we will have in the next 40 years, in 2051.
Wagner
I may have to dig in my old parts bin. I may have one or more of those laying around.