It’s been a long, strange trip for the personal computer over 30 years. Ars takes a look back at the comings and goings of players in the PC market, from Altair to Zeta OS, to see how we got where we are today. “When you step back and look at the big picture, the overall dominance of the PC becomes clear. However, this was not always the case, and in fact it wasn’t until 1986 that the PC platform first surpassed 50% market share. This was more than a decade after the first personal computer was sold.”
30 Years of Personal Computer Market Share Figures
61 Comments
Look at the graph on the last page and see how badly Commodore really blew it. Sad to see if you have owned several of their excellent machines as I have.
Also not really spelled out in the article, but there between the lines, is just how much we owe the geeks. You know guys like Ed Roberts, Woz and Jay Miner who built their machines, and an industry, out of pure love of the technology. I would take these guys over the egomaniacal Jobs and Gates with their million dollar houses and cars of this world any day.
-
2005-12-15 4:58 pmBluenoseJake
Steve Jobs and especially Bill Gates are geeks themselves, they just managed to make insane amounts of money from thier geekiness.
Edited 2005-12-15 17:03
-
2005-12-15 8:01 pmrcsteiner
Steve and Bill are mainly marketing/business types with a little technical background, and each depended on others (in the past as well as now) to do the heavy techie lifting.
Another anti-Apple article at Ars Technica!
They all must work for MS.
(Does OSNews have “funny” mod points?)
-
2005-12-15 12:45 pm
-
2005-12-15 1:45 pm
-
2005-12-15 2:06 pm
-
2005-12-15 2:51 pmjenniamc
Too bad they make the mistake of referring to market share with regard to the total number of installations in use but use market share (sales) number to make the point.
Its the oldest trick in the book to give the illusion that there are fewer Mac users out there than what there actually is.
I know technology has moved on an incredible amount, but the pocket calculator comparison always bugs me a little.
The old UNIVAC 1107 was a 36bit machine with a huge amount of offline storage on tape drives, and the processing power was more distributed amoung the I/O and other components than just in the CPU.
So I wonder whether you could actually run the same kind of programs to the same precision on a modern pocket calculator. Has anyone tried it?
-
2005-12-15 4:46 pmAnonymous
Well, these days you could build a calculator with an Eniac in it.
Check this out:
http://www.ee.upenn.edu/~jan/eniacproj.html
-
2005-12-15 5:50 pm
-
2005-12-15 5:49 pmrcsteiner
I doubt it. I admittedly don’t know much about the 1107 running EXEC I or II (before my time), but the 1108 running EXEC 8 was a multi-CPU box capable of both multiprocessing and multi-threading, and it juggled batch, real-time, transaction processing, and interactive tasks/users all at the same time.
http://www.fourmilab.ch/documents/univac/instructions.html“>This should give you some idea of the functional complexity of the beast (it wasn’t a RISC box <g>).
-
2005-12-15 6:49 pmrayiner
It depends on how you define “pocket calculator”. If you’re talking about a something like a HP49g+, then it’s probably true. That has a 75MHz ARM9 processor, 512KB of main memory, and 2MB of flash storage. That’s a faster CPU than the 1107, twice the CPU, and storage equivalent to about 1.3 miles of tape.
A very good article, with a good mix of facts for its length. It also bring back memories. I grew up in the 1980’s and it was in the so called 8-bit era that I got interested in computers and later made it into a career. On of the main things that got my attention was the colourfulness of the industry. There were so many different machines and ideas. I sort of doubt I would have chosen the same profession 15 yrs later…
Are these market share numbers US only?
I’m asking because the ZX Spectrum, which was a big success in Europe, doesn’t appear at all. Also, in Europe the Amiga did better than the more expensive Macs.
And the 16-bit era is a misnomer. Amiga, Atari ST and Mac were all based on the 32-bit 68000 architecture from the start in the mid 80s. It was only IBM and MS that stayed in the 16-bit-segmented stone age until the mid 90s.
-
2005-12-15 2:53 pmAnonymous
“Are these market share numbers US only? ”
Probably. I also remember a lot of other funky “home computers” from the 80’s like the Dragon-32, Dragon-64, Oric and MicroBee to name a few.
-
2005-12-15 4:56 pmAnonymous
“Are these market share numbers US only? ”
Probably. I also remember a lot of other funky “home computers” from the 80’s like the Dragon-32, Dragon-64, Oric and MicroBee to name a few.
No, according to the replies in the Ars forums they are worldwide sales numbers.
-
2005-12-15 3:52 pmjaygade
Are these market share numbers US only?
The comments after the article mention that they are worldwide figures. They did leave out a lot of machines that were less popular but still important.
-
2005-12-15 9:20 pmkmarius
If the machines could have been called 16bit, Commodore would have called them that. Larger number of bits = better PR. The CPU itself had a 32bit address space, but that didn’t make the Amiga 32bit ( http://en.wikipedia.org/wiki/68000 )
-
2005-12-15 9:21 pmkmarius
I meant “If the machines could have been called 32bit, Commodore would have called them that”. Bits mattered back then, and a 32bit home computer would sell better.
-
2005-12-15 9:32 pmAnonymous
Wrong. The address space and bus of the 68k was 24 Bit. The internal
datasystem was 32 Bit, but the external data bus was 16 Bit. Such
processors are called 16/32 Bit. Other typical specimen of that kind
are the Natinal NS31016 or the i80386SX.
The 6820 was the first pure 32 Bit 68k family member and migration
toward the 68020 was no big deal.
The AmigaOS was always a 32 Bit OS.
>They all must work for MS.
not all of them, but there are several well known people at Ars that ARE MICROSOFT employees…there are also a number of “microsoft partners” and people elected by microsoft to be called “VIP’s” which are given special treatment for their abilites to tow the company line in public forums as non-employees.
not all of them, but there are several well known people at Ars that ARE MICROSOFT employees…
Names and evidence please.
-
2005-12-15 5:55 pmrcsteiner
Well, I know that pdampier was an employee of MS at one point in time. Maybe still is, I dunno (I’ve not been active on Ars for a few years now).
Too bad they make the mistake of referring to market share with regard to the total number of installations in use but use market share (sales) number to make the point.
Its the oldest trick in the book to give the illusion that there are fewer Mac users out there than what there actually is.
Silly conspiracy theorist. How is this a trick if the article does exactly what it says in the headline?
Sales numbers are simply the only useful long-term numbers that are available. Or have you got any historical data on installation numbers?
And what would count as an installation anyway? How much dust would a computer have to gather before it falls out of your statistic? And what about things like dual-boot and virtual machines?
-
2005-12-15 3:48 pmalcibiades
The problem some mac users have is, they cannot accept the verdict of history, and so are compelled to try to rewrite it all the time.
What history shows, and this article too incidentally, is that there was one irresistible market model: open hardware. Open hardware as a model was enormously powerful even when the OS was DOS. As soon as tit had a “good enough” operating system, which happened with W95, everyone else was left in the dust.
Now, people argue no, it was all down to MS unfair competition. This is nonsense of course. There was unfair competition, but what was it that put MS in a position to be able to compete unfairly? If you and I were to try competing in the same way, people would just laugh at us. What allowed it was the power of the open hardware, good enough OS, model.
It beat closed hardware and a very much better OS in the market. In fact, it beat several of them.
The mac people will immediately think, this is all wrong, the closed system offers huge benefits of plug and play, just works, etc. But what they have to think is, given that no one supplier is ever going to have a monopoly of closed source all-in-one systems, what they are really in favour of is a market where the structure is mutlple suppliers of tied hardware and OS bundles. Now, the reason that we do not have such a market today, is that the closed systems, in competition with the open-hardware systems, lacked evolutionary fitness just because they were closed. This is why we ended up with a market of open hardware and the OSs we have today.
In the end buyers, particularly corporate ones, did not either want multiple incompatible closed systems, nor did they want to be tied to single source hardware from one supplier. This meant that the closed system market is a very small ecological niche. Its the same reason you do not have an ecological structure with lots of medium sized slow moving herbivores in the African plains. They got eaten. The fast moving ones didn’t. This leads to an ecological structure with lots of different fast moving herbivores.
None of this says that Apple should necessarily license the OS, or even could make a go of it if they did. But it does say that the tiny share they currently have is not an accident, and probably cannot be materially reversed by current strategies, and that lamenting how the stats are calculated will not change any of this.
Incidentally – have you noticed that the AOpen Mini lookalike, open hardware, is scheduled to sell in the UK for more than a Mac Mini? Could it be, curious minds are starting to wonder, that people will pay a premium NOT to have to run OS X on it, rather than the other way around?
-
2005-12-15 4:11 pmTyr.
In the end buyers, particularly corporate ones, did not either want multiple incompatible closed systems, nor did they want to be tied to single source hardware from one supplier. This meant that the closed system market is a very small ecological niche.
Nice theory, except that in the beginning there was only one supplier of PC’s : IBM and they only opened up after a fight (the Phoenix bios lawsuit and the failed PS/2 architecture are two examples)
The real reason IMHO is the fact that while Apple, Commodore and Atari were hot with the enthousiast crowd and even educational institutions, IBM already had sales reps in every major company. These were the first to switch to computers (IBM naturally, a brand they knew and trusted) Afterwards employees were given deals on old machines and even new ones to encourage them to learn valuable computer skills.
No high principles and starry eyed users admiring open hardware. Just plain old “honey I can get a great deal on this computer from work” and “Let’s just get what I use at work, I already know how _that_ works”
-
2005-12-15 4:29 pmAnonymous
While I agree with the “familiarity-at-work” and “knock-off-price” paradigm, I am really intrigued with your belief that companies are the first to switch computers. I mean even compared to enthusiasts? I think this is a bit of muddled thinking. Don’t we so many articles about how most business are still using Windows 98 or NT and not XP and will nto upgrade to Vista because they did not upgrade to XP. Plus they have old hardware which will not be supported, etc.
I don’t knwo about your company but our company is never the first to switch computers or OSes. Our IT department just decided a couple of months ago that SP2 is okay and is still making sure that Tiger will work. Most enthusiasts would not wait this long. I know I did not.
Anyway, isn’t it logical that a company with many computers will be slow in switching since they need to make sure that all their old softwares work and the new computer will behave with all the old computers.
-
2005-12-15 4:50 pmTyr.
I am really intrigued with your belief that companies are the first to switch computers. I mean even compared to enthusiasts?
I probably phrased that badly, they were he first to switch in great numbers. Look at the numbers in the article and see how the volumes change around the time of the introduction of the PC from under 2000 to 10000 and beyond.
Companies buy large orders and renew their inventory on a much more regular basis than consumers, at least they did then. Remember buying that shiny new 286 for $3000 ? Companies then were the only ones that could afford to upgrade regularly.
nyway, isn’t it logical that a company with many computers will be slow in switching since they need to make sure that all their old softwares work and the new computer will behave with all the old computers.
Yes but then they were switching from no computers at all to a computerised office. Also compatibility then wasn’t such an issue, mostly the only thing that changed was speed (this is how “IBM compatible” became such magic words) at least until the advent of Windows which is when upgrading became REALLY problematic.
You can’t really applies todays rules to the situation then, its a different ball game.
-
2005-12-15 8:25 pmAnonymous
It isn’t starry eyed users that won with open hardware, it was business sense and economics that made open hardware successful. Regardless of what lawsuits were filed that forced IBM to go open, they were open.
Reasoning – you as a small guy, we’ll call you Mike from Texas, can go out and slap together a PC by buying bulk orders of sound cards, cases, motherboards, CPUs, RAM, VGA cards, etc. You can pick and choose manufacturers and get the best price. Then you can sell some boxes for a slight profit. You have no need to spend money on R&D, so you can further undercut the all-in-one vendors.
Another guy, we’ll call him Ted from Iowa, can also do the same thing with a different mix of brands at different price points.
You can see which company offers the price/ performance ratio you are looking for and buy that – and buy that in bulk. You aren’t locked into the 3 or 4 tiers of products sold by Apple, Sun, SGI, Commodore, Amiga, etc. MS-DOS/ Windows provided a unified front-end to the crazy hodge-podge of hardware on the back end.
As Bill G’s letter pointed out, a standard to develop hardware for provided immense opportunity and predictability. The simple reason was the more investment there was, the more investment there was.
In addition, it allowed innovation out of the bag. The early companies were innovative and dreamed big, but having a company with defined goals and a focus means that many possibilities can’t be pursued, and those have to come from 3rd parties. If your system isn’t open, that 3rd party ecosystem is unavailable. Finally, as Amiga, Be and Apple showed, poor business decisions can doom innovation, investment and progress while an open system allows greater product security – ie just because Gateway goes under, no one loses faith in PCs.
Whether Open Software is really the next step in this trend remains to be seen. It needs a standard to base on, and while most people are looking at Linux, Debian, blah, blah, it’s gonna be the web.
-
2005-12-15 9:22 pmTyr.
[i]Reasoning – you as a small guy, we’ll call you Mike from Texas, can go out and slap together a PC by buying bulk orders of sound cards, cases, motherboards, CPUs, RAM, VGA cards, etc. You can pick and choose manufacturers and get the best price. Then you can sell some boxes for a slight profit. You have no need to spend money on R&D, so you can further undercut the all-in-one vendors. [i]
As I said in an eaarlier post : don’t make the mistake of applying todays situation on the past. The PC was already a hit by the time the clones appeared. The hardware market, although reasonably large bin comparison to other platforms of the time, was infetisemally small compared to the one of the mid 90s. Sure the handfull of clone makers like Compaq, Tandy, Wang and Commodore did compete a bit, but they were hardly very differentiated hardware wise and still came at a hefty price.
See: http://andy.saturn9.ws/Photo%20Albums/1989_tandy_pc_ad.jpg (Tandy 386: $8499, look at the spec and tell where the custom video, audio, motherbord, etc are)
The whole “open” history might have appealed to companies at the time to join the market, but the way it’s told now is just plain revisionism.
-
2005-12-15 10:09 pmalcibiades
Well, if we look at the history, Compaq with the Phoenix bios started in ’82. The real growth, and the competition with Apple for the desktop, lay ahead, and the Mac succeeded or failed in a market environment in which Apple was supplying bundled hardware and OS, and the IBM compatibles were supplying hardware from whoever you wanted, and a good enough OS from MS.
I do agree that IBMs endorsement made all the difference at the very start. But its what happened afterwards that decided it.
My reading of history is that the two business models had a fair shot, head to head, and that what lost was not the Mac OS, but the Apple business model. And you can see why. You are a corporate, you are buying a few thousand machines. Now, what do you do when Apple tells you, sorry, price just went up (as it did under Sculley). Or when they tell you, sorry, shortage right now, ship you half next quarter.
You get fired is what happens. You have a meeting with your management at which they ask you why you let one supplier have you over a barrel. And you start giving them long technical explanations about why you don’t want to have a mixed operating system environment, so you should just pay.
This is, and always was, a total non starter. What you want to do is call up another vendor and get as many as you want tomorrow. And have them compete on price.
So, that’s how it worked out. And that’s why, right or wrong, Apple will never get out of the niche until and unless it licenses the OS. I’m not saying this is good or bad. Just that its a fact now, as it was a fact then.
-
2005-12-15 5:51 pmKris
I’d slightly rephrase what you have said and say:
What Microsoft did was use the market forces in the hardware sector perfectly. They build their monopoly on top of heavy competition, knowing that the competition (wich was enabled by the open hardware) could either lead to
a)differentiation or
b)cost-leadership
(at least if you belive in Porter)
on the manufacturers side. It was rather obvious that cost-leadership was going to happen. As a result of corse this leads to lower prices eventualy (eventhough the idea is to drive out oponents via the threat of lowering prices below their production cost and taking their market share). The lower prices lead to a large amount of people owning the hardware.
What Apple did was compete by differentiation because they simply did not realise that the money was to be made on the software side of things.
-
2005-12-16 1:26 amskingers6894
It all depends how you look at it.
The other “verdict of history” is that Apple is the only manufacturer of personal computers that was there is the seventies and survives today.
IBM themselves sold off their PC biz a little while ago.
The PC clones worked for the “cloners” but it did NOT work for IBM. They lost the platform entirely.
-
2005-12-16 9:09 amalcibiades
Yes, both of those propositions are true.
But maybe the important question is not so much how the market has worked for the companies, but how the market has worked for the customers. In these debates, I read a lot about people’s personal preferences – the Apple people for integration, others against it. I read a lot about how it would be good or bad for a given company, IBM or Apple, if they did or did not bundle, unbundle, allow clones. What I don’t find people thinking seriously about is what market structure is better for us the customers as a whole. That is what counts.
Does anyone really think we would like it better, or be better off, if today, instead of buying our hardware from a number of suppliers, and running the same OS or a different OS on it, we had to choose between incompatible bundles of hardware, software and OS offered by (say) Commodore, Apple, Compaq etc, each with around 20-30% of the market, all doing their best to lock us in to whatever we bought, all incompatible…
We should be profoundly grateful that the market voted against the Apple model as it did, because if the Apple model had come to dominate the industry, the industry would be far less interesting and we would be far worse off. Yes, Apple might be better off. Who cares?
-
2005-12-16 4:18 pmskingers6894
Maybe, maybe not.
If there were genuinely three or four platforms I’m sure the things that count would have become multi-platform. It would also be unlikely that a single browser or media player format could have gained a monopoly.
Imagine, web developers would have to adhere to real standards as there would be at best a browser with 30% of the market – unless it was really outstanding and cross-platform. The IE monopoly thing could never have occurred.
The only thing we know for sure is that a small (though large enough to buy billions worth of kit) like the Apple approach so as long as they are around we have a choice. Open or closed. It’s still valid to choose closed if you wish. Some people do I guess.
http://www.scripting.com/specials/gatesLetter/text.html
Thanks for that link! Very interesting stuff.
It really is.
But I don’t really get what Billyboys interest was in the matter. Where did MS come into the equation except as a facilitator?
DOS was very much incompatible with the Mac hardware and would have been no match for the Mac GUI anyway. Was it about MS application software?
-
2005-12-15 4:35 pmjaygade
But I don’t really get what Billyboys interest was in the matter. Where did MS come into the equation except as a facilitator?
DOS was very much incompatible with the Mac hardware and would have been no match for the Mac GUI anyway. Was it about MS application software?
Because MS was and is a Macintosh developer. It would have grown their applications software market even more.
the finest 8-bit computer of all time…..
also didn’t give enough coverage to NeXT – again amazing, amazing computers for their time – and as was alluded to but not really explicitly expressed – OSX is NeXTSTEP, or at least NeXTSTEP 5.0?
Back in the day when I wanted to invest in a C64 with a 1541 disk drive so i could play games and other things. for its time the C64 was an awesome machine! the games were as good as NES games. My older brother convinced me to wait and buy a PC of which i wasn’t impressed with the games on at the time anyone who has played some of the early ones on a CGA monitor can attest to. Thinking back it was strange having choice in computing platforms you didn’t want to spend a bunch of money on hardware and software only to see it dead end in 2 years.
So it turns out that Commodore was the only one to give the PC a fight. They failed miserably at getting people to convert to Amiga. It seems like one big mistake was that the first Amiga was not a low-end machine price-wise, like the C=64. Perhaps if they had introduced the Amiga 500 first, the could have gotten more C=64 owners to switch.
I bought an Amiga 3000 in ’90 or 91 and it served me well for 6 years.
Ony one nit to pick. Even MS understood that Win95 did not support preemptive multitasking. They had a better idea. They called theirs “cooperative multitasking”. That is to say, they left it up to the individual programmer to periodically let go of the cpu and let other processes do their things. Suffice it to say, we programmers are a greedy lot. And Win95 users suffered dearly for it. Hence the three-finger salute.
-
2005-12-15 6:47 pm
-
2005-12-15 6:49 pmjaygade
Ony one nit to pick. Even MS understood that Win95 did not support preemptive multitasking. They had a better idea. They called theirs “cooperative multitasking”. That is to say, they left it up to the individual programmer to periodically let go of the cpu and let other processes do their things. Suffice it to say, we programmers are a greedy lot. And Win95 users suffered dearly for it. Hence the three-finger salute.
I don’t know why people don’t understand that Win9x *did* have preemptive multitasking! Win3.x had cooperative multitasking. So did MacOS until MacOS X.
Win9x crashed more because the memory protection wasn’t as good as NT, especially when running a mix of 32-bit and 16-bit software.
-
2005-12-15 10:07 pmAnonymous
The reason why I say it is because I had several programs that involved rather long processes where I had to entertain the endusers to keep them from rebooting while I was updating a database, etc. and the only way I could get the little idiot message to update periodically was to suspend the main process and let windoze have control for a few cycles. Running the idiot message in its own thread did not make a difference. MS claimed they had pe, but they didn’t do it very well. Just my humble opionion.
-
2005-12-15 7:07 pmAnonymous
You are inadequately educated on Windows 95. It does actually do preemptive multitasking, despite your assertions otherwise. The problem of applications apparently hanging the system and thus requiring the three-finger salute was due to the fact that there is something called the Win16 Mutex that locked access to certain GDI code (a lot of that was still 16 bit for compatibility: NT never had 16 bit code there) and resources that was shared amongst all applications,that allowed for Windows 3.1 16 bit applications to run with better compatibility than they’d run under Windows NT 3.1 (or later) because they used programming cheats that assumed many different things that were no longer true. As a result, if a GUI application (one that uses those GDI resources: it didn’t apply to other applications) went haywire, then the GUI (but not the kernel or all non-GUI applications) would get stuck waiting for that Win16 mutex to be released. Thus, it appeared as though the entire OS used cooperative multitasking, which simply isn’t true. All 16 bit applications executed in Win 95 (and NT) execute within a common Windows on Windows subsystem, and cooperatively multitasking within the confines of that subsystem, and that subsystem is preemptively multitasking as a regular 32 bit task with everything else.
Windows 95 supports all the multitasking/multithreading primitive operations (though with things like security descriptors always ignored) that are required, and it supports them such that (where you aren’t relying on security descriptors and other things) an application written to run with multiple threads and even preemptively multitask/multiprocess with another application via shared memory, mutexes/semaphores/etc. work identically between Windows NT (and derivatives) and Windows 95 (and derivatives). What is certainly true (as shown by many tech sites in the past) is that Windows 95/98/ME doesn’t handle those things as efficiently as the NT and derived kernels.
16 bit Windows 3.11 did not preemptively multitask between 16 bit Windows applications; it did, however, do preemptive multitasking between all virtual dos boxes (run via the MS-DOS Prompt .pif) even if the DOS applications running within them were unaware of the existence of everything else around them: no cooperative multitasking in that case.
Jonathan Thompson
-
2005-12-15 7:36 pmjaygade
Thanks for that clarification; I knew it had something to do with 16-bit code.
I had forgotten about virtual dos boxes under Win3.x.
and educational too.
It seemed an honest stab at writing a quick history. The comments here all raise objections to the article but I think they reflect more on the biases of the commenters than the actual article.
If you enjoyed the articel, you might consider buying the book Hackers, it’s interesting (eventhough the focus is a little different (MIT, Gaming Industry)).
-
2005-12-15 11:20 pmjaygade
If you enjoyed the articel, you might consider buying the book Hackers, it’s interesting (eventhough the focus is a little different (MIT, Gaming Industry)).
I read Hackers way back in the day. I wish I had a copy now, it is a great insight into the beginnings of things.
I read it back in ’86 or ’87. A lot of stuff has happened since then, but I think that it is still very relevant.
Although interesting the artical only really applies to the US market. In the UK, Atari made almost no impact with the 400/800. The Apple II didn’t sell in any significant volume (way to expensive), and the much more influential (and better selling) machines such as the Sinclair ZX81 & Spectrum, the Acorn BBC and Archimedes (where the most popular CPU today – the ARM was originally developed). Even the Dragon32 and the Amstrad CPC sold better than the early Apple/Atari machines over here….
Browser: Mozilla/4.0 (compatible; MSIE 6.0; Qt embedded; Linux armv5tel; 480×640) Opera 7.25 [en]
<Does anyone really think we would like it better, or be better off, if today, instead of buying our hardware from a number of suppliers, and running the same OS or a different OS on it, we had to choose between incompatible bundles of hardware, software and OS offered by (say) Commodore, Apple, Compaq etc, each with around 20-30% of the market, all doing their best to lock us in to whatever we bought, all incompatible… >
Yes, of course there are many of us who believe this. The worst possible situation is to have everyone run the same software on the same hardware. That forces people with specific computational needs to use nonoptimal hardware and software. It also opens the door to security problems and viruses which thrive in is such a homogeneous environment.
The “ideal” computing situation would be for there to exist dozens of operating systems and dozens of hardware architechtures that all interact through common file formats and common communcation standards. These software standards could evolve to promote productivity and security. Users with high floating point requirements (for example) could use optimal hardware and software combinations without having to worry about exchanging MS Word documents (or database files, or whatever) with others who are using completely different combinations of hardware and software tuned to, for example, interactive graphics performance.
Multiple hardware vendors is only one part of this hetergeneous environment, we need to move also toward multiple software vendors and toward information interchange standards. There should be dozens of word processors running on dozens of operating systems that can read your MS Word or MS Excel file, and you should be able to exchange that file in such a way that there is absolutely no danger from viruses or other security problems.
However, as long as Intel owns 99% of the hardware market and MS owns 95% of the software market, we aren’t going to get there. We may have painted ourselves into a corner from which there is no escape.
It’s a hard task to take the entire personal computing history and fit it into a few pages, but I think the author did a good job here.
This is something that kids born in the late 80’s and 90’s should read to get a better understanding of the computing history. Even when I talk to a lot of people 5 years younger than me they have no idea that any of these products existed and how large their market share actually was. To them it has always been an x86/Windows-world with the little niche computer Mac on the side.
It saddens me to read about the failure of some great products once again though. One of the great mysteries to me is the faliure of the Amiga. I mean, sure, their marketing sucked but pretty much everyone knew about them (at least here in sweden). Yet people bought expensive computers that was years behind the Amiga.
It was interesting to look at the marketshare curve, and I was rather surprised at some of it. I didn’t think the IBM-PC and clones had such a large share that early in the game actually.
This will be a good reference to hand over to people that doesn’t have the slightest.
Thanks.
I guess the people who stayed away from Amiga went for PC as the Amiga was seen mostly as a games machine and the PC was for work purposes, even if more expensive.
Yes, Its nice to read a computer history article that actually mentions the Amiga & Atari ST.
Its amazing how many articles just skip these machines
For those who haven’t seen it, the 1985 memo from Gates to Sculley referred to in the article can be found here:
http://www.scripting.com/specials/gatesLetter/text.html
Hah, it seems they did follow Bill’s advice in the end : (on AT&T) “The mac interface should be viewed as a separate application interface that they can put on top of UNIX if they want”
For those who haven’t seen it, the 1985 memo from Gates to Sculley referred to in the article can be found here: