In this, the 60th anniversary year of the computer, it may be interesting to look back at a couple of key events in the evolution of this very important market. This is a market now amounting to extraordinary numbers of machines. In 2010, the last year for which we have numbers, there were no less than 10 million machines shipped! This growth and penetration is unparalled in the history of industrial products in the last 100 years, and is an amazing success. However, to get to this stage, the industry had to make its way through some issues and decision points. There are generally agreed to have been key turning points. What would have happened if they had gone differently?
Note: This week, we feature a guest column by alcibiades, as Thom is attending a concert by The Streets in Paradiso, Amsterdam. Next week he will be writing the column again.
This excerpt from Computer World for March 2011 is reproduced by permission of the publisher.
Let’s recapitulate the present state of the market. IBM, as was always expected, has retained its dominance of the corporate sector. The last minute decision to use its own software on its new line of machines, and the court decision against Compaq and Phoenix, led to the dominance of the end to end model. This model now has a 97% share of shipments. As is well known to corporate buyers, this allowed IBM to deliver seamless integrated solutions to the desktop, using a mix of mainframe servers and intelligent terminals. There has been extraordinary progress in the speed and functionality of these terminals. In 2010, some have a few megabytes of memory, and there are rumours of hard drives with several hundred megabytes appearing shortly. Local computing, which was previously unheard of, is possible. Color screens and sophisticated graphics are beginning to find their way from the art departments of advertising agencies into some executive offices, but it will be a long time before these are anything more than expensive toys for most people.
There is a small market for standalone educational and home machines. The Amiga, the Macintosh, and a host of other familiar names fill this segment. It probably accounts for about 10% of the total market. Apple leads at the expensive end of this market, but the Amiga is still strong.
The first time buyer of a computer today faces an interesting and challenging decision. Whatever choice he makes will have strong implications for the rest of his life. Whichever supplier he goes with, he will get a full range of applications, but he will not be able to move these applications from one platform to another, and the vendors have also adopted incompatible file formats, so that he will not even be able to move his documents very easily. Nor will any of his peripherals work with any other supplier’s. Move computer platforms, and you will have to jettison keyboard, mouse, screen, server. Few companies ever do it.
Service levels in the market are usually thought to be very high and fully satisfactory. As one corporate buyer we spoke to for this article said, “Why do I need to choose between thirteen different suppliers of something? What I want is one supplier who delivers. I have that.”
When we look back, we can see now that so many of the ideas trumpeted in the early 80’s about the future were no better than the fantasies then current about wrist communicators. The idea that there could ever be a global information network turned out to require huge quantities of standard platforms, which simply do not exist. The idea that out of copyright books could be digitised and made generally freely available would of course require open formats. There are none. The idea that one day we would all have computers in our homes was perhaps the most absurd. What on earth would we do with them? How would we ever afford them? Apart from the mains socket, what would we ever connect them to?
Still, if the development of this market and technology has not realised the wilder dreams of the visionaries of the last century, it has contributed greatly to progress, and some applications that have seemed unlikely are now being realised. We are just now acquiring machines powerful enough to do graphical manipulation of photographic images, and researchers at IBM, while remaining very tight lipped about how exactly this is to be accompished, do hint that their computers may have abilities of this sort within the next five years. There is talk of universal messaging services. IBM’s service arm is considering building a massive email switch; a sort of electronic clearing house, to enable companies to send messages to each other, even if they do not run on the same computer systems. What a remarkable innovation this would be, and testimony to the creative powers of the industry.
Looking back, we can now see how little merit there was to some early proposals that it would be best if the industry standardized on compatible hardware, which could be made by anyone. It would have been a disaster. Companies would have competed solely on price, quality and reliability would have plummeted, huge numbers of cheap and marginally functional machines would have flooded the market, there would have been no way to deliver the seamless end to end controlled user experience which is such a wonderful feature of our present environment. It is, we now see, only in a world in which hardware and software are developed, manufactured, marketed and supported by the same organisation, that customers can really get the quality and stability they most deeply desire and need. It was the far sighted courts in the IBM Phoenix case that we can thank for this. Thanks to them, the industry has been able to adopt a sustainable business model, and put creativity and innovation ahead of the pressures for short term profits.
And so, we look forward to ever growing sales, and the prospect that by the year 2050, the market may as much as double. Just think, 20 million machines a year. What a remarkable growth record that will be, if it happens!
Huh?
i must have got even more drunk than i thought, on what i thought was last night.
Yeah!
http://en.wikipedia.org/wiki/Alternate_history_%28fiction%2…
Edited 2006-05-29 14:28
I didnt even drink last night! I played Risk all evening while drinking orange juice! Really, i did!
Ok, seriously, someone know how we should interpret this? Is this some kind of future scenario of someone who cant compute? (2010-1980 != 60). Or is Thom on acid again?
Edited 2006-05-28 16:26
Or is Thom on acid again?
Drugs are for kids. Other than that, read the article again if you want to understand it. It’s not that hard to see what this guest column is trying to prove.
Right idea,wrong decade.
It’s just saying that without directly competitive hardware vendors and open standards, technology wouldn’t move on any front. A very confusing way to put out that information.
OK, Mr. Wizard, now that we have the hardware, why is most of the world stuck with a single software vendor? Where’s the competition to push that technology ahead?
well there is the penguin, the devil, and some others…
Can someone please explain the article?
It’s an Apple commercial:-)
It is, we now see, only in a world in which hardware and software are developed, manufactured, marketed and supported by the same organisation, that customers can really get the quality and stability they most deeply desire and need.
Can someone please explain the article?
Alcibiades is unwell and cannot respond personally to questions. He is heavily sedated, following a nasty attack of spatio-temporal displacement syndrome. It appears to have been triggered by reading the following article.
http://online.wsj.com/public/article/SB114729881894749433-ORYg5V1P3…
Anyone susceptible to this syndrome should exercise caution while reading it. You wouldn’t want to end up like Al…
Huh??? The article linked to compares the product develoment models of commodity dedicated functions consumer devices like the iPod to general purpose computer software (and the hardware that supports this), like MS Office and the like. Definitely not an apples to apples (pun?) comparison, and therefore in my view barking up the wrong tree. And as pointed out, MS is using a similar development methodolgy for similar products like the XBOX, etc. However, I suppose that if one is a regularly contributing columnist that is obligated to write articles on a regular basis that it may be difficult at times to come up with meaningful topics.
These kind of “what if?” scenarios work better if they have some degree of plausibility. The idea that the non-existence of PC clones would mean that the world wide web never existed, or that offices would all be using terminals connected to IBM mainframes, seems very, very silly to me.
Personally I find it very unlikely that IBM would have had so much success in that situation. I think that a lot of the success of the IBM PC platform was due to the cheap clones, rather than the IBM brand itself. After all, the reputation and image of IBM didn’t stop companies buying those cheaper clones rather than “the real thing”. If IBM only offered crippled terminals then surely the many alternatives would have been more attractive to businesses, and IBM would never have come to dominate the market?
The author drastically overstates the problems caused by the lack of a standard platform. After all, Mac and Linux users don’t have much trouble existing in a Windows dominated world. Even in the days of 8-bit computers there were cross platform applications and file converters. If there were more competing platforms then I think it would encourage companies to create more cross platform software and standard file formats.
I don’t really understand the author’s reasoning regarding the lack of a standard platform killing the Internet, when did the Internet ever require a standardised platform? After all it existed before IBM started work on the PC, and today servers use a variety of hardware and operating systems. Why would things like email not exist in that world when they predate the PC?
Then there’s the world wide web, I don’t see how changes to the PC would have had any effect on its development by Tim Berners-Lee. After all, he developed it on NeXTSTEP running on 68K hardware, and IIRC web browsers appeared for the X Window System and Mac OS before DOS/Windows. Hardly an example of something that would have been killed by the lack of standardised hardware and software…
Particularly silly is the comment about digitised books not existing in that universe due to the lack of an open format. What about plain ASCII text? Most of the documents on Project Gutenberg are available in that format and it dates back to the 1960s.
Another thing I don’t find plausible is the idea that hardware would be far more primitive than it is now, with 100Mb hard drives, limited RAM and basic graphics being the norm in 2010. Even if you take the IBM PC out of the picture, what about competition between other home computer companies like Apple and Commodore?
Look at all the progress that was made in a few short years in the early days of the home computer. For example, compare 8-bit computers from the early 80s, like the ZX Spectrum and C64, with 16 bit computers available by the mid 80s, such as the Amiga and Atari ST. Why would that level of progress have stopped because the IBM PC went in a different direction?
Failing that, what about the games market? I think people underestimate how many advances in graphics, sound and storage were pushed by the demand for more impressive games. This would all have driven the creation of more advanced hardware, and IBM would have had to adopt it or fall behind.
Of course it’s impossible to know for sure what the computer industry would be like today if the history of the PC had been changed, but I can’t imagine it being anything like this world the author has dreamed up.
What about plain ASCII text?
What about PDF?
The basis of this projection is wrong in that it does not take note that there were a number of other manufactors of microcomputers before and after the IBM-PC was delivered.
If IBM had tried to force buyers to use the software/hardware model shown in this article the S100-Bus machines might have taken over.
While not completely standardized you could buy hardware from a number of diffirent suppliers that would work in your machine – if you could get the drivers working.
The IBM-PC basicly won because:
1) It had the IBM name behind it.
2) You only needed one driver model for all early IBM PCs.
3) The full hardware write-ups in Popular Electronics and Byte did not hurt it a bit. I don’t remember ever see such a complete write-up on the S-100 designs before that. Probably because all the diffirent S-100s machines had diffirent if interchangable hardware. IBM-PC one model only for the first year or so.
And maybe most importantly.
4) The major non S-100-bus microcomputer players dropped the ball:
Apple concentrated on Education at the time, they did not add the features in the pre-Macs that businesses wanted. And the Macs were too expensive for the low-end business work.
Commodore’s management was greedy, paying themselves instead of development. The C64/128 models were never easyily expandable enough – the full range of the Amiga line was never reached in the proper time frames (the A2000 still had a 68000 instead of shipping with a 68020 on the motherboard), it was always behind the CPU state of the art.
Atari had a gaming image that breaking out would have been hard, but they did not help by crippling the Atari-ST line with limited data and address buss. If any early machine needed 32 Bit Data and 24 Address busses from day one to get max preformance out of it, the Atari-ST was it.
—————————————————–
Correct me if I am wrong. But most of the other remaining manufactors at the time would had a problem maintaining the advertising blitz that we saw from the above companies. But if they had gotten together to agree on driver design and shared advertisizing they could have swamped a rigid IBM sales model.
On the otherhand after thinking about the egos and very dumb mistakes done by the S-100 crowd I have changed my mind – they were doomed because they could never agree on a standard design to ship that all basic business software could run on without any modifications.
Can I write a column if I snort a line too?
A +1 doesn’t express how hard I laughed…
It’s funny cause it’s true!
People of earth! I am from the future!
Right off, I think the article is a bit extreme and takes a simplistic view of how standards evolve and ignores the fact that not all standards are created equal.
Take the example of the internet. The “internet” as we know it represents a collection of standards that are oblivious to the hardware platform or software application that implements them and, in a manner of speaking, devalues the underlying infrastructure by providing universal communication. Which is why organizations like Microsoft or AOL fought so hard against internet adoption, lost, and were forced to adapt.
Hardware standards aren’t altruistic, they’re driven by business requirements. For manufacturers there are significant gains in economies of scale that can significantly reduce costs and broaden their potential market, with those gains often outweighing the cost/expense of maintaining proprietary technology or the risk of losing a controlled market. It’s ultimately about profit. Apple’s move to Intel is an ideal example of that. The more disparate a group of existing technologies, the more likely a standard will evolve or become established.
Sometimes the standard is mutually agreed upon, often through standards bodies. Sometimes the standard is determined by the market in a case of survival of the fittest (not necessarily survival of the best). Before hardware or software needed to be deemed “PC compatible” to find a successful market, it needed to be “Apple compatible” at a time when the Apple ][ was the leading computer platform. It’s also interesting to note when looking at IBM and Apple that both platforms soared in popularity and acceptance due to the availability of clones or third-party hardware, and both manufacturers failed miserably when they attempted to regain control of their market by implementing proprietary technologies, which the market rejected. Unofficial standards are born.
There was also a time when printers needed to be “Epson-compatible” if they were to be taken seriously. Why? Epson established itself early as a market leader and software was written specifically for it. It was much easier for printer manufacturers to gain acceptance by adopting Epson control-set rather than trying to convince software vendors to keep revising their software for different printers. Of course, Windows changed all that by unifying the printer interface and allowing printer manufacturers the freedom to provide their own drivers, a standard of different kind.
Software is a different kettle of fish. What defines a software standard? The platform it runs on? The document formats it supports? Microsoft Windows and Office are two examples of products that are proprietary and lock customers in to Microsoft’s platform, but at the same time they also give users access to the widest range of hardware and software options that can scale from home users to global conglomerate requirements. So Win/Office is proprietary and anathema to standardization at the same time that it likely encourages standards adoption.
I think the whole point is that the market and the industry would eventually have converged on standards-based computing. It was inevitable. Sure, certain events were key and influenced it, but in the absence of those events others would have stepped in.
And even aside from that, concepts like the internet or open-OSes like linux or BSD were equally inevitable, if not more so, since they reflect the ultimate example of users adopting and embracing standardization even in the face of established, big-business corporate-lockin opposition.
Just my 2c.
How about we just focus on the present, k?
1. Learn math
2. Learn history
3. Use common sense
4. Write about what you know, and don’t try to state your ideology in a “What if?” scenario that doesn’t use the first 3 suggestions/rules
5. Don’t use drugs!
6. Don’t quit your day job to become a fiction writer
I don’t know but from wiki:
http://en.wikipedia.org/wiki/ENIAC
“It was unveiled on February 14, 1946 at the University of Pennsylvania, having cost almost $500,000”
http://en.wikipedia.org/wiki/Computer
“the decimal-based American ENIAC (1946) — which was the first general purpose electronic computer”
So I believe 2006-1946==60. Not every computer is a PC.
ps. This article is fictional, I don’t see any facts that proves author claims.
Edited 2006-05-28 19:57
Wikipedia isn’t 100% correct on things: that’s why it can be edited by many people, which leads to….wikipedia isn’t 100% correct on things!
There were definitely computers before ENIAC, though I don’t care to check all the gory details, since I’m not writing an article that is based on that detail. But yes, if there’s going to be time references in an article that’s supposedly set in the future, it’s good to get the timeline written down and lock down which year the article exists in
sobkas, the author is projecting his wishes for what his interpretation of what his ideology says would have happened, without taking off blinders to how things work in business.
It’s an facetious article demonstrating what might have happened if end-to-end vendors like IBM, SGI, Amiga, Sun, Apple, and the other industry dinosaurs had managed to lock out commodity vendors. If Wintel hadn’t broke their backs, a similar scenario is what we’d have to deal with today.
I think the internet and email scenarios may or may not have worked out, but hardware would definitely be behind where it is now. Look at Apple and graphics cards. Even now, they scrape the bottom of the bucket with video cards. Apple doesn’t push standards – their most innovative hardware accomplishments in the last 10 years was dropping the floppy and antiquated connections. They push cosmetics. They are just typical of the closed vendors.
Look at HDDs and the cost of RAM upgrades through them. Seagate vs WD vs Maxtor (oops, Seagate now) vs Hitachi (IBM couldn’t keep up) vs Samsung creates forward progress. NVidia vs 3Dfx vs Matrox, now nVidia vs ATi provide progress in graphics. Now that AMD has given Intel a serious run for the last 2 years, one more block has fallen and we have another open area.
“I think the internet and email scenarios may or may not have worked out, but hardware would definitely be behind where it is now.”
I don’t really see the evidence for that.
Look at the massive progress that was made in the early days of the home computer, when Commodore, Apple, Atari, etc. were competing with each other. Would you really say that there was less progress in hardware between 1980 and 1985, than there was between 2000 and 2005?
Personally I think the jump from 8bit computers like the C64 and Apple II, to computers like the Amiga and Mac, is more impressive than the multicore CPUs, faster graphics cards and other refinements we’ve seen over the last few years.
Apple may not have produced many hardware innovations recently, but I don’t think that was always the case. Look at the innovations by other “end-to-end vendors”, for example Commodore who pioneered multimedia, or Acorn who created the first RISC CPU home computer. At the time the PC couldn’t compete when it came to graphics and sound capabilities, and even in the 90s the Wintel world was still copying their innovations.
Or look at games consoles and arcade hardware, they are produced by closed vendors, yet they have generally been close to the cutting edge. The competition between companies like Sega and Nintendo was enough to bring down prices and push the development of new technology.
As long as there’s competition I think progress of computer hardware is assured, regardless of whether it’s created by closed or commodity vendors.
i think the article talks about a two punch combo where the software lockin we see these days with ms office helps the companys build socalled silos around their products.
therefor the only competition will be about having new customers select your silo over that of a competitor.
ok, there could a company like microsoft that produce a office suite or similar that use the same file format over the multiple platforms.
but stuff like that can be controled by proprietary os libs that needs a kind of “one platform only” contract before before you get the docs.
there are all kinds of dirty tricks one can pull. i keep refering back to a example of the industrial revolution, where diffrent screw producers used diffrent spacings and so on for their products so as to lock the customer to them.
still, i think the “article” is a worst case scenario. and i belive i read a similar one over on anandtech or maybe it was extremetech some time ago…
Edited 2006-05-29 00:21
“therefor the only competition will be about having new customers select your silo over that of a competitor.”
Maybe I’m being dense, but I still don’t see how that would stop the progress of technology. If you bought a C64 rather than a ZX Spectrum, an Amiga rather than a Mac, or a Sega rather than a Nintendo, you were locked in to a certain extent. If you decided to switch to another platform you had to purchase all new software, deal with file format issues, and probably replace most of your peripheral hardware too.
Yet there was a huge amount of innovation and progress due to the competition between those platforms. Even in the author’s worst case scenario, I don’t see why that would change.
“there are all kinds of dirty tricks one can pull. i keep refering back to a example of the industrial revolution, where diffrent screw producers used diffrent spacings and so on for their products so as to lock the customer to them.”
And despite that, the technological progress during that time was so great that today it’s considered revolutionary.
I’m not arguing that “lock in” like that is a good thing, just that it wouldn’t create the kind of technological stagnation that the author of the article describes. The only thing I can see doing that is a lack of competition caused by the industry being dominated by one company.
it would be a dead stop, but it would slow down to a crawl compared to how it is today.
reason i say this is because i live in norway, and here i see two mobile phone providers that have more or less split the market between them. ok so there is a lot of virtual providers, but those rent capacity from the two big ones.
basicly the introduction of new tech is more or less a crawl. they have barely gotten UMTS working in the citys as we now hear talk about HDSPA (or whatever the name is).
“And despite that, the technological progress during that time was so great that today it’s considered revolutionary.”
but if you lokk into it, the development happend on a case to case basis. it was first when ford was able to create a factory that ate coal and iron ore at one end and spit out finished cars at the other that things got realy interesting.
before that it was one shot projects that required a strong personality at the helm that could stand up to the company representatives and demand stuff.
without that you was at their mercy.
sure it was revolutionary compared to what came before, but compared to today it was a crawl…
The Evolution of Business Models in the PC Market
now able to buy and build superfast systems for the fraction of the cost of the system of yesteryear.
This growth and penetration is unparalled in the history of industrial products in the last 100 years, and is an amazing success. However, to get to this stage, the industry had to make its way through some issues and decision points. There are generally agreed to have been key turning points. What would have happened if they had gone differently?
a mac on every desktop.
wha? That was the most pointless thing I have ever read.
I wrote the piece to entertain, amuse, provoke. But also with a serious thought in mind. I had written another proper analytical piece about the same subject, which Thom may publish at some point if he feels its appropriate, and an impish sense of humour got the better of me, so off it went.
The serious thought is this. In several quarters, the argument is made that Dell’s recent problems, and the consolidation represented by HP-Compaq, show that the current industry model is failing. People argue in particular that Apple and Sun succeed in being more consistently profitable, avoiding competing purely on price, and deliver better quality products, because they control hardware and software and deliver a complete solution, and (not really true of Solaris of course) only let their OS run on the hardware of their choice, not yours.
Now it is obvious that in computing, at least in PCs, the so called ‘end to end model’ has become trivially unimportant in terms of share. What I tried to ask, and provoke thought on, was: what would have to have happened to make the reverse situation hold? How could it have occurred for the ‘end to end model’ to have got and kept a 97% market share? And what would a world in which that had occurred look like?
And would we like it better?
That’s the real question. It is to some extent a live question too. Because there are voices, and I would say, not voices which are on the side of intellectual freedom, who genuinely would like to see the enforcement of an ‘end to end model’ – so called, enforced by varieties of DRM and locking.
Anyway, that was the genesis, thank you all for reading and reacting, and sorry you didn’t like it.
why did not you such wrote it in plain english, like you just such did. you know say what you mean.
I understand why this item was written. I myself have been a big fan of science fiction for many years, with a particular fascination for the alternate universe thing. But if you are going to write articles like this, then they should be more plausible.
Your fictional premise that IBM is able to protect their PC architecture and eliminate clone manufacturers from becoming competitors is interesting. But I respond to this by saying that if this would have been the case, then companies like Atari and Commodore would probably still be around today, and Apple would have a larger market share. So it wouldn’t be an all-IBM (or almost all) world as posited by any means. There might have been some of the issues regarding software compatibility across disparate platforms as you suggest, but there would still have been a lot of advanced software developed for the desktop systems that would be out there. And you also forgot about Bill Gates. The first software he developed and sold was a BASIC interpreter that ran on S-100 bus machines. He would certainly have developed and sold software for any dominant system architecture.
As for the Internet, the birth and development of the technology behind this had nothing to do with the PC. The Internet actually descends from a DARPA project that companies like Honeywell and BBN worked on to develop a world-wide computer network for military command and control. This network originally ran only on mainframes and dedicated communications front-end processors. It was only later with the proliferation of PCs that effort was undertaken to develop a means to interconnect these devices in a similar network, which is really the starting point of the Internet as we know it today. This could have been done with Apple, Atari or Commodore systems just as easily as with IBM PCs or clones thereof. There are no aspects of the protocols used on the Internet today that can be restricted by commercial interests, since they are all based on public RFCs.
Finally, regarding the comment about Dell and “the business model failing”: wrong big time. Dell’s problems (if you can even call them that, since they are still an incredibly profitable company), are based on arrogance, greed and complacency. They even stated themselves that they had let their price points get too high, thus opening the door to other manufacturers in the desktop and commodity server space to gain market share with lower prices. And where is Dell hurting the most? In the server space. Dell has focussed primarily on smaller, general purpose servers that are used mainly for supporting MS Exchange, Web servers and the like. I have many customers that like to rack and stack Dell PE2850s for just this purpose. But such servers are easy to manufacture, as they use commodity parts, so a lot of other companies are getting into this business with lower prices than Dell in some cases. But where Dell really fals down is in the large enterprise server space that support large valume transaction processing and large databases. This is where Sun, IBM and HP do well, and Dell has a long way to go before they can be a player here. But they are still king of the desktop, and I see no indications of this changing anytime soon. As for the mention of COMPAQ, they, too, got too complacent and let their cost structure grow to unmanageable levels. They also thought they could move in a big way into services and solutions selling, which did not at all fit their business model. And, of course, their emphasis on using the channel to sell commodity products instead of selling direct like Dell did not help.
In the para starting ‘As for the Internet…’ you are basically agreeing to a large extent.
Here is how SiloWorld differs from today:
1) You can only buy a computer to run an OS from the OS developer.
2) The market is therefore fragmented, there is less hardware competition, which leads to higher prices, lower production runs, higher costs, slower functionality increases.
3) The prevalence of incompatible islands and higher prices lowers consumer demand. The Internet exists as protocol, but the apps hosted on it, shopping, searching and so on are far fewer and less profitable, if they exist at all. Because volumes are lower. Amazon may not exist. The dot com bubble?
4) There being fewer PCs installed means smaller network markets, much less and more expensive broadband, if indeed broadband in today’s form exists.
5) The network islands that existed before continue, the Internet is not today’s universal application rich and content filled space. You still have Compuserve, the old AOL, eWorld, Prodigy. Because there are five or six incompatible OSs you don’t have total freedom to choose. Not all OSs are compatible with all BBSs. OnTyme probably still exists. Sending email from one BBS to another is iffy.
6) The application software space is fragmented by the incomptatible OSs and the smaller PC market, and so prices are higher, and you cannot always get the combinations of software you want, and this in turn diminishes the attractiveness of the PC as a consumer purchase.
And so, you look down the shopping mall and you don’t find stores selling software or the local PC shop. You find the Apple Dealer, the Commodore Dealer and so on.
There is no reason why Gutenberg could not exist. The problem is, too few people have PCs, scanners cost too much, and its just not plausible as a method of distributing books. Do digital cameras exist? Yes, they are professional tools, and probably are tied to the OS+ hardware vendor you picked. Digital music? 5 different clients for the iTunes store? 5 different OSs for the player to interface with? Where do the microprocessors get their volume production runs from to get the prices down enough? Think how long it took to bring CDs to market.
I do understand and accept that people found the tone of the article irritating. But on the substantial point, I think SiloWorld would be very different indeed from today in most respects, and a lot less fun. And a lot less free. And that most people are not thinking specifically enough to see that.
“1) You can only buy a computer to run an OS from the OS developer.
2) The market is therefore fragmented, there is less hardware competition, which leads to higher prices, lower production runs, higher costs, slower functionality increases.”
What I don’t understand at all is why that fragmentation would mean less hardware/software competition and less progress.
The early home computer market was heavily fragmented, with dozens of companies making incompatible computers, each with their own OS, peripheral connections, and unique software. Would you really say that there was less progress then than there is now?
To me the switch from 8bit computers to 16bit systems running multitasking GUIs was a massive leap forward, and it happened well before IBM PC clones came to dominate the industry.
You make a point about the lack of volume production causing higher prices, but that assumes that every company would make their own unique hardware. That certainly wasn’t the case in the 80s, look at how many 8bit computers from different companies used either the Zilog Z80 or MOS 6502, or how many 16/32bit computers used a Motorola 68k. Companies like Acorn who designed their own CPUs were in a small minority, and generally peripherals like 5.25″/3.5″ floppy drives and printers were manufactured by 3rd parties and had a degree of standardisation. Why would that have changed due to the lack of PC clones?
While the fragmentation and incompatibilities caused problems it certainly didn’t kill demand. The C64 is still the biggest selling computer of all time, even though you had to replace all your software and many of your peripherals if you switched to another computer.
Of course, even back then there were cross-platform standards, and adapters that allowed you to use hardware designed for different systems. I had no trouble using a minority platform (RISC OS) in the early 90s, despite most of the people I worked with using Mac, Amiga, or Windows systems. There were utilities to read differently formatted disks and converters for popular file formats. I see no reason why the development of that kind of product would have stopped in the future, so I find your “SiloWorld” a rather unrealistic worst case scenario.
When it comes to the Internet/web, I don’t see why having a variety of a incompatible platforms would change things significantly. In my opinion it was the popularity of the web, not the standardisation on one platform, that spelled the end for those “network islands”.
Consider the fact that the first Web browser was created for NeXT hardware/software, while NCSA Mosaic was written for the X Window System then quickly ported to Mac OS, Amiga OS and Windows, web browsers for RISC OS, OS/2, Atari ST, and various other platforms followed quickly. At the time the market was quite fragmented in the real world, yet the web quickly spread to all those incompatible hardware/software platforms. I just don’t see why it would be any different in your alt-universe.
I know I’m repeating myself here, but I don’t think anyone has really managed to address these points and I’d like to understand your reasoning.
Anyone remember the TV program Sliders, maybe an episode from them
<…>lipped about how exactly this is to be accompished, do hint…
accompished->accomplished
This is the result of Bad Drugs.
I’m sorry to say this, but this article has more holes than Swiss cheese. The idea that a monopoly OS running on a sub-standard commodity platform created all the innovation of the last ~15 years, including the huge adoption of the Internet, is flat-out absurd.
The computers by Apple, Commodore, and Amiga were all ahead of the IBM PC in sophistication. Why? Because they were competing with each other! The competition had nothing to do with clone makers, price wars, and semi-standardized platforms and had everything to do with the fact that if someone compared Company A’s platform with Company B’s platform and Company B’s was more powerful/easier to use, Company B won the sale.
The Web was invented on a NeXT machine. It and the other Internet services of the early-to-mid 90’s were largely powered by Unix software running on a variety of hardware systems, and the clients were a wide range of systems including Macs. However, most consumer OSes had to have Internet access software added on by third-parties. Windows, for instance, DIDN’T EVEN HAVE a TCP/IP stack until Windows 95! (I’m talking consumer-land here, not NT.)
All the major Internet protocols were invented from a Unix standpoint, and Unix has always been at home on a variety of hardware platforms. There’s little reason to believe that only by having consumer PCs based on a single semi-standard could the Internet flourish. The first major Web browser, Mosaic, was quite Mac-oriented, in fact. So was Netscape. Windows for some time was considered rather a joke in terms of Internet support, and until IE 3, any Microsoft-specific Internet software was highly dubious in its impact.
I believe the late 90’s dominance of IBM clone PCs, Windows, and the fall of alternatives actually stifled innovation. Only in the 2000’s with Linux, Mac OS X, open source Web browsers, and open source server technologies like LAMP is the Internet really on an innovation track again.