Fans of just about anything alternative all seem to suffer from a similar affliction: a naïve underestimation of the pains of switching. This goes for U.S. fans of the metric system, alternative fuel proponents, vegetarians, and yes, OS fanatics. Now, personally I’m all for a lot of those things I just mentioned, but as a lapsed vegetarian, I know full well how, despite the advantages of the alternative, sometimes it’s hard to switch and easy to go back.
Rather than try to mention all the reasons why it’s hard for an alternative OS to make it in the marketplace, I’d like to focus on one of the main obstacles to embracing an emerging platform: vendor lock-in. in essence, in the software world, vendor lock-in is when a customer is dependent on a particular vendor’s product, and the costs of switching from that product are prohibitively high. Cost of switching may be kept high by various means, but most of them focus on a lack of compatibility or interoperability, often intentional.
Data Formats
One of the classic tricks in the software industry is to keep a lid on data formats to promote lock-in. Customers will happily input and import data into a system, but find that exporting the data back out is very difficult, if not impossible.
In desktop applications, companies do this with document file formats. Users create a library of files that are only readable by the vendor’s application, resulting in platform lock-in. There’s another twist on file formats too: if you change the formats to not be backward compatible with older versions of the software, you may be forced to upgrade if you work in an industry that shares a lot of files, because when people start sending you files based on the new version, you will need to upgrade to read them.
This tactic rears its head in the OS marketplace when often-exchanged files such as documents and media files are saved in proprietary formats that are not readable on an alternative platform. Over time, ingenuity often wins out, as Mac and Linux machines can now display most files for Windows applications, but it’s a constant struggle.
Thankfully, most customers will not stand for such user-unfriendly tactics, so most software today offers some mechanism for exporting to interoperable formats, though in most cases you lose some of the special software features in the export.
Those special features are a corollary to the data format method of lock-in. A perfectly legitimate form of “soft” lock-in is for an application vendor to be constantly introducing new functionality into their programs that are reflected in the saved files. Old versions of the software and competing applications will not be able to read the data because they don’t offer the functionality. In this case, the users are locked in if they use those features. Microsoft and Adobe are especially adept at this method, but it can be good for the users because the software gets more and more feature-rich. Of course, a lot of this functionality ends up being gratuitous, and results in merely a more bloated install, and more complicated user interface. Application vendors walk a fine line when attempting this “soft” lock-in.
APIs
A European Commission report on Microsoft’s business practices quotes a Microsoft executive in an internal memo, cited in Wikipedia: “The Windows API is so broad, so deep, and so functional that most ISVs would be crazy not to use it. And it is so deeply embedded in the source code of many Windows apps that there is a huge switching cost to using a different operating system instead.” In other words, it’s easy to write software that’s intimately tied to the Windows OS, and takes more effort to write more-portable software; therefore, much of the software out there would require extensive rewriting to work on another platform.
And this does not just apply to off-the-shelf software that you might want to use. Most of the software written in this world is never sold to anyone. It’s written for in-house consumption. So if a particular company has written some custom software (and they have the source code and the skills necessary to port it) they may have relied so heavily on the Windows APIs while writing it that it would be prohibitively expensive to port it. And in-house software is more likely to have been written using this quick-and-dirty method, precisely because it’s meant for in-house use.
Of course, offering great developer tools and a deep and wide API isn’t evil. I’d venture to say that the developers for every platform want the best tools and the most developer-friendly hooks possible. It makes their job easier. But it also takes a massive expenditure of resources on the part of the OS vendor to make that possible. If a great API weren’t an effective anti-competitive tool, I wonder if Microsoft would have made it such a priority.
Embrace, Extend, Extinguish
Lock-in is a very old problem – it’s as old as commercial software, and decades ago people realized that one way for users to combat this problem was to demand and embrace open standards. So software companies “embraced” open standards. Some of these standards have been very successful, such as those that the Internet was built on, like TCP. Others, like SQL and HTML have thrived, but have had various extensions added by vendors that have threatened to balkanize the standard.
Again, Microsoft is notorious for its practices in this regard. Its attempt at hijacking Java might have been successful if not for Sun’s deep pockets and willingness to fight in court. It’s also been accused of trying to appropriate Javascript, Kerberos, BASIC, LDAP, HTML and many others.
The most egregious example of trying to press an advantage with HTML, however, wasn’t Microsoft, it was Netscape. Netscape, having created what was indisputably the best web browser of its time, had designs on controlling the direction of the World Wide Web. Netscape extended on the agreed-upon standards for HTML, adding scores of new tags, and popularizing the use of tools like Javascript to make web pages more interactive, but, in doing so, totally undermined the standards, and forced other browser makers to play catch-up. But web developers rejoiced, because they widely believed that the standard HTML was too constrictive of their creativity. But Netscape wasn’t even an OS vendor, right? No so fast. They actually had visions of the browser becoming an application deployment platform, which would have disintermediated Microsoft, so the impetus to be in control of the standards was there. Ultimately, Netscape’s threat to Microsoft was its undoing, and Microsoft appropriated Netscape’s role as hijacker of the HTML standard, adding DOM and ActiveX to the mix.
Even Open Source software uses the Embrace, Extend, Extinguish model. What is Linux if not an embrace of all the positive aspects of Unix, including many of the familiar utilities, along with many useful extensions? And it’s protected by a strict license that prevents the remnants of the commercial Unix vendors from taking Linux’s innovations and rolling them back in. The consumer is better off in many ways, as most observers would admit that Linux has achieved a level of widespread acceptance, utility, and, yes, even uniformity, than Unix had in its many years of existence, pre-Linux. And, though being free (Gratis) was a major factor in its popularity, it’s cost in relationship to the generally-very-expensive proprietary Unix hardware-software solutions it replaced places it firmly in the Embrace, Extend, Extinguish pantheon, since undercutting a competitor’s price is one of the time-honored tactics of this practice. (See: Internet Explorer)
So while E-E-E isn’t lock-in per se, its result is a de facto lock-in: as one piece of software uses an established standard as a wedge into a market, if it can gain users and bury competitors using E-E-E, it will eventually reign supreme, leaving users with no alternative. Often, the E-E-E tactic is followed with our next lock-in method, as the dominant player exploits its position by ensuring that the major third-party applications for its platform work only on that platform. Windows-only, Oracle-only, Photoshop-only, and yes, even Linux-only, applications and extensions are all too common.
Anticompetitive Partnering
The software world is a symbiotic mishmash of different software with varying levels of interoperability and dependency. Nothing exists in solitude. A PC with no software is a doorstop. A kernel by itself is worthless, an operating system with no applications is useless, but applications won’t run without an OS, and extensions and plug-ins won’t run without an app. Especially in the enterprise computing space, a particular application will generally be supported only for a particular mix of necessary software: a particular OS (and even a particular version, which might be an older one), a particular database, a particular application server, a particular webserver, a particular ERP system, a particular accounting system, and so on.
It’s difficult and costly to code, test, and support your software with all the various available versions of the above types of software, so companies typically pick one or two and focus on them. So companies that achieve dominance, often find it easy to hold on to that dominance, at least on the short term, because the major application vendors only work with them. And these relationships are explicit and often backed by contracts, co-marketing relationships, revenue sharing, and bundling.
Sometimes, it’s not even market dominance that’s the impetus for anticompetitive partnering, but mere shrewdness. IBM could easily have supported a variety of operating systems for its new PC, or picked a better-backed one, or written one itself, but it picked a second-hand one proffered by a young college dropout, and the rest is history.
Application Availability
Very closely related to the previous example is the obvious case of Microsoft’s Windows OS and the thousands of applications that only run on Windows and nothing else. It’s different from anticompetitive partnering only because there is no formalized relationship, other than the developer-vendor one. More developers choose Windows-only because it’s cheapest and easiest to focus only on the dominant player. But it’s by no means confined to Microsoft and OSes. In fact, it’s one of the most widespread practices in the industry. Any maker of software that encourages other developers to make dependent applications is hoping to lock-in some customers that way.
Peripheral Availability and Drivers
One of the major challenges that an emerging operating system faces is that of supporting the various peripherals and accessories that a user might have. Virtually every peripheral in existence supports Windows, and though some commoditized peripherals like keyboards might be easily supported in an alternative platform, others, like video cards and printers might be very difficult, especially when they contain special features that require proprietary information from the vendor. Even widely-used platforms like Linux and MacOS suffer from this problem, to say nothing of a marginal platform like SkyOS or AROS.
There is very little that an emerging platform can do about this. Solutions include: 1) begging manufacturers to write drivers for your platform; 2) trying to write them yourself, through various methods of hacker heroism and brute force; 3) focusing on a small subset of peripherals, perhaps even manufacturing them yourself, and advising your users to use only those.
Some non-OS platforms have this sort of lock-in. Specialized applications in scientific, musical, or other niche fields have hardware devices that work exclusively with particular software.
Hardware Compatibility
A closely-related method of lock-in is also probably the oldest. The earliest computers were all hardware-software combos. In fact, as many people have pointed out, the “sharing culture” that was prevalent in the software world before software was widely commercialized emerged because hardware vendors made their money from hardware, and the software was seen as almost incidental. But hardware vendors locked their users into their software because you needed their software to run their hardware. It wasn’t until IBM created the “open” PC platform that it became feasible to write very low-level software for a platform without the say-so from the hardware vendor.
Apple is of course the most notorious practitioner of this method in today’s consumer computing world. If you want to run the Mac OS, you buy a computer from Apple. Even their imminent move to the x86 platform won’t change that.
Look and Feel
As people become accustomed to the way a particular tool works, that familiarity can sometimes act as a disincentive to switching. For example, I’m used to driving on the right side of the road. When I went to South Africa a few years ago, I had a fun time driving on the left. I was even able to become accustomed to shifting with my left hand. Luckily, the clutch was still on the left, or I would have been in big trouble. For some reason, though, in the car I was driving, the turn signal lever was on the “wrong” side, so whenever I would make a turn, I’d end up turning the windshield wipers on. Even after I was well-accustomed to driving on the “wrong” side of the road, and using the “wrong” hand to shift with, I was never able to kick the habit of using the windshield wiper lever to signal turns. It was just too ingrained.
Similarly, people have become accustomed to the way their computer’s user interface looks and works, and switching to another platform can be frustrating. Mac users trying Windows, and vice versa, can be an initial challenge, despite their similarities. Microsoft knew this. When the first versions of Windows came out, and were a blatant copy of the Mac’s UI, Apple sued, unsuccessfully, setting a precedent for the un-protectability of look and feel. As a consequence, many tools for Linux adopt the look and feel of Windows, or of common Windows applications. Many true believers and UI snobs decry this, but there’s a sound reason behind it — lowering the barrier to entry.
Hosted Applications
There’s an interesting trend in the computing world that serves to undermine vendor lock-in, but also enables some scary new forms of potential lock-in that make all these other ones look like child’s play. The internet opened up the possibility of applications that run on a server far away but can be accessed by a user over the internet, through a standard web browser. As I mentioned earlier, Netscape saw this new reality as a way of lessening the importance of the operating system for much of everyday computing, and so did Microsoft. If we look at today’s world, much of what a computer user would have used a standalone application for just a few years ago is now routinely, or even exclusively done online: looking up a word in a dictionary or thesaurus, finding a map, reading an encyclopedia, browsing a magazine’s archive, even everyday email use, managing photos, keeping a journal, updating an address book, scheduling.
In fact, almost all of what people regularly use a computer for can be done today online with hosted applications, even word processing and photo retouching. Some online applications, like Salesforce.com, have become fantastically successful commercially, with companies paying relatively hefty per-seat fees, because they’re so good, without all the management hassles of traditional software.
The scary part is that with the vendor having absolute and total control over the application, its features, and even your access to it, they’re in a position to keep you locked in like never before. They even have custody of all of your data in most cases, and in some cases, they have the only copy. Only time will tell what impact this trend will have on the individuals and businesses that use these services.
Connection to Proprietary Services
In some cases, software, and even hardware, is usable only when connected to a vendor’s proprietary service. A very widespread example of this is mobile phone carriers, who sell hardware and software that, for the most part, are only good for their service. If you stop paying, your hardware and software become inoperable.
But this is the case for other widely-used software, such as, increasingly, games. Most of today’s hottest games are primarily internet-based, and you play by interacting with other users over the internet. The industry has been slowly shifting from that multi-player use being a free value-add to being the only method of playing, and the connection to the online world of the game, with a monthly fee, is necessary for the game to be played at all.
In most of these cases, the various components, such as the software and the service (and sometimes the hardware) are seen as inseparable parts of a complete package, and they are. But they are also an example of the most successful and complete lock-in.
Ways of Combating Lock-in
So how can customers combat the various forms of lock-in? Well, they’ve been trying, with various degrees of success, for decades. The fact is, there are some forms of lock-in that are especially injurious to the user, and others that aren’t. Some of them, such as the gaming example, are really quite useful the way they are, are straightforward, and therefore don’t call for any kind of active resistance. If you’re not interested in paying the monthly fee, then buy a different kind of game. Similarly, if you detect lock-in that you feel might set you up for onerous terms in the future, then you can vote by not buying or using those products.
However, there are many forms of lock-in that are not so avoidable, and can, and perhaps should, be resisted. Various tactics have been tried over the years, and some have worked: Building isolation layers or emulation in software that breaks down artificial barriers is a high-tech solution with promise and a storied history. Promoting and defending open standards with teeth has had success, but only when it’s resistant to Embrace, Extend, Extinguish. Promoting platform-independent development environments like Java and CGI-based internet applications has worked well, and the attempts at co-opting popular closed platforms, APIs, formats, or standards, such as the Mono project or Word or Photoshop file formats is another.
The software industry is pretty unique in the amount of control that a vendor can exert on its users throughout that user’s experience with the product. Manufacturers of other products can only dream of that control. But users can, and should, push back when necessary, lest the industry run roughshod over them. We can’t forget that software companies exist first and foremost to make money now, and set themselves up to make more money later. Even the free software movement has a larger goal of influence and market control in which you are a mere pawn. Seeing and recognizing the various methods of lock-in are the first steps in being a contentious advocate for your own consumer software rights and forcing the software makers to be user-centered and look out for your long-term interests.
Very good article. So why can’t we switch to alternative fuel proponents, beginning with hybrid technology today? If the estimations are right, we have to do it eventually within the next 30 years or so. And already now, oil prices are climbing daily, and that won’t change…
But being “alternative” is truly not easy, I noticed this the hardest when trying to use Linux Desktop Full-Time. Eventually swichted back…
The metric system is not alternative it is the standard. Everybody uses it including USA’s organizations like NASA. Even in UK it is almost uniformly used as I have been told by UK citizens I know.
The metric system is certainly an “alternative” here in the US, unfortunately. And frankly, because of the cost of switching, I wonder when we’ll ever be able to adopt it. And as for fuel goes, hybrid is an interesting way around lock-in, because it’s not really alternative fuel since it still uses the existing gas infrastructure, but it’s the gateway for bringing electric car technology into the mainstream. Can you think of an analogy to hybrid technology in the software industry?
“but it’s the gateway for bringing electric car technology into the mainstream”
open source cross platform software like OO.org is the gateway for bringing open source technology into the mainstream….
well, i tried…
i switched to using Linux exclusivly on the desktop several years ago, it was a little bumpy at first but its all mostly smooth sailing now, occasionally i have to RTFM but i remember even Windoze has more than its share of problems too…
no OS is perfect, just depends on what hoops you are willing to go thru for making your OS work for you, with Linux there is slightly more of a learning curve (but even that is almost gone now with the more user friendly distros) or you can accept an over-priced OS with draconian EULAs and big brother product activation and call home to the mother ship – and get a margionally more user friendly OS that is more fragile and more prone to infection to malware & viruses…
i rather use the one that just requires more learning and user knowledge than the latter…
Vendor lock-in is fine until vendor bites customer on a$$. Always have done, always will – look at .NET:
Microsoft released some (but not all) parts of .NET to the open source community, and are STILL trying to push .NET 1.0 even though they admit both that it has failed and that .NET 2.0 will be incompatible with NET 1.0.
When that happens, it becomes a turning point – IBM-compatibles killed off all microcomputers but Apple, and UNIX adoption slowed because of the UNIX wars, whilst before that people switched to VAXen running UNIX instead of VMS because DEC had dropped the PDP-10 and all but one proprietary OS for the PDP-11.
(VAX hardware was originally VAX-11, the Virtual Address eXtension to the PDP-11, with compatibility mode for the older software. However, later they dropped compatibility mode, and from the start VMS was meant to look like RSX-11, leaving users of RT-11 et al. out in the cold.)
Just when I decide to switch full-time to Fedora. I just need to get all my ISO’s down and onto CD’s. Anyway, I’m going ahead with it.
Here’s hoping for the best.
I think with sufficient cause, one can easily switch to a new platform and leave the old one behind–particularly if the old one has a rude community and continually fails, or is led down wrong paths over and over again by either liars, backstabbers, incompetent individuals, or people who simply demand that the OS or its features are going to be on the side of illogical, ill-conceived, poorly-thought-out ones that ignore what the end-users want. (Not that I’m thinking of any particular platform, mind you :-)).
Too many power plays and not enough common sense and decency. And the vendor thing I can readily see (but then, eventually they see the light of truth when people get fed up and leave).
–EyeAm
64-Bit SuSE and SuSE alone for me! SuuuuuuuuuuSE. Novell is beautiful and listens to user concerns (even if it’s the exokernel idea).
While I support and prefer the metric system, the one annoying flaw in its popular adoption that almost no one realizes is using mass as weight/force.
If the US does switch, then I hope we do it the right way, using Newtons instead of grams. In the imperial system, pounds is truly a unit of weight btw, and mass is measured in slugs.
.. for me this misuse of units is the equivalent of changing the semantics of “ls” for example..
no OS is perfect, just depends on what hoops you are willing to go thru for making your OS work for you, with Linux there is slightly more of a learning curve (but even that is almost gone now with the more user friendly distros) or you can accept an over-priced OS with draconian EULAs …
Apparently, you did not RTFA, or else you would realize that there are several reasons why many of us do not have that choice. And I guarantee you that user friendliness doesn’t really fit into the equation either. Even if I can use the OS without ever looking at a manual, if it doesn’t do what I need it to do, what good is it? On the other hand, if the OS requires a degree in rocket science but does a certain task that no other OS does, then it is certainly worth learning if that task is important to you.
If your computing needs basically boil down to simple, ‘bread and butter’ tasks (eg – browsing the web, listening to mp3s, etc), then you can pretty much take your pick of any OS you want to use. But once your needs begin to get even ‘semi-specialized’, your choices become much more limited. And those semi-specialized tasks don’t always and automatically lead to Windows either … sometimes they lead away from Windows. And sometimes the opposite is true.
I’ve been using Linux and assorted other FOSS software in some capicity or another since the late nineties, and, by all accounts, I am what could be fairly charaterized as enthusiatic supporter of FOSS “alternatives” with a decidedly optimistic view of the future possibilities for free and open computing platforms. That said, it is my opinion that altogether too many of my fellow Linux/FOSS advocates routinely fail to distinguish true analysis from advocacy–a fault shared in equal measure by both true believers of all stripes and sales and marketing folk–to their own detriment and possibly to the general detriment of Linux/FOSS as well.
This article offers an excellent presentation of the realities surrounding vendor lock in and the high cost it potentially imposes upon anyone attempting to switch to a new technology or platform. Linux/FOSS advocates should read this carefully and thoughtfully, not because they’re unaware of the costs imposed by vendor lock in strategies (freedom from vendor lock in strategies is, after all, one of the traditional core “talking points” of FOSS advocacy) but because all too often they wish to have both ways: lock in exists when pointing out the downsides of traditional closed source proprietary software but doesn’t exist when it comes time tally up the costs of switching to an alternative.
MS’s recent “Get Out the Facts” campaign does, in fact, rely upon many dubious studies, riddled with the type of sampling errors and false inferences which would earn a failing grade in most intoductory statistics courses, but there is more than a kernel of truth to be found within the FUD. MS has been very successful in creating lock in, which means that in certain market segments it has indeed become prohibitively expensive or even practically impossible to switch to an alternate platform. That’s not FUD, that’s the truth. And an Enron inspired accounting method, where the costs of escaping lock in are placed solely in the cost column for proprietary solutions, isn’t going to convince many whose job it is to manage IT infrastructure.
f the US does switch, then I hope we do it the right way, using Newtons instead of grams.
While I appreciate the differences between weight(force) and mass, i fail to see the US system as being significantly better. While the “slug” is “on the books” as a unit of mass, most places use the “pound mass” for such things. In common usage, for measurement of materials and the like, the US system generally reverts to giving data in tons, pounds, and ounces – all of which are stictly units of force and thus measurments of weights and would vary depending on the local graviational field. Doing chemistry, cooking, engineering, etc. based on wights is just silly. Recipies should be done in terms of masses.
Granted, most commonly used devices to measure “mass” actually measure weight and rely on the relatively constant acceleration of gravity under most situations to derrive the desired mass.
OSnews
Overly Scientific news anyone….
“Microsoft released some (but not all) parts of .NET to the open source community, and are STILL trying to push .NET 1.0 even though they admit both that it has failed and that .NET 2.0 will be incompatible with NET 1.0.”
.Net 1.0 was a failure? Since when? If it’s such a “failure”, why is the OSS realm working on Mono? Also, last I checked 1.x will be fully compatible with 2.0. There are some deprecated objects/members, but the compiler will graciously let you know what to do.
Re: This article. Very well written David, kudos.
Yes, it’s like my vegan frenemies, who think that switching away from cheap, red meat sources to their expensive soy-based, limited-availability products is merely a matter of choice.
And it’s protected by a strict license that prevents the remnants of the commercial Unix vendors from taking Linux’s innovations and rolling them back in.
While the GPL forbids the appropriation of source code for closed source products, it does not forbid corporate interests from using that source code if they play by the rules. If a vendor wishes feature compatibility without playing by the rules, they are free to implement those features without using GPLed code (or they can ask the copyright holder for the rights to used that code under a different license).
Needless to say, I didn’t see the point in continuing to read a misleading or misinformed article after that point.
while your right about that comment, its about the only one in the whole article that didnt make sense…
still, the rest in there is allmost old hat if you have been watching the it secor for some years…
This is a very well written article. Thanks, if only all articles were like this.
On a side note, I read this one article on CNN that said vegans were actually healthier than the average person because they eat only natural food without chemicals or other additives. They had more/stronger bone mass than the average person.
I will concede that I oversimplified the GPL issue and that it would be a little misleading to someone who didn’t know anything about the GPL. Excuse me for giving OSNews readers a little credit.
For the remedial OSNews readers out there: The GPL prevents a commercial OS maker from taking GPL-licensed code from Linux and rolling it into their code unless the derivative work is then also licensed under the GPL. A commercial vendor can of course take utilities, application, libraries, and other components that are GPL’d and distribute them with their commercial OS, but they can’t take open source code and fold it into their closed source code.
Unlike the BSD license, the GPL is very strict in ensuring that the code licensed under the GPL stays free. In fact, it is because of that strictness that many developers prefer to use the GPL. They don’t want some other person or company taking their code, adding to it (or not) and then selling it as closed software. They want others to be able to use or add to their code, but they want to have the right to take those additions back, and add upon them too.
MacTO, I fail to see what you found objectionable in my oversimplified characterization of the GPL’s role in keeping Unix vendors from appropriating Linux innovations, other than the fact that I didn’t write two paragraphs about the subtleties.
“And it’s protected by a strict license that prevents the remnants of the commercial Unix vendors from taking Linux’s innovations and rolling them back in.”
TOTAL LIE
you can implement whatever you want from linux it’s protected by copywright using the GPL licence, not patents
proprietary versions can implement a clone of the linux kernel without paying anything to anyone and keeping their product proprietary
osnews is trollnews
the only objective of this site is to start flames to achieve high hit rates
just like sensation media
eugenia … i would like to get an adjective to describe what you are doing but i think it would mean the censorship of my comment
Was it the GPL comment that you objected to, or was it the whole idea that Linux did an Embrace, Extend, Extinguish on Unix? I can certainly see why some people would find that to be a surprising assertion, and since we generally feel like E-E-E is evil, the knee-jerk reaction from a Linux true believer would be to balk at such a suggestion. I was actually a little astounded myself when I thought that up, since I generally believe that the emergence of Linux was one of the best things that ever happened in the OS world.
But I’d be curious to hear your thoughts on that, in greater detail, rather than just a cheap shot like, “Needless to say, I didn’t see the point in continuing to read a misleading or misinformed article after that point.” It doesn’t really stregnthen your position to admit that you can’t even bring yourself to read through something that you think you might disagree with.
“you can implement whatever you want from linux it’s protected by copywright using the GPL licence, not patents ”
I think you’re misinterpreting what I meant by the word “innovations.” Anybody can copy any feature from any piece of software they want, GPL, commercial or whatever, unless some dumbass patent gets in the way (as you point out, correctly). But you can’t take GPL code from Linux and roll it into Solaris. Sure, you can clean-room the Linux kernel, but that’s not what I was talking about. Good grief! What was it about that comment that struck such a nerve?
The metric system is just an alternative for the public. All engineering work is done using the metric system, otherwise standard constants and equations would need to be changed. The use of the pound/mile/… is just part of the old British heritage, and is what’s given to the public. All the big signs on the road are spelled out in Miles, but if you look by the side of all roads in the US, you’ll see the KM marks, which is what’s actually used when engineering just about everything. Metric system is awesome. Try to convert from inches to feet to miles. Good luck! That’s not the way engineering is done in the US! All engineers use Metric!
lulu
i will explain my anger:
because the free software movement is all about avoiding lock in and giving freedom to everyone. where i live ( in europe ) the free software movement is also trying to fight software patents. people that are making decisions are confused because people like you are mixing copywright with patents to change their mind. please don’t ever again mix copywright and patents.
also patens help vendor keeping up vendor lock in. so please think better before expressing yourself.
Wouldn’t everyone moving to an Open Source Unix be a form of lock in. That’s what I got from the article. What good is Open Source App X if it’s locked in to Linux syscalls, making it so that I can’t running it on my OS of choice?
if you are using free software you can change the syscalls to the linux kernel for it to work with other operating system.
and that’s why free software usualy runs on bsd, proprietary unix and even windows besides a huge number of computer architectures
if you are using proprietary software that uses syscalls to the linux kernel, then you are locked in. but that’s a proprietary software problem. that has nothing to do with free software. that’s why we still don’t have a flash plugin for Gnu/linux running on anything other then intel. so get smart and stop using proprietary software if you care about lock in.
so stop claiming that free software get’s you locked in. it does not make any sense. AT ALL
Re: .NET being a failure,
http://news.zdnet.co.uk/software/windows/0,39020396,39205912,00.htm notes that Steve Ballmer of Microsoft says that: “Microsoft chief executive Steve Ballmer has confessed the software giant’s .Net strategy has come to a standstill, says he’s accepted SQL Server’s shortcomings and vowed to keep fighting search giant Google.”
As for Mono, it is clear there are proprietary offerings linked with .NET that Mono won’t get their hands on – so we’ll see how well Mono fares vs. .NET. Also, I’ve no knowledge of any product that actually USES Mono – just because someone is developing something doesn’t mean people are using it.
RE: .NET 1. 0 (actually, re: 1.1)
http://www.eweek.com/article2/0,1759,1820225,00.asp
“Microsoft has identified situations where applications written to the .Net Framework 1.1 break when run against the .Net Framework 2.0.”
if you are talking about a operating system the correct term is GNU/Linux which is Free Software
Linux itself is only the kernel and not very good for comparation with complete operating system such as Microsoft Windows
it just doesn’t make sense comparing and kernel with a operating system. the kernel is only a tiny fraction of the big thing which is the operating system
As for Linux using “Embrace, Extend, Extinguish” tactics against UNIX (or anything else), that’s just FUD. Your capacity to charge for Linux is only limited to what the market will pay for; you can get it free, yes; but if RedHat are to be believed, people are paying them upwards of US$10,000 for their “Enterprise” versions.
UNIX failed because (a) it priced itself out of the market, and (b) different vendors tried to Embrace and Extend UNIX and Extinguish other UNIX vendors. Linux coexists quite happily with Windows, OS/2, and all the BSD’s on whatever hardware they’re available on (not to mention the various proprietary Unices). The Linux kernel, because it’s GPL, CANNOT be used to extinguish others because the technology is visible and can be re-implemented by others.
True, other software that’s not GPL’d can be added on top of the Linux kernel, but that’s the fault of the coder or distributor of that software, not of the Linux kernel.
Any attempt to distribute “core UNIX utilities” on top of Linux (the kernel) that were not under a FOSS licence would probably be seen as an attempt to “fork” Linux (the OS), and therefore fail, anyway.
Try to convert from inches to feet to miles. Good luck!
Don’t need luck. It isn’t very hard. The same people who have problems converting inches to feet to miles would still have a problem convering centimeters to meters to kilometers. I remember reading a rather enlightening little article about how lumber is sold in Europe by the 120cm lengths, since 1m isn’t divisable by 3 evenly. Many countries have only converted to the metric system in the last 30-40 years or so – including Canada and England. Many people in England still use feet and miles, and you will find it that way in a lot of other countries as well. What about clocks? calendars? degrees in a circle? Why haven’t they been changed? Why are things still sold around the world by the dozen? So see, metic isn’t as perfect as some people make it out to be. Base-10 might be great for scientist, but normal everyday human life needs something that can be split up into 1/3, 1/4, 1/8, or whatever. Maybe I can make up a unit of measurement just because I want to be different and make my country adopt it. Of course, people 200 years later will still be using the pound – like they still do in France.
Back on topic, good article, btw.
“True, other software that’s not GPL’d can be added on top of the Linux kernel”
LIE AGAIN
you can add anything on top of the linux kernel and it doesn’t have to be GPL
see nvidia drivers
flash plugin
you just can’t use the code that is protected by the GPL on proprietary software
you can have proprietary drivers and software on top of the Linux kernel
While the points you’re making are discussion-worthy, you might want to try not interjecting “LIE…LIE AGAIN,” etc. Do they really add anything to your comment?
Take a deep breath. Relax. Keep posting. We’re listening.
Nvidia kernel drivers are well-acknoledged to be on somewhat shaky ground as far as GPL-compliance. It’s quite difficult to make a kernel module that doesn’t use at least some header files from the kernel source, which are GPL. Nvidia’s sole saving grace right now is the fact that their kernel module part of the driver is actually open. The closed part is in the OpenGL libraries they distribute, which “talk” to the kernel code (i’m probably very much over-simplifying this, I admit).
The flash plugin doesn’t even come into play – it utilizes NO GPL code whatsoever – it access shared libs which are under *L*GPL, not GPL, which is how it (and other proprietary software) can exist on Linux.
Most libraries are licensed under LGPL, which allows linking them to a closed-source application, in certain circumstances.
Great. So now we get to hear from all of the apologists who try to explain why the “alternatives” are so much inferior to the status quo. What a bunch of self-justifying belly-achers. If only the inertia these people exhibit weren’t dragging the rest of society into the same hole they have ended up in. Not that I would presume to tell anyone what they should do — so kudos to everyone with the courage and fortitude to quietly set examples that others might follow.
Personally, I could care less about Bush monkeys, wealthy Gates dynasties, who does what to whom, or what OS they use.
I care about my personal freedoms. The ‘Pursuit of Happiness’ is one of those cherished liberties.
If my happiness, or, at least one of them, happens to come in the form of me being able to raise my ‘software’ middle finger to the ‘established’ ones – then so be it.
I just spent the last 4 hours installing and configuring win2k server and sql 2k on a spare machine because Microsoft provides no bloody way to get data out of a MDF file without going through the SQL server. And what’s even better, it appears that newer versions of sql server can’t import data files from all the previous versions – at least, the hosting provider I use can’t import it with the latest version of SQL server.
Thank you, Microsoft – I knew next to nothing about Databases before this, but they’ve successfully converted me to MySQL / PostgreSQL.
“Even Open Source software uses the Embrace, Extend, Extinguish model. What is Linux if not an embrace of all the positive aspects of Unix, including many of the familiar utilities, along with many useful extensions? And it’s protected by a strict license that prevents the remnants of the commercial Unix vendors from taking Linux’s innovations and rolling them back in.”
You got something completely wrong here – the licence only prevents commercial vendors from keeping the innovations to themselves, and that is the desired effect: it keeps open the possibility for sharing.
Example – how do you think Red Hat is doing business – by *not* keeping up with the new linux kernels?
The final release of Sql2k5 will support xcopy deployment of mdfs (backwards compat to 2k).
while this is true on the kernel level, the kernel is free for anyone to use. this includes the very companys that used to make inhouse unix kernels. and from what i recall there was no company in the unix world doing business like microsoft does. only comersial unix’s i can think of where made in-house to run on top of their own hardware and sold as a package. ie, aix (iirc) on ibm hardware and solaris on sun hardware.
the vital diffrence is that while say microsoft’s extension to kerberos can only be guessed at based on comparisons of network traffic and then have to be reverse engineered, the linux kernel can be moved wholesale into and existing unix enviroment and should in theory perform nicely with the existing software. rember that posix is a open standard and while linux isnt sertified, it have all the basics of posix implemented from what i recall…
so while on the surface it looks the same, the vital diffrence is that while microsoft keeps their changes under lock and key, linux have them out in the open for anyone to look at.
basicly, embrace and extend can be a good thing if all the changes are done in the open and stays open.
As for Linux using “Embrace, Extend, Extinguish” tactics against UNIX (or anything else), that’s just FUD. Your capacity to charge for Linux is only limited to what the market will pay for; you can get it free, yes; but if RedHat are to be believed, people are paying them upwards of US$10,000 for their “Enterprise” versions.
No, they are not to be believed. They are growing their revenue on trees.
When looking at how complicated unit conversions are, you have to look at what percentage of a population can sucessfully convert one unit into a different one.
For example, take a length of 654 m and let european people convert it to km during a street interview, then take a length of 823 ft and let american people convert it to miles.
You will see that in Europe the percentage of correct conversions will be significantly higher than in America, while simultanously the average time needed for the correct conversions are less in Europe.
Even engineers (I am master of science in mechanical engineering) find it much more easy to make calculations in metric units.
If we calculate coupled thermodynamic/mechanical systems we NEED to do everything in SI units (System International), because otherwise there would be SOME error somewhere. So in that cases we cannot even use millimeters or grams or degrees or celsius, but have to use meters, kilograms, radians, and kelvins simply becuse of being LESS COMPLICATED.
Selling wood by 120 cm is nothing special. If you want to divide something in three, you do good to choose an initial length that is easily divisable by three. You could with the same argument say: How funny, they are selling 60 inch wood logs, why such a strange number, why not 100? Simply because customers demand it like that.
You should notice: The UNIT is still cm, not yard or foot or inch.
Definition: I use “metric” as “units derived from SI units” e.g. mm, km, MPa, … . And SI units are underived units like m, s, kg, N, … .
PS: Of course you Americans have the trouble to get used to the metric system, we in Europe are already (mostly) out of it. The next generation here in Austria will definitely no longer learn ANY old units in school. How will americans then talk to europeans about car fuel consumption and these things if europeans no longer know that gallons and miles even exist?
“So why can’t we switch to alternative fuel proponents, beginning with hybrid technology today? If the estimations are right, we have to do it eventually within the next 30 years or so. And already now, oil prices are climbing daily, and that won’t change…”
When will you people get it: hybrid cars are no better for the environment than internal combustion cars. They don’t use less oil either. They burn less gas directly, but where the hell do you think the electricity they use comes from? Fossil Fuel burning powerplants. And no, there are nowhere near enough windmills and hydro plants to meet the demand, and building them would use a ton of oil for the machinery etc.
And please let’s not talk about hydrogen powered cars either FFS. Hydrogen is extracted from water using.. electricity. Which comes from blah blah blah.
No talk of corn powered cars either please. Without oil based pesticides and natural gas based fertilizer, loads of fuel stock would get eaten and we’d suck the ground dry of all its minerals. No-go there either.
None of the hybrid or hydrogen cars will result in less oil being used. There is seriously no hope, so please just get over the pie in the sky magical uses-no-energy car. It doesn’t exist, we are screwed.
You are just repeating what I actually said.
# David Adams : Unlike the BSD license, the GPL is very strict in ensuring that the code licensed under the GPL stays free
The code released under the BSDL stays free. I know a lot of people can’t believe that, but it’s a fact : code released under the BSDL stays free. It can be integrated into a proprietary product (or a GPL one), but it stays free.
# Rho : Of course, people 200 years later will still be using the pound – like they still do in France.
Very few people use pound in France, and for most of them it’s a way to say 1/2 Kg (and you’ll never heard anybody asking for 2 pounds or 4 pounds of something, but only for 1 or 3 pounds).
. IBM could easily have supported a variety of operating systems for its new PC
They did. The early PCs also ran CP?M. However the 8088/8086 chip was designed to allow for 16x as much memory to be used. Further since memory pages could be assigned starting at virtually any byte (every 16 bytes) it was far easier to organize complex applications or collections of applications. CP?M died on the PC very quickly because Dos took advantage of the hardware to permit types of software to be easily written that couldn’t have worked under the older operating systems. No computer operating system/hardware bundle at the tme could have supported a wordperfect or a Lotus 1-2-3. Very soon thereafter this changed but that’s another side point.
Further Microsoft itself was selling multiple operating systems. By the time CP/M on PCs was completely dead Xenix was out and IBM was partnering with Microsoft to write OS/2. Incidentally Minix came out around that time too and well the rest is history.
But hardware vendors locked their users into their software because you needed their software to run their hardware. It wasn’t until IBM created the “open” PC platform that it became feasible to write very low-level software for a platform without the say-so from the hardware vendor.
IBM didn’t create an open platform. There was a tremendous amount of Rom which was semi-nonstandard. Further there was a BIOS which was used by the operating system and by software. Compaq in reversing engineering the BIOS created an open PC platform.
The next generation here in Austria will definitely no longer learn ANY old units in school.
I am pretty sure the usage of horse power will stay a lot longer as one generation, even if the correct unit would be (kilo-)watts.
Same for units in areas like air traffic, where distances are in nautical miles, heights are in feet and speed is in knots
You certainly have a point about virtually all of the “alternative” fuels being promoted requiring electricity that right now would have to be generated primarily from coal (in the US at least) but you need a little information on how the hybrid car works. It generates its own electricity using the built-in motor. You don’t plug them in. The hybrid’s benefit is that the small gas motor is always running at optimal efficiency. Electic motors are much more efficient for stop and go traffic, while a small gas motor is very efficient at constant high RPM. So they’re a good way of making existing infrastructure more efficient.
I do agree with you that the alternatives are problematic, but there are plans in place for dealing with those, too. We won’t get into that here, but people are thinking about those issues.
Hosted Applications: heh, so far my favorite hosted application is Google Earth, wow!
Although, I’m continuing a mostly off-topic discussion, I find it interesting so what the heck.
The mostly Imperial derived ANSI units of measure has long been hard coded in US fabrication and tooling. Our machine tooling has generally been the indicated reason the US continues to use ANSI units rather than introduce error by conversion. For instance, 2cm is a fairly precise measurement on metric tooling, but on ANSI tooling trying to find 0.787401575″ is a difficult task.
Not even ten years ago I would have said buying a metric lathe in the US a futile, significantly overpriced task. The new prevalence of embedded computers on the machine shop floor has allowed metric precision on par with ansi units, thus opening new doors to US fabrication. Now, largely, the ‘raw’ materials vendor, who still uses ANSI, is generally the deciding factor on what unit will be used. Fabricating SI parts can become prohibitively expensive while working with ANSI materials.
Alrighty, inch-foot-mile: 1 inch = 1/12 foot = (1/12 *)1/5380 mile. Wow, that’s intuitive. cm-m-km: 1cm = 1/100m = (1/100 *) 1/1000km. Hmm, shifting a decimal place, that was hard.
I actually like the analogy of metric vs standard in respect to this article since any reasonable person can easy jump back and forth between the two (provided they have a cheat sheet to look up constants and such) i.e. no one driving from Canada to the US should be screwed when the signs change from km/hr to m/hr (even if they didn’t have their odometer to help them out). Of course, when things go wrong, they go very wrong – wasn’t that last probe to mars lost because half the work was done in metric and the other half in standard and no one took the time to check?
while the original bsdl coverd code stays free its soon be made obselete by any corporation that dont want to be cheritable by introducing changes that make their product incompatible with the original and not releasing the changes.
only way to avoid this is by building up a kind of technological inertia around the bsd code so that to change to the incompatible product would be more trouble then its worth. but its a uphill battle tho as the other side can take your open changes and incorporate them while not releasing their changes. thats what embrace and extend is about, one way compatiblity. you can get in but you cant get out…
“but where the hell do you think the electricity they use comes from? Fossil Fuel burning powerplants.”
how is that, i dont plug my prius into anything and i get 60+ miles to the gallon and a fairly nice ride…. thinking about a insight for the wifey too….
while the original bsdl coverd code stays free its soon be made obselete by any corporation that dont want to be cheritable by introducing changes that make their product incompatible with the original and not releasing the changes.
That makes sense ?
How are we pawns in the OSS agenda?
I can understand your point that OSS practices E-E-E,
but to say that any user is captive to it,
in any way more than the GPL constricts them,
is totally inconceivable to me.
In addition, the Open source license used by the BSDs is as nonconstrictive as anyone could hope for…