Right now the situation for developers of minor operating systems seems somewhat bleak. Windows and the Unixes compete in the server world, and Windows and MacOS X compete on the desktop. Linux even gets ported to every embedded device, leaving few niches for the hobbyist or sidelined operating system developer. Some have even gone so far as to say that New Operating Systems Won’t Stand a Chance. As anyone who reads OSNews can tell you, however there are a wealth of new systems with new ideas that just aren’t taking off. Given all these new ideas some – like capability security from EROS for example – should be good enough to catch on, so why aren’t they?
There are many intertwined reasons for this phenomenon, and a good place to start looking at them is the duality of opposites between operating system standards – POSIX most prominent among them – and innovative, revolutionary ideas. One the one hand, implementing a standard such as POSIX gives a new operating system a leg up and a library of working or partially working software, always a boon to an OS looking to build a userbase. On the other hand, the real gain of many revolutionary ideas like orthogonal persistence (definition: making RAM persistent by writing it to disk at checkpoints rather than using a filesystem to permanently store data) can only come from breaking compatibility with these standards and with mainstream computer use, isolating the project in question! If we ever hope to go beyond Unix and Windows a solution must be found so that new systems can do their own thing without cutting themselves off from the rest of the world.
The Unununium project found an interesting way to handle this that actually helps solve a couple of other issues, too. Rather than require that all programs for their operating system be written in a compiled language like C and link to their system calls to start with, they’re building a Python interpreter that can talk to the OS. This will render any standard Python code which doesn’t link to special operating system features or other software (a sizable minority) nearly instantly portable, irrespective of the fact that they implement that orthogonal persistence I was talking about above.
The next largest problem for new operating systems is, of course, hardware driver support. Thanks to fragmentation and disagreement in the OS community every new system must over years of work laboriously build its very own unique library of drivers to drive the very same hardware everyone’s already using on more mainstream operating systems! Some people think Linux is slow to gain hardware support, but at least it eventually does. Most hobby and academic projects can expect to be abandoned before they gain enough driver support to even match Torvalds’s kernel.
Here, at least, there is an apparent solution: a standard interface for device drivers to talk to kernels with. The problem is that it was tried and Project UDI‘s efforts at a Unified Driver Interface failed to be adopted by anyone with enough political clout or driver support to count. Even the companies that created, funded and poured effort into UDI haven’t actually adapted their operating systems to use it. The F/OSS software community doesn’t support it either, as when the debate about a driver API/ABI for Linux came up on Slashdot a month or two ago, the overwhelming opinion was that not forcing drivers to link to the kernel sources would lead to too many proprietary drivers! This sort of attitude might work for a system with Linux’s following, but no new OS is going to gain any support by forcing manufacturers or even other hobbyists and researchers to rearchitect their hardware drivers for each new, quirky kernel they want to run them on. UDI may not “benefit the Free Software movement”, but for those of us who hope to someday see our kernels running on real, everyday hardware it is a godsend. And just imagine if a major kernel like Linux or a BSD fork did support it, rendering many of its drivers usable under hobby system XYZ! UDI may be sponsored by proprietary software makers, even the latest “Great Satan” SCO, but it has reasonable technical merit and does the job it’s intended to do. More hobby and academic kernels should be UDI-compliant.
Yet what peril even awaits those who can write or obtain device drivers? The lack of runnable software. Every operating system that is developed must always have every single program of new software ported to it, even its own development toolchain. Again, while standards like POSIX and other similarities to mainstream operating systems can make this work easier, it is still a major piece of work. The integration of interpreted and high-level languages into the userland of the OS also helps in this regard, but more needs to be done. Every operating system ports certain important pieces of software like its toolchain and usually some user stuff like a text editor as well, and we have tutorial websites like BonaFide OS Development and The OSFAQ Wiki. Put them together! When you port a piece of extremely useful or common software you think everyone else will support, write about how you did it so everyone can learn from your experience! I know I will, and this collaboration should help with the effort of getting a userland set up.
Of course these few ideas won’t make the world an OSdevers paradise, but I humbly think they will help. Standardized device drivers will aid new OSes in picking up the hardware support they often desperately need, while high-level and interpreted languages at the basic levels of systems will provide a small but substantial software base to systems getting onto their feet. Finally, I’d love to see an effort to document how common, everyday programs can be ported to new operating systems, because we all know mucking about in the GCC source tree without a map is nobody’s idea of fun.
About the author:
Eli Gottlieb is an operating system hobbyist whose pet kernel lives at Glider. He hopes he’s helpful, or at least sparks off good dialectics in the comments.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
It sounds like Eli really hit the nail on the head. The worst part about developing a new OS has got to be trying to write device drivers for the millions of devices out there. If there were a unified device driver API like UDI out there, then one of the big hurdles of OS writing would simply go away.
That would be awesome!
I agree on this, if an OS-developer could write one driver which fits all soundcards and one driver which fits all network cards, it would certainly be very useful…
Um, I thought the UDI meant that a driver for a specific piece of hardware would work with any kernel with a UDI.
Correct me if I’m wrong though.
You’re perfectly right, he went off on a slight tangent about NVidia/ATI. It really would be nice if they’d release some specs, though, even if it were just for 2D acceleration.
I agree on this, if an OS-developer could write one driver which fits all soundcards and one driver which fits all network cards, it would certainly be very useful…
Well, that is what Microsoft is trying to do, create a *very* basic sound card based standard in which ALL sound card companies conform to, meaning you can have a simple base driver which supports all sound cards, and if you want special features, you go off and download the special vendor specific driver.
Its a nice idea in theory, but like so many other things, all that has to happen is have just one company to make either a half assed implementation or basically don’t implement all the features as they should, then you pretty much end up with the current mess with APCI – a great standard buggered by lazy motherboard manufacturers and operating system vendors not properly supporting the standards.
impossible. difficult hardware always has different protocols.
Unless there is a “uniform sound card driver protocl” and all the hardware designer conform that, a universal sound card driver will be possible. but that will put too much effort on the hardware design, maybe need some embeeded OS running on soundcard to provide a uniform interface.
so it is not likely to happen.
“BonaFide OS Development”-link should point to http://www.osdever.net/
Its WAY more than any problem with device drivers, thats delusional.
The problem is that people are sheep, and will refuse to try something new (aka not-windows) even if they could loose weight, find love, and get rich by using something else they STILL WOULD keep using windows. I had a guy the other day say “Man, you ever use that Mac stuff its all backwards”. Like apparently anyone who uses Mac’s have aids or something. He was all serious and shit too, like its a big scary vodoo computer or something. Cracked me up, but THAT is what your dealing with in the real world.
It would be delusional to think that any new OS is the next windows. The people that are going to try a new OS are the ones that like to tinker. If a person could start up a new OS with Linux/Windows level hardware compatibility, it should be easy to get a few hundred or even a few thousand (if the OS is really good) people to form a community for it. That’s plenty to keep an OS going.
I think most people that are developing new OSes aren’t doing it to take over the OS market. I think they’re doing it because it’s fun and interesting and they want an OS that has the features they want.
The best OSes are written by those that write code because they love it. If you’re writing an OS to make money, you’ll end up like windows. An OK, well marketed, not-very-innovative OS that’s just good enough to keep its market share. Where’s the fun in that?
You are absolutely right. Linux GUI designers, for instance, when they think of “usability”, think of this idealized novice user who hasn’t been tainted by Windows. While such users probably exist in the third world, the vast majority of computer users are so affected by the Windows experience that to them “usable” means “Windows-like”. Incredibly brain-damaged, or at best arbitrary things that Windows do become completely natural and logical, and it forces alternative GUI designers to cater to these expectations. Even in things such as program names, like the article we had yesterday on Linux names. If someone had never used Windows before, they would have no way of knowing that Excel is a spreadsheet, Outlook an email client or Power Point a presentation maker, yet it’s too damn hard to figure out that MPlayer is a music player and XCDRoast is a CD burning application.
I don’t think you read the article, it is about development hurdels with building new operating systems, not getting people to use them. You are way off topic.
The fact is world has gone too far in too short time! But if at all one want to get going, still have a slim chance in OS. Natually standardisation is the key for success. For example VESA drivers for video works almost for the all video cards, similarly for every hardward we need a standard.and then OS writer can start off. The second way could be like apple, where you write,test and deliver code only for the specific hardware!
Number of OS are available but they are not picking up because of lack of interest, unavaibility of the applications, cost to buy them and most important the new OS need to standout from what is available in the current market!
> Windows and MacOS X compete on the desktop.
Various Linux distributions have an eqaual or higher desktop share than Macs, so there _are_ many people for whom Linux is more than desktop ready. The fact that it doesnt suit the computer illiterate users doesnt lessen its desktop readyness.
According to the summary of the article, the article seems pretty arrogant and ignorant, as its author probbably is.
This is the author of TFA speaking. I USE Linux ON MY DESKTOP. I’m partway to convincing my mother to use it. I just don’t see it becoming synonymous with “desktop” the way MacOS X and Windoze currently are for their hardware platforms.
And, as are you. Whoever you are.
Instead of guessing as to the reasons why its hard to develop a new operating system, why don’t you go an interview the developers of say, 3 or 5 alternative and completely different operating systems?
SkyOS, MenuetOS, and so fourth. Ones that aren’t based off of Linux and have their own kernels and interface layers.
Guessing is nice, but finding out what happened to people who actually went and DESIGNED OS’S is far better.
Instead of guessing as to the reasons why its hard to develop a new operating system, why don’t you go an interview the developers of say, 3 or 5 alternative and completely different operating systems?
I don’t think most of them know for a fact why people don’t use their systems..
I doubt they did the proper surveys 🙂
this little talk from R. Pike:
http://herpolhode.com/rob/utah2000.pdf
(and for fun, peek at http://fred.cambridge.ma.us/c.o.r.flame/msg00037.html as well 🙂
It’s hard to believe that Rob Pike wrote that in 2000–very prescient. I wonder what the future really holds. I wonder if multiprocessor, multicore computers will one day soon make microkernels actually more practical, whether the messaging “overhead” that makes them impractical at present will become a non-issue. Maybe in 2010 we will all be running GNU/Hurd.
I’m curious: why the multicore computers would reduce the overhead of micro-kernels compared to monolitic OS?
I don’t see why it should..
OTOH, L4 people claim that their implementation have a very low overhead (I don’t really understand how they manage to do it and other failed), so HW/SW compatibility may be the “only” reason which would prevent people from switching.
While the monolithic kernel starts in development with one major lock (which makes it not scale well with more cores) and gets hand tuned to use finer locks instead, the microkernel starts with a lock for each different component (it might depend a bit on the os how many different modules are running in user mode) and scales a good bit better then already. Of course you can fine tune the micro kernel, too.
Probably if both were perfectly tuned they would scale equaly well, but before both are perfect the micro kernel has a bit of advantage there – if there is work that uses a lot of different os functions.
10 year ago, discovering Solaris at the University, i understood what a real OS should be. I downloaded Linux (0.99.x) and then I ran Linux at home. No more wfw3.11 crap or wfw3.11 in disguise (aka win95).
But now I don’t care if it’s linux or windows. Both are rock solid if h/w & drivers are. Coding GUI wxWidgets lets me not care. Or why not some Java? Batch programming? ACE. My main development is now in winXP and I have Cygwin.
Ok, ten+ years ago it did matter, but now? For embedded and RTOS it’s a different story, agreed. But not for the desktop. imho.
/Meng
10 year ago, discovering Solaris at the University, i understood what a real OS should be.
But is Solaris what a real OS should be?
Roughly 25 years ago, coming from an Apple DOS 3.3 and TRS-80 environment, I was introduced to CDC’s KRONOS and NOS timesharing operating systems. In college I was introduced to Unisys’ OS1100 and DEC’s VMS.
There are so many things found in those mainframe and miniframe OSes (as well as in various IBM operating systems) that don’t seem to be present at all in Solaris, and some of those things were (and still are) extremely useful for those who used or are still using those operating systems.
The computing world isn’t just Unix and Windows, but it seems like many of the so-called “experts” today have defined it in just those terms.
This was an extremely well written article but we have to remember that home computer users today really only have two hardware platforms to choose from:
1. PC.
2. Macintosh.
Back in the days of the 1980s, there were a multitude hardware platforms the home user/enthusiate could choose from.
a. PC; various manufacturers (IBM, Compaq, etc.)
b. Apple; Apple II, II+, IIe, IIc, IIgs, Macs & more
c. Commodore; Pet, VIC-20, C-64, C-128, Amiga 500-2000, and PC Clones)
d. Atari; 400 Series, 800 Series, ST Series
e. and the list goes on and on…..
These computers had their own individual requirements and hardware wasn’t interchangeable between vendors. They all had their own proprietary way of doing things and we made the best decision based on wants, needs & budgets.
When you compare the old hardware-days to today, its somewhat similar. Various companies have video cards that support the PC, but most center on Intel i810, NVidia or ATI Chipsets and a few others.
Since they’re using the same chipsets, they have to make better drivers to out-perform the competition. They all claim to have the best resolution, performance, etc. while the drivers are viewed as, ‘proprietary,’ by the companies.
But they still use the same chipsets…….
While the situation for drivers makes many a developer’s life harder than it should be, they cynical part of me sayss that this all about money.
And since this may very well be all about money, I wouldn’t be looking for this situation to change.
Virtualized i/o devices. The Xen project, for instance, could well prove to be a major boon to hobbyist and niche OS developers. Under Xen device drivers for the hardware are provided by dom0 and then virtualized for use by the the guest OSs. Guest OSs then only need to have a Xen driver for each type of i/o device supported by Xen. At the moment, iirc, Xen really only supports storage devices and nic cards, but other device types are certainly possible.
On a side point, I agree with the difficulty in drivers talk. Personally, I’m trying to code my OS ENTIRELY in assembly, and trust me, it’s not easy! Vesa VBE is decent, but I’d rather go directly to nVidia and ATI and ask for specific specification for their cards, to get the BEST drivers going. Correct me if I’m wrong, but it’s no secret that Windows video drivers, performance wise, outpace linux, unix…and most other OSs. Is this right? Of course not, but MS has the cash to buy the specifications, while the small “hobby developers” don’t.
Anyways…..
The article DID seem a little bias, but did bring out some good points. However, I believe that if an OS is GREAT, it will get acceptance no matter the competition. Everyone starts SOMEWHERE. Look at Windows for example, did they start out as a multi-billion dollar corporation selling a CoOl OS? No. They put out a good product at the time, and gained the respect of the community. Same with Linuxes (is that right? Linux plural lol) and Apple. Especially apple It IS hard for little known OSs to gain popularity, but NOT IMPOSSIBLE. NOTHING IS IMPOSSIBLE!
To wrap things up, overall a good article, but should have focused on more “hobby” OSs such as MenuetOS, VisopsysOS, NewOS….. 2 is good, more is better
Look them up at wikipedia: http://en.wikipedia.org/wiki/List_of_operating_systems#Hobby_OS
[Note: when I say linux or unix, I’m talking about all their derivatives (slack, debian, BSDs, gentoo………………..)]
–ZaNkY
>Personally, I’m trying to code my OS ENTIRELY in assembly, and trust me, it’s not easy
Well, running with both feet tied is not easy too..
Somehow coding in assembly doesn’t tend to make things easy, strange eh?
I never actually saw that Wikipedia page, thanks.
Ms DID start off as a multi billion company. The company’s name is IBM.
If it weren’t for that, MS today would be a company the likes of Borland, because that was what they were doing befor the hook-up with IBM.
They might not even be around.
if i were to start a new OS – i would develop it within a standard VM – liek vmware – that way i only need to support a small number of essential devices and many people can use the OS.
Linux (with an assortment of software added) IS ready for the desktop (!), and guess what…
…here’s (a couple of reasons) why:
1) It runs (on) the desktop
2) It prints
3) It shares
4) It connects
5) It puts and gets
6) It doesn’t crash (more than others do)
7) It is (as) secure (as others)
8) It does non-English
9) It does OpenGL
10) It does windows, icons, mouse, pointer
11) It does Word documents
12) It does Excel documents
13) It does Powerpoint documents
14) It plays media
15) It has Firefox
16) It does Exchange
17) It runs on my old Macs (Mac OS X doesn’t)
18) It runs Windows software (Cedega)
19) It runs VMWare (for money)
20) It runs QEMU (for free)
21) It has corporate backing
22) It has millions of users
23) It runs Apache/PHP
24) It runs MySQL/PostgreSQL/Firebird/Oracle
25) It is NOT Windows and NOT Mac OS
And it’s time you people who think this is not indicative of a succesful operating system start defining what is.
The typical definition is:
1.) Supports the latest fad.
2.) I’ll never be called a geek for liking it.
The latest fad changes every 2 weeks or so and it’s impossible for anyone (even Microsoft) to fully keep up with.
But, if you can qualify for 2 by being in the homes of 90% of people, or at least 10% of famous people, then you don’t have to do 1 all the time.
Bill Gates is called a geek. Does this qualify Windows as a geek operating system?`
Steve Jobs is certainly a geek – remember NeXT? Does this qualify Mac OS X as a geek operating system?
How about Jean Louis Gassée? Makes BeOS a geek operating system, doesn’t it?
DOS? It’s a shell!
UNIX? VMS? Sure, it’s used by bank accountants, but they don’t know, so I guess that makes them less geeky?
And what about all the people using Google (Linux) to search their way through cyberspace? Geeks?
The latest fad is what somebody says is the latest fad. You just said that somebody is not you. It’s not Microsoft. Who is it, then? Times Magazine? Linus was there.
It’s Thom .
I said you had to not be called a geek for liking it. No one is calling Gates a geek for liking Windows :-p.
Geez, you wonder why people vote you down; maybe it’s your inability to recognize humor.
Sorry, Thom. English is not my native tongue, and the cultural differences are certainly reflected in ones ability to see the fine nuances (used to convey humor) in language too. I might have taken it seriously when it wasn’t meant to be.
Nontheless, Gates is indeed called a geek, though, in fact he thrives on the image and his spin doctors and marketing department supports that image. He certainly uses Windows, too, and he certainly likes it and even boasts its superior feature set and capabilities (directly or indirectly), so that would make Windows a geek OS by your definition. But, of course, being the dominant OS can hardly qualify for as niche by any common use of the term.
Sorry, Thom.
Huh? What? It wasn’t me! I’ve got nothing to do with it– I didn’t…
Eh seriously now, what’s up with me?
Ok. Got it wrong again
Somebody (ma_d) wrote “It’s Thom”. Since I don’t pay too much attention to who’s writing what but more what is written, I guess I took it that signature “ma_d” was you, Thom, writing as a non-OSN staff member.
My fault. Well, The “sorry” should go to “ma_d”, I guess.
Confusion is inevitable
is… how many people actually read the article before posting comments? A great number of these comments have little to nothing to do with the article.
I read the article. It’s about niche operatings systems and their place in the global operating eco-system. It postulates that niche operating systems won’t stand much of a chance in taking a significant share of the user base of the existing eco-system, thus necessarily remaining niche.
Linux was a niche operating system in 1992, it is not today. People couldn’t fly 200 years ago, they can today. The internet was an exclusive 20 years ago, it’s inclusive today. This concludes my analysis for today.
Right now the situation for developers of minor operating systems seems somewhat bleak.
Stop right there. While I appreciate the technical analysis presented in the article, I think the details presented lose sight of the root causes that may help answer the “why” of the opening statement. To do that, I would take a more fundamental view of the issue and ask the following question:
What problems or deficiencies with existing dominant OS’s are egregious enough such that a new OS can/should be developed to solve such and therefore enjoy an accellerated adoption rate?
The response should center on the fundamental issues associated with an OS such as:
* Security – Everyone understands the problems associated with Windows security. The general consensus is that it was not built for the Internet age. That said, can Windows be made secure is that more trouble than it’s worth. Is the architecture of the OS flawed? Should an OS be more like modern gaming systems (locked down) and less like wide open gates {to hell]?
* Quality and robustness – Is Windows XP a quality product or does it have glaringly obvious shortcomings? If so, where are they? Having a system that has, for all intents and purposes, its own immune system so that it can adapt and recover from things that go wrong with itself or those introduced by the outside world could be perceived as a quality system. This is not limited to viruses or malware, but by poorly written software as well. The user must be shielded from error. This goes beyond simple structured error handling in code. We need immunity on a macro level. Analogy – I can cut myself by being dumb with a knife and my body repairs my own error. But if I catch a cold (not my fault usually), my body still protects me by correcting the fault. I am shielded in that manner. Extend this to OS functionality. Apps, drivers, everything. I don’t want to see a fatal error. Fatal means I die and that is not a good experience to me or my OS experience!
* Ease of Use – the abiity to make the user’s life simple and enjoyable is human nature and a valuable intangible asset of any OS. Yes, looks count. Form and Function – its an old adage. We want smarts that are good looking. Nobody wants good looling and dumb. Or dumb but good looking, right? Don’t answer that. Rather, is Windows XP hard to use? Is the windowing system flawed in obvious ways that can be significantly improved. Folks criticize KDE and even Gnome for “copying” the Windows way. Is that because Windows is, on a fundamental level, a good system? Or is it hideously flawed and can therefore be significantly improved? If you think it can, explain how. Be definitive. Help the KDE and Gnome developers.
* Hardware Support – The ability to effortlessly install hardware is vital. Does Windows have inherent issues that can be improved? Of course we all recognize the dominance windows enjoys in terms of hardware support. But how can the current system be “improved”. For example, what about a device that tells the OS how it needs to be communicated with. Inherent device driver built into it. Kind of liek the game console analogy. Plug in a game and it works. Its code is inherent. How about extending this to hardware too? Who says we need to continue with this idea of software-based device drivers? What if the hardware told the OS how it works natively? Firmware on steroids.
* Applications – having the tools required to do a job. Are there application limitations on Windows that a new OS will solve? Are the development tools necessary to provide solutions avaialbe. If not, will the developing system such as Mono serve to shore up the development limitations Linux has versus Windows so that Windows appliation development “limitations” can be solved? Restated, at some risk, is riding the coat tails of Windows development going to improve Linux adoption? is that a good application of our best talent?!
* Communications – In what ways is the computer used to communiate and in what way is Windows deficient in those areas so that an OS like Linux can be applied to solve/correct those problems/limitations? I think that convergence of devices is a serious problem that no OS really handles well. Linux seems to have a few niche edges on that space, but can Linux adoption be rapidly increased on the desktop simply because it runs Mom’s Tivo really well?
* Others?
Folks – my point here is we need to get to the root causes. Ask the fundamentals and amass the responses. Analyze and focus on ways to improve. We can’t get so caught up in the forest that we lose site of the trees. Restated, if we focus on ways to do things differently or improve on an estabished paradigmn, it will be very difficult to establish dominance from incremental evolutionary gains.
Recently we have tackled the definition of bloatware. That, combined with talks about software complexity in general seemed to lead us to an undocumented conclusion. The complexity of existing software is getting out of hand. Modularity is needed. I think that modularity of software such as in “breaking up OO.o into manageable pieces” is a good idea.
The ongoing delays with Vista -I think- underscore this point. One of the most powerful and talented software companies in the world is struggling with this issue. How much longer can an OS be extended using what amounts to be a extended mindset of feature creep/bloatware such that it becomes mind boggling and unmanageble? Maybe we have already reached that point? If not, what will the system after Vista look like and how much longer will it take to develop. These are hard questions that speak to the need for a new way. A fundamental seachange in the way we approach the OS.
* Supportive of an echo system – Can the system be improved to accelerate the pace of new developments and inovation?
This is, IMHO, the most exiting area of OS development. What platform is needed for optimizing usabillity through evolution?
We have decades of usabillity research that still hasn’t been implemented in the mainstream systems. Why is this?
Can the way OSS projects are handled be improved so that the adoption and evolution rate is accelerated?
I’m reading a book about UNIX design at the moment (so I’m in that mood 😉 ) that claims that the reason mechanism are separated from policy is because it tends to survive longer, X11 is old compared to GNOME f.ex.
I was thinking that the key is to layer a system so that the policy side of things could be developed by anyone with a little time and the abillity to learn. Coupled with an easy way to share, discover, try and modify these developments a situation resembling the early days of the web could be stimulated. You know when everyone could create a homepage (and did) by just copy&paste from other sites and add small tweaks.
About your post:
* I think that OS research in security is not to correct Windows, but to correct Unix too: while Unix is much more secure than Windows, the number of vulnerability found is still much too high, is-it possible to have an OS which is secure without forcing the user to patch again and again?
And for Windows, the problem is caused by legacy with a past where in a choice between ‘ease of use’ and security, Microsoft always choosed ‘ease of use’ even if the annoyance was pretty low, which bites the users in the end when his OS is corrupted by viruses (which didn’t wait Internet age to propagate).
* Somehow the talk about shielding users from error gives me shivers: this imply that the computer know better than the user what the problem is and will fix it. Nice for newbie when it works (aka never) and a pain in the ass for everyone else to debug such complex mess.
* ‘Folks criticize KDE and even Gnome for “copying” the Windows way. Is that because Windows is, on a fundamental level, a good system?’ Stupid remark. They obviously copy Windows because it is what the user know so this facilitates the transition.
* HW support: 2/3 of Windows crash are caused by poor HW drivers, so yes Windows has issue.
Full plug and play without CD may be easier for the user, but it ignores the fact that usually the CD or the firware which comes with the HW is obsolete..
So not going to the website of the HW maker usually makes you feel the pain in additionnal crashes.
And the reason why firware comes on CD is usually a cost reason.
In the end, I think that your topic questions are not very good sorry.
About why Vista takes so long to be done, I’m not sure that this is a research topic, unless you work for Microsoft, apparently other OS (Linux,*BSDs) manage to evolve without such heavy modifications so apparently this is not an problem with the OSs just with Microsoft.
Thank you for your feedback. The purpose of my original post was to poset he question: What deficiencies exist in Windows such that the development of a new OS/Platform is warranted.
I tried to foster thinking in this area with a broad set of bullet points and then provided some of my thoughts on each. You may not agree with any of the thouhts I express, and that is fine. But you don’t provide tangible response on the question.
It is far easier to critique than to innovate. I am just saying, let’s rethink the approach. If the intent is to foster the adoption rate of Linux.
And with all due respect, you miss the point. Folks have indeed criticized the similar look and feel of KDE and Gnome to Windows XP. It is not “stupid” to state a fact. It is stupid to flippantly cast off the accusation and continue down the road toward slow adoption in spite of it.
> What deficiencies exist in Windows such that the development of a new OS/Platform is warranted.
Well, security obviously, but it doesn’t really need research to go to Unix-level security: just apply Unix solutions (which is hard to do because of legacy), so this isn’t very interesting, what would be interesting is coming up with something better than Unix and still be usable.
> And with all due respect, you miss the point. Folks have indeed criticized the similar look and feel of KDE and Gnome to Windows XP. It is not “stupid” to state a fact. It is stupid to flippantly cast off the accusation and continue down the road toward slow adoption in spite of it.
Well, the reason I used the stupid word is that for you the natural explanation of copying WindowsXP by KDE or Gnome is that WindowsXP is good. The obvious truth is that the copying is made because WindowsXP is ‘familiar’ to most people. And most of the time, familiarity trumps ‘good or bad’ HMI (except when it is really, really bad), see any usability study.
You also assume that the criticisms are valid, are they really valid or are-they just from people who hate Microsoft (many do) or who are fans of another OS?
You make a good point. I think that implicit in my reply is my assumption that the XP windowing system is good. I think that it is. I also think that KDE is better (although a bit rougher around the edges in terms of font rendering, icon handling, and excessive verbosity throughout is settings and options), that will flush out in time.
That said, Gnome is the most elegant interface and I say that with OS X 10.3 sitting two rooms down from the XP box I am on right now. I think this explains the Ubuntu phenomena. Well, back to work for me. Thanks for your insights.
Linux is still a niche OS. Mac OS X is still a niche OS. Windows is also a niche OS. It just depends on your perspective and the platform being considered. The most blatantly obvious platform is just the desktop. That is why anything other than Windows is considered a niche OS in general.
Arguably, Linux has a larger marketshare than Mac (and, so could possibly could be considered less of a niche OS), but while the Mac target is relatively small for hardware and software producers, the Linux target can be huge. So, since it is much easier to cover the Mac target, Mac’s are much more likely to have software and hardware support than Linux. Therefore, even though Linux has progressed and gained acceptance in leaps and bounds, it still suffers from some of the same problems as many hobby operating systems.
Linux will continue to suffer from many of these problems until something is done about it. Many Linux fans bemoan a chicken/egg situation. The real chicken/egg situation is gaining enough widespread adoption first so that Linux will have more power to influence standards later. That is what MS did, and there’s a reason why it works.
Linux has done a surprisingly good job of influencing standards to a point, but I have yet to see ogg as a format that I can use anywhere even though it is a fabulous free format that has been around for a long time. Instead I see WMA/WMV getting pushed everywhere that isn’t tied to iPOD products. That is just one possible example.
I would say that the UDI and other types of solutions wouldn’t just help what the author considers niche operating systems to have an easier point of entry. It would help Linux and the computer industry in general.
A river having carried small stone particles for millions of years made the Grand Canyon. A drop a day on a stone will wear holes in the stone just so. The only thing Linux – or any other piece of free software for that matter – needs to be succesful is to survive.
This is can never be said for software which needs to generate money – it must change, it must generate need, lust, excitement, want.
Linux does not need to generate money. Syllable does not. EROS (Coyote) does not. Haiku does not. AROS does not. These are operating systems which need hardware to run on, that is all.
Even Vista is not the be all/end all of operating systems, and neither is Mac OS X, Symbian or the operating system in my washing machine.
It is by the slow spread, the mouth-to-mouth, the binary tree evolution, the one-at-a-time, the inescapable, unavoidable drop of water that the stone will be worn away, and sure as I am still installing free software that drop will keep falling.
Take off your rosy desktop only glasses and look around.
Linux, and now OS X, run the cluster market. That’s not a niche market. There are several companies making decent profits on it: Apple and Atipa for example.
Linux is doing quite well (ie, a major share) in the general server market.
Linux is the leader in the small webserver market.
Mac is the only system a large group of people have considered for years (they’re called English professors ). It’s gaining steadily amongst scientific users and even consumers looking for a computer that works as well as their iPod. Apple’s sales are great, I don’t honestly see how one could call them a niche…
Maybe you could say Linux is about 5 niche’s:
Developers
Old Unix Lovers
Web Kiddies
Scientific Users
Angry Former Windows Users
And Mac is a few more:
Graphics People
English Professors
Old Unix Lovers
Scientific Users
Happy iPod Users
Oh, and btw, iRiver plays ogg. If you shop around you can find good stuff, if you listen to a Best Buy employee (at $6.50 an hour) you’ll hear nothing intelligent.
“Take off your rosy desktop only glasses and look around.
Linux, and now OS X, run the cluster market…..”
You never did too well on reading comprehension tests, did you?
I didn’t say that they were a niche product in ALL markets. I simply stated that they ARE niche products in some way. I also stated that Windows by the same token can be considered a niche product since it is possible to find a market where it isn’t the dominant product. Hmmm… Would that line of thinking possibly go along with what you wrote?
The point that you seem so ignorant about is that many of the markets you mentioned are niche markets. How does being a dominant player in only niche markets raise you above the level of being a niche product? Maybe you didn’t do too well in English classes in general.
Oh, and speaking of English professors and scientific users… I have never seen an English professor use a Mac. And, the scientific users I know are much more likely to go for a PC with Linux or Solaris than a Mac. Your experience doesn’t necessarily reflect reality.
To finish off, I’m so glad that you could come up with one product that can play ogg. That’s fabulous. Know what? Samsung makes a device that can play ogg as well. I’m sure that someone else can come up with another obscure device. But, what happens if I don’t like the quality/features/options/look/price of the relatively small number of devices that are capable of playing ogg files? Where’s the competition and variety to allow me to get a product that I am willing to spend money for? Ogg isn’t a selling point to most people because they don’t care about Linux. WMA and mp3 licensing issues aren’t a problem to most people. What happens? Two words… chicken, egg.
Have you ever met an English professor? At least here, ISU, the English building is packed with Macs. Lots of brand new ones (lots of brand new Dell’s too actually; they have nothing else to spend their budget on I guess).
You said you couldn’t find a player, I mentioned one. Sheesh. No one finds exactly what they want you know? Sometimes you have to settle a bit.
Most scientific users are sitting on Linux boxes, you’re right. But more and more are buying Mac’s. I work for several, and I’m watching some of them (against my advise) switch to Mac. It’s happening, Apple isn’t pimping xserve because they can’t sell it.
My point, as stated later, was that these OS’s are popular in MULTIPLE niche’s. This means calling them niche OS’s is a bit unfair. They’re more like “multi-niche OS’s.” What if all you had were niche’s, and you still made up 10% of the market: Are you still a niche?
To quote you:
“Linux is still a niche OS. Mac OS X is still a niche OS. Windows is also a niche OS. It just depends on your perspective and the platform being considered. The most blatantly obvious platform is just the desktop. That is why anything other than Windows is considered a niche OS in general.”
I did fine in reading comprehension. Although significantly worse than math and vocabulary :/.
Your topic sentence is: “Linux is still a niche OS.” And the closing sentence of your first paragraph: “That is why anything other than Windows is considered a niche OS in general.”
Maybe you should write more clearly if your intended thesis was not as stated.
I apologize for being so blatantly rude yesterday. I was in a very nasty mood.
Really though, you’re just arguing semantics at this point. I stand by the same point you just quoted. Even though Apple is benefitting quite a bit from the halo effect, Mac computers are still novelties (even if the one of the niches they occupy is “the coolest computer on the planet”).
A product can occupy more than one niche and still be a niche product, but that wasn’t really the basis of my point. In general with computers, the desktop is the most obvious market. Therefore, if a product is considered a niche product on the desktop it is very often considered a niche computer product.
Let me see if I can put this a different way. Life has existed for millenia without computers. There are still billions of people who don’t own a computer, and many of them still haven’t even used a computer. It’s not to say that computers can’t be great for a lot of things or that computers don’t affect the lives of the majority of people, but for the majority of the worlds population computers still occupy a niche market. Google, Microsoft, IBM, Intel, AMD, DELL, HP, all of them can be considered niche products. Computers fascinate almost everyone, but they drive/are driven by relatively few people.
Linux has been embraced by millions. It drives some of the worlds most important technology. It still doesn’t have widespread software or hardware support in general though beyond what the open source community can provide. So, it is still a niche product because without those things it will find a difficult time on the desktop.
What defines a niche? Semantics, perception, marketing…
I use Linux and support Linux. That doesn’t mean that I don’t think there are things that it lacks, and I’m not afraid to call it a niche product while still recommending it to others. I’m just honest with people about why they probably haven’t heard of Linux. People realize that I’m being dead honest with them and usually are still willing to try out Linux.
Technically speaking, every definable “market” is a niche market ;-P Of course, in practice, niche is usually used to denote a “small” or “specialised” market where “small” and “specialised” are defined in reference to the overall market. And then there is the disconnect between reality and perception, which is readily observable in the case of the desktop OS market. The desktop is the average consumers point of reference when it comes to computers, and thus it follows from this all other OS markets are considered “small” and “specialised” in spite of the fact that the desktop OS market is not the largest in terms of either volume or overall economic size.
If the desktop OS market is considered in the proper context, the phenomenal and ongoing success of Linux is much more apparent. Cell phones, embedded appliances, and servers are markets where Linux is doing extraordinarily well. Linux has been a disappointment only for those people who believe that the entire computing universe revolves around the general consumer desktop, which in terms of both overall volume as well as in economic terms it plainly does not.
And thus if you’re looking for Linux’s (or really F/OSS more generally) influence on standards, you’ll need to look outside the consumer market space. TCP/IP, HTTP, SSH, SMTP, POP3, HTML, XHTML, CSS, and a neraly endless string of other acronyms your average desktop user has never heard of, yet relies upon daily, are in large part contributable to the role F/OSS has played in the development of networking infrastructure. And the ongoing importance of POSIX (or quasi-POSIX as the case may be), especially in the embedded device market, is in large part a consequence of the growing market for Linux and thus ever increasing base of existing software which is roughly writtent o the POSIX standard.
Would the people who mod me down care to tell me what constitutes a succesful operating system? Definitions would be nice.
Would you also care to respond to my second comment on the article itself? Please refute my statements and tell me how the article definitively demonstrates the inescapable future of niche remaining niche.
And lastly: The problem with the software industry is that for every good programmer there are a thousand not-so-good, for every person that understands a system there are millions that don’t. But we all want to play.
You want safe, secure, high-availability, you need to dump object orientation and substitute predicate – a paradigm based on mathematics. You need to guarantee on a hardware level, and this cannot be done in the operating system, however “secure” it is. You need encryption across the board, EROS’ keys etc. You need the end all/be all. This you will never get in the corporate world, because that would mean the death of purchase.
But you might get it in free software.
You make excellent technical points, but your comments get modded down because the article simply isn’t about Desktop Linux, Toaster Linux, or anything with the word “Linux” in it. Linux gets classified as a server/embedded OS, which is an area in which it is downright major.
Though I must apologize if my article gave the impression of “inescapable future of niche remaining niche”. The point was more to show a few ways for systems to accelerate their own growth.
Your article indeed mentions Linux on more than one occation, but that is not the reason I bring Desktop Linux to the front, rather, Linux is THE example of a niche operating system which actually DID succeed in breaking into the existing eco-system and grab a significant share of the userbase. So, if your article – as I read it – tries to point to ways for other niche operating systems to grow, it would do well to have a closer look at Linux’ history. It may be the the dot-com crash, Linus as a savior, its reputation as “safer” (being UNIX) amongst “illiterates”, its name, the confusion of Linux as a kernel and Linux as a brand name for all that is free, or any other of a number of factors, including actual technical merit.
I guess I should have stated my intentions more clearly
Edited 2005-12-29 09:10
The growth and lifetime of Linux isn’t all that applicable to other minor and hobby systems today for one reason: Linux is a Unix. Nobody who starts with Unix has to port all their programs anew, 100%, to move to Linux, they just have to port that minority which relies on kernel functionality.
“Nobody who starts with Unix has to port all their programs anew, 100%, to move to Linux, they just have to port that minority which relies on kernel functionality.”
That is not entirely true. If they implement a POSIX layer, X-windows etc. they are very much on the way to an easy port. Examples: BeOS, SkyOS, Syllable, Haiku, you name it.
Actually, you need gcc (same object format) and you’re pretty much going. Like most operating systems have and use.
And porting the minority which relies on kernel functionality could be done to any operating system.
I personally think the size of the code of a lot of software along with the complexity of interdependencies makes for much more complications.
I do see your point, but if someone chooses to implement such a different system or ignore so many standards, write their own compiler or base the design on a hardware platform no-one ever heard of, then someone has chosen to work uphill.
Good for them (and us)! The world needs more people who are willing struggle for something – be it better or not.
Well, I guess next to all hobby os developers use at least Bochs and/or Qemu, some VMWare or VirtualPC, but all these are not perfect emulations (got floppy disc controller code that works on Qemu, but not on real hardware etc. similiar problems also exist in the other vms) and one day (at least if you want someone to use it) it should leave the virtual machine and run on real hardware.
Xen is interesting, true, but of course not a perfect solution as you have a linux or netbsd kernel running, too. Yet it can help in cases where a native driver doesn’t exist yet.
Since when have hobby OS’s often hit mainstream? I think there are probably far more hobby OS’s today than ever before. Remember Minix? That was a big deal at the time, no one else had tried to bring a decent OS to 16bit personal computers.
The big names back then were Unix (commercial variants, and a dependent BSD) and VMS. Then there were things like MSDOS and CP/M which were nothing in comparison and still were developed by companies! Maybe you could argue that since one guy wrote the first QDOS that counts; but he didn’t manage to get people to use it did he?
Device drivers are hard now, sure. But at least now we aren’t just entering the phase of portable OS’s where you can write it in C and move it fairly quickly to new CPU’s. Back in the day people wrote systems for a single machine (ITS for example) and lost it when the machine was no longer useful.
Remember Minix?
But of course.
http://www.osnews.com/story.php?news_id=12381
I was referring to the time when it was a big deal .
Ya know, new, fancy, young. Now it’s sort of … well … 25 or so?
I agree that it is not just device drivers. And yes, to a certain extent, all of us are guilty of being sheep. Most of us, unless we are just brimming with spare time, are not rushing to tear out your Fedora Core, even to try another Linux distro. You probably just got it setup the way you like it. Maybe you just got a project handed to you to develop a web site for someone. So, unless you are familiar with bochs or Xen or vmware, and don’t have lots of disks laying around, you probably will do what many do – wait. I would submit that a lack of apps woul d not kill an OS. Look at how long it was before Linux got the kind of apps that were useable by the average office worker. This did not deter the faithfull. I think, imho, that what would be at least a strong catalyst are:
ONE. Not one single feature, but a collection of features that make it very usefull for a purpose. A purpose could be serving files and apps. Novell sold a system which had THREE good ideas: A. Central Storage. B. Application Server. C. Security and Stability
TWO. Cheap Hardware & device drivers [ proven by Linux ].
THREE. Stability [ please, please, please ] and security.
FOUR. A POSIX emulation/compatibility layer.
FIVE. A system level programming language that is not SO complicated that mortals can’t use it [ Many times this is C – but it could just as well be something else – it’s been done in other Languages ]. Notice that Novell did not really start out with a system language, and that did not hurt it. It did have a means to write applications.
There are several routes that you could take if you wanted to write a new operating system from scratch that would be viable and work. I’ll outline a few here.
1) Write it all around a higher level virtual machine (java, C#/Mono, common lisp, etc.) Then you just have to get people to port your virtual machine around and run it. This results in the new system wrapped by the older one. Slowly replace the older components with better newer pieces. Eventually you replace the kernel and the transformation is complete.
2) Target a virtualization technology (vmware, xen, etc). Build your system from the ground up. Preserve backwards compatability by simply running all the older apps in their own virtual environments. This is a good way to run 32 bit apps on a 64 bit system for example. I wouldn’t be shocked to see Windows take this route. It would also be interesting to see something like an Antivirus/malware kernel run on a Xen partition such that it could monitor or repair the main OS without being corruptible itself.
3) Write or port compatability layers for your new OS. Glib light, X11 and WINE would be a good start. You’ll end up with the OSX approach.
Michael
I am no expert, but imho, the thing that will ruin your day is supporting ANYTHING that is not reasonably intuitive. I am reminded of this when I have to teach someone who is a newbie how to do a task. PC’s are old technology, but BIG BUSINESS is driving the idea that new features are what we need. This makes us forget about intuitiveness and experience. Then, something like the i-p*d comes along, and we go “…yeah, like that!” Meaning that we probably had a collective shell of an idea in our heads – but nobody stopped and gave it life. The i-p*d is not exactly the best example. But consider this: Why does it take days & weeks for a newbie to ‘get it’? It really is not SO unreasonable to expect that we get a simpler computer experience. Think of all of the sub-systems that are in a car. Do you care about the air/fuel mixture? ( Oh God, I am telling on myself, but ) in days gone by, you had to in order to start a car. It was a manual process, and a poor user experience. Too much to know. Too sensitive. We just want to drive! To put it another way, you don’t want to know the correct impedance for the elements of your toaster – you just want a piece of toast! The knob says Light – Medium – Dark. Nice experience – no calculator – no anti-virus – no consultant needed. Just my 2 cents. Sorry.
To term an operating system successful or not is largely a matter of the point of view. For any given OS, is one person’s definition of success the same as another’s? Do SkyOS enthusiasts, for example, consider SkyOS a success because of the many features and the rate of advancement, or a failure because it isn’t being used by the majority of people out there (aka the “not-ready-for-the-desktop” syndrome)? Would Bill Gates rate SkyOS’s success the same? While some consider small OSes participants of a bleak picture; other users, even if a relatively small amount, are enjoying the ride.
I agree with a previous poster who proposed that OS News do more interviews and articles related to non-mainstream OSes and systems. Here’s one to get you started: http://groups.yahoo.com/group/CommodoreOne (renamed to C-One after the group was started)
Getting involved in open source projects is harder than it looks. Back when I was in school I was really interested in doing some open source development. I posted a message to a number of mailing lists and whatnot telling them I wanted to spend some time helping them out. Of course people would typically post right back telling me to just go for it, to dive right in. But all this went very contrary to what I had envisioned getting involved in an open source project would be like. The way I had it figured, it would be more like getting a real job. You’d apply so to speak, then they’d hire you so to speak, and the team leaders or whoever would assign you some tasks, much like working at a real job. Once you had completed your tasks you’d get assingned some more tasks and so on until you finally learned the ropes. And it seems like I’m not alone. A lot of fledgling open source developers seem to have the same notions that I had.
In reality, what I envisioned is quite different from how it is. When it comes to open source projects, you basically just do it. The problem with this though is that it just doesn’t work for newbie developers. They have no idea what to do or where to get started. Imagine if upon hiring their developers microsoft just told them “here you go, here’s the source code. Go for it.” You’d probably end up with a complete mess.
But this isn’t to say that open source doesn’t work. It works spectacularly as far as I’m concerned. I think the very modular nature of most open source operating systems helps out with that. But the way I see it, the community would be a whole lot better off if it took some of those mailing list newbies under its wing once in a while.
Anyway, that’s my two cents.
I see the device drivers as the main problem.A Unified Device Driver model for OS’s be a major step foward, the problem is however is its pretty bad when even under NDA ATI/Nvidia wont release specs to developers. Perhaps if the government was really worried about Microsofts monopoly they would look into their relationships with video card manufacturers. Especially since ATI’s does the XBOX’s chipset now. The old excuse is competitve advantage is why they dont release them, but its mainly believed to be they are worried they are stepping on someone else’s patents. Which if this is the case the U.S. govt. should be all over them since they allow bogus patents anyway.
I feel that in these days is the attempt to write whole new OS to test new ideas is often a mistake called ‘premature optimisation’. Why? Because it could be done in user space using the common posix system as a “device driver library”. For example it should be possible to implement the orthogonal persistance as a byte code virtual machine (similar to jvm). You get all the hardware drivers, its simpler to implement (speed optimisations like jit could be added later and there are avaible libraries for doing it) and most important – the users can continue using their old OS with all its applications and at the same time use the New Cool System for new software, where it is useful.
Well, unless you want to test how different driver models perform. That is a topic where it doesn’t work that easy.
“UDI may be sponsored by proprietary software makers, even the latest “Great Satan” SCO, but it has reasonable technical merit and does the job it’s intended to do. More hobby and academic kernels should be UDI-compliant.”
Indeed. I find the lack of support for UDI to be a great loss. If all non-Windows OSes supported UDI, a major problem for alternative OSes would be solved.
> Folks have indeed criticized the similar look and feel
> of KDE and Gnome to Windows XP.
Some religions must also be copies of each other, since in all of them, the teacher and his disciples eat together.
They must also be copies of each other because in most of them, there is the act of “blessing.”
Third and the final proof that all the world’s religions are copies of each other is that the teacher is given a name that implies that he cares for his disciples, or that he does good to them! They also use water to imply purification. Most talk about death, but get this, all talk about life!
The point is, having some candy colors does not mean one is copying WinXP. Gnome will be dead and lost until they fix GTK and go for Gnome 3, getting rid of all the waste libraries.
What should KDE do to a) be good and b) be not like Windows? What should it be to be just “fast and good” to use, as old Macintoshes were, instead of focusing on a spesific “newbie” group? The newbies don’t care, they will learn, but all the stupidity that results from the newbie focus will alienate all the rest.
I think someone should show a rat’s ass to kernel hackers and fork linux. And implement a stable binary interface for modules. UDI is a splendid idea. These stupid politics with the source have brought us this far. How do we go on from here?
What should KDE do to a) be good and b) be not like Windows? What should it be to be just “fast and good” to use, as old Macintoshes were, instead of focusing on a spesific “newbie” group? The newbies don’t care, they will learn, but all the stupidity that results from the newbie focus will alienate all the rest.
Rethink wether the WIMP paradigm really is the best model for a modern system.
Modernize legacy desings such as manual memory managment (save/save as…).
Throw out the concept of “applications”?
Are “files” and “file systems” really the best abstraction for a modern system?
There are a lot of things you could do to improve a system while _not_ being like windows.
I seem to remember an open source project that once tried to write a program that consists of two parts:
1. Read binary Windows drivers and extract hardware specifications from them.
2. Automatically generate Linux (or whatever) drivers from this information.
Unfortunately, I can’t remember the name of that project. Anyone has got more information if this project still exists and if they made any progress?
Hmm, I understand what you mean. It’s a good thing in theory, but it might cripple the hardwarecompanies and it might hold back innovation. It’s not impossible it would end like the ACPI mess – both OS-makers and hardware creators all having their own version of the standard. I was rather thinking of like for example how it went for IDE-discs, the operatingsystem has one driver which works for almost all IDE-harddrives.
But it would probably not work out for stuff like sound-cards and videocards (VESA is a good thing, but using special drivers give better performance, especially in the world of 3D) – so I suppose you’re right.