“Technology professionals have loosely used the term ‘UNIX’ since the first person had to explain the difference between the Berkeley and AT&T flavors, so it’s not surprising to find as many UNIX standards as there are versions of the operating system. Peter Seebach wades through the wellspring of UNIX standards and sorts them out for you, concluding that the rumors of the death of UNIX are (as usual) greatly exaggerated.”
+1 to this article.
In particular, it gives an accurate, well-explained perspective of UNIX and Linux relative to each other.
Eric Boutilier
(http://tinyurl.com/n59fk)
OpenSolaris
(http://opensolaris.org/os/blogs)
filled in everything I thought I knew and things I was completely clueless about, good post, and good report.
While it took a while to wander through the various links, the end result taught me a lot about UNIX and POSIX that I didn’t know. Thanks!
Browser: Links (0.99; OS/2 1 i386; 80×33)
This has been one of the best articles this site has ever linked to.
It was a bit heavy going in places, but stick with it and you will be rewarded.
Oh, and after all the reading, I didi appreciate the humour beside the authors picture.
Seems to me that the guy is stuck in Unix shell and C medieval age. Yes, standardization matters, but what matters a lot more are the features provided by a programming environment. And to be honest, I don’t see many features provided by the Unix + C + shell combination. If C and some shell scripting alone were so powerful, no one would bother to use Java, .Net, Perl etc. My favorite quote from the article:
By contrast, twenty-year-old UNIX utilities still compile and run.
I guess that’s because in the last twenty years, Unix didn’t really make progress.
Some forms of older tech are still hanging around for a reason — sometimes its simple inertia, of course, but often it’s because those older tools tend to work very well for what they are used for.
I still work in an older mainframe environmment and write Fortran code on occasion. Why? Because for the specific application we support and the context in which it runs, that environment and language is the best combination that we’ve been able to find.
Will it move someday? Yes, when it makes sense to move it. Until then, it stays where it is because it works.
Not everything easily maps to a scripting language or a GUI, and not all existing problems require a fancy or trendy solution. Sometimes a shell script or a little C program is enough.
Just because the API has stayed relatively stable doesn’t mean Unix hasn’t made any progress, it just means that the basic function calls and libraries needed to do a particular task have survived, what those functions do under the hood, however, could have changed radically
I had trouble keeping reading and taking the article serious. The author is either dense or willingly spreading FUD on Windows. Apparently Windows is a completely unstable target compared to Mac and UNIX especially. Say what? Because APIs have changed so many times for Windows?
Newsflash: Windows is backwards compatible! That means you can write a DOS program or one for the win16 API, and have it run on every single version of Windows in existence! (ok, DOS progs on WinNT might not..but still)
and now the “stable” target that you should be developing for, if you have an eye for the future, is Vista.
Ehm…right. How about people just keep developping winXP or Win32 programs instead, since they’ll run perfectly fine on Vista? Actually, almost all WinXP programs run on Windows 98 as well, after you install 1 lousy patch from Microsoft which adds for example unicode support . Unstable target my ass. Windows has managed to progress technologically while remaining backward compatible, which is, asides from any flaws Windows might have, an incredible technical feat.
Mac, presented as ‘not quite so bad as Windows’ has gone through 2 architecture changes for crying out loud, not to mention getting the Next-API thrown in. Does the latest OS X run OS 9 software flawlessly? What about OS 7? And OS 6? (Ok I don’t actually know, besides having trouble trying to install OS 9 compatibility mode briefly in OS X on a machine that wasn’t supposed to run OS 9; but I’m a little doubtful it’s as easy to run 15 year old Mac software on OS X as it is to run 15 year old DOS software on WinXP).
And if UNIX is similarly backwards compatible, it’s for all the wrong reasons. Peter Seebach being lyrical about UNIX’s standard compliance is slightly unjustified recalling the old adage “UNIX loves standards. That’s why it has so many of them.”
If you want to use anything using GUIs, there’s no such thing as a unified standard.
I’ve run across dozens upon dozens of games and applications that do not install and run on windows XP. The biggest complaint I hear about XP are that, ‘Software X no longer runs!!’ I’ve seen some offices where they have one windows 98 machine still lying around just because the accounting software is broken in weird ways on XP.
Not only have they changed the API several times (is it even fully documented yet?), but they continue to change other critical components in incompatible ways at break neck speeds. NetBUI to WINS to DNS. Try getting a handful of NT4 workstations to talk on a Windows 2003 domain with active directory, they are incompatible, and since there are critical applications that don’t port to XP on them, they are necessary!!! You call that backwards-compatibility. THese are real-life situations I’ve dealt with people.
Windows is designed, developed and expanded ad-hoc with little or no overall design philosophy. This works great on a desktop PC, but is a huge pain when you have a network of slightly different ad-hoc machine.
Unix is a bit differ, it is an engineered system with over-riding design philosophies. Everything is a file, Terse communication, Simple tools that do one thing well. I’m not talking about OpenOffice, X-Server, or any other heavy-weight stuff found on unix sometimes. Unix was designed to have many specific tools that work together in standard ways to accomplish goals not necessarily percieved by the original authors. It is simpler to handle large numbers of unix (even different flavors) on a network. I can be reasonably sure that if I need NFS I can get NFS, if I need SSH I can get SSH, if I need NIS, DNS, you get the picture.
A lot of the issues with Windows software breaking is because the software was broken to begin with. It relied on undocumented or even buggy behavior in the Win32 API and then when Microsoft changed things under the hood or fixed the bug, the program stops working.
I’ve had NT clients authenticating to a win2003 domain, and it’s not hard to get MOST win16 programmings working under XP, if you apply a little elbow grease
Even OS/2 is more “backwards compatible” with very old Microsoft software than Windows is, at least these days.
It was easy to run DOS program in Windows 9x, since it had a copy of real DOS to fall back on when it needed to run such things, but NT doesn’t have that luxury, and OS/2 doesn’t need it because IBM understood how to write a 32-bit Virtual DOS Machine that actually functions well, can boot real DOS images, etc.
I remember Microsoft releasing WIN32S.DLL files every three months with new features just to break WinOS2, so don’t talk to me about Microsoft’s stable Windows APIs.
Even when there’s divergence, UNIX systems typically provide solid API documentation.
Ten years ago I subscribed to MSDN and Microsoft sent me ten (or was it 12?) CDs chock full of software and technical information. At first I was in heaven.
The problem came when I tried to find specific details. I soon learned that the Microsoft way is to thoroughly document the tool, not the technology. I remember trying to find the file format of .rc files, but all I could find was how to use the proprietary Microsoft tools that generate .rc files. It’s as if they were saying, “You don’t have to know–we’ll do it for you.”
The thing I love most about Unix/Linux/GNU/FOSS is that no one seems to have any agenda that entails hiding technical details. If something isn’t documented, it’s because no one has gotten to it yet, not because of a marketing decision. And even then, you can always search the source code.
Newsflash: Windows is backwards compatible! That means you can write a DOS program or one for the win16 API, and have it run on every single version of Windows in existence! (ok, DOS progs on WinNT might not..but still)
have you ever actually tried using any non-trivial dos program on nt? simple stuff works, but you better not try anything that might actually be useful… that’s an “illegal operation” on windows, and your program “will be terminated.”
most win16 apps i’ve tried to run on 2000/xp do run, but usually crash within 5 minutes…
and what about all the programs written for windows xp were broken by sp2?
Ehm…right. How about people just keep developping winXP or Win32 programs instead, since they’ll run perfectly fine on Vista?
because they probably won’t run perfectly fine on vista. how many win16 apps run perfectly fine on xp sp2?
and they probably won’t run in whatever version comes after vista… of course that’ll probably be in about 20-25 years if recent trends continue…
If you want to use anything using GUIs, there’s no such thing as a unified standard.
X Window System Standard
http://xorg.freedesktop.org/X11R7.0/doc/PDF/xlib.pdf
X Windows is standard in the same way that the Von Neumann architecture is standard. It’s very lowest common denominator. You can’t do anything useful with bare Xlib and your program won’t look like everybody else’s program (at least not without a large amount of work). You have to use a toolkit like GTK+ or Qt. But right there, you have to decide which one you want to use and hope that the end user has the right libraries (and right versions of the libraries) installed.
Anonymous.: have you ever actually tried using any non-trivial dos program on nt? simple stuff works, but you better not try anything that might actually be useful…
Anonymous.: because they probably won’t run perfectly fine on vista. how many win16 apps run perfectly fine on xp sp2?
Well… As someone who actually frequently uses old programs under Windows. (Not as often now though as I used to, since I switched my primary Windows machine to the x64 version.)
You do encounter a number of problems running old programs.
Some programs weren’t “written correctly” to start with and exhibit more problems under newer versions than they did under older ones for various reasons, including security reasons.
Some programs were written with special non-Microsoft “libraries” or directly interracted with hardware or what have you. For example Glide. Microsoft has absolutely no control over these things and can’t be expected to “port” these over to a new OS. Also, even if they wanted to try, there would be a lot of libraries and pieces of hardware they’d have to cover.
As if that wasn’t enough, some programs use “low-level hooks” in the Operating System. These are expected to be changed every so often.
And so on…
(As a side note… Windows x64 presents it’s own special problems, in that it can’t execute 16-bit programs.)
However… If you write a good clean app, that uses “normal” Windows APIs properly, from my experience the app will work right from Windows 3.0 through Windows XP (non-x64). For example, last time I checked all of my old apps work. My old IDEs still work. The majority of my old games still work. My old office suites worked. My old graphics editing packages worked. In fact, ALL of my software for manipulating graphics on my x64 machine are old 32-bit programs. Many of them are from around the time of Windows 95.
Guys, the difference between Unix API stability and Microsoft API stability is this:
– Microsoft puts a lot of effort in making its products very backward compatibilie.
– However, every few years, you must rewrite all your code from scratch. I.e. a Win16 application might run today, but its source code cannot be used to build a modern 32-bit Windows application.
– Unix puts less efforts in backward compatibility.
– However, many Unix source code from 20 years ago is perfectly usable today and often needs few modifications to make it usable to build modern applications. This is not 100% true, i.e. Motif source code might work today, but QT/GTK are the modern standards today. However, in general, source code tends to be useable for a very long period on Unix.
Edited 2006-03-09 15:00
Mainframe environments are the same way. When I worked at Northwest Airlines, we had both assembler and Fortran source code written as long ago as 1966 which would still compile on our Unisys 2200 boxes without issues on hardware that had just been released.
There were 40-50 OS releases done in between, maybe more, as well as some fairly huge hardware and OS architecture changes, and yet basic applications still worked. 🙂
Indeed, the author doesn’t know …. about how the Windows internals work.
Like some people said, if 15 years ago, you wrote a proper Win16 program, using documented api behaviors, it will still work PERFECTLY today. I did it myself.
Of course, I’m talking about compiled code. Source might need some work because the Microsoft C/C++ compiler improved alot in the past years. Just like any compilers.
However, it’s true that some code won’t work today, mostly very old DOS code. Why? ask Intel. Hopefully, most 386+ will work natively while older code might be emulated perfectly if the code didn’t rely on dirty tricks.
Of course, I’m talking about compiled code. Source might need some work because the Microsoft C/C++ compiler improved alot in the past years. Just like any compilers.
Talk about spin… “Might” need some work? No, it will. And it will need work not because the compiler has gotten so much better, but because things have CHANGED.
You windows fans are sounding like the Linux fans. The article is ABOUT UNIX. He barely mentions Linux. He barely mentions Windows. Get back in your cages.