You have to wonder how meaningful this news is in 2024, but macOS 15.0 Sequoia running on either Apple Silicon or Intel processors is now UNIX 03-certified.
The UNIX 03 Product Standard is the mark for systems conforming to Version 3 of the Single UNIX Specification. It is a significantly enhanced version of the UNIX 98 Product Standard. The mandatory enhancements include alignment with ISO/IEC 9989:1999 C Programming Language, IEEE Std 1003.1-2001 and ISO/IEC 9945:2002. This Product Standard includes the following mandatory Product Standards: Internationalized System Calls and Libraries Extended V3,Commands and Utilities V4, C Language V2, and Internationalized Terminal Interfaces.
↫ UNIX 03 page
The questionable usefulness of this news stems from a variety of factors. The UNIX 03 specification hails from the before time of 2002, when UNIX-proper still had some footholds in the market and being a UNIX meant something to the industry. These days, Linux has pretty much taken over the traditional UNIX market, and UNIX certification seems to have all but lost its value. Only one operating system can boast to conform to the latest UNIX specification – AIX is UNIX V7 and 03-certified – while macOS and HP-UX are only UNIX 03-certified. OpenWare, UnixWare, and z/OS only conform to even older standards.
On top of all this, it seems being UNIX-certified by The Open Group feels a lot like a pay-to-play scheme, making it unlikely that community efforts like, say, FreeBSD, Debian, or similarly popular server operating systems could ever achieve UNIX-certification even if they wanted to. This makes the whole UNIX-certification world feel more like the dying vestiges of a job security program than something meaningful for an operating system to aspire to.
In any even, you can now write a program that compiles and runs on all two UNIX 03-certified operating systems, as long as it only uses POSIX APIs.
Interestingly, some Chinese RHEL forks have in the past had UNIX certification. (This certification can lapse, even. Which means companies have to keep paying to play.)
The fact that, of all things, z/OS, which isn’t even a Unix-like (but has a compatibility environment), has the certification kinda makes a joke out of all of this, though.
While the associated cost might be high, the UNIX principle is one of standards and interoperability.
Linux and the concept of “choice” has moved us away from that.
“Linux has pretty much taken over the traditional UNIX market”
A similar argument is a browsers being W3c compliant is irrelevant now Chrome is so dominant.
Standards in computing are important, but sadly new features tend to trump it, especially in FOSS world
Unix always been a mess of non-compatible standard revisions, bad code portability, and arbitrary reinventions of the same wheel through all its different commercial and academic versions.
In the end, Linux and GNU/LLVM managed to offer what Unix/POSIX never really were able to. Basically, ./configure and the rest of autotools managed to finally do what POSIX has tried, and failed, to do for decades. And Linux offered a common portable kernel architecture to build systems on top of.
And the market spoke accordingly.
Unix certification is mostly for giggles at this point. It’s basically just basically Apple, IBM, HP, and SCO paying a yearly fee for some cert that likely is required by some ancient legacy software/support contracts that still rack in enough money to make the whole nonsense worthwhile.
Xanady Asem,
The lack of standards is the whole reason behind it, but a lot of linux devs hate autotools though. Hundreds of checks for archaic behaviors on platforms the code will never be compiled on. These are performed every time we change a configuration setting, which is all the more frustrating when you’re trying to interactively make changes to test their impact. While linux is a strong dev platform overall, autotools can be cumbersome. Ideally it’d be fast and just work everywhere, but this isn’t always the case. Even the new autotools seeking to fix this can become the cause of the very problems they sought to fix. Take cmake for example. It looks good in theory, however in practice I’ve seen many build failures arising from cmake incompatibilities. The version used by authors is incompatible with the distro…
We’re here because the world revolves around legacy code with bad standards. Cleaner frameworks and standards would be refreshing IMHO.
The point was just about autotools being an accidental quick and dirty way of actually achieving in practice, what a staunch standard had tried, and failed, to do. The supposed unix code portability was never a real thing.
A lot of the interoperability and standards was developed on unix, but it was not necessarily unix.
Xanady Asem,
Posix compatibility is real, but most software don’t explicitly target it anymore because its ancient. I guess we could say that linux has become the new adhoc standard for modern software – especially server side. TBH I feel linux could benefit from more modernization too.
This is complete nonsense. POSIX compliance and UNIX certification are completely different things. OS developers care about POSIX compliance, and all active Unix-likes aim for it. Only military contracts care about UNIX certification.
I never said POSIX and UNIX cert are the same thing.
You got it backwards in terms of requirements. POSIX is a federal gov contract thing (or was ages ago).
How can an OS like MacOS that refuses to run “unsigned” software by default be UNIX 03-certified? If I have to jump through special hoops to compile a “signed” build, how can that OS be UNIX 03-certified? This is why UNIX certification was always meaningless: it allows just enough wiggle room for UNIXes to be theoretically compatible but actually not.
I haven’t made up my mind whether OSes should allow the running of unsigned software by default (I think the default should be override-able, but haven’t made up my mind about what the default should be), but my point is, UNIX 03 provides wiggle-room for OSes to do both and yet we are supposed to believe it’s a “standard”.
Brew (or indeed any curl installed software) isn’t notorised using the Apple Dev keys. They are signed in the sense that brew self signs if I remember correctly.
So UNIX compatibility is still a thing where you can run the same software in a consistent manner (not withstanding things like architecture and so on)
Unix standard for OSX applies mainly to its BSD layer. There are other layers that make up OSX that have nothing to do with Unix, and thus outside of the scope of the certification.
So, you can run CLI apps without notarization with the default MacOS settings? Even ones not downloaded with curl?
I don’t know about that. I was just addressing your concern about OSX being Unix. And that the BSD layer can be in fact certified as Unix. I asume that may imply that most *BSDs out there could be Unix certified if they had the funds to pay for the process.
It seems that signed apps that run on the shell and unix are not necessarily mutually exclusive concepts .
> So, you can run CLI apps without notarization with the default MacOS settings?
Yes, you can. Install the Xcode dev tools, fire up a terminal, vi hello.cpp; g++ hello.cpp; ./a.out.
Works just fine.
I came to the Mac back in the G3 days with OSX because of the Unix underpinnings. I appreciated that Apple bundled an X11 server and Java in the OS. Sure, things have changed a lot over the last 20 years, but small(ish) gestures like Apple bothering for UNIX certification aren’t nothing. Even with all of the changes, it remains a reasonable Unix with a great UI.
I am encouraged by how little Apple changed things with the move to Apple Silicon. If they wanted to make things fully locked down, they could. So far, they do not.
Linux looks and works a lot differently than it did 20 years ago. Same for macOS. Operating systems evolve.
I am glad that Apple is still bothering to jump through the hoops of certification every now and again.
benmhall,
I don’t own a mac computer, otherwise I’d test it myself, but if you were to send your binary to someone else wouldn’t it be blocked by default? I’ve run into this issue trying to run 3rd party command line software on a mac at work and wasn’t able to solve it without getting help…
https://apple.stackexchange.com/questions/436674/how-to-unblock-binary-from-use-because-mac-says-it-is-not-from-identified-develo
https://macpaw.com/how-to/unidentified-developer-mac
Obviously there’s still a learning curve even if you are familiar with unix like I was, haha.
Alfman,
I’m running binary packages from pkgsrc, so it’s possible. I think the trick is the perms are tied to iTerm2 and not the individual applications from pkgsrc.
GUI applications need to be signed though.
Yes. I’m running binaries from pkgsrc, and they work fine.
As others have said its outside of scope, but also the kind of people that would want unix certification, probably care that the os is locked down. That’s probably a good thing for them. I don’t like it as a user of a personal machine. But wait Apple doesn’t have a server form factor…. Why does anyone want it to be unix compatible?
The Mac Pro is available in rackmount configuration, yes even the new one with Apple Silicon:
https://www.apple.com/us/shop/buy-mac/mac-pro/rack
Although the most common use-case is as video-audio editing/compression server, you can also use it as a webserver if you want.
Unix under the hood has been a selling point since NeXTStep became OS X, and it works rather well. I prefer a BSD or Linux distro, but macOS is a good middle ground. It’s *nix enough, and the GUI doesn’t break very often.
I’m of the opinion they shouldn’t allow unsigned software by default, but I’m fighting against people who dumb things. Including myself.
This is probably more like a “why not?” thing for Apple. The cost of the certification is likely peanuts for Apple, and they can mention the certification on the Mac OS section of their website, even though it probably doesn´t have much marketing value.
It’s at least /somewhat/ remarkable how far they’ve come. NeXTSTEP had a lot of gaps in its original coverage of 4.2BSD, and even in the OS X days programmers had to add quite a few workarounds to deal with POSIX deficiencies.
They are also the one desktop Unix vendor left standing, so that’s something.
> The Open Group feels a lot like a pay-to-play scheme
I feel like The Open Group has been a play-to-play scheme since the early ’90s. I love the big iron Unices, but have been using Ubuntu xx.xx LTS at home and at work for years, Sure do miss IRIX though. The Italian sports car of Unices.
The Open Group and their Single UNIX Specification was always a way for the different Unix vendors to feign source-code compatibility among their Unixes while allowing just enough wiggle room to make sure that each vendor’s Unix was different enough to have its own software ecosystem. It was a way to pretend “UNIX” was one homogenous thing, as a reaction to Windows NT which was actually one homogenous thing and worked everywhere the same.
This is what happens when you try to port source code from one Unix to another, complete with the OS outright lying about supported features to gain certification:
https://www.youtube.com/watch?v=XzhCGSE7KKw
Indeed. Traditional Unix vendors have always been extremely dissonant.
It ended up in a bizarre collective attempt at offering “same but different” products. To the point all those commercial Unix were for all intents and purposes pretty much different systems.
Old tech tends to be fetishized sometimes. RISC and Unix specially ;-). It is not as romantic when you realize that most of those commercial Unixen were basically quick and dirty ports. And the only reason for them selecting Unix was because the source code was available and licensing it was cheaper than building an OS entirely from scratch.
Once NT and Linux reached a certain level of scalability. There was close to zero value proposition for all those commercial unix variants, except for some very specific use cases and legacy applications.
Both NT and Linux stablished defacto standards. Even if it was in unintentional/unintended ways.
Once upon a time government contracts for computers required them to UNIX certified.
This is all about RFP business ( Request for Proposal ). Companies and governments will put out a call for suppliers to bid on upcoming contracts. A set of minimum requirements will be listed that bids must meet in order to be considered. If all criteria are met, it basically comes down to price. The list will be filtered down to at most 2 -3 ( probably 1 -2 ) candidates using this method and then a deeper evaluation will occur.
One of the criteria will be UNIX certification. If you don’t have it, you don’t play. That simple. It may not even be used in the final solution. It is just a hoop to jump through that the procurement team will enforce. Sometimes, these requirements are designed so that only a single supplier ( the one somebody wants ) can meet all the criteria.
Windows NT had a POSIX subsystem for similar reasons. It allowed them to win US government contracts.
In this case, it’s keeping Apple from ripping out the parts of macOS that I really enjoy.
The modern version of this is RHEL compatibility.
It doesn’t really matter any more for them. They’re fairly niche operating systems with a long tail at this point.
macOS, and maybe AIX, are the only relevant OSes in the list provided, so it doesn’t make a lot of sense to put money into products which are riding out the life of the hardware they’re installed on. Or the life of the last COBOL programmer in the case of z/OS and the amount of money business are willing to spend on lawyers in the case of HP-UX.
The POSIX APIs got us a long way. It sustained the early FOSS OSes by allowing software to be fairly portable, and POSIX not evolving fast enough has been a problem.
It’s probably time for a FOSSNIX and FOBJIX projects to build cross-platform APIs and a test suite. LOL