Two interesting articles on OS X today. This one summarizes some of the less-obvious new features of Apple’s Tiger. The other one theorizes that Apple’s shift to Intel is an incentive for Windows developers to port their software due to the lack of the endian problem.
Somehow, I doubt the reason more apps aren’t ported to OS X is “the endian problem”. It might be the major differences between the Win32 API and Cocoa that are the reason. Or maybe it’s the differences between C++/C# and ObjC (I hate ObjC, btw). Or maybe it’s the fact that many Windows developers use DirectX components instead of SDL/OpenGL/etc. Or perhaps it’s just that they’re all too lazy.
You “hate” ObjectiveC? That’s interesting. Most people aren’t so shallow as to find a computer language so loathsome that they “hate” it… especially one so well thought out as ObjC. Then again, a lot of programmers I know prefer to avoid Java and ObjectiveC because they just can’t wrap their minds around anything non-procedural, but I digress.
As for your API points – very few apps actually use DirectX or OGL (games) when compared to non DirectX, straight ahead applications…
As for Win32 vs. Cocoa, you can write C++/C in Cocoa. You can even access CoreFoundation objects directly, as well as the Carbon API’s. If you really can’t wrap your head around any ObjC at all, you can even skip Cocoa all together and just write to Carbon… you never even have to see ObjC code.
Why, oh why, do people feel the need to pontificate on something when they only know a little tiny bit about it?
Even if it is possible to write C/C++ in Cocoa, most of the documentation is based on ObjC. Even if the concepts are the same, the language differences is still a hurdle you will have to overcome in order to port your application to OSX. Even though any good programmer will be able to master several programming languages, it is still a nuisance to write in different languages at the same time.
No matter what problems different languages causes, the endianness should not be that big a problem.
No, you cannot access Cocoa objects (nor Obj-C objects for that matter) in C/C++. Objective-C++ is meant for an Objective-C program to access C++ class objects. Meaning, if you already designed a class in C++, you can use it in your Obj-C program. But not the other way around.
The other thing about Apple’s graphical desktop APIs is that the graphics coordinate system starts at the lower left corner, just like the cartesian plane. Windows however, the origin is in the upper left corner. Without getting in which one makes more sense, suffice to say this is something to account for when designing GUI apps. Personally myself I’ve worked on a few projects where all GUI elements were created on the fly, for a dynamic GUI. First of all, if I were to port such apps to Carbon/Cocoa, those measurements would have to be adjusted, secondly Cocoa uses floats in all inputs regarding coordinates so that would have to be considered as well.
You can create and manipulate Objective-C objects from other languages by means of the Objective-C runtime API which has a C interface.
For information regarding the NeXT runtime interface you can look at:
http://developer.apple.com/documentation/Cocoa/Reference/ObjCRuntim…
And for the GNU runtime interface you can peruse objc-api.h and objc.h
The API difference between Win32 and Cocoa is a much bigger impediment than endianness. However I wouldn’t be too suprised to see Apple fork off a version of WINE sometime soon, Cedega style. People won’t want to buy Apples to run windows software, but it’s free so it wouldn’t be such a barrier to switching either.
Michael
Codeweavers has already announced that they will be selling crossover office for intel mac os X machines.
Here is the press release: http://www.codeweavers.com/about/general/press/?id=20050622 .
I just hope that with this we’re not starting another wine-linux tragedy that happens alot in linux-land, people that install linux, then wine, and then start using ALL their windows programs under wine. And when they get tired because some programs don’t work, they go back to windows and say that linux ‘is not ready’, because it doesn’t run all their windows software, instead of trying to use linux programs for the job…
Why peal off a version? why not work with Mainsoft, who actually HAVE the Windows code, and bring their software accross; sure, it isn’t the complete code, but for 90% of cases, its a tweak and compile away – it doesn’t work like wine, in that, it isn’t a virtual machine, but its an API which simply requires a recompile.
Oh, and the MFC/widgets used in it can be themed so that it fits into the rest of the desktop.
Noone is going to want to recompile all of their windows apps with some random Mainsoft environment in order to gain a few Apple sells. The only tractable solution there is if you can just run the windows software as is without thinking about anything other than how ugly it is.
Michael
Cocoa = ObjC – they are not different.
I think that porting apps is a combination of issues including:
1. language used
2. APIs used
3. various technologies used
APIs make programmer’s lives easier – but make porting harder depending on the APIs used. Language is an issue – if you use C# you will need to do a lot more editing to your code than if you just used C++ to begin with. FInally directX is a MAJOR thing when it comes to games.
I hope that apple either makes a VM environment to fully take advantage of their hardware without major performance hits – or the open source community does this – just run darned windows in a window and run your game or app – no need for porting
Cocoa is an API. Objective C is a language. Objective C can be used in other operating systems for other purposes, with other APIs (such as the GNUStep API). Cocoa can be used with languages other than Objective-C. For example, Cocoa-Ruby lets you write Aquafied apps using the full Cocoa libraries (and handy nib files) from the comfort of the Ruby language.
See:
http://en.wikipedia.org/wiki/Cocoa_%28API%29
http://en.wikipedia.org/wiki/Objective-C
Yes, they are.
Cocoa is the API stack.
Cocoa is the language.
What you’ve said is tantamount to saying “C++ = MFC, they are not different”. That would be retarded, much like saying there is no difference between ObjC and Cocoa.
Cocoa Overview
Cocoa is an object-oriented application environment designed specifically for developing Mac OS X-only native applications. The Cocoa frameworks include a complete set of classes, and for developers starting new Mac OS X-only projects, Cocoa provides the fastest way to full-featured, extensible, and maintainable applications. You can bring applications from UNIX and other platforms to Mac OS X quickly by using Cocoa to build state-of-the-art Aqua user interfaces while retaining most existing core code.
Cocoa is one of the application environments of Mac OS X and a peer to Carbon and Java. It consists of a suite of object-oriented software libraries and a runtime engine, and shares an integrated development environment with the other application environments. You can write Cocoa applications in either Objective-C or Objective-C++ (there are Java bindings as well) but you can also call Carbon C functions.
Cocoa provides a basic application framework for event-driven behavior and for application, window, and workspace management. In most cases, you won’t have to handle events directly or send any drawing commands to a rendering library. In addition, Cocoa offers a rich collection of ready-made objects to add to the interface of your application. Most of these objects are available in Interface Builder, Apple’s user design tool. You can simply drag an object from an Interface Builder palette onto your interface’s surface, configure its attributes, and connect it to other objects. And of course, you can always instantiate, configure, and connect these objects programmatically. To support user interfaces, Cocoa includes various technologies, including those that promote accessibility, perform validation, and facilitate the connections between objects in the user interface and custom objects.
Both Apple and third-party vendors are continually releasing Cocoa frameworks to support the most advanced features. The core Cocoa frameworks are Foundation and Application Kit, which contain the classes needed by applications and other tools. The Foundation framework defines a base layer of classes that can be used for any type of Cocoa program, but are used primarily for creating applications that don’t need a user interface, such as command-line tools and Internet servers.
The Application Kit framework contains all the objects you need to implement your graphical, event-driven user interface, including windows, dialogs, buttons, menus, scrollers, and text fields. The Application Kit simplifies your work as it efficiently draws on the screen, communicates with hardware devices and screen buffers, and clears areas of the screen before drawing.
When coupled with Interface Builder, Cocoa helps you create fully functional, object-oriented applications on Mac OS X in a fraction of the time you would need using procedural languages. The Foundation and Application Kit frameworks and Cocoa’s infrastructure takes care of the details for you, so you can concentrate on features.
If you are ready to begin learning about the APIs and tools available on Mac OS X for Cocoa, go to Getting Started With Cocoa, for a guided introduction and learning path.
Getting Started with Cocoa:
http://developer.apple.com/referencelibrary/GettingStarted/GS_Cocoa…
Porting Multithread Applications from Win32 to OS X
http://developer.apple.com/macosx/multithreadedprogramming.html
Object C is able to use C code directly without change. I think that’s why MS is pushing C# – it makes it harder to port the code to OSX. With Windows projects done in C, most of the code can remain unchanged. You just add a little objc to handle the system stuff. With C#, most of the code must be completely rewritten.
Except not really, since every single bit of Win32 code in the C program would need to be rewritten. Every bit of network code, every bit of GUI code, you name it. As a matter of fact, it’s probably easier porting C++/C# programs to Objective C (although by no means is it easy- Smalltalk object syntax is quite different than C++ object syntax). Besides, who writes/wrote Windows programs in C? I thought it has been C++ or greater for a long time now, since before OS X.
A god programmer knows how to separate the main program from the little bits that tie into the OS. and in the case of the network code, the OS provides a robust networking system that a developer can easily tie into. if there is stuff int eh app that relies on the network code, then you need to create an interface in your program that will allow you to tie in a different network API when ever you want with out having to rewrite the entire program.
“The Endian Problem”, is a bigger problem than some people think. But at the same time, it’s not the biggest problem I think many developers had in the past. The biggest advantage in my mind will be the ability to dual-boot Windows and OS X on the same Mac system, which will allow a developer to develop applications for both platforms using the same hardware. Right now, if a developer wanted to do that it would require significant hardware investments in both a PC and a seperate Mac. Now the developer can just have a Mac.
I understand about big/little endian but what problem stem from standarizing a platform on big vs. little or vis-versa?
The problem is simply that sometimes code is written which assumes one byte ordering over the other and when you try to port this kind of code, those sorts of errors aren’t flagged by the compiler (it has no way of knowing, really). Which means you basically have to hunt them down by hand which can be horribly time consuming and error prone.
I imagine this kind of thing affects games more than any other kind of software.
It isn’t a problem with one or the other, it is simply a problem that the PowerPC was big endian and x86 is little endian. When you have a 32-bit data word (4 bytes), you might be tempted to swap some of the bytes around in order to do quick math operations. The bytes will be in a different order on PPC vs. x86 even though the number stored in the data word is the same when seen at a higher level. If you try to swap the bytes on the PPC the same way you swapped them on the x86, the PPC is going to give you a very different (and very wrong) answer in the end.
for the programming API’s being dictated by the apps makers and not by the OS makers
You mean like qt, wx, swing, swt, etc.? It isn’t that there aren’t alternatives. It is that the alternatives usually aren’t as good if you are going native only.
I don’t know what you are smoking, but qt, gtk, wx, and others are very nice to work with. Further, I would rather write to qt, gtk, wx, etc., if only because dealing with MFC and Cocoa can be a pain in the ass, and because one can get cross-platform without having to write multiple frontends.
Sorry, but MFC might be a pain in the ass to work with, Cocoa is absolutely NOT. Portability is the only reason to write a Mac program in those other tool kits.
People like me have Zero interest in running Windows on my Mac. In fact I don’t use any MS software on my Mac. The fact that Apple is going Intel will make zero difference for me as I’m no more likely to run Windows or MS programs.
Now, the chances of me running BeOS, OS/2 (eComStation) and other OSs jumps way up. But Windows? I have no use for it or apps for it.
It’s surprising how many people around here are spouting off about Cocoa, yet clearly know little about it.
Cocoa is a collection of two frameworks, Foundation Kit and Application Kit. Application Kit often is what people actually mean when they say “Cocoa,” as it contains all of the magic necessary for making GUIs.
Cocoa is implemented in Objective-C, and is heavily influenced by its philosophies and attributes–most particularly its dynamism. But they clearly are not the same, as, out-of-the-box Mac OS X provides a Java binding, and third-party developers have provided packages for Cocoa development in Ruby, Python, and C# (and there probably are others).
Application Kit strongly differs from every other GUI API in just about every way imaginable, to the point that experience in other GUI APIs makes things more painful while learning it. Application Kit has been lauded ever since it was introduced in NeXTSTEP, particularly for the significant increase in productivity it provides. Regardless, it remains unique.
To demonstrate the paradigm shift in moving from any GUI API to Application Kit, consider these analogies:
Win32/MFC/WinForms : GTK+/Qt/Swing :: Java : C#
Win32/MFC/WinForms : Application Kit :: C : Smalltalk
In other words, most would consider the move from Java to C# to be relatively trivial, yet the move from C to Smalltalk would be tremendous because of the huge difference in paradigms. It’s the same with Application Kit, and *that’s* a big part of what makes developers shy away from native Mac ports.
As for Objective-C, I happen to believe that it’s a great compromise between performance and flexibility, and adds painless backwards compatibility with C. Only the likes of Smalltalk and Ruby can compete in the flexibility arena, and they perform an order of magnitude slower.
Although all of Cocoa is accessible with Java, Java simply is too static to allow some of the magic that Cocoa makes so special.
Nobody would consider the move from C# to Java for an existing code base trivial. Rewriting an app from one language to another is virtually never trivial.
Further, I’d love to see some runtime benchmarks to back up this ridiculous claim that Smalltalk and Ruby are an order of magnitude slower. Do you actually know what that means? Do you think that something which takes 1 second in C takes 10 in ruby?
The penalty you might see from switching from a strictly procedural language to something with objects would be the so-called abstraction penalty, and even that is pretty minor these days. There’s also interpreter startup time, but over the course of a long running app (say 10 minutes) the 10-20 seconds of startup required by an ENORMOUS interpreted app is pretty minimal.
Objective-C isn’t a bad language, however everyone ignores its fatal flaw (shared with C and C++), which is a complete lack of garbage collection. GC is what makes Java, Ruby, Perl, Python, etc so much more pleasant to program in. Particularly for large projects. When it becomes hard to leak memory instead of hard not to you’ve really found something worth upgrading to.
Objective-C isn’t a bad language, however everyone ignores its fatal flaw (shared with C and C++), which is a complete lack of garbage collection
That is not *entirely* true; despite the fact that ObjC doesn’t have the “pleasant” GC that will handle generation of objects and will automatically detect when an object is no longer referenced and/or needed, in ObjC you have a little bit more control than in plain C or C++. I am not an expert in ObjC -as I am learning it- but it is not like C/C++ where a mis-calculated pointer will leak memory at once (or create unexpected fatal results).
From what I’ve read so far, ObjC is a little bit “better” in terms of memory management than C/C++; but I may be wrong. ObjC looks bad (different) the first time you see it, but once you understand its messaging system, you just can’t go back.
Objective-C has no intrinsic advantages over C with regard to memory management. You have no additional control and memory management isn’t any easier. Frameworks can be developed (like NeXT’s Foundation Kit) with classes for managing object lifetimes. NeXT for instance created NSAutoreleasePool for delegating object lifetime to the lifetime of an instance of NSAutoreleasePool. This doesn’t change matters for resources not descendended from NSObject, and it still requires explicit management for correctness.
C++ probably has the most sophisticated/complicated/flexible language-level resource management of the three languages you mention.
” Nobody would consider the move from C# to Java for an existing code base trivial. Rewriting an app from one language to another is virtually never trivial. ”
i think It is quite easy to convert C# code to Java and vice-versa. C#->Java is not applied much, because it is not needed. Java->C# is sometimes tricky because many java libraries do not have counterparts in C#.
But on this subject, i am a little confused why people are talking about C# anyway? C# or .Net is not used in MacOS environments at all (i mean real world applications), Java is there and developed-supperted by apple itself.
If C# turns into a problem u could always go monkey about it(monkey = mono), i’m learning Java right now but it pisses me off that many local employers require knowledge on MS specific programming
ForeverChat
#OSx86
“Nobody would consider the move from C# to Java for an existing code base trivial. Rewriting an app from one language to another is virtually never trivial.”
In an absolute sense, you’re right. But /relatively/ it’s trivial, compared with the other scenario I proposed.
“Further, I’d love to see some runtime benchmarks to back up this ridiculous claim that Smalltalk and Ruby are an order of magnitude slower. Do you actually know what that means? Do you think that something which takes 1 second in C takes 10 in ruby?”
There are plenty of “language shootouts” out there with benchmarks showing languages like Ruby and Smalltalk to be at the bottom of the pack in terms of performance. Needless to say, this doesn’t make those languages bad in any sense–I use and love Ruby myself, in fact. But it is an issue that deserves some degree of consideration when comparing languages with similar features.
Let’s face it, if the dominant market share was Mac OS X, then everything we see now would be reversed, except exploits of course.
So given that what would a mythical WinPPC do? Bring their hardware to x86 code base, make it easier for x86 developers to code and reduce the time involved so it makes it a justifiable buisness expense.
that’s what is going to happen here with the new MacTels, programmers can discuss specifics until they are blue in the face, but Apple is going to make it very attractive buisness wise to code for Mac OS X. They are most likley going to make it easier and faster as well, which will bring more software titles to Mac OS X, which will in turn sell more hardware because people would get what they get on WinTel with a much improved operating system.
Well, actually I don’t believe that so-called endian-problem could have been a very big problem and prevented Windows developers to compile for Mac OS. After all, we have compilers for that and there are a VERY limited set of softwares which would need to manually care about it (games, OS specific code and a few others).
Apple is probably confident that they have a good software in OS X and that they could gain market shares if competing on the very same ground as dominant X86 market. They probably thought that IBM and PowerPC didn’t have so large advantages (if any, I’m not a benchmark expert…) to drive many users to switch, expecially because X86 market is competitive that users always get a fair price for their HW.
But I don’t see any war coming between MS and Apple. I think they Intel-MS-Apple had drawn some kind of agreement. I can’t see MS agreeing to continue developing Office for a competitor which is threatening to steal market shares…
I think that it would be more than reasonable to keep working on Office for Mac even if Apple managed to start to over throw MS’s OS market share. MS makes a large percentage of their money from Office. So even if they started losing OS sales they would still have the most used office suite to make them lots of money. And having that suite built for a very popular OS would still be important.
Let’s face it though, I am no fan of MS, only use it when I have to, but Apple, *nix’s, and so on, have a very steep hill to climb. This is mainly due to the so called “mind share.” Even though most people dislike Windows, due to it’s problems, people still use it because they really don’t know about their other options.
Is endian order a problem for anyone who uses any language above Assembler? Or maybe also for compiler writers or coders of that sort of low-level tool?
I can’t imagine any normal application being written in a way that cares.
Endianness is an issue for binary file formats and application-level network protocols (the transport level and below are standardized on big endian courtesy TCP/IP).
It’s a pesky problem, but not the kind of thing that in itself would prevent a shop from porting from one platform to another if they thought they could make a buck. This is why you hire contractors for 3- and 6-month contracts.
Paul G
The incentive is, developers will not have to deal with endian issues.
The disincentive is, developers will tell Mac users to just use Virtual PC.
I’m not sure which will win out. But I would guess that the “just use Virtual PC” response will prevail…
is VMWare for Mac. You just know they’re working on it.
Well, Cocoa is a Framework/API and a way of living. Once you understand it, it leaves .NET in the dust.
But it’s a little bit more harder to fully understand.
True is the phrase that reads: “Once you understand Cocoa+ObjectiveC, you will ask yourself, why wasn’t this being done from the beginning like this?”
😉
And I work with .NET every single day.
The root class NSObject apon which all other classes are normally built has built in reference counting. Thus:
MYObject myObj = [[MYObject alloc] retain];
Will allocate the memory for my object; once I’m done using it you would just do a [myObj release]. If you passed the pointer to myObj to any procedure and that procedure wanted to keep a reference to the object it would also do something like:
MYObject refObj = [myObj retain];
When every you are done with a object, you just issue the “release” message; once the reference count for an object reaches zero, the memory is released.
This is a very common way automatic GCs are created for other languages; except, the runtime engine tries to determine when you no longer need a copy of the object (they mostly add the “retain” and “release” code behind your back).
Please note that most objects are created as “autorelease” which will get flushed whenever the current memory pool is droped. You can create a memory pool when ever you want (mostly at places where major processing is going to happen; like just before printing a report. Droping he pool after the report is printed will free all the tempory objects created during the report processing).
“But on this subject, i am a little confused why people are talking about C# anyway?”
It was just an example of a language that’s similar to the other language used in that example–Java. Yes, there are differences, but both languages share the same philosophy, paradigm, and basic look.
If I’m not mistaken, there’s no force stopping anyone from using garbage collection (as it’s currently understood) with Objective-C. And if a developer simply can’t stand Objective-C, Ruby is an excellent alternative. Personally, I like Objective-C’s adoption of Smalltalk’s split message names; I find “[target loadDataRepresentation:foo ofType:bar]” to be much easier to follow (and practically self-documenting) than the more common “target.loadDataRepresentationOfType(foo, bar).”
By switching to the i386, Apple might be able to create an ABI that reads in most of the code and translates to a different location of memory so it can run windows binaries in a much higher speed than traditional emulation.
http://www.freebsd.org/doc/en_US.ISO8859-1/books/handbook/linuxemu-…
“Just use Virtual PC” is the easy answer that software shops are certainly going to use, but if I wanted to use a Windows box I would have bought a Windows box [or have one of the wonderboys rip the motherboard out of a coke vending machine and port a demon spawn of Unbuntu married to SUSE 9.3 onto it, put the whole thing in a mess tin and charge $129 USD for it].
I really hope there will be more apps available for OS X because that will really strengthen the platform.
I’m intrigued about the apparent assertion of the writer that it will be easier to write virii or malware for the Mac, using vulnerabilities in the Intel architecture.
Is there any validity to this point?
Objective-C doesn’t have “Smalltalk object syntax.” It doesn’t really have much in common with Smalltalk in terms of syntax. It has named parameters in message selectors, and that is about as similar as the syntax gets.
The biggest hurdle for developers was why spend tons of cash to develope for 10% of the PC market. Switching to intel CPU’s makes the Mac cheaper and more attractive to the broader market. As Apples market share increases not only will new technologies emerge to ease developement on Mactel more developers will jump on board. People like my self that primarily use Windows for gaming will just on the chance to game on a better platform.
AFAK OS X wouldn’t run on any x86 PC, but only thouse produced by Apple.
If u don’t own a Mac and want to build Mac Software, you can always use OpenDarwin as your development plataform (Since it uses the same kernel but without the propietary components) Or get a Mactel and Dual Boot…
Well, Cocoa is a Framework/API and a way of living. Once you understand it, it leaves .NET in the dust.
1. Cocoa and .NET are completely different things.
2. Is Cocoa platform independant (last time I checked it was OSX only), .NET (Mono) is crossplatform
3. Cocoa is application framework (written in ObjC) and .NET is a platform.
But it’s a little bit more harder to fully understand.
Yep, and you’re a real example. Time to start learning meaning of basic computer words for you.
True is the phrase that reads: “Once you understand Cocoa+ObjectiveC, you will ask yourself, why wasn’t this being done from the beginning like this?”
Ok, I understand Cocoa and ObjC. But, I still don’t ask my self that.
Still, I wonder why does everybody thinks that moving to Intel will bring developers. GUI API for Windows and OSX is completely different. Even the basics of user interface are different. A simple coder would better start coding again than correcting the previous sources (except if most of the software was not GUI based). Unless someone brings a complete set of windows libraries that enable you to recompile your XYZ app, there just isn’t reason enough to start coding for mere 1% of people
And I work with .NET every single day.
And you should keep on doing that, at least if you don’t want to become platform dependant. But, to be fair. MS.NET is platform dependant too. Mono isn’t.
p.s. And I’m linux biased (and OSX somehow forced), in fact the only operating system that doesn’t run at my place (except occasionaly) is Windows,
“Objective-C doesn’t have “Smalltalk object syntax.” It doesn’t really have much in common with Smalltalk in terms of syntax. It has named parameters in message selectors, and that is about as similar as the syntax gets.”
Just the selector syntax is closer than most languages get. Many readers here need to stop thinkink in such absolute terms.
“I don’t know what you are smoking, but qt, gtk, wx, and others are very nice to work with. Further, I would rather write to qt, gtk, wx, etc., if only because dealing with MFC and Cocoa can be a pain in the ass, and because one can get cross-platform without having to write multiple frontends.”
Cross-platform GUI APIs aren’t necessarily a Good Thing. They certainly make developers’ lives easier, yes, but they do not create especially rich applications. Although widgets may be drawn natively, that’s not enough to feel native; each GUI has a different feel and different philosophy, and those differences can’t be captured by a one-size-fits-all solution. Aspects like the rich, pervasive drag-and-drop, document-modal dialogs, standardized preferences system, etc. that make Mac OS X special get lost when using such APIs. To me, it’ll be a very sad day when developers stop making platform-specific GUIs for their apps.
Objective-C doesn’t look anything like Smalltalk, so vaguely referring to syntactic elements as “Smalltalk object syntax” is just wrong. Completely wrong.
For those of you who do not know, there are two languages that I know that that allows you to write cross platform code without changing a single line of code. All you have to do is compile the same source code on the target platform
Real Basic: http://www.realbasic.com
Pure Basic: http://www.purebasic.com
While the top one produces pure bloat, the bottom one is pure speed and small sizes. However, it takes care of the little edian problem *some* developers have.
Objective-C++ (supported since jaguar i think) is the same objective-C extensions to C++. This gives you the best of all worlds.
Cocoa is as cross platform as C#
Mono is the open source copy of .NET
GNUStep is the open source copy of cocoa
You can compile obj-C (and i thing Obj-c++) on windows and can even download the core foundations frame work from apple (assuming you find GNUStep incomplete). What you cand get from apple is a windows version of the app kit. This is especially sad since, OpenStep Entrprise (This became cocoa) had this. I bleive that apple maintains this. I suspect they simply compile iTunes for windows using it.
Until apple releases it, use GNUStep to go cross platform.
GNUStep is an open source implementation of OpenStep, though they do provide compatability for changes Apple makes. Cocoa consists of components with no GNUStep implementation, forms built with Apple tools need to be converted, and interoperability for distributed objects is a potential problem.
C# is just a programming language. Between Rotor, Portable.NET, and Mono, C# code can be compiled and executed on several platforms. .NET as a whole on the other hand is a different story in terms of maturity and robustness.
I really wouldn’t consider either platform seriously if I wanted to develop usable cross-platform software that end-users would find pleasant.
GNUStep won’t cut it though. It has a cygwin dependancy – at least from the FAQ.
Too bad that Apple didn’t continue to make OpenStep available for windows.
But there’s no way that GNuStep can be considered for crossplatform work.
Mono is a much more legitimate crossplatform framework than GNUStep.