It’s been two months since Steve Jobs announced that Apple Computer was going with Intel processors, and in that time, Apple’s transition toolkit has made its way into the hands of the Macintosh cognoscenti. And in that time, it’s become clear that this transition will likely be the easiest of all those yet experienced by Apple developers.
I wonder what this is going to take on the level of optimisation.
The test machine is not what the finished product is going to be like, but it seems to me that it’s going to take quite a bit of time to optimise the code.
And here I was, assuming that there would be dual-core 3.6? machines out that would just eat up all the overhead.
I know Rosetta is going to run my legacy apps, but I’m now looking forward to having to swap out all my apps for native x86… and I’m sure there are apps that won’t be built [or at least not immediately] for x86.
It’s a conundrum. Do I buy now or do I wait?
The positive feedback from the developer side means that in all probability there won’t be a serious problem developing for the platform. Apple seems to have prepared well for the transition.
I so hope they don’t put “Intel Inside” on it. I can do without it. An Apple logo is more than enough for me.
http://www.spiegel.de/img/0,1020,477662,00.jpg
Wouldn’t that be a good compromise? 😀
You know how to drive a stake through the heart, I’ll give you that .
If they do it like that, I’m painting over it. That is never going to sit on MY desk.
I can live with Intel if I have to. However, I do NOT want to run the risk of Intel exploits. One gets used to not worrying about virii or other nasties.
And it would be good if it would bring more apps on board that are now not considered viable for porting to Apple [I realise it’s still going to be all Apple, but it might persuade companies to port to the Mac seeing as a lot of the underlying architecture should be the same].
Sure it’s going to be easy when most PPC code can be and was compiled to run on dev Mactels in just under a few hours.
There is a whole bunch of programs already MacTel ready, the one’s who used Codewarrior, namely Adobe and M$ programs, will have to be redone. But they are all do for a upgrade anyway with brand new universal binaries will keep the PPC majority happy for many years.
I wouldn’t count on a Intel sticker on the front of a Mac, ever. Steve Jobs would rather drop dead first.
Here’s all you need to know about the transition, it’s going to be painless and seamless. 99% of Mac users won’t even know the difference.
http://appleintelfaq.com/
To be honest I think this is the way of the future – where the underlying hardware becomes an irrelevance and translation is done dynamically as required.
If the performance leap from the Intel switch is as high as suggested – up to 3.6ghz processors – the you’ll probably find most PPC apps run almost as fast as they would on, say, a 1.25ghz Mac Mini anyway.
The transition from mc68k to PPC was for the better. Motorola was unable to push the 68k series any further (modulo the great ‘060) so this was the obvious choice. Now they take a great RISC cpu and replace it with a piece of shit made by Intel, which wastes transistors on 16bit compatiblity with stupid piece of shit software made by Microseft in the ’80s.
The transition from mc68k to PPC was for the better. Motorola was unable to push the 68k series any further (modulo the great ‘060) so this was the obvious choice.
What do you mean modulo? How can you say they were unable to push it any further and just ignore the latest and greatest effort.
And there’s no reason why the 68k could not have been implemented in a superscalar out-of-order core like the Pentium Pro and its successors all the way to the new Merom. In actual fact it would have been somewhat easier given the 68k’s cleaner and more orthogonal design. The instruction set was also well-prepared for vector and 64-bit extensions.
The supposed advantages of RISC have largely diminished if not turned into disadvantages anyway. Apple fell for the early-nineties RISC hype and thereby forced Motorola to join them in the PowerPC camp.
Not only did the PowerPC switch cost themselves a lot of customers but it also killed off any other 68k-based systems like Amiga or Next that could have helped providing the volume necessary to fund competitive processor development.
Who knows, Apple might not even have to switch to x86 now if they had stayed with the 68k back then.
Not so horrible this x86 thing is after all hu? Welcome to the world all of us *nix devs live in, a multi-arch world where code is closely maintained for compatibility across everybodies popular architecture.
I have to port a 15-year old application suite forward that ran under MacOS 6 (!). ANSIfication of the K&R-C dialect and later Carbon-izing are done, as is the switch to XCode (I still like Codewarrior better, but then I still work on a 400 MHz Powerbook). One component of the suite went from PPC to Intel in under an hour. Set up the transition system, copy files over, fix one ResEdit resource, and instant Intel glory.
Another component uses a filing and Btree-index layer with a rather complicated journaling and page cache, that writes to disk native endian. At the point of I/O I don’t know what is passed on and at the point of knowing the data, I don’t know when it will be written. The layer was written out-house, but fortunately the (15 year old) sources are there. My joyous task is to keep the suite data compatible. Serious hacking ahead.
Economically I don’t suffer, though, because I charge the man month it’ll likely take at regular industry rates
I understood that in recent years 16-bit emulation overhead was moved from the CPU to the OS.. the calls may still be in the CPU but what isn’t used doesnt slow anything down.
After all, PPC chips also have emulation hardware built into them, Apple made use of this to make 68k-on-PPC easier. PPC may be architecturally superior.. but then Itanium might also be, and yet both crash and burn due to slow speed growth and limited take-up.
Mac will benefit from being on a common processor architecture.
I understood that in recent years 16-bit emulation overhead was moved from the CPU to the OS.. the calls may still be in the CPU but what isn’t used doesnt slow anything down.
No, 8086-style segmenting, the A20 gate and all that crap are still in there alright, not least because the BIOS is still 16-bit.
But it hardly matters if you have a 150 million transistor budget.
” as is the switch to XCode (I still like Codewarrior better, but then I still work on a 400 MHz Powerbook)”
Trust me, Xcode sucks on a 1.5GHz PowerBook too. I’m fine with the Intel transition, I can deal with that. But being forced to port my project to Xcode, that’s what I’m mad at Apple for.
“Not only did the PowerPC switch cost themselves a lot of customers but it also killed off any other 68k-based systems like Amiga or Next that could have helped providing the volume necessary to fund competitive processor development. ”
NeXT was porting their platform over to the 88K seriers of processors, and had no interest on developing the 68K line further. Then they decided to be an OS provider, rather than a HW provider… which is what they wanted to do since the beginning.
Commodore was dead by the time the PPC came out, so your point about the amiga was moot.
There were 68K follow ons, the ‘060 comes to mind, just that no one was interested. Motorola more than willingfully joined the PPC consortium since their 88K was going nowhere fast.
The Apple-IBM-Motorola alliance for PowerPC was created as early as 1991 (even though the first PowerMacs didn’t appear until 1994).
At that point NeXT had just started shipping the fairly successful 68040-based NeXTstation. Only after the effective abandonment of the 68k through AIM did NeXT start looking for and porting to other platforms.
As for the Amiga, mismanaged Commodore might still have gone bust, but the disappearance of an obvious future for the Amiga certainly did not help, because the resulting uncertainty eroded its previously big user and developer base. And it made the job for Commodore’s successors impossible, because emulating the Amiga with its special hardware and lots of hand-optimised programs was not a viable option until much later.
Apple wouldn’t use the 68060 because it easily outperformed the first PPCs on integer performance even for native code. They even refused to provide a MacOS update for cloners wanting to use the 68060 because it could have endangered the much-hyped (but ultimately pointless) PPC switch.
And yes, Motorola themselves certainly had their part in the 68k’s downfall, what with slow delivery of 68k improvements and the wasted m88k me-too-RISC effort.