“On the whole, I bought the story too, although I know that it won’t all go smoothly. Anyone who owns or is considering buying a Mac has to have questions; these are what I think the answers are.”
“On the whole, I bought the story too, although I know that it won’t all go smoothly. Anyone who owns or is considering buying a Mac has to have questions; these are what I think the answers are.”
Can anyone else confirm that a fresh install of 10.4 solves the problems described in this article? I get that beachball WAY to often but I really don’t want to go down the road of reinstalling everything if it isn’t going to solve anything.
works for me.. you can do it with an archive/install
First to your slow Tiger question, I have found that Dashboard leaks memory so badly, that having a bunch of widgets will kill performance. Also many less professional third party widgets don’t turn themselves off when Dashboard is out of sight, and they continue to eat CPU. But the memory is the real problem, supposedly this is being fixed or at least improved in 10.4.2. I would recommend closing all your widgets until then though. For what it’s worth, I also always do a clean install.
As for the article, the author “predicts” that we won’t see the first dual os x / windows boot Macs until well after Apple begins shipping retail systems. This is silly and uninformed because people have already installed Windows on the developer preview machines.
yes, but at horrible resolution
Dashboard is not leaking memory.. widgets are running programs and require memory when loaded up in dashboard. lots of widgets means lots of loaded programs. Dashboard widgets do not take up cpu resources when dashboard is not in focus however.
Wrong, and wrong. Dashboard IS leaking memory. See: http://lists.apple.com/archives/Dashboard-dev/2005/Jun/msg00063.htm…
This has also been reported on numerous websites. Thinksecret recently reported that 10.4.2 “improves widget memory management”.
As for dashboard widgets using CPU, a _correctly_ coded widget doesn’t use CPU when the dash isn’t in focus, but that behavior has to be coded into the widget, and I can tell you definitively that not every widget author is doing that. I have submitted a handful of bug reports on that issue already to widget authors.
“I predict that eventually, perhaps in June 2007, Apple will begin exerting pressure on users to upgrade to the newer technology”
I really don’t see that happening so early. There might not even be Intel based Powermacs out yet in June 2007, there’s no way Apple will stop making PowerPC compatible versions of iTunes and what not that early (as she states).
And then she goes on to bash Tiger, which I find totally unwarranted. Yes, she may have had some trouble, but the vast majority of people have not had show stopping problems.
Agreed. Even with the problems I faced (on my blog), Tiger still hasn’t shown me the same grief that Windows XP showed me (free OEM copy thanks to Microsoft NZ’s woo-a-thon to OEM builders).
Tiger is getting there, and going by delay in the 10.4.2 update, I’d say that there are a large number of bugs being fixed, and maybe even Quartz 2D Extreme being activated.
Surely the question concerning most current mac users is whether Rosetta will be able to make intel-only applications run on PowerPC macs.
Does anyone know the answer to this?
Everyone who has read the Universal Binary pdf knows the answer: No. Rosetta is only for intel-Macs.
rosetta is for current apps to run on intel macs, not the other way around.
Rosetta is only for current apps WITHOUT the need of the G5 or the G4/Altivec. So, if your app is coded in 64bit(not too many ) or is accelerated with Altivec, no way Rosetta run it.
Oy, people don’t really seem to get these – at least the people writing articles. Of course, they aren’t programmers. Any programmer worth anything can tell you that you can recompile programs for different processor targets with no changes (unless you have done something weird). For example, I can compile Generic Linux Program A for x86 or PowerPC with no changes to the source code. What I can’t do is compile Generic Linux Program A for Mac OS or Windows without changes.
But, of course, article writers are treating this like porting from one OS/API to another with some magic Apple voodoo that may or may not work as planned. That isn’t the case.
Now, there are a lot of people also saying that more programs will be ported because it will be easier because of the switch. With the possible exception of games, this just isn’t true. The PowerPC processor was not the item stopping the porting of applications like Opera, MS Office, etc. The problem was that Mac OS X is a different OS. There are always ways around this, like using a language like Python and a Tk GUI, but they aren’t so widely used. Now, with games, about a third of the work is endian issues (byte order). This will be solved, but they will still have to port from DirectX (if used) to OpenGL and from the Win32 API to Apple’s (Carbon, Cocoa).
For 99% of applications, it is just checking the two boxes since there won’t be an OS differences on the two platforms. Drivers and games are different.
No, you can’t. Firefox for example tooke quite some time to port.
Yes you can. Most applications will take very little time to port from PPC Mac OS X to x86 Mac OS X. A simple recompile, with the correctly checked tick box.
if you have a Mac, you can try it for yourself. If you don’t have a Mac, get hold of one and try it. Mathematica ported their application in a few hours.
Porting to Mac OS X from another OS though, is a different beast altogether.
Wolfram could easily port a useful Mathematica to the x86 MacOS X because it was written to be portable and already had an optimized x86 kernel. The amount of time any particular application will require to port will depend on the presence of endian problems, sensitivity to floating point behavior, how dependent on AltiVec the program is, and how many new/different bugs leak through the frameworks to cause unexpected problems.
Even if you can compile a program without the compiler issuing any errors for both platforms, that doesn’t mean that it will necessarily work correctly.
Saterdaies, this is not totally true.
Apps written in Java will (for the most part) just work.
Apps written in C, C++, Coco, or Objective-C apps, will need to be tweeked and recompiled, but little changes should be needed.
MetroWerks developers will have a much harder time. They need to fist port to Xcode and that, depending on your app, could be painful.
Many Xcode and Carbon developer, who were at the Worldwide Developers Conference tried porting their apps with the help of Apple and walked away with long lists of bugs.
I think the migration is a great idea, and for most people will be easy, but not for everyone.
I’m not sure what spinning beach ball problems you guys are having. I admin about six Macs with combinations of upgrade-installs, clean installs, and archive and installs of Tiger… none of which have problems with sluggishness — even the 400 MHz iMac G3 from 1999 works like a charm. Maybe you guys need to buy RAM. None of my Mac have less than 512 MB, and my 1.5 GB RAM Powerbook G4 runs smooth as silk.
For 99% of applications, it is just checking the two boxes since there won’t be an OS differences on the two platforms. Drivers and games are different.
This is an oversimplification. 99% of Cocoa apps will compile for x86 just by checking the box. But most large commercial apps are still using the Carbon framework, and these often require a significant amount of tweaking. The reports so far are good; we’re not talking about a “Carbonization” level of retooling. This is a well planned, fairly easy transition. But not a completely painless one.
But, of course, article writers are treating this like porting from one OS/API to another with some magic Apple voodoo that may or may not work as planned. That isn’t the case.
Writers and consumers are right to be somewhat suspicious of the Universal Binary plan, but for practical – not technical – reasons. A UB is just a “fat binary” – seperate compiles for x86 and PPC glued together. That means developers now have to support two seperate architectures, with all the additional support headaches and tracking issues that involves. If you dismiss the headaches, you have no real world experience supporting software. Large developers will trim these UB’s back to x86-only at the earliest possible moment they can get away with it.
Only idiots think there is going to be a problem switching from PPC to Intel. In fact, I VERY much doubt anyone would notice at all if you couldn’t look inside and see the processor and Apple hadn’t said anything. People would think nothing had changed.
I have Macs already. When I buy, and which model I buy, will have nothing to do with what processor is inside of it. It will still be Mac OS X. And Steve Jobs isn’t about to sell something that doesn’t work. Apple isn’t Microsoft. They don’t put out beta quality software and hardware and pass it off a production quality products.
They don’t put out beta quality software and hardware and pass it off a production quality products.
Are you sure? I had dozens of problems with tiger (bad battery performance, hangs, safari 100% cpu… fresh install, ibook g4 1GB). That isn’t quality software. About hardware: remember that g5 fan speedup? What about the big iBook g3 graphic issue that took months to solve?
There is no reason that digital cameras and printers that work on the current PPC OSX will fail to work on the new Intel OSX. Hell, OSX printer drivers work on any system with CUPS.
“No, you can’t. Firefox for example tooke quite some time to port.”
99% of the waiting time for the port of FireFox was waiting for the developer machine to arrive.
You have never developed in NeXTStep have you?
I did a full format and fresh install of Tiger on my 12.1″ powerbook, and its rock solid and fast. The startup time is wonderful!
I can’t say that I’ve had the pausing problems those that upgraded have had, but I can say that I -dont- have them with my fresh install.
… but they happen when I run some demanding applications – which is expected so I don’t worry.
Upgrading from Panther or Jaguar to Tiger is asking the installer to perform miracles, consider how much has changed under the hood. Don’t do this. Even an Archive and Install will eventually have to convert some of your old preference files to Tiger’s new formats.
Just do an Erase and Install. Reinstall your 3rd party apps and copy your old documents back. The results are guaranteed. The hassle could end up being significantly less than dealing with an upgrade.
I meant that in a few years some apps won’t be compatible with both architectures. So, does anyone know if Rosetta can (or will be able to) translate from Intel>PowerPC.
If I bought one of the last G5 powermacs in early 2007 and then was unable to run some newer intel-only applications, I would be pretty annoyed.
From what I’ve read, at least in the press releases in the beginning, Apply will continue to support both PPC and x86 architectures for an extended time until support is cut off. How long will this be? I’m obviously not sure, but I’d suspect a fair amount of time for developers to port the application (whether difficult or not) and for the users sake.
As for Rosetta having backwards compatibility for Intel>PPC: I haven’t heard anything about this, and I’m only led to believe it’s only PPC>x86 code.
I dont know if any of you lived through the 68k –> PPC switch – I did. There were a lot of Apps that were PPC only, and undoutably once the intels come out, withing a year there will be some apps out that will be intel only – its just a fact of life – no matter how much it pisses you off – it will happen and as time goes on the phenomenan will continue to occur
There is no reason that digital cameras and printers that work on the current PPC OSX will fail to work on the new Intel OSX. Hell, OSX printer drivers work on any system with CUPS.
Actually there’s an excellent reason: in Mac OS X, kernel extensions (what we would call “modules” or just drivers in Linux-land) and IOKit drivers (drivers built around an IO abstraction layer – the bulk of Mac drivers) must be – at a minimum – recompiled for x86, if not tweaked. So unless your device is already supported by OS X out of the box, or is brand new when the Macintels ship, you might run into driver trouble.
Only idiots think there is going to be a problem switching from PPC to Intel. In fact, I VERY much doubt anyone would notice at all if you couldn’t look inside and see the processor and Apple hadn’t said anything. People would think nothing had changed.
Any serious user running demanding PPC apps or games, for example, would notice right away (performance drop of 25% to 50%). Anyone with a driver problem will notice right away (hardware no longer works). Anyone who needs to run an app that depends on Altivec will notice right away (won’t run). Anyone who needs to run a Classic app will notice right away (won’t run). And so on.
I’m not Apple-bashing; they’re doing a great job making this transition as easy as possible. But it is a transition, and there will be a little pain, some for developers and some for users.
hmm… lets think about this….
the comment was talking about current working devices.
“…the comment was talking about current working devices.”
Yes, it was. Being a programmer, I can understand why existing devices that ‘currently work’ would stop working on the new Mactels. Because of the completely different architecture, drivers will need to be reworked, and that’s bound to cause some bugs. I mean, think about it… PowerPC’s have openfirmware, which handles devices (although I’m not very knowledgeable about how much work it does) intels don’t. The way devices interact are different. It’s a completely different set of hardware components, which just spells trouble to me.
Honestly I liked this article, as it answered several questions I had (I’m looking at buying one of the last PPC PowerBooks next year…) and I think that while it may not be completely right in terms of dates, that’s ok. Don’t throw the baby out with the bathwater.
ummm…… you don’t know much about linux land…. we call them modules to lol…. windows calls them drivers
“They don’t put out beta quality software and hardware and pass it off a production quality products…”
Puh-leaze………the first version of OSX was Alpha SW, and no way Tiger should have been released when it was. 10.4.2 was what they should have released. Exploding Powerbooks anyone?
Don’t get me wrong, I love Apple, but in their rush to get product out the door their QA has gone into the toilet the past few years.
Come’on. They are switching to cheap chips and all those big mac users will have to buy new macs. That will make a huge profit boom for microsoft.
Why do you think companies release stuff that is incompatible with existing stuff and urges them to upgrade? It’s profitable.
Now, I doubt their marketshare will drop. It will rise…. They already sell iPods at wal-mart and I bet they will get more distributors, as they can now lower their prices (or keep same to just increase profit margin)..
The decision was business. I mean, there were a lot of people asking apple to disconitinue the mac and focus more on devices like iPod.
actually mac users don’t have to break away from their 4+ year upgrade cycle. I bought my first mac when the first PPCs came out, and I bought a 68k. I bought a new mac 5 years later – and even though PPC macs were out, 68k was still supported. There were 2 major OS releases (7.6 and 8.0) and several point releases (last one 8.1) which supported the 68k platform. Many applications came out that were FAT binaries, only a few came out initially that were PPC only. As time went on more PPC only apps came up – but by that time I was ready to upgrade anyway (a 33Mhz 68k processor got me through high-school and my first 2 years of college without a blink of an eye! – that was something!)
Yeah, but still… history shows us that users will still upgrade faster than usual..
But Apple has been very good when it comes to transitions and I’m sure their support for PPC will be long lasting
Mac’s at WalMart
ummm…… you don’t know much about linux land…. we call them modules to lol…. windows calls them drivers
You don’t know too much yourself, do ya?
The point I was alluding to was that on the Mac, kernel extensions (kext’s) are equivalent to Linux modules, but there is also a completely seperate driver model on the Mac (IO Kit). Drivers of both types will require a recompile and/or tweaking from PPC to x86.
And I’m sure Linux kernel developers would back me up when I say we (in Linux-land) call drivers… drivers. It’s hardly a Windows-only term. A module (just like a kext) is a loadable bit of kernel, and this is an often-used model for handling drivers, but any given module (or kext) is not necessarily a driver.
Facts:
1 – Rosetta is made to run PPC apps with no altivec or 64bit on x86 Mac OS X.
2 – Rosetta is not made to run x86 Mac apps on PPC Mac OS X
3 – Apple is pushing developpers to develop all new apps with Universal Binaries. These are binaries that will run on both x86 and PPC Mac OS X (remember NeXTSTEP’s fat binaries? GNUstep has a similar concept too. Mostly same thing, new name, new shiny coating).
Therefore, there is no need for an x86->PPC Rosetta unless the developper of the application is stupid and chooses to ignore the majority of the Mac installed base (which will be PPC for a long time still) lol.
May the fears rest in peace.
Will This Mean That Macs Get Cheaper?
I don’t expect that Macs will drop much in price as a result of this switch. Even if there is a greater supply of chips (and Apple did not cite short supply as a reason for moving away from the PowerPC), any price drop due to greater supply is likely to be hidden by high development costs, which will be passed on to buyers.
And why is it that when people live the author make statements, they never back it up? Oh, so now Apple moves all its chipset/motherboard and so forth design to Intel, and the cost gets higher? how does that happen?
It doesn’t add up, hence the reason, I wouldn’t even bother with the truck of salt, I’d dismiss the article all together.
Tiger is so many miles beyond Panther, that the author’s uninformed statements just give me a bitter laugh.
It seems only stupid people are allowed to write about technology.
Apple will have 10 times the price points if not more available after partnering with Intel, both higher and lower than existing price points.
I suppose the down side of Mactel is all the retarded Mac journalists will get more circulation and be able to impose their noise on everyone else.
The main text in the article is just a (very slight) re-write of an article on the Macworld website and in Macworld magazine:
http://www.macworld.com/2005/06/features/intelfaq/index.php
it’s quite obvious that she has read this and changed it a little to try and make it sound original. I’m another with no Tiger problems at all FWIW.
I’ve heard that IBM has just announced the low voltage PPC970FX G5 chips. They say their LV G5 CPUs takes about 16W @ 1.6GHz ๐
Are they trying to tease Apple? ๐ ๐ ๐
http://www.osnews.com/comment.php?news_id=11133
Fuck the stupid stack based Intel ProtoPentio system. If I wanted a space heater I’d buy one, not a computer.
stupid stack based Intel ProtoPentio system
Yes, x86 compilers do usually pass arguments on the stack, but that’s more by convention than by hardware design. gcc e.g. can pass arguments in registers with the -regparm option, but sadly that isn’t used much.
But since Apple is new to x86 they could do it the right way from the start and establish a register-based calling convention for their software.
And x86_64 has a register-based calling convention anyway.
If I wanted a space heater I’d buy one, not a computer.
Err, which Cupertino-based company was it again who’s overclocked processors require liquid cooling?