“Yesterday, I had an hour to work with Apple’s new Power Mac Quad G5, and for anyone in the broadcast, sciences, music, print and photography industries, I have some advice: Place your orders now!”
“Yesterday, I had an hour to work with Apple’s new Power Mac Quad G5, and for anyone in the broadcast, sciences, music, print and photography industries, I have some advice: Place your orders now!”
8 GPUs you say!?
Lol.
That caught me too…
sorry but i prefer to order a amd opteron dual core
i will surely now invest into a dead plattform
The fact that it will be eventually eclipsed by an Intel powered Macintosh doens’t mean that it isn’t a kick ass system today. In 6 months your Opteron system will be old news too. Apple may bring out Intel iBooks in January but it could be another 6 to 9 months before the high end Power Macs change over. Not quite a dead platform yet.
a systems who soon few will develop on it are not a good investment
don’t want to waste my money
Then another 12 months for stable, optimised Intel versions of the applications that you want, giving the average customer atlesat a 2 year waiting time if one were to have bought an iBook on January (if released).
I’m interested to see just how many of these firebreathers Apple sells. I mean, my guess is that only a small percentage of computer users really need that kind of horsepower…
Of course not. You think this system is meant for Joe Sixpack? Read the very first sentence of the review: “for anyone in the broadcast, sciences, music, print and photography industries”. Apple has a considerable niche market in those areas for high-performance systems. I used to work for Shaw, the local cableco; the floor where the Shaw TV people worked is always stacked with the latest, highest-spec Macs available. Just because you can’t sell something to everyone doesn’t mean you shouldn’t make it…
a lot of this kind of industry begin to switch to linux
To use Photoshop, Maya and _serious_ pro apps?
Stop tubthumping will you.
you don’t know industries use CinePaint, Maya under linux?
i hope you know some studio created their home tool under linux?
stop the fud, if you don’t know what you talk about…
I doubt whether many people would be looking to buy this machine solely as a Maya or Cinepaint machine. If you set up a box to run just these pieces of software you’d probably get an Opteron box running Linux. I would say that Linux and even Windows are better for 3D work than a Mac.
The fact is however that you usually buy hardware to run a suite of software, and Windows and Macs have lots of software (for the graphics, photography, video, publishing and audio markets) available to them at a reasonable price.
It’s a Mac and people buy these systems to run software:
After Effects, Motion, FCP, Protools (and all the other pro audio apps), Shake (although also available for Linux for much more) and all the Adobe graphics products to name a few.
Where is the affordable video editing and compositing, publishing, audio and graphics software for Linux?
Cinelerra? Little support, not particulary good for long form work.
GIMP? No CMYK support
Smoke? Flame? Nuke? Piranha? Buy it if you can afford it!
And how many individuals have the time, skills and infrastructure to develop their own tools for Linux?
Don’t get me wrong, I would love to use an Opteron box running Linux with one caveat, that all the software (or equivalent) I use was made available on that platform.
So I think that the new Quad will be great for busy creative industry organisations and individuals. Why? Because it is faster than anything else on that platform.
Why quibble over a few percent performance on the hardware when it is all primarily about the software and getting it to run as fast as possible? Time is money.
For sound engineering apps, Apple rules the roost. I’ve only encountered one studio in recent years using a non-Mac environment, Windows in that case, and the engineers hated it. I’ve never encountered Linux in studios, and it’s doubtful this will be changing soon.
Many studios haven’t even switched from classic Macs to OS X yet, because the system works just fine for them. It’ll be interesting to see if the new Intel Macs will change this situation or not.
“a lot of this kind of industry begin to switch to linux”
please support your statements. As of now, there are zero high-end applications for photo and video editing on Linux. No, GIMP doesn’t count.
“a lot of this kind of industry begin to switch to linux”
please support your statements. As of now, there are zero high-end applications for photo and video editing on Linux. No, GIMP doesn’t count.
And what does Pixar and Dreamworks use? Macs?
Both those companies use Linux based clusters to do their
video work.
There’s nothing wrong with the PowerPC architecture…
But as soon as have a company like Apple to put it
to the hands of the consumer, don’t expect widespread
adoption.
“And what does Pixar and Dreamworks use? Macs?
Both those companies use Linux based clusters to do their
video work. ”
Yeah, for rendering farms but their workstations are either Mac or Windows most of the time.
Pixar switched over to G5 X-Servers and a dilluted Mac OS X several months back for their render farm.
And only a small percentage of compute users will shell out $3300 for it. All is as it should be.
Back in 1995 any PPC mac cost at least that. Still lots of users bought them. If I could afford it, I guess I’d buy one also. Of course if you can have a decent machine for much lower you’re probably won’t purchase such a beast unless you really need it or you have too much spare cash.
For every office user with an overpowered 3 ghz PC (just to type letters in word) there must be a graphic designer who’ll like just a bit more than he really actually needs.
You’d be surprised. I was in a local Apple Store, and a buisnessman walked in. From what I could gather from the conversation they were having as I played with Illustrator on the nearby Dual 2.3ghz G5, the guy was just a manager at a buisness, no need for anything powerful.
In spite of this the sales guy was pushing the 16 (16!!!) gig of ram and a dual 2.3ghz (AFIAK, that was the best the store had in stock).
Apparently I wasn’t much of a “serious” customer because I was asked to leave the computer while the sales guy pushed the dual g5 on him.
I stuck around enough to play on some of the others, but that dual 2.3 was quite amazing. On the 12″ powerbook g4, blender (open source 3d suite) rendered a cube with GI on a small plane with 8 samples at 31 seconds. The 2.1ghz iMac rendered at 21 seconds. The dual 2.3 ghz rendered at 13 seconds. (using both processors with the “threads” option turned on)
The 2.37a binaries don’t used 64-bit though.
On my non-OC’d 1 gig of ram, 2ghz 3200+ athlon64 (venice) under ubuntu-amd64 (5.10), using binaries from apt I was pushing 18 seconds, under windows xp 32 using blender3d.org binaries it’s around 20 seconds.
So for some high end things I think one can really use the quad processors. I think it’s interesting to note however that my (totally unscientific) experiments might be interesting to try using a dual core Athlon64. That might actually run faster than the dual processor G5 that was at the apple store (it wasn’t the new dual cores, it was a slightly older dual processor version).
I’ve tested Blender on both a 2.2GHz dual-core Athlon64 and a 2.3GHz dual-core G5. The Athlon64 is about 25% faster than the G5. In 64-bit mode, the Athlon64 is really fast at Blender.
Before blender went GPL, the code was 64-bit clean. Since the main development was done on SGI boxes.
Since it went open source I guess a lot of the new code that’s gone into hasn’t been as clean. I tried building CVS (which could’ve been the problem but GCC spit out some integer errors) yesterday with march=athlon64 but it dies horribly as I said.
What I’d like to do is make a couple of basic but CPU intensive scenes and then slowly overclock the CPU (which shouldn’t be too hard since my CPU idles at 25C and underload runs at 32C, with the 6600GT underload it runs at 34-35C) and test it for speed and errorage after each reboot with those scenes.
Apple really should consider continuing to sell both X86 and PPC for years to come. Atleast up to the point that a Pentium-D can encode h.264, 2-5 minutes faster than the current highest end G5 configuration. After that drop the production of new PPC Macs.
That’s the problem, isn’t it? Encouraging people to buy now is essentially encouraging people to buy a system that won’t be supported in a year’s time. In two years, there will be significantly less software written for it.
>>Encouraging people to buy now is essentially encouraging people to buy a system that won’t be supported in a year’s time.
Where did you get that? Steve Jobs said they’ll continue to support the G4 & G5 architectures for many years to come. Software compiled for the Intel machines will be compiled into a “Universal Binary” that will run on either PPC or Intel. No way is your Quad G5 going to be obsolete anytime soon.
No way is your Quad G5 going to be obsolete anytime soon.
Given the glacial pace at which processor clock rates are now increasing I suspect that machine will remain competitive for quite some time to come.
Universal Binary – you mean like Java? Hey, look, with what we’re able to do with the browser these days, it could be that old platforms will have a long life. But you’ll see, the numbers really won’t add up. Ask yourself this – in one year, how many people will be developing games for the original Xbox? It’s almost a reversed scenario, but remember that to develop for Xbox (with x86 processor) is easier than to develop for than Xbox 360.
even if mactels are an outstanding success from day 1, PPC macs will still be the vast majority of apple branded machines for quite a few years. I suppose no serious software vendor will go the ‘MacTel Only’ route any-time soon. Most of them have really no reason to anyway. Same code, just a checkbox in the XCode build dialog…
No, it’s not the problem. The PPC line of Apple’s hardware will be supported for a long time. You can’t possibly believe this change will be done overnight. It will take years. I don’t expect things to be completely switched over until at least 2010.
will be supported by apple till 2010. considering that it can up to double developement and support costs it is not very likely that application developers will do the same.
(and before the clueless people strike again – no, it is not a single recompile to port from ppc to x86, at least not for nontrivial programms)
$3,299 I would rather build a dual, dual core Opteron system and load Ubuntu 5.10. Don’t get me wrong, the G5 is a very nice CPU. But take a look at these numbers compareing a G5 runing OSX, Linux, a P4 running Windows and Linux and an Opteron running Linux.
http://sekhon.polisci.berkeley.edu/macosx/
this benchmark may be a litle old, but I would imagine if dual 1.8 GHz Opterons do that to dual G5s, then it would scale up to current CPUs. (the opterons may scale even better with onboard memory controllers)
Basicaly only get the G5s if your app uses Altvec
Um I would rather by the Mac can install Linux on it. The G5 kicks Opteron’s ass if you get a decent threading OS on it. OS X with it’s hybrid mach kernel is what kills Apple’s overall performance in multi threaded server tasks.
Now don’t get me wrong I love OS X on my powerbook, and it’s a great desktop OS, it just can’t handle certain loads at all. OS X for the lusers, and Linux for BOFH’s. It’s all good then.
hmm, you must not have looked at my link becuase it shows:
Worst_________________________________Best
G5-OSX PM-LIN G5-LIN P4-WIN P4-LIN K8-LIN
So sure, the G5 with Linux is better than the G5 with OS X, but the opteron (k8) on Linux bested them both.
If only the G5 had an on-board memory controller
He claims his link shows:
Worst_________________________________Best
G5-OSX PM-LIN G5-LIN P4-WIN P4-LIN K8-LIN
I reversed the modding down. He is simply stating what he thinks his link shows, in an uncontentious and matter of fact way. We may not like what he shows, and it may be that his link doesn’t really show that.
In that case the thing to do is say so.
The interesting part of that page is that the G5 on OS X is so much slower than the G5 on Linux, despite the fact that the benchmarks seem to be CPU-bound.
I agree, that is what I was trying to demonstrate. I’m surprised I was voted down for saying so, I simply was stating my interpretation of the benchmark I linked. People are always to eager to vote down comments because they don’t like what they say, even if the comment is civil and just an attempt at discussing the matter at hand.
Its not like my link was out of the realm of reality either, this two part AnandTech article talks of similar issues with OSX:
http://www.anandtech.com/mac/showdoc.aspx?i=2436
http://www.anandtech.com/mac/showdoc.aspx?i=2520
Edited 2005-11-20 09:28
So, the link seems to show that Linux on G5 is faster than OSX on G5, at least for some things.
The thing we should see in the spring, which should be very interesting indeed, is what the situation is on Intel, and also head to head, what the situation is on Intel not just versus Linux, but versus XP. We should also get it for ‘real world’ stuff like Photoshop actions.
We’ll get costings as well. Looking forward to this one!
I just looked at configuring a dual-dual Opteron on
Colfax-Intl. Admittedly, I went fairly high end
(dual 280s, 16GB RAM, 4 TB disk, dual 1920×1200
displays) RHEL 4, but the total I ran up was about
$16K… not pocket change.
Well, you will get those kind of prices with 6GB RAM, 4 TB disk. Try configuring it with specs closer to the G5 being talked about. I would use Ubuntu 5.10 over RHEL 4
Barring the fact that you can’t even build a comparable Mac (it maxes out at 1TB local storage), the quad will cost you almost $17,000 in that configuration.
Let’s see not identical but comparable, You get 7X500GB and 2X500Gb 4.5 TB Aith exansion for another 3.5 TB a 2x512MB cache. REdundant powesupply and much more remote management.
Quad g5 with 2x500Gb internal + Fibre channel controller= $4773
16 GB Crucial memory $8080
Apple Xserver RAID 3.5 TB 7x500GB (half populated) $8499
Total $21,352
More money more hardware but it can be done. It is arguable that this is a more robust solution.
“$3,299 I would rather build a dual, dual core Opteron system and load Ubuntu 5.10.”
And do what with it? Edit video using “Kino”?
You go realize that many video renders that used to use IRIX now use Linux right? Any many other pices of software like Maya are availible for Linux?
Depending on what you do, Linux could be a very viable system for a media professional. The best situation is if you’re a 3D artist, Linux has most of the big 3D packages (Maya, Houdini, SoftImage, PRMan, etc). Very big names like ILM have moved to Linux for not just their render farms, but their artist desktops. For video, Cinellara might very well do the job. For scientific computing, well, it’s varied, but a lot of places do use Linux for the job (especially high-end places moving from UNIX). The big issue is if you’re in music or publishing, the tools for that on Linux aren’t as good as what you’ll find on Mac or Windows.
I found the benchmarks and software referenced in the article (especially on the PC side) to be very esoteric and generally not useful for the majority of users out there as as a measure of PowerMac performance. Although I agree that OSX needs some work, it’d be nice if the benchmarks were made using software that people were likely to use, or which don’t require knowing little-known software packages and codecs I’m a video and media content creator, and have never heard of nor worked with the “mpeg-ts” variant of MPEG (as a matter-of-fact, googling for it returned pages of people wondering how to get it to decode and how to use it on standard systems.)
Perhaps, however, the benchmarks were simply a way to show how CPU/Memeory I/O bound processes compare on the CPU/OS combinations. The fact that many people may or may not use a particular application related to a benchmark isn’t really the issue, rather how the raw CPU, Memory subsystems compare under different OS’s. PS. I use MPEG-TS a lot, encoding under VLC….
I want to be clear, I am not anti-mac, I just think they have issues with their operating system that hold back the hardware. Should this be the case, apple switching to Intel is just a band-aid to the OSs real issues (disscused in the AnandTech article).
I’m a video and media content creator, and have never heard of nor worked with the “mpeg-ts” variant of MPEG
What “video and media content creator” has never heard of MPEG Transport Stream? Anyway, the point it, the wonderful and intuitive iLife can’t even turn a DVD compliant MPEG-2 file into a Video-DVD. Neither iDVD nor iMovie not QuickTime can open the file (which, of course, plays fine on VLC on the same Mac). On the PC you simply demux the file with TMPEnc (free) and Ifoedit (free) turns it into a DVD right away.
Are either of those easy or intuitive for most people?
There is always ffmpegX.
http://homepage.mac.com/major4/
It supports almost everything (including mpeg/ts) and is very simple to use.
I found the benchmarks and software referenced in the article (especially on the PC side) to be very esoteric and generally not useful for the majority of users out there as as a measure of PowerMac performance.
The whole point of the benchmarks was to determine if the PowerMac was suitable for statistical computing.
Sure, The author of the benchmark ran software he wrote to demonstrate that MacOS X was slow. Ecspecially with no mention of what parameters were used to perform the benchmarks no instructions on running the benchmarks so independant verification is possible. Also no details about the compilers used. And the end of the article clearly demostrates an anti Mac bias. And it is well known that GCC compiles non optimal code for PPC. Since his other motive was to claim IBM deliberately crippled the PPC970 to protect sales of thier own server his motives are even more questionable, no? Given the fact that IBM sells OpenPower servers running linux based on PPC970 I would say the author is spreading FUD.
Funny since he talks about freedom and transparency at the begning of the article. I am sure I can come up with benchmarks to show any OS/CPU combination is slower than any other.
It would be prudent to always view claims of performance objectively and skeptically.
Edited 2005-11-20 23:03
ure, The author of the benchmark ran software he wrote to demonstrate that MacOS X was slow. Ecspecially with no mention of what parameters were used to perform the benchmarks no instructions on running the benchmarks so independant verification is possible.
He posted the source to the scripts in question. The scripts take no parameters, and the version of R in question is specified, so the instructions are trivial and omitted.
And it is well known that GCC compiles non optimal code for PPC.
It’s also well-known that almost everything on OS X is compiled with GCC. Contemplating the G5’s performance with XLC or the like is mere intellectual masturbation. The performance of a CPU in artificial scenarios is irrelevant.
Since his other motive was to claim IBM deliberately crippled the PPC970 to protect sales of thier own server his motives are even more questionable, no?
Needless editorializing, perhaps, but that doesn’t change the results of the benchmarks.
It’s also well-known that almost everything on OS X is compiled with GCC. Contemplating the G5’s performance with XLC or the like is mere intellectual masturbation. The performance of a CPU in artificial scenarios is irrelevant.
The author’s motive is clear. He wanted to show that MacOSX was slow, slower than linux and also that the G5 aka PPC 970 was slower than the x86 cpus.
Running linux on the G5 isn’t any more real world than using XLC. But the benchmark author did indulge in some amount of mental mastrubation by posting the results from linux on the G5. As a natural extention he should have indulged a little more and compiled the benchmarks with XLC and run it on linux to make his point about the cpu.
Using XLC is probably more real world because anyone who cares about performance would use the best compiler for the job.
But that page shall remain above all else a FUD spreading excercise.
The author’s motive is clear. He wanted to show that MacOSX was slow, slower than linux and also that the G5 aka PPC 970 was slower than the x86 cpus.
And he accomplished that quite well.
Running linux on the G5 isn’t any more real world than using XLC.
Judging by the fact that Ubuntu, YDL, and Fedora all have PPC versions, I would say there is a decent number of people running Linux on the G5. XLC, on the other hand, who uses it? IBM won’t even sell you XLC for OS X anymore (as of Nov. 9). Furthermore, the version of XLC that really improves performance on the G5 (8.0, which is what IBM used to get the very good SPEC numbers for the 970MP) is only available for AIX. The older version, judging by the SPEC results for the JS20 on spec.org, don’t seem any better than GCC 4.x.
Using XLC is probably more real world because anyone who cares about performance would use the best compiler for the job.
Scientists and engineers using open source software aren’t going to recompile it using a $1000 compiler. Those using closed-source software, well, they can’t recompile it anyway, can they. The only people who could conceivably justify recompiling their stuff with XLC are supercomputing folks, for whom $1000 is a small price compared to a $15m supercomputer, but those folks are a pretty small group compared to those who use a G5 Mac as their desktop or workstation.
One more addition. What about those people who, god forbid, don’t use C/C++ or FORTRAN? What kind of code does the Java VM generate for the G5? How about Mono? Assorted Lisp or Smalltalk compilers? How are these people going to recompile with XLC? CPUs that depend heavily on the compiler for performance are a losing proposition. Itanium should have been proof enough of that.
Edited 2005-11-21 03:58
And he accomplished that quite well.
Not really.
Judging by the fact that Ubuntu, YDL, and Fedora all have PPC versions, I would say there is a decent number of people running Linux on the G5.
Linux distros have support for multiple platforms. Just because the port exist doesn’t mean that many people are using it in the realworld. NetBSB has port for almost every concievable platform but its market share is nowhere close to linux or windows or MacOS X.
One more addition. What about those people who, god forbid, don’t use C/C++ or FORTRAN? What kind of code does the Java VM generate for the G5? How about Mono? Assorted Lisp or Smalltalk compilers? How are these people going to recompile with XLC? CPUs that depend heavily on the compiler for performance are a losing proposition. Itanium should have been proof enough of that.
The PPC 970 doesn’t rely on the compiler as much as the itanium does. Judging by sales Apple sells 4 million macs a year. In Q3 FY05 desktop sales had 65% Y-O-Y growth. Most people find the G5 to be a decent high performance machine. The compiler only comes into play when useless synthetic benchmarks are brought up to debate performance issues and aren’t done properly.
For real world use the G5s have proven to be excellent machines. Ask Virginia tech.
Linux distros have support for multiple platforms. Just because the port exist doesn’t mean that many people are using it in the realworld.
YDL runs a business on just selling and supporting Linux for G5s. Clearly, there is a significantly-sized user community.
The PPC 970 doesn’t rely on the compiler as much as the itanium does.
No, but it’s pretty idiosyncratic in a way most highly OOO CPUs aren’t. The group-formation scheme is responsible for a lot of headaches, to the point where IBM gained a lot of performance in the Power5 by tuning it.
Most people find the G5 to be a decent high performance machine. The compiler only comes into play when useless synthetic benchmarks are brought up to debate performance issues and aren’t done properly.
There are two claims buried in your statement. First, that people find the G5 to be a decent high performance machine. This is probably quite true, but I’d also point out that a lot of Opteron systems much cheaper than your average G5 (the quad being something of an exception) are also comparable in performance. Second, that compiler performance only comes into play in synthetic benchmarks. SPEC isn’t a synthetic benchmark. It’s a bunch of real-world kernels. If SPEC’s 176.gcc is 20% slower with a given compiler, it’s highly likely that GCC compilation will be 20% slower with that compiler.
For real world use the G5s have proven to be excellent machines.
Now, that’s an entertaining statement, especiall in light of what you just said. The G5 got VA Tech in the Top 500 because it runs LINPACK really fast. Talk about a synthetic benchmark! I’m sorry, but “real-world” software occasionally has to hit memory! “real-world” software doesn’t just consist of a set of matrix multiplies with almost no integer component and tons of ILP for the FPU. I’m sure the G5 has worked out just fine for VaTech, but it doesn’t prove that the G5 is anything more than just an acceptable CPU.
Edited 2005-11-21 09:16
There are two claims buried in your statement. First, that people find the G5 to be a decent high performance machine. This is probably quite true, but I’d also point out that a lot of Opteron systems much cheaper than your average G5 (the quad being something of an exception) are also comparable in performance.
Well you are making the classic mistake. You are assuming that I give a damn about an opteron running any other OS. I bought My macs for MacOS X, any other OS on any other hardware doesn’t interest me. I like most people run software and buy the machine that runs said software. I don’t get my jollies owning high performing hardware that is cheaper and deal with the software it will run. Some performance frenzied geeks might get a hard on doing so.
Second, that compiler performance only comes into play in synthetic benchmarks. SPEC isn’t a synthetic benchmark. It’s a bunch of real-world kernels. If SPEC’s 176.gcc is 20% slower with a given compiler, it’s highly likely that GCC compilation will be 20% slower with that compiler.
SPEC has become a synthetic benchmark ever since CPU manufacturers started using compilers with special tweaks that specifcally make thier SPEC numbers look good and offer no realworld benifit. Take intel’s or IBM’s top numbers and use gcc to compile SPEC and the numbers will be lower.
Now, that’s an entertaining statement, especiall in light of what you just said. The G5 got VA Tech in the Top 500 because it runs LINPACK really fast. Talk about a synthetic benchmark!
Really then Blue Gene/L is also just an excercise in entertainment. VA tech bought 2200 G5s and spent months setting up a super computer in thier lab for a few shits and giggles. Those G5s were bought for meeting computing needs of students above all else.
I’m sorry, but “real-world” software occasionally has to hit memory! “real-world” software doesn’t just consist of a set of matrix multiplies with almost no integer component and tons of ILP for the FPU. I’m sure the G5 has worked out just fine for VaTech, but it doesn’t prove that the G5 is anything more than just an acceptable CPU.
Look at the top500 results again you will see opterons and xeons on the list I am guessing you think they are acceptable cpus also. In some cases it took more otperons to get less results than the g5s.
Edited 2005-11-21 15:58
Well you are making the classic mistake. You are assuming that I give a damn about an opteron running any other OS. I bought My macs for MacOS X, any other OS on any other hardware doesn’t interest me.
The fact that you don’t care about non-OS X machines is really quite irrelevant here. The other machines run R just fine, and that was the topic of the sub-thread you butted in to.
SPEC has become a synthetic benchmark ever since CPU manufacturers started using compilers with special tweaks that specifcally make thier SPEC numbers look good and offer no realworld benifit. Take intel’s or IBM’s top numbers and use gcc to compile SPEC and the numbers will be lower.
If you have compile tweeks that make real-world kernels run really fast, then your compiler will run general-purpose code faster. And of course GCC’s numbers will be slower than ICC’s or XLCs. GCC generates slower code than those compilers!
Really then Blue Gene/L is also just an excercise in entertainment.
Not only have you managed to mix two of my statements, but your conclusion does not follow from your premise. I never said the VaTech G5 cluster was an exercise in entertainment. I was simply pointing out how entertaining it was that you chose to trump a supercomputer that is thus far merely famous for running an egregiously artificial benchmark really fast, despite the fact that you just a few lines before complained about artificial benchmarks.
Look at the top500 results again you will see opterons and xeons on the list I am guessing you think they are acceptable cpus also. In some cases it took more otperons to get less results than the g5s.
Does simple logic really baffle you this much? Let’s break this down nice and simple-like.
1) You claim the G5 is a performant CPU, because of its Top 500 results.
2) I claim that Top 500 results don’t make for a fast CPU, because the benchmark on which it is based is particularly artificial.
3) You claim that the Opteron gets good top 500 results as well, and you bet I think that’s a good CPU.
4) You further claim that it takes more Opterons than G5s to get into the top 500.
Do you see how (3) doesn’t really answer (2)? Do you also see how (4) just sounds stupid without a refutation of (2)? The G5 isn’t as good as the Opteron, despite the Top 500 results, because LINPACK is a stupidly simple benchmark. The fact that the Opteron runs LINPACK decently well doesn’t change the situation.
The fact that you don’t care about non-OS X machines is really quite irrelevant here. The other machines run R just fine, and that was the topic of the sub-thread you butted in to.
I don’t care about R. This Article under discussion wasn’t about R. An AntiMac shill brought up a lousy benchmark of a software package that 99.9999999% of the world’s population or even the Apple Install base will never run. My point is more relevant to the Article posted than this sub thread started to troll on a Mac article.
If you have compile tweeks that make real-world kernels run really fast, then your compiler will run general-purpose code faster. And of course GCC’s numbers will be slower than ICC’s or XLCs. GCC generates slower code than those compilers!
No intel specifically tweaked ICC to compile SPEC to run faster. Doesn’t make any code faster.
Not only have you managed to mix two of my statements, but your conclusion does not follow from your premise. I never said the VaTech G5 cluster was an exercise in entertainment. I was simply pointing out how entertaining it was that you chose to trump a supercomputer that is thus far merely famous for running an egregiously artificial benchmark really fast, despite the fact that you just a few lines before complained about artificial benchmarks.
I think you let your brain out to dry. Pretty much every supercomputer on top 500 are “famous” for running linpack. The VAtech cluster is a real super computer for students of VAtech it also happened to be an excellent performer in LINPACK. You made it seem that VA tech built it to run LINPACK as an excerise in trying to be famous.
Read further down to think why I think you are speaking out of your nether regions.
Does simple logic really baffle you this much? Let’s break this down nice and simple-like.
Apparently it does you.
1) You claim the G5 is a performant CPU, because of its Top 500 results.
Reality check, you brought up Top500. I merely mentioned VA tech.
2) I claim that Top 500 results don’t make for a fast CPU, because the benchmark on which it is based is particularly artificial.
See point 1.
3) You claim that the Opteron gets good top 500 results as well, and you bet I think that’s a good CPU.
I never claimed the opteron wasn’t a good cpu. What are you ranting about?
4) You further claim that it takes more Opterons than G5s to get into the top 500.
Learn to read. I said “in some cases”. My point is supported by facts.
8 on the list is a js20 based cluster 4800 ppc 970s cpus @2.2Ghz (Aka g5)
8 Barcelona Supercomputer Center
Spain MareNostrum – JS20 Cluster, PPC 970, 2.2 GHz, Myrinet
IBM 4800 2005 27910 42144
Followed by the 2.4 Ghz opteron bases cray system at 5200 cpus.
10 Oak Ridge National Laboratory
United States Jaguar – Cray XT3, 2.4 GHz
Cray Inc. 5200 2005 20527 24960
A 4096 node 2.6 Ghz opteron cluster ranked 14 followed by a 3072 node Xserve cluster a 2.0 Ghz cpus. The difference in scores were 5% for Rmax and the G5 based cpu actually had a higher Rpeak.
14 ERDC MSRC
United States Cray XT3, 2.6 GHz
Cray Inc. 4096 2005 16975 21299
15 COLSA
United States MACH5 – Apple XServe, 2.0 GHz, Myrinet
Self-made 3072 2005 16180 24576
15 COLSA
United States MACH5 – Apple XServe, 2.0 GHz, Myrinet
Self-made 3072 2005 16180 24576
Do you see how (3) doesn’t really answer (2)? Do you also see how (4) just sounds stupid without a refutation of (2)? The G5 isn’t as good as the Opteron, despite the Top 500 results, because LINPACK is a stupidly simple benchmark. The fact that the Opteron runs LINPACK decently well doesn’t change the situation.
Read my reaponse above.
So how does the fact that R runs better on x86 make the a G5 slow when it does in fact beat the opteron in other benchmarks. Your point is equally stupid it would appear. I consider R to be a stupid non real world benchmark, you consider LINPACK to be same. The bottom line is either we are both speaking out of our asses or we are both making sense.
The difference is real people and more of them are using the G5 for supercomputing work and publishing results in a respected benchmark for supercomputers. Than the people using the G5 for R or even the people running R.
You can’t point to SPEC and claim LINPACK is a simple benchmark. SPEC results never garauntee real world performance and itanium is a classic example of that.
Edited 2005-11-21 21:26
http://sekhon.polisci.berkeley.edu/macosx/
Mwhahahahahaha. What is this peace of crap;
“I have to reboot to install the Quicktime mpeg plugin. What’s up with that? I can’t believe that I had to reboot the machine just to install the mpeg2 codec player for QuickTime.”
This guy doesn’t know that quicktime is part of the OS and that the Quicktime app is just a front end.
“That is just so lame it’s funny. Especially given that they are using the Mach Microkernel which is incredibly inefficient in part because it is a micro-kernel.”
looooool. The micro-kernel of mac os X isn’t used as a micro-kernel. Anyone that has only read the documentation of Mac OS X knows that.
“They take the efficiency hit of a micro-kernel but they still don’t have a modular system to the point that even a codec needs a reboot?!?”
cf. up.
“Other bugs? Malicious web pages can install dashboard widgets. Egad! Didn’t we hate pre-SP2 XP for allowing this? ”
Was fixed ONE WEEK AFTER Tiger was out.
“Who would have thought that a decade after the famous 1984 Big Brother ad, it would be Apple with the highly controlled largely cathedral OS and IBM would be spending hundreds of millions of dollars on code it allows anyone to share and to contribute to? ”
The most funny, this guy think that IBM loves the linux community. He doesn’t understand that they are all working for free for IBM.
Your link is just another link to a guy that doesn’t like Apple.
This guy doesn’t know that quicktime is part of the OS and that the Quicktime app is just a front end.
Doesn’t change the fact that it actually makes no sense that Qt is part of the OS. I wonder why no one ever sued Apple for not being able to remove Qt from the OS… .
looooool. The micro-kernel of mac os X isn’t used as a micro-kernel. Anyone that has only read the documentation of Mac OS X knows that.
Indeed true. Mac OS X uses a hybrid kernel, just like the WinNT kernel.
The most funny, this guy think that IBM loves the linux community. He doesn’t understand that they are all working for free for IBM.
Yup. Apple tries to achieve the same with Darwin, only it isn’t as succesfull.
“Doesn’t change the fact that it actually makes no sense that Qt is part of the OS. I wonder why no one ever sued Apple for not being able to remove Qt from the OS… .”
Because Apple sells the hardware AND the software. It is pretty different.
Anyway, if this guy want to read mpeg-2 without rebooting, he just has to install mplayer or VLC.
That said, it is still bit ridiculous to have to reboot the OS for such trivial things. BeOS could start a new media server without rebooting 6 years ago. QuickTime should just be a service of the OS not the whole underlying foundation of it. At least, if GUI elements use it should only require closing the session, not fully rebooting the machine.
The most funny, this guy think that IBM loves the linux community. He doesn’t understand that they are all working for free for IBM.
Wrong way to look at it. IBM is working for free for us, the Linux community.
Wrong way to look at it. IBM is working for free for us, the Linux community.
You can say that, but IBM is the one with cash in their pockets.
Author’s use of the word “exponential” is really stupid. I can’t imagine a clearer example of a performance increase that is clearly “linear”, than doubling the number of processors. In the best case, if we could previously do n operations in unit time, now we can do 2n. What kind of function is n -> 2n? Right. Linear.
Yes, but 2 is an exponent of 1 (2^0)=1;(2^1)=2. Therefore 2 is exponentially bigger than 1. Of course, that’s where the similarities end. 3 Processors is linear power.
I do get what you mean; but people widly bend the truth when it comes to performance _anything_. I could boot GEOS on a commodore 64 is 30 seconds. Why is it that 25 years later my exponentially more powerful machine, still takes 30 seconds…
think a bit. you’re booting an incredibly more complex system in the same amount of time…
True.
Though really at some point you start wondering what all that complexity goes into.
I recently played with an XBox (hadn’t played with a console for years) and the first thing I thought is, hey, some games are well done, always smooth. I played Spartan Total Warrior, lots of things moving, effects, always smooth. Are there any good games on a 733 mhz Celeron with a 64mb Nvidia GPU that plays that smooth on a PC even at 800×600 ? I’d say no. That’s the hardware in an xbox however. The hardware doesn’t change, so devs have to give their best to fit whatever they want to fit in.
My point is that the more power you have, the more you use it, and often without consideration. One can write a sucky java app and as long as it works with the top machine the developer won’t care. You can just upgrade your hardware after all. Isn’t money free these days?
When either Mac OS-X or Windows XP eat about 150 mb of ram without actually having launched a single end-user app maybe there’s something wrong. And yes I know, there are caches, smooth font-rendering engines and all that. I know also “easier-to-code-for” systems mean more apps faster, but I am pretty sure good optimization can still go into Windows, OSX and Gnome. On any of these OS 256 mb is the minimum, and any confort (eg. no swapping when reading your mail) will require 384mb.
If you wish to run linux on a 64mb machine you have to go back to small window managers, editing configuration files by hand to add apps to your menu. Why can’t we have both progress and keep a small size?
“I do get what you mean; but people widly bend the truth when it comes to performance _anything_. I could boot GEOS on a commodore 64 is 30 seconds. Why is it that 25 years later my exponentially more powerful machine, still takes 30 seconds…”
How about the number of of operations your exponentially more powerful machine has also gone up exponentially? I take it that you’re not booting into GEOS anymore…
I could boot Geos on my unmodified C64 and have 5k left to open stuff.
Of course that meant opening anything and trying to do some work in it immediately crashed the machine.
But I did like the idea behind it and it has survived albeit in a different form on another platform and I will always fondly remember my C64 days.
How’d this get modded up? You have no idea what kind of function this is. Since you’ve only got two data points, you can fit whatever curve you want to it. I say it’s a sinusoidal increase in processing power!
That said, exponential is probably the best term for it. As in 2^n. The PowerMac went through 1 processor, then 2, now 4. That’s 2^0, 2^1, and 2^2 … 2^n.
The author is not stating the number of CPU’s increased exponentially (in which case you would be correct rayiner), he clearly states the *speed* has increased exponentially which is christally clearly stupid.
especially considering the mentioned speed increases are between 50% and 80%
apearantly the increase is neither linear nor exponential , but I dont expect a rational article from a man who expects his desktop to be equipped with seatbelts
The author is not stating the number of CPU’s increased exponentially (in which case you would be correct rayiner), he clearly states the *speed* has increased exponentially which is christally clearly stupid.
He stated that the speed has increased exponentially, with the proviso that its for tasks that can be parallelized. Since the number of processors has doubled (twice, once from the PowerMac single, then again from the PowerMac dual), it is indeed an exponential increase in performance. The fact that it’s 50% or 60% isn’t relevant here. That is, say, 1.8^n, which is still an exponential function.
One can think of all sorts of exponential functions, so the statement that speed has increased exponentially is completely meaningless and therefore still christally (and now super) clearly stupid.
But appearantly the increase depends on how much the software takes advantage of the new CPU setup.
I still fail to see the logic in how the number of processors doubling says anything about how the speed increases for the user, and find it amazing that actual measurements are discarded as irrelevant.
You must be a physicist
will apple plan to upgrade the xserve line again or are they just gonna stay where they are?
Disappointing to see Apple zealots modding down comments which question the performance of the machine/OS relative to other platforms _without_ having the guts to provide a counter-argument founded on any facts. And yes I’m a Mac owner …
These quad cores are an excellent deal, price-wise. To build a competitive dual-dual Opteron system, you’d need a pair of Opteron 280s. These are not cheap CPUs, running about $1350 a pop. Throw in a K8WE, and you’re already looking at $3100. A cheapo graphics card like the one in the quad will run you maybe $100, a hard drive another $100, and RAM like $40. Throw in a case and fans, you’re looking at $3500 minimum. Now, depending on the work you do (eg: scientific computing), the Opteron 275 might be more comparable to the dual-G5, but that’s still $3000, and for $300 more you get a warranty, technical support, and the ability to run OS X-only software like FCP.
EDIT: It just occurred to me that the Quad only has a single-dual channel memory controller, while the K8WE has dual-dual channel memory controllers. So in that case, knock $100 of the prices above to factor in the cheaper motherboard. Still a good deal, however.
Edited 2005-11-19 21:27
Thats quad cpu. Not quad cores. yes, technically, they have 4 cores, but you don’t call a quad cpu machine a quad cored. dual/quad etc cores implies one cpu with multiple cores. It confuses people
er, its dual dual core cpus. sorry.
That machine is exactly like the PowerMac Quad, both have two dual-core CPUs.
This sounds like a great machine that is well out of my pric3e and performance range.
But let’s just imagine if the BeOS, which was designed to be multithreaded/multiprocessor, could run on this monster.
Mmmmmmm, good.
I wonder why the author felt the need to spell out PCI on one of the pages of the article… Was he trying to hit some word-count quota or something?
Are you kidding? You’d thnk to yourself “I think I will check my mail” and *BOOM* BeOS would open mail, check the servers and filter out the spam for you, then display the message it thinks you want to see first before you even move your mouse!!
No, you need a big hog of an OS these days to run on these beastly-powerful CPUs/motherboards.
Say goodbye to insanely powerful and cheap systems like this one from Apple. Look at Intel’s roadmap for the next five years if you want to see just how bad it is going to be for Apple.
The only machines that are even in the same ballpark for performance are much more expensive AMD systems.
Not only did Apple getting the boot from IBM cripple their future desktop systems, but now that the first reviews of Yonah chips are starting to hit the Net, low clock speed and high power, the magnitude of the Intel fiasco will be hitting Mac users in full force soon.
What is so sad is Apple was really starting to kick ass on fronts and now it is probably too late for Apple to try to salvage their relationship with IBM as long as Jobs is still in charge of the computer hardware side of the company…
This machine is very good for price/performance, at the moment, but it’s just like the original dual G5 machines. It’s edge is going to last for a few months, but the PC world will quickly overtake it, because it moves so much faster. This time next year, the top-end Opteron will be a 2.8 GHz model with DDR2-667 (10.4 GB/sec) memory bus. In a quad-CPU machine like this one, it’ll have over 20GB/sec of aggregate memory bandwidth. Meanwhile, the Quad will be stuck with the same 2.5GHz G5, with the same DDR2-533 memory bus. If the PowerMacs transition to Intel in 2007, the Opteron equivalent to the quad will be an 8 CPU (dual quad-core) machine with DDR3 memory.
The problem with the G3, G4, and G5 have not been that they cannot ever match PC processors. It has been that it cannot do so consistently. The last time the G5 hit parity with the Opteron was in 2003, when it came out at 2.0 GHz and the fastest Opteron was 1.8. Back then a dual-2GHz 64-bit system was a steal at $3000. Fast-forward a few months, and the 90nm Opterons come out, prices on them plummet, and dual Opteron machines can be had for 2/3 of that. In the 2 years since the releases of the original G5 and Opteron, the fastest air-cooled G5 has gone from 2.0GHz to 2.5GHz (a 25% increase), while the fastest air-cooled Opteron has gone from 1.8GHz to 2.8GHz (a 56% increase). Now, the 970MP has achieved parity again, but how long will it last? The simple truth that Apple had to face was that it was very likely that it’d last for a couple of months, then they’d have to be second-tier for another two years, just like before. If I were Jobs, I’d be sick of that state of affairs too!
well even though there is only a few amount of people that would use this kind of power, but it would be one hell of a set of braging rights if you owned one.
It’s amazing how people call such a huge machine ‘expensive’, is $3300 expensive? No, really? Imagine 5 to 10 years ago a cost of a high-end workstation would be exponentially (heh) bigger than this. And SGI still manages to sell their stuff for $30000 – $50000. And it’s not really 10-20 times more powerful, right? So yes, $3300 is expensive for an average ipod buyer, but for a studio that earns thousands for its work, or for a photographer whose single shot could cost as much as this PowerMac this is really not at all expensive. Hey, some people buy watches that cost $30000 every week.
I second that.
If that machine at 3300$ lasts just 24 months, then it’s 137 $ per month (assuming after 24 months you don’t even sell it, you just burn it in a big joyful fire). Many many jobs require more than that in monthly costs. Anyone using this would definitely be a professional, not the average home user. I am pretty sure it costs more in material to be a taxi driver and it’s not the best paid job.
I’ll place my order when I win the Lottery.
“Steve Jobs said they’ll continue to support the G4 & G5 architectures for many years to come.”
Sure. He also told that to m68k users way back when. Know how long m68k support stayed around? It sure wasn’t many years.
Steve Jobs will lie whenever it suits him. Read any non-official history of Apple (and I’m not talking about Guy Kawasaki’s, either). You’re a fool to trust any of his assurances. The only person Steve Jobs cares about is Steve Jobs.
In any event, Apple’s support is almost irrelevant. Developers aren’t going to keep supporting two platforms, especially when one’s known to be a dead-end. Consumers aren’t going to buy a dead-end, either. Look for Apple PPC to be dead within two years.
That is not to say this system will be worthless, of course – load a decent Linux distro on, and you’ll be set for basically forever.
-Erwos
Er…the m68k transition was long-over when Jobs took over Apple.
Really.
Be reasonable.
PowerPC came out in mid-94. MacOS 8.5 (the first to discontinue 68K support completely) was released in late ’98. A solid 4 years. Ditto for Photoshop, which supported 68K until 5.0 released in mid-98.
Bottom line is most companies are not stupid: as long as there is a significant PPC market, software will be updated for it. And unless you believe Intel Macs will be a smashing success (would be nice, I admit), most Mac users will be on PPC for a long time to come.
is that they will likely continue to sell ppc systems as the low end option in their product lineups for a year or two until the full product line transition has been made and there is enough software (both apple and third party) to make the transition as smooth as possible.
The author apparently doesn’t know what “exponential” means. More processor cores can at most provide linear speedups.
Your right that performance improves linearly with adding processors, but Apple hasn’t been linearly increasing the number of processors. Instead, they’ve been doubling the number of processors, thus achieving exponential growth.
Your right that performance improves linearly with adding processors, but Apple hasn’t been linearly increasing the number of processors. Instead, they’ve been doubling the number of processors, thus achieving exponential growth.
Read the article. He says:
The speed increase on any computing work that can be broken up into parallel processing is exponential.
Face it, it was stupid marketing speak, and no clever semantics reinterpretation makes it less stupid.
Face it, it was stupid marketing speak, and no clever semantics reinterpretation makes it less stupid.
Hardly. Doubling processors is easy, doubling the performance of one processor is complicated. Work that can be broken up into parallel processing is easy to increase in performance exponentially, while work that cannot be so broken up is not. It’s only stupid if you are looking for a way to make fun of the author.
Lets take your idea that exponential scaling is going on here and do the figures, and also assume perfect scaling with number of processors, and make life really easy on your theory by choosing the base to be the fourth root of 4, ~1.414.
Taking n to be # of procs, we get:
1.414^4 = 4, hmm, good.
1.414^2 = 2, hmm, still good.
1.414^1 = 1.414, uhh….
1.414^8 = 16, amazing!
In other words, in this case your model works for only 2 data points, but as you pointed out earlier, that doesn’t mean much. Graph it and it’s a no brainer.
Hey I’m as big a Mac fanatic as the next guy but this is a waste of money unless you really need one now. Your buying into dead technology any way you look at it.
Dead technology ? If you use this for your job it will keep on working for years and software will be released for PPC for quite some time.
In 5 years, like for any PC you buy today, the value will probably be about 100$ anyway, so any PC will be dead technology in some distant future.
Why compare always linux to macosX?
They’re totally different in what is the most important things: target.
MacOSX is targeted at ease of use, productivity and innovation for the *end-user*…
Define “end user” as someone who wants to use a computer for various purposes (download the video from his camera maybe, and quickly & easily do a dvd out of it, make music, watch dvds, buy software at the local retailer and hey presto!, it works with his hardware…): osX wins hands down.
no configuration whatsoever, intuitive and crammed with easy and intuitive little programs that do all of this as soon as you plug in the machine.
No googling around, or cvs, or you name it… (really: use it for a couple of days before ditching it; it deserves this intellectual honesty).
More advanced users, of course, must be defined too: if they have the time/will to configure their machines, find out the right programs (if they exist on their platform of choice… Adobe/Autodesk/Dassault Systemes/Macromedia anyone?) then they usually just do their choice and be happy.
Linux seems faster, if it can do for you what you need, *and* the amount of time and knowledge you can devolve in having it work your way is little or acceptable, go for it… it’s even free!
But for most of the average non technical users, MacOSX is amazing; if Toyota gave you a slick, cool, wonderful car for 20k$, loaded with accessories and granted to easily do what you need, on or off road, with a great market and loads of accessories who really are plug-and-play, would you complain because you could self-build a car based on a different engine, with free controlling software, always rough around the edges (anyone knows how easy is to sketch something, or build the fundamentals… but refining a product takes a lot more time…) that outperforms the Toyota 40% of the time (in our case what would be? networking? high end 3d for cinema companies? programming? pure scientifical calculations? -> less than 5% of the userbase it seems), but that requires a extremely detailed knowledge of everything and a lots of luck for everything to work ok (remember: the pieces you use to build your car are made by others… Gaim team, Kernel team, Xorg team… sometimes they simply don’t have time/resources to make something work…), and which costed, say 10k$, what would you buy?
Linux is great, and never ceases to amaze me (although it’ll probably never be on my laptop again… I devoted the last 6 years to it, and enough is enough for me), but my next choice will be MacOSX.
I simply want everything to work out of the box, and the great programs that make loads of things possible, and easy.
Just my 2c comparison 😉
Lorenzo
And you really think those 2 cents can be deducted from the Quad beast price? 🙂
Nah, Lorenzo’s just on a dark conquest promoting the use of Apple’s fiendishly clever but oh so expensive platform.
I don’t think it’s all that expensive. If you look at the total cost of ownership, Apple isn’t really that much more expensive.
To really compare Apple to the competition it should be enough to approximate the hardware specification and the software specification as offered on an off-the-shelf Apple, take your pick of which machine you want to compare.
If you put the custom-built PC next to a Mac and they both do about the same thing, they can’t be exactly the same just go for functionally the same specification, is the custom-built PC significantly cheaper than its equivalent Mac? I would really like to know [it’s not going to make me buy Linux or Windows though].
Have a look at this link:http://www.alienware.com/intro_pages/aurora_m7700/selector.aspx
An alienware AMD X2 (dualcore) laptop
I’m anxious to know how Apple is going to make command.Especially while HT on Intel CPU’s isn’t exactly the egg of Columbus.
The quad mac is a great value for the hardware, but I would still prefer a quad opteron, for the usual reasons of peripheral choices. Even with Linux on G5, I believe some of the important Mac PCI peripherals out there won’t work. It will be a hoot if Intel cannot deliver for Apple, but given Intel’s vast resources, it would be astonishing of they didn’t deliver.
… should get a life.
Since Steve came back to Apple one thing that clearly changed was support. Steve doesn’t not like supporting legacy anything. By legacy, I mean anything that is not the current version. Once Intel chips start appearing in Macs, you’re going to read from Apple how quickly everyone has transitioned to Intel. They will encourage everyone to drop their dual processor, dual core G5s and buy the newest fastest Intel Macs.
I sold an 800MHz G3 iBook for about $800 USD on eBay only two days ago.
by not supporting legacy, they keep people spending money. Plus everyone benefits from the best technology that isn’t held back by the clutter and compatibility issues of yesteryear.
Unfortunately X86 is a step in the wrong direction. Since PPC is open now, it would have been wonderful if Intel made PPC variants for apple. Even Microsoft is shifting towards PPC with the arrival of XBox360.
Universal binaries, in my best estimation, will remain the standard until apple decides PPC is truly dead and X86 is light-years ahead, or it becomes very difficult to support it in XCode for some reason (unlikely).
cheers
>Even Microsoft is shifting towards PPC with the >arrival of XBox360
Not quite i.e. refer to MS’s past PPC support e.g. Windows CE 2.x and Windows NT 4.0. In MS applications, MS supported PPC through MacOS X.
Microsoft’s Windows and server biz group has stated that with they only support IA-32, IA-64 and X64.
XBox product line belongs to another MS biz group.
That was one of the worst reviews I’ve ever read. He tried the system *one* hour? 8 GPUs? 0.01% of buyers *need* 4TB RAM? Let’s just hope this guy enjoys his bribemoney at the bahamas right now. Ugly stuff.
Unless you are the curator of a Macintosh museum, spending a lot of money on a quad core G5 is a complete waste of money.
OS X on Pentium 4 has proven to be very fast. OS X on dual dual-core Xeons will completely smoke the G5. Unless you just can’t wait for a little more grunt, buying a PowerMac in 2006 is a far better choice.
Most publishing companies use Macs for many reasons: WYSIWYG, Colorsync, workflow and capability to automate workflow using AppleScript. Until this new PowerMac came out the speed increase was in fractions. Now it is suddenly twice as fast. For the author who is used to manipulating huge photoshop files or say Aperture or FCP, this will essentially double the amount of work they can do in one day (given that that is not exponential). Give the guy some slack. Everyone knows that there are other processors that are faster, other systems that are free but for companies that have invested heavily into Mac software, this would pay for itself in a couple of days for big publishing or media companies. If the Intel or Opterons running their favorite applications will come on the scene in a few years, I am sure publishing companies will go that route.
Shouting or screaming that everyone who does not use your favorite system/software is a moron is going to get you nowhere.
(Hyper) threading is so fast and flawless by design.
Plz Mr Jobs let me run OSX legally on my AMD X2.
I know, it rankles all those PC users and build-it-yourselfers, when they hear anything positive about the Mac. I am a mac user and this makes me feel all warm and fuzzy despite the error with exponential and the Mac obviously being very slow and overpriced.
While you guys were building faster PCs: We got QuickTime
While you guys were building faster PCs: We got MacTV (first media center PC with remote))
While you guys were building faster PCs: We got eMate (first Tablet PC)
While you guys were building faster PCs: We got QuickTake (first Digital camera)
While you guys were building faster PCs: We got Newton (First PDA)
While you guys were building faster PCs: We got FireWire
While you guys were building faster PCs: We lost legacy ports
While you guys were building faster PCs: We got iPOD
While you guys were building faster PCs: We got iTunes
While you guys were building faster PCs: We got iMovie
While you guys were building faster PCs: We got iPhoto
While you guys were building faster PCs: We got Airport (First consumer WiFi capable laptops)
While you guys were building faster PCs: We got Final Cut Pro
While you guys were building faster PCs: We got GarageBand
While you guys were building faster PCs: We got Os X Tiger
While you guys were building faster PCs: We got iMac G5 (slim All-in-One)
While you guys were building faster PCs: We got Front Row, Photo Booth
While you guys were getting Viruses, Worms, Malware and Trojans: We got nothing!
While you guys were building faster PCs: We got…well you get the gist.
You guys did get a lot of first person shooter games though!