Phoronix, known for their various speed tests and reviews, compared the latest in Ubuntu and what, until recently, used to be the lastest in Mac OS X with 29 different benchmarking tests. Some of the results were rather interesting.The machines in question were, for OS X, the Apple Mac Mini with an Intel Core 2 Duo at 1.83GHz, integrated graphics, 1GB of DDR2 memory, and an 80GB hard drive. The reviewers failed to mention the Ubuntu system’s hardware specifications, sadly, so we’re a bit out of luck on those details. It was pointed out to me that Ubuntu 9.04 was actually tested on the same machine using BootCamp– my mistake.
On the other hand, both operating systems in question were described in much detail: on the Mac, kernel 9.6.0 i386, X Server 1.3.0-apple22, OpenGL 1.2 APPLE-1.5.36, GCC 4.2.1, and Journaled HFS+ were used. Ubuntu 9.04 used the Linux 2.6.28 kernel, X Server 1.6.0, xf86-video-intel 2.6.3, OpenGL 1.4 Mesa 7.4, GCC 4.3.3, and the EXT3 file system.
Before giving some detailed results of some of the specific tests, the overall testing showed that Ubuntu was faster than Mac OS X in 18 of the 29 tests. Some were landslides while others were only ahead with marginal differences for both systems.
The first tests that were reported were more 3D game-centric. Since it’s no secret that Mac OS X and Linux alike aren’t the best systems for gaming and one doesn’t often read about them being compared in such circumstances, I was anxious to see the results on these particular tests. Mac OS X dominated this field with FPS measurements of 17, 18.2, and 16.2 on the respective resolutions of 800×600, 1024×768, and 1280×1024. Ubuntu only achieved FPS rates of 10.5, 7.15, and 4.4, respectively– a far cry from OS X’s performance. The game used to test was “Urban Terror” 4.1.
Mac OS X also dominated the Java 2D Microbenchmark, achieving 275% faster than Ubuntu in text rendering; the Byte UNIX Benchmark (conducting floating point arithmetic); the SQLite v3.6.13 test, performing 12,500 SQL insertions in just over 25 seconds while Ubuntu took over 111; Crafty (a chess program), being six times faster in operations than Ubuntu; and PostgreSQL, with 1028 TPC-B transactions per second while Ubuntu only had 389.
Ubuntu’s dominating areas were in the OpenSSL test, with 21 signs per second whereas OS X only got seven signs per second; and two of the Java SciMark tests, having about three times as many Mflops each as OS X’s 91.05 and 20.14. Though Ubuntu was faster than OS X in just over half of the tests, many of them were only marginal differences and not landslides as a good many more of OS X’s were.
In essence, Mac OS X 10.5.6 seems to be the better operating system according to these benchmarks, at least in the area of Mac Mini-esque hardware.
In other news, the website that conducted these tests could definitely use less obtrusive ads. This was murder to write.
Ok, so you can justify testing graphics on linux with an intel card just because most people have intel cards but its not secret that intel drivers (whilst being opensource and improving blah blah), are nowhere near as good as intel drivers on other OSX or Windows.
Even so its fair to say that this benchmark shows exactly how poor intel performance on linux/ubuntu is so maybe this might do some good to kick the xorgintel driver team into making some better drivers.
The thing that gets my goat is that this benchmark and the comments that go with it suggest that linux has poor graphics performance in general which simply isnt true – animation studios and people doing serious 3D graphics work use linux, mostly with NVidia graphics sinec their support is way better then intel and amd.
Edited 2009-05-13 01:31 UTC
They definitely should have used a system that would be decently supported in both OS’s (nvidia graphics). Linux lost all the benchmarks that involved graphics acceleration, which is unsurprising I guess considering the quality of intel graphics drivers.
Also, this article reminded me of why I never go to phoronix… page 1 of 34834, sigh.
I open The Register and what do I see
The thing about Ubuntu and Linux in general it evolves and improves so fast
That might be true, but I have yet to have seen any graphics card which runs faster on Ubuntu (even the Nvidia ones in the past at least seemed to be quite consistently slower on linux).
There are two problems here:
1) Having a kernel that forces people to provide open source drivers. Whilst linus tolerates closed source drivers, its a risk creating them. And companies aren’t going to pour all their optimisations into open drivers which any other company can just steal the optimisations. Otherwise its like handing over money to the competition.
There is no good justification for this. If Open source is that great, then such drivers will succeed regardless.
2) Ubuntu doesn’t develop linux, it just grabs a bunch of packages which other distro’s have worked on. Canonical seems to only concentrate on their own projects.
Whilst the foundation of Ubuntu is shaky, Canonical is off spreading their resources further and starting other projects like the netbook remix, which has a shaking foundation too because barely any drivers on either are complete. And I have seen NO evidence of Ubuntu trying to collaborate with other companies either, to determine their needs. Everything seems based on assumptions.
3) The community. I’ve learnt from the Ubuntu brainstorm community that frankly, the most vocal Linux users are idiots. That’s the biggest problem. I’ve argued with Linux users who believed that time shouldn’t be wasted on wysiwyg editors because grandma should learn mockup languages / CSS instead for her site. And I’ve argued against many users who were totally convinced that DEB’s/RPM’s are more secure then shell scripts. Ubuntu’s vocal population I think has turned too much into politicians who care more about spreading OSS then aiming to make the best software.
Compare QT/Cocoa to GTK for instance. GTK obviously gets dominated in general cases, yet, plenty of people seem to be on a crusade against C++. Its rediculous.
And because of the community, the end result is that Linux is still too risky to develop for.
I wouldn’t blame the xorgintel team for this. If the community gave up their holy war and started once again writing the best software they can, because they want to (not because of politics). You’d end up with an MIT kernel which was completely open in all ways, and software which was developed with users in mind.
Assumes that only companies can write optimised software. Not a valid assumption at all. The one and only advantage that companies have in writing software is that companies have access to secret information held by … companies.
Duh.
OK, so ATI have been good enough to release documentation recently, and even some code and a programming guide.
http://www.phoronix.com/scan.php?page=article&item=amd_r600_700_gui…
http://www.phoronix.com/scan.php?page=article&item=amd_r700_oss_3d&…
http://www.phoronix.com/scan.php?page=news_item&px=NzAxNg
http://www.phoronix.com/scan.php?page=news_item&px=NzE3Nw
Expect the decent, open-source, 3D drvier for ATI chips to follow within a month or so.
http://wiki.x.org/wiki/radeon
http://wiki.x.org/wiki/radeonhd%3Aexperimental_3D
Clearly not ready yet, but definitely on its way. Enjoy (when ready).
So ATI have released documentation (specifications) of their chips to open source programmers, open source programmers are busily writing an open-source driver for Linux for ATI chips, so that ATI chips will soon become the most powerful graphics chips available with a decent (non-binary-blob) 3D driver for Linux, which will no doubt be supported directly within the kernel, and hence Linux buying public will tend to buy ATI chips.
This is giving money away … how exactly?
Edited 2009-05-13 02:56 UTC
//3D drvier for ATI chips to follow within a month or so. //
Sure. A month or so. Just wait, it’s right around the corner.
Take a look at the sheer AMOUNT of documentation released, and the complexity of such a graphics processor.
Now try to tell me that the open source community can write as good a driver as ATI’s for Windows in 1 month’s time.
Intel had open specs and open drivers for far longer than ATI, the chip is simpler AND they actually have paid developers for the xorg driver. The intel driver is probably the best maintained driver for xorg right now.
The result?
The benchmark here clearly shows that the result is not what you would expect, the performance is awful. Comparing my hp box at work, the graphics performance are actually better in vista than linux with the xorg-intel driver, even after tuning the driver parameter for migration heuristics (which did a HUGE difference).
Compare this to the nvidia binary blob, which gives me the opposite result on another box. ATI is in an unusable state at the moment (I’m not taking sides with any graphic vendor, that’s just a fact).
Ummm…no. Intel drivers were suffering from a *BUG*.
This whole issue has nothing to do with having an open source kernel or gtk vs. cocoa or whatever.
It has everything to do with market share. Linux will get great desktop hardware support (3d graphics, wireless, acpi) when it has decent desktop market share. Look at the server market right now. You don’t have to buy a server that’s supported in linux. Every server product is well supported in linux. A manufacturer that put out a server product that wasn’t supported in linux would be laughed at. All of that is because linux is installed on a good portion of servers.
There is a lot of misinformation being spread right now about Linux having only a small market share. They are actually talking ONLY about the desktop market.
If we are talking about the entire market wherein the devices you mentioned (3d graphics, wireless, acpi) are used, Linux would have a very decent market share of that entire market. Perhaps 20% or so.
http://blog.linuxtoday.com/blog/2009/04/windows-owns-96.html
http://blog.canonical.com/?p=151
http://itmanagement.earthweb.com/osrc/article.php/3818696/Linux-Des…
The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.
Yup, that has do be it. It’s a big dark conspiracy.
The idea that a company might actuallty sit down and work out the number of extra sales they’ll get by supporting linux vs. the cost of supporting linux and come to the conclusion that supporting linux doesn’t make finanical sense is preposterous. It has to be Microsoft I tell you, they are EVIL!!!!
HA HA HA! I cannot believe how people think these days.
Developing software is a very expensive task. And developing drivers is even more expensive.
Particularly, what Linux has achieved on the desktop with almost no commercial support is admirable, but thinking someone is paying Linux programmers not to develop is really funny.
Well, I must say you are at least consistant in your ability to show you have absolutely zero clue to how a business, corporations, economies, markets, or even the world works. But then again things like facts and truth are really just an obsicle for blind fanboys like you.
Wow, you’re delusional. Typical freetard response. http://linuxhaters.blogspot.com … learn some things, sonny.
Who cares about market share? If something works, good for it. Use it, or not, who cares?
Or if they’re stubborn, lazy, or clueless. (Note that I don’t mean to be insulting/condescending there, I consider myself all of those, heh.) It’s not easy supporting everything.
The problem is it seems it is never going to happen until Linux gets its act together. I have heard : this is the year of the desktop Linux for the last 10 years.
Why? My guess too many options, too many opinions, to much liberties, too much configuration.
Have you ever try to teach a “normal user” the difference among RGB-BGR sub pixel rendering with hinting or no hinting in the Ubuntu Window configuration? People just look at your face an ask you: How does it look nice? How do I make it look like Mac or Windows?
Edited 2009-05-13 16:37 UTC
Its been awhile since seriously posting on osnews but this requires a response. In my professional opinion over the internet, you sir have no clue what you are talking about.
Hmmm lets see what does Ubuntu develop? How about notify-osd, the new growl-like notification system which is controversial but very pretty? How about usplash, the userspace bootsplash (soon to be replaced however). How about upstart, the new event based init replacement good enough for Fedora AND Debian to both adopt in the default installs? Granted, it is (IMO) crap, but how about the python ORM storm? Do you even know what an ORM is used for?
How about patches.ubuntu.com? Yeah they’ve written a good bit of code. How many patches you ask?
jeff@desktopmonster:~$ wget -qO – http://patches.ubuntu.com/PATCHES | wc -l
2632
jeff@desktopmonster:~$
Seriously how can you say, “Whilst the foundation of Ubuntu is shaky…” when it is based off of Debian? Are you saying there is a lack of Debian developers or packages? Sure a lot of them are not friendly to people like yourself but they aren’t ghosts. Frankly, you are talking out of your fourth point of contact. Call your proctologist and ask him to find your head, then take a shower and come back to play.
If you get the “deb/rpm” from a reputable source like oh your distribution’s package repository it will be GPG signed. Yes my friend, that is more secure than a shell script.
Really I shouldn’t have bitten as you obviously troll. The only argument I totally agree with you on is QT/Cocoa vs GTK. GTK is crap compared to either of those but is being worked on. Until next time kids…
I have to agree with you on Ubuntu grabbing packages. Moreover at least for me the Intel performance of Linux is not that good. Why? On my AMD systems I see great improvements. Moreover the quality of packages has degraded but not due to their development, but due to packaging. A comparison with Gentoo would be more fair if Linux was the target. If GFX is the target, Ubuntu is fine.
Yeah, in a perfect world everybody sings Kumbayaa.
In the real world we really need the GPL for all those individuals who’d like to take for themselves and shove the rest in the dirt.
I love hearing about movie studios using linux for their 3D work.
It would be great if OSNEWS could try contacting these studios and see if they would be willing to comment on it.
eg. If they were using mac or windows on their clients PC how did it improve things in using GNU/Linux?
What Distro?
Anyway.
Not exactly Pixar but for BigBuckBunny at the Blender institute we ran all 64bit linux workstations with 4-8gig of ram and 2,4 and 8 core PCs.
Whilst we didnt benchmark the NVidia cards on windows and linux generally performance was good, and there was no way we were going to use 64bit OSX or Windows for the short movie.
Time lapse of the studio
http://www.youtube.com/watch?v=6IcLxNVWBX4
Info about the PC’s
http://www.bigbuckbunny.org/index.php/maqina-workstations-benchmark…
I think you are looking for this presentation:
http://www.linuxmovies.org/2008/fosdem.tux.with.shades.2008.pdf
And some people say graphics are shit on Linux. 😉
“And GIMP doesn’t compare to Photoshop”, well there is CinePaint ( http://en.wikipedia.org/wiki/CinePaint ).
It does things Photoshop can’t and “CinePaint originated as a rewrite of the GIMP 8-bit engine in 1998 and still superficially resembles GIMP”. But it’s a bit specific for it’s field.
Edited 2009-05-13 08:34 UTC
One thing that was obvious from that presentation is that linux was mostly being used for the renderfarms which is nice but not necessarily showing how 3D graphics performs.
Even so I have heard a number of animation studios use linux on the desktop too though this wasnt made clear in the presentation.
The hardware used was the same Mac Mini. As least, that’s the was I interpret it.
From the article:
“Ubuntu was running on this system via Apple’s BootCamp.”
I don’t know whether BootCamp has anything to do with this results or not, but the system hardware is clearly the same.
Oh. So it is. I have no idea how I missed that– I read that paragraph maybe three or four times. Thanks for pointing it out!
BootCamp is just a boot loader(and drivers for windows), it shouldn’t have anything to do with the results.
The article was an interesting read, but it’s nothing more than bragging rights for a fan boy to say “My OS can beat up your OS.”
I’ve not seen this mentioned yet, so I thought I’d say it here.
The only thing you could say is, bootcamp might mean same harddisk and certain parts of a harddisk are slower then others, I think the partitions start on the outside and get slower the closer you get to the center, right ? And the Apple softwas was probably installed first. I could be wrong ofcourse, but it is something I did got me thinking.
I am sure OS X is optimised for a Mac mini. Its easier when you design the software/hardware to get performance.
Yes, and by the same mark, Linux is not optimized for anything. So it`s slower.
As I mentioned in one of my recent comments, I’ve been troubleshooting a problem with a friend’s Macbook Air, so I’ve been getting re-acquainted with OS X.
There are probably real-world performance differences with 3D graphics on Intel, but in my normal use, the Air was significantly less responsive than my regular computer; in fact it wasn’t much faster than my netbook.
Embarrassing.
How were you using it? I find my MacBook Pro far more responsive than my Gnome/Linux desktop with similar specs. Did you leverage platform advantages like OSX’s consistent App-centric dock? For example, on Linux you wind up quitting the application constantly, just because you closed the last window. On OSX, quitting the application is typically a separate step, which means that your commonly used apps can be kept effectively preloaded. I really wish this would catch on for Linux.
In Linux, and most other OSes, recently used stuff gets cached, hence improving loading times. Just test it with OO.org: the firt time it’ll take a few secs to load. Close it, open it again a it’ll load almost instantly.
The OSX feature you’re mentioning is absolutely about the document-centric approach of OSX’s interface, it doesn’t have much to do with technical aspects of the OS.
Valid point, yes, but having something cached isn’t the same as having it loaded, running, and available to simply run a new window. Even comparing Gnome Terminal with the OSX version demonstrates that. In functional use, even with no terminal windows open I’ll have the application running, and a new one is nearly instantaneous to load.
Of course, you could always use XFCE.
If you want to keep the app. running, don’t close it. It isn’t rocket surgery.
Most dangerous job in the world: Rocket Surgeon … don’t cut the red wire!
See, that’s the thing. Just because you close the one window you have doesn’t mean you’re done with the application, and that’s the assumption many Windows and Linux applications seem to make.
Instead of hitting the X, why not hitting the _ (Score) and minimize the app to the taskbar? Or put it on a separate desktop and pull it back when needed again?
Cosmetic troubles, IMNSHO.
For people who want to really do something about graphics on Linux, consider supporting the Open Graphics Project.
http://www.linuxfund.org/projects/ogd1/
And that is why I can’t be bothered with Linux on the desktop. For years there have been projects to do things right but they never get finished and then there is a call to arms to do the same task again but a different way.
Linux as a backend no issues and great but for general desktop use – I’m over it.
I remember seeing so many comparisons between PowerPC machines and various machines running Linux and the whole monolithic kernel vs. micro kernel argument and how Mac OS X was always severely thumped in every performance comparison.
So, Mac OS X wins a few tests, good or bad, but why are people making excuses for Linux here? Wasn’t it fair in the past that the graphics drivers weren’t the best or that Canonical didn’t optimise anything when Linux was winning?
It’s not exactly that any of this matters since it’s not going to change anyone’s mind really. I’d be more interested in seeing how FreeBSD performs against Mac OS X since they share bits and pieces quite often.
Struggle is good. If gives us a constant goal, right?
I too saw those marks and interesting how you ignore the follow up which explained why some of them were the result of the default configuration as with the case of the MySQL benchmark. It has nothing to do with micro versus monolithic versus hybrid versus chocolate bar with sprinkles on top.
Mac OS X was designed first and foremost as highly responsive desktop operating system. There are sacrifices when you focus on latency and responsiveness over throughput; and yes, when it comes to responsiveness, Linux doesn’t even come close to Mac OS X. If I counted the number of times my netbook came bogged down and poorly responsive with a couple of applications open versus Windows on the same machine – I’d be here all day.
It sounds more like you need to upgrade your ram than get a kernel optimized for desktop use. It’s not exactly a secret that current mainstream Linux distributions are less memory efficient than XP
I find OS X to be downright viscous — as though there is perceivable latency between the input devices and the screen. It’s possible I’m just imagining things because of the way desktop effects slow down some other actions, but that’s how it feels.
Woah, hang on – it has nothing to do with efficiency of memory; this was running ArchLinux whose total memory foot print was less than Windows XP – so it has nothing to do with the memory consumed. What it has to do with is the algorithms that are used to balance processes/threads to ensure that the end user gets a responsive system.
Pardon? nothing is slow to me; maybe it takes a second to load up the window to display the contents of the drive, or it takes a couple of seconds for an application to load by clicking on the dock but what I am talking about is smoothness when running 3-4-5-6 applications at the same time. For me, I couldn’t care less about the speed of one application all by its lonesome self; what I am talking about is a system under a reasonable load and getting some decent responsiveness from it.
Edited 2009-05-13 06:07 UTC
If we’re talking about running multiple cpu-intensive applications on a single processor core I’ll have to concede the point; I don’t specifically recall the effect you described, but I can’t test it.
Out of curiosity, have you tried switching to desktop-optimised kernel and seeing what happens? I would be hesitant to declare that the scheduler is the major factor in performance differences between two operating systems, but very interested to see what difference it makes.
As for OS X, my understanding is that low latency means a negligible delay between user input and the output appearing, which is exactly what I haven’t noticed while using it.
Edited 2009-05-13 06:39 UTC
Funny, as repsonsiveness is one of my biggest gripes about Mac OS X. I’ve used the most powerful Macs you can imagine, and even those that herald the coming of the starborn ones (to paraphrase Yahtzee) have noticeable delay when launching applications or interacting with them (buttons, menus, etc.).
This is absolutely intolerable. Mac OS X is smooth, yes, but not when it comes to responsiveness.
Before I get the usual group of Mac fans on my bum: the above does not imply, in any way, that Windows does this any better.
I believe you are confusing things. Responsiveness has nothing to do with launching Apps. It has to do with how the system take care of your requests and events.
A slow application, does not mean the system is not responsive enough, or a slow launching.
However, if the menu bar that does not appear when pressed it is a responsiveness issue. That is particularly true in Applications written in Java, but they are unresponsive in every platform, even Windows.
Windows, and Linux are very fast, but when you have your processors at top capacity, both systems get very unresponsive, especially Windows. Mac OS X, usually keeps receiving and behaving properly under the same circumstances.
Exactly.
So when I click a launcher, the app needs to be there instantly to receive brownie points. When I press the close button, it needs to disappear instantly for brownie points. When I press a menu button, the menu should appear instantly. Etc. Mac OS X simply does not perform optimal when it comes to responsiveness.
I’m from a BeOS world, and anything less than instant responses is evil and bad and should cause people to be fired.
Mac OS X totally sucks in this department, even on very powerful machines. Windows XP and esp. vista sucked balls here too, but Windows 7 seems to have nailed it pretty good (still not good enough though). Sure, it needs tricks like SuperFetch and such to get there, but I’d rather have tricks getting me there than not getting there at all.
On 7, all the applications I use appear instantly – Chrome, Office, Miranda, you name it.
Except for blu though. blu’s a ridiculously beautiful Twitter client written in WPF, but it’s goddamn heavy on resources. In fact, it’s my most memory intensive app .
I don’t count application loading times as “responsiveness.” It’s how responsive the system remains while loading something that matters. Caching apps in memory does not and will not work in every case so when the system does need to start loading something, possibly even a really heavy app, it should do it in a way that the rest of the system stays responsive and useable.
It is true. Mac OS X is not as responsive as it should on the user interface department.
That’s not true. BeOS was very responsive, I used it, but it was not that responsive, especially considering how old the graphic interface in BeOS was. BeOS was not as responsive as Mac OS 9, for example. Mac OS 9 has its problems, but responsiveness was not. All the time the user was first (Except when the system hanged itself).
Preemptive operating systems, are not as responsive as cooperative multitasking systems, for obvious reasons, but you gain robustness.
BeOS display technology, was not either even 1/10 of the sophistication Mac OS X or Vista has. BeOS was pixel related, very similar to what Mac OS 9 was and Windows XP is. Everything was a bitmap.
Mac OS X, in contrast, is PDF-vector related, heavy transparent (everything has alpha channel, even if it is not used), double buffered. (It has to be slow, because everything is written on the screen twice), heavy anti-aliasing, etc. How do you think those animations are made? Vista is similar to Mac OS X in that respect.
You’ve got to be kidding me, right? You’re pulling my leg, right? You’ve put up a camera next to my display, and in a few weeks, I’ll be eating bag of crisps and see my face on one of those crappy home video shows on TV.
Seriously. Have you ever USED Mac OS 9 and BeOS?
HA HA HA! Yes I used both of them.
BeOS was very fast. No doubt about it, especially because it was a real multitasking system. But we are talking here about responsiveness, and there are many things to consider. Let me explain.
For example, multitasking responsiveness… In that regard, BeOS was amazingly fast. The king of its time… And it might be the king even by today standards.
But when you say interface responsiveness on one application only… Like hitting a button, or pulling a menu… Sorry, Mac OS 9 was faster. I did many tests at the time. The answer is simple: Mac OS 9 gave all the processor to the front most app, while all other background processes were desperately craving for attention. UI was usually coded on the same thread as the interface, so when the user was using the front most App the whole processor was just waiting for the user.
If you tried to do something “heavy” in the background, like CD burning, or printing, it was a disaster for the background process. Usually you lost CDs (buffer under run), or spend hours printing a page. But the front most App buttons were responsive. Was it good to be so responsive in the User Interface department? Maybe not. It has to do a lot with how you want the system to be perceived, but that is the reason cooperative multitasking was used on personal systems at the time: Windows and Mac. Preemptive systems like BeOS, while more robust, did not “feel” fast. BeOS shared the processor with all the background processes trying to give them a fair portion of time to each one. Mac OS up to 9 did not.
Windows has a similar approach even these days, that’s why Windows is so bad on all multitasking tests, comparing it to Linux or any UNIX. And that’s way so many users say Windows feel faster than Mac OS X or Linux, or any other UNIX.
See this, for example:
http://mobile.osnews.com/story.php/19769/Ubuntu-8.04-vs.-Windows-XP…
Of course, if you used BeOS in an Athlon 1 GHz you would not see the difference, just as many people swear Vista is fast on their 4 GHz Quad Core machines… I remember I conducted my tests on a PowerPC 180-225Mhz, 64 MB RAM. Many Mac users at the time noticed BeOS speed, especially rendering things, but many said user interface felt faster on the Mac. Again, the same word Windows people use today: “feel faster”.
I haven’t seen that kind of performance since version 10.4.x, even when it’s short on available real memory, though I’ve had stuttering from the virtual memory system when an application tries to implement its own system.
The graphics card has a lot to do with it, though, since OpenGL is used in many places. The early Intel-based machines with the early Intel graphics chipset lagged a lot but then, they weren’t able to access 226 MB of shared RAM in Mac OS X the way that they could under Windows.
Until 10.4.x, Mac OS X was never interactively responsive for me. I know that was the intention but it never happened, even on dual processor machines. My Ubuntu machine feels better but even that has some odd performance foibles.
Responsiveness is heavily dependant on the machine load.
Under low-load (means less than 100%) conditions Linux is a bit less responsive than Windows. Under full load Linux is by far more responsive than Windows (at least XP).
I had two machines with 4 cores each, the Linux machine even being slightly slower (2.8 GHz vs. 3.0 GHz).
I ran a finite element calculation on each using all 4 cores, both machines needed 1.5 GB RAM for this calculation and had plenty of RAM available for other stuff. CPU utilisation was 100% at both machines, both processes ran with standard process priority. None of the machines had to swap.
On the WinXP machine it was not possible to do anything productive during number crunching. On the slightly slower Linux machine working was slightly less responsive than without load, but still good. And by the way, my work included software like Salome, GIMP and OpenOffice where responsiveness definitely IS an issue.
When looking into desktop performance, the high-load scenario is not the typical one, on the other hand if you sometimes DO saturate your processor, having still good response is definitely a plus.
I think that everybody needs to decide on his own which behaviour is most satisfying to him.
I have experience the same. However, Mac OS X under the same circumstances is even more responsive than Linux.
Some of these tests look like they should be compiler/processor limited so the OS shouldn’t really matter at all. I wonder, for instance, if they would have gotten the same result on the OpenSSL test from both OSes if they compiled the benchmark themselves with the same version of GCC.
Tbh, they both suck if Urban Terror is the best game you can use to benchmark.
The main issue with the FPS test in Urban Terror is the difference in OpenGL versions. As stated, Ubuntu was using v1.4, while OSX was only using 1.2. Most games, especially ones as optimized as ioquake3 (the engine that drives Urban Terror), will use different render paths for different versions of OpenGL. Most likely, ioquake3 saw OSX was using an older version and turned off some of the features. In Ubuntu, the newer version triggered the newer features, which on Intel graphics is done via software since the Intel GPU hardly does anything in hardware.
So the difference wasn’t so much optimized Apple drivers vs unoptimized linux drivers as it was old rendering path with more hardware acceleration vs newer rendering path using software. I’d bet that if they redid the test with UT set to the lowest values for the rendering quality, both would show nearly the same FPS values.
No, I think the main issue is that they are benchmarking an irrelevant game which barely no one plays. What is the point of benchmarking an old game with lower requirements anyway? Even though one OS would be better than the other, both would still suck. I find the entire test irrelevant.
I guess there are some weird rationale for choosing this game for the benchmark, but while Linux and OSX are arguing who has the better Urban Terror performance, Windows are bragging about Crysis performance. See my point?
Would be really nice if all these test results and the accompanying pictures where set into some technical context .
How do you test ?
What do you test ?
What are the results ?
What do the results say ?
Just publishing test-results ,does not give answers .
Anyone any idea whether fsync() / fdatasync() does anything in Mac OS/X ? The large performance difference for the SQLite and PostgreSQL performance benchmarks made me wonder about this.
It might be that MacOS X is not turning off hard drive write back cache, whereas Ubuntu is. The Ubuntu figures for SQLite come out at around 112 transactions per second (assuming each insert is 1 transaction) while MacOS X comes out at 500 transactions per second. The Ubuntu time is more consistent with a drive that has write back disabled.
My experience with chess engines is that if they are compiled with the same compiler they run at exactly the same speed regardless of the OS. Could it be
that for some reason the MacOSX version used more
processors than the Ubuntu version?
I am assuming both Crafty’s were compiled with the same compiler. If not then one has also to take into account that there is a large difference in performance among compilers.
At least until recently the Intel compiler produced much faster code than gcc (when using both with runtime profiling).
I have a friend that built a £600 hackintosh which when benchmarked, blew away the results of a £1500 Mac Pro.
Okay the Mac Mini’s aren’t as completely overpriced as the Mac Pro; but why would anyone pay the extra for Apple hardware, except to be allowed to legitimately use Apple software?
Edited 2009-05-13 13:37 UTC
intel drivers in jaunty are broken:
https://bugs.launchpad.net/bugs/314928
Edited 2009-05-13 17:19 UTC
“Before giving some detailed results of some of the specific tests, the overall testing showed that Ubuntu was faster than Mac OS X in 18 of the 29 tests. Some were landslides while others were only ahead with marginal differences for both systems.”
(Sadly) this is wrong. Mac OS X won 17 of the 29 benchmarks. You probably missed the “fewer are better”-labels in some of the tests.