With the release of Ubuntu 8.10 only a few days away, Phoronix decided to take a look at the performance figures over the past releases – from Ubuntu 7.04 to Ubuntu 8.10. Phoronix used its own extensive test suite on fresh installations, with the same parameters, on the identical hardware. The results are rather surprising. Update: I’ve added some more information about this, gathered from the Ubuntu mailing list. You can find it in the ‘read more’.
The Phoronix test suite is rather extensive, as the website describes.
The tests we used had included the BYTE Unix Benchmark, SciMark 2.0, SQLite, Tandem XML, eSpeak Speech Engine, timed Apache compilation, timed PHP compilation, timed ImageMagick compilation, Bonnie++, Flexible IO Tester, GnuPG, OpenSSL, LAME MP3 encoding, Ogg encoding, FLAC encoding, WavPack encoding, FFmpeg encoding, OpenArena, World of Padman, Unreal Tournament 2004 Demo, GtkPerf, Bork File Encrypter, Java SciMark 2.0, and RAMspeed.
Almost all the tests show a worrying pattern: over time, Ubuntu is getting slower. The Phoronix guys were surprised by this, so they re-ran their tests to rule out the possibility of a fluke – even though the tests run autonomously and multiple times each – but the results were consistent. “Major slowdowns after Ubuntu 7.04 ‘Feisty Fawn’ in so many different tests certainly weren’t what we had expected.”
While the benchmarks offer an intriguing insight, it’s important to note that each new release also offered new features which might affect performance. In addition, it’s a more or less grudgingly accepted rule that as time goes by, software gets fatter, slower, and more complicated, and that you need ever faster hardware in order to keep up. It appears that Ubuntu isn’t exempt from this – almost – universal constant.
Update: The news of this story hit the ubuntu-devel mailing list. It appears that the people on the mailing list are glad that Phoronix did not point fingers towards Ubuntu itself, but rather tried to look at the components that make up Ubuntu. The developers are already trying to see which major changes could account for some of the performance drops. Phoronix itself has stated that they are planning on running the same test suite on other distributions and hardware as well – Fedora’s results might be up by the end of the week.
This thread is worth a read:
https://lists.ubuntu.com/archives/ubuntu-devel/2008-October/026794.h…
Phoronix didn’t do a very good job, which is why I personally continue not to trust them as a source
While many of the graphical regressions could be attributed to ATI driver changes (they should have used the shipped drivers for all releases if they don’t need that driver), I don’t see what invalidates some of the other benchmarks such as Lame encoding and compilation. Many people even confirm those observations in the thread.
As for not blaming this on Ubuntu or other distributors, why on Earth don’t they have their own regression tests to help them? That’s their job, not that of individual projects. After all, they’re the ones packaging the whole, they only have six months between releases and fixes for LTS will not be backported.
This behavior is described in the social judgment theory (psychology).
And it’s called an auto-reference judgment error.
One will attribute it’s own success (internal factors), but attributes to others their failure (external factors).
Ex.
Ubuntu is faster: This is a clear advantage of open-source and Linux
Ubuntu is slower: The test is wrong, the author is bias or/and it wasn’t configured properly
Well there is also the fact that they only tested ubuntu. Users who don’t know better will blame Ubuntu when the fault could lie with GCC, or the kernel itself.
Phoronix says they will be testing other distros, but really they should’ve done this before publishing the article.
To be fair on the benchmarks I would be more interested in how the 64bit version would compare.
Th fault is still with Ubuntu, and I don’t think you got that. They package the whole.
These people obviously did a poor job of benchmarking Ubuntu.
Hello! Ubuntu is open source.
There are hundreds of billions of people working getting every single ounce of performance and security out of every line of code.
Its just not possible for any linux distro to get slower like that. Only windows does that kind of hanky-panky.
If someone was able to coordinate just 15% of all the hundreds of trillions of programmers out there cross checking everything into farting at the same time – it would obliterate not only the atmospheres of Jupiter and Saturn but would smack Charon and Xena right out of the solar system (thus giving pluto its planet status back).
When I’m picking a server to place in my closet out of old hand-me-down parts to hold all my home videos of me brushing my teeth or cleaning the fridge you can be I’m going to choose a linux based OS.
They are faster, securer, and more feature complete than anything out there. I wouldn’t trust my home videos to Homeland Security, NASA, the NSA, the NRA, the IRS, but a group of people who could blow away Jupiter with just percent of them trying. Dude. Nothing can beat that kind of robustness.
So finally and in conclusion, this article is wrong. Linux based operating systems can only get better. They need to fix the benchmarks to show this just like nVidia does.
Edited 2008-10-28 12:21 UTC
Given that some of the Ubuntu devs have acknowledged there may be some truth in this, your response sounds like head-in-the-sand fanboyism.
I think he was being sarcastic.
Heh. Then it was so far above my head it was a danger to air traffic. :o)
“Its just not possible for any linux distro to get slower like that. Only windows does that kind of hanky-panky.”
Whatever. Nice rose coloured glasses by the way.
> Whatever. Nice rose coloured glasses by the way.
What rose coloured glasses? I don’t see any… Oh, is that — sarcasm?
No, all software has regressions, even in OSS. Software may get slower as new features are added. Even OSS
You failed to pick up the OP’s sarcasm. Read what he says about fixing the performance and the comparison to nVidia.
It was satire.
LOL! (sorry, useless comment, but it was a nice read early in the morning).
This almost makes me sad that LH just closed up shop. He would have gone to town with this
Why was this comment modded down?
He has been respectful with his comments and in a sarcastic way describes some of the zealotry we see everyday in the FOSS world.
Maybe the community should be more open to this kind of comments because they hide a lot of reality about how the FOSS world moves.
but something is still wrong. LAME is primarily a CPU bound process, to nearly double in encoding time suggests either the compiler used had issues with the intel optimizations or that something was throttling the CPU.
Further it is not clear what they are comparing. It is not release vs release. To do that they should be using the packages that came with the release (with the optimizations and choices that a normal user would be using), not a custom compile. The question is whether or by how much Ubuntu has slowed. A custom compiled LAME encoder is not showing how LAME encoding speed varies between releases but how much other things on the system slowed a common version of LAME at the source level.
Still though, the Phoronix writers are a fairly technically adept bunch. If it is a problem with methodology it is one rooted in the way the default Ubuntu install is configured. I guess over time we will see what shakes out.
Edited 2008-10-28 13:21 UTC
that the developers are being proactive and are taking a look at this so called performance slowdown seriously. I think thats pretty cool. And considering how difficult it is to benchmark open source software like that with so many libraries and dependencies etc etc, I think the Phoronix test guys and their test suite does a pretty darn good job considering the test suite exercises so many things.
Phoronix better be careful. Criticizing Ubuntu in any way shape or form will activate the swift vengeance of fanboys far and wide. They’ll waste no time ripping to shreds anyone who dares criticize, constructive or otherwise, their precious Ubuntu. It’s sad really, that a community can be so delusional in the face of raw data. Oh well, to each his own.
“Phoronix better be careful. Criticizing Ubuntu in any way shape or form will activate the swift vengeance of fanboys far and wide. They’ll waste no time ripping to shreds anyone who dares criticize, constructive or otherwise, their precious Ubuntu. It’s sad really, that a community can be so delusional in the face of raw data. Oh well, to each his own.”
This has also been my experience of Ubuntu community. I once made the mistake of asking in a ubuntu irc channel if the boot process could be speeded up as I found it to be slow on my laptop.
Needless to say I was ripped to shreds and banned for being a trouble causer, so not really a community I wish to be a part of.
I find Ubuntu gets more and more pleasant to use each year. If it wasn’t for a different show-stopping bug each time, I’d be using it more. Hopefully 8.10 irons out the out-of-the-box faults I had on my netbook, I’m in need of Firefox 3 and Asus won’t get off their arses.
I think I’m almost an Ubuntu fanboy, however, Interpid does seem a touch slower to me – however, I don’t mind a marginal drop in performance if the overall experience improves with every release.
Of more concern are the bugs that creep in usually not show stoppers but…
Be aware that Firefox 3 along with KDE 4 are two programs that suffer badly from performance issues due to a bug in the XRender function within the binary nvidia driver for Linux.
http://www.phoronix.com/forums/showthread.php?t=11044
Fortunately, most netbooks use Intel graphics, so Firefox 3 performance should be great, but nevertheless it pays to be aware of issues such as this.
There is quite a lot of “Firefox 3 is fine on Windows, but it is a dog on Linux on the same hardware … is Mozilla neglecting the Linux version?” type of FUD flying about the net over this issue.
If your netbook has Intel graphics hardware, don’t be put off by the FUD about Firefox 3 on Linux. It won’t affect you, since this is a nvidia problem not a firefox problem.
well however they were configured… the tests show some big differences. I think that this is normal as hardware gets faster software can take advantage of that and add more features. I don’t think this should be a huge deal, although since devs are saying it could be an issue I guess we’ll be seeing more performance work done (hopefully).
The more features you add to software, the slower it is going to get. Over time, hardware speeds increase, so you end up with more or less the same speeds, but software that does much, much more then it used to.
That is not the problem. The problem is that cannonical did not know this. An automated performance test suite should be part of any half-way serious development process. You need to know the impact that your changes are having on things, even if you choose not to act on that information, it is vital to have. With it you can say “introducing feature X did not impact performance as much as we expected it to”, or “we have a huge performance hit on this release that we were not expecting, we got to figure out what is causing it”.
> The problem is that cannonical [sic] did not know this.
This is truly the heart of it all. The Ubuntu development team is more amateur and less trustworthy than I expected.
Ubuntu as a distro. tends to concentrate on Joe End Luser like all platforms with this focus they slow down over time as the result of adding all the bells, whistles and gongs that end lusers love (graphical this and that daemons to make network configuration fool proof.) the problem is adding all these features not only slow down the system it also makes the system more fragile.
Proves the saying.
Build a system that even a fool can use, and only a fool will use it.
mandriva, suse… is “joe sixpack” oriented… and are faster
How do you know this?
Not to be confrontational, but I would at least expect numbers. Personally I have not been able to “feel” a difference in speed between the various distros.
I’m curious to know whether the tests that they’re going to do on fedora will be any different.
try them and see the difference
Not really helpful in even attempting to quantify if they are slower or faster than each other.
“It appears that the people on the mailing list are glad that Phoronix did not point fingers towards Ubuntu itself, but rather tried to look at the components that make up Ubuntu.”
Yeah, great!
When there are good words about Ubuntu it’s ubuntu merit, when there are bad words than it’s the “components” fault!
LOL!
They just take all the glory (contributing almost anything to the whole ecosystem) and leave the critics to the others.
If they want to proof performance this should be disabled before recompiling (But then you have 7.04):
pulseaudio
PolicyKit
Networkmanager
some last resort xorg stuff
….
kernel stuff
completely fair scheduler
selinux
preemptivness
Edited 2008-10-28 17:36 UTC
Everyone knows there are new features. The point is still that Ubuntu is still slower.
People are tying their best to figure out ways around the fact that new versions of Ubuntu are slower just like new Windows versions. Vista gets burnt at the stake and Ubuntu gets unlimited excuses. Biased people!
Edited 2008-10-28 19:06 UTC
Its nice to finally meet a Vista user that admits Vista has a performance problem, admitting it is the first step
the harsh reality is I have Ubuntu on a machine that has no hope at all of running Vista, and runs XP quite comfortably.
the other reality is one is most probably caused by a well known regression in GCC or possibly the scheduler rather than an artificial problem created to protect “premium content”.
Its likely that this will be resolved esp if it is the likely candidates, as work in ongoing in this very area, and that its likely to be fixed in the 2 revisions before Vista2
If you actually read the article, and read between the lines, you’ll see that the speed went here:
Change in GCC
Change in GTK
Change in ATI drivers
Change in kernel timing and scheduling
The first is being worked on. 4.0 was a major rewrite and the gcc crew knew that it would be a while to get the speed back.
The second is improving as the current major rev of GTK is improved.
The third is making strides now the ATI has made enough info available to make the drivers open.
The fourth may/will improve as the kernel team balances the new timing and scheduling algorithms.
So it’s all being worked on, it’s just that a number of systems all being revised at the same time coincidentally overlapped to make a major slowdown.
LOL performance has possibly dropped phoronix is awful at benchmarking, but it would hardly be surprising. Anyone using the open source Intel drivers and moved from XAA to EXA and soon to UXA(what!?) has had more that a little pain, as software goes through a transitional phase.
The funny thing is that performance particularly that is paramount, and only makes a difference if features/security/stability/other justify it. If performance has dropped, unless the benefits can justify the performance drop, they can simply because its a fantastic release, (love love love my wireless networking out of the box).
Not the harsh reality is. esp when talking a simple test like Dhrystone its not features its something pretty low level which affects everything, which is kind of useful.
What is fun though is they have tested programs that make up part and parcel of the linux experience that are faster than they were before. Openarena is a good expample whey have had some massive performance boosts with the ioquake engine.
That all said it looks like performace problems have been introduced and need fixing as it does not look like the upper layer stuff.
Now what is really really funny is this has less impact than my personal bugbear the lack of OpenOffice.org 3.0 in Ubuntu which outweighs *any* performance regression its really stupid, and having it in backports is simply not good enough.
Now whats really funny is the anti ubuntu fanboy rants, which outweigh by an order of magnitude any defense by any fanboy.
OpenOffice.org 3.0 is still very new. You have to realize that Ubuntu supports huge amount of languages officially. People would expect that they get a fully working office package with well-working spell-checking and hyphenation for the languages they write in (if those languages are not very small only). For example, OpenOffice.org 3 still doesn’t have those things in my mother tongue, thus I rather use Ooo 2 still – although I could get Ooo 3 for Ubuntu 8.10 quite easily from the PPA repositories too (and I’m too busy to compile and test some alpha or beta versions of hyphenation & spell-checking just now).
I know the reasoning behind the choice. It was clearly a hard choice. I think the pro’s for including it outweigh the cons, others think differently and they could be right.
I have Openoffice.org 3 working on machine, and it was fairly trivial…but its not yet for those people who want to try a new OS. Ibex in so many ways is wonderful.
I suspect I’m more disappointed than damning over this. Its been such a good release. Wireless is just a one area that is spectacular. I hoped so much for a perfect release this time. X was not as singing and dancing as it should have been. Mono is on 1.9.1 not 2.0. I know my Intel driver is sat at 2.1.0, and its because the the Intel drivers have not stabilized yet.
I know I have to wait the matter of weeks for these to be corrected, but read my posts I was busting for this release to be a shining example of Linux+X+Gnome+stuff and it hits in some of the spots, but falls short in others, but Ubuntu only gets *one* launch. I suspect those Distributions that are prepared to wait a few weeks will reap the rewards.
Don’t be too sure about OpenOffice 3 either, if you have a nvidia card and you are using the nvidia proprieatry driver. It seems that OpenOffice is another program, along with KDE 4 and Firefox 3, that is hit very badly by the binary nvidia proprietary driver bug in XRender.
http://www.breakitdownblog.com/nvidias-linux-performance-woes-espec…
How bad is this bug? Try a performance degradation in the XRender function of apparently up to 50 TIMES slower.
http://developers.slashdot.org/article.pl?sid=03/08/16/0034235
I think it may eventuate that quite a bit of Linux software may be getting tarred with a “poor performance” bad rap from this nvidia bug.
PS: Phoronix used ATI for their ubuntu comparison test. The problem is that ATI graphics drivers for Linux for a long while were similarly terrible, and only now is there eventually becoming available some decent-performing ATI drivers for Linux.
Edited 2008-10-29 00:13 UTC