“We always knew that WebKit is going to make Konqueror fast; but how much faster? Today we test that by putting Konqueror with KHTML through the SunSpider JavaScript Test and the then do the same with WebKit. To get an idea of how fast they are compared to other browsers, we also decided to put Firefox 4.0 Beta 2 through the tests.”
Mozilla is still in the midst of planned optimizations, and integration of JaegerMonkey into Firefox 4. So it’s still early to make comparisons.
See:
http://arewefastyet.com
https://wiki.mozilla.org/JaegerMonkey
Edited 2010-08-15 18:42 UTC
oh ok, I’ll just travel to the future and grab a copy of Firefox from then, instead of being interested in how speeds compare between browsers right now
Edited 2010-08-15 20:03 UTC
I think the point being made was that it’s a comparison against last year’s model and a more useful comparison would have included bars for stuff typical of this generation of browser. (Chrome 5 or 6, Opera 10.50+, Safari 5)
Wait until Firefox 4 will be released, and compare then. The current beta is still significantly behind in performance from the planned release version.
Wait until Firefox 4 will be released, and compare then. The current beta is still significantly behind in performance from the planned release version.
I don’t understand this mentality. It sounds just like you’re just trying to defend Firefox and make it not look bad, ie. an excuse.
What’s wrong with comparing _current_ development to _current_ development? There will always be something later on, you just can’t ask everyone to wait until the next big thing before allowing comparisons.
Firefox v4 will come when it’s ready and will undoubtedly get benchmarks done then but that’s then, not now. This is now.
firefox does not actually look bad
the article in question used firefox just because it makes it more “newsworthy” to talk of firefox
firefox is not the fastest in javascript at all, albeit they’re working on it, i can’t feel anything going “slow” in real life, except the initial startup.
html rendering itself is extremely competitive
firefox does not actually look bad
Indeed it doesn’t. That’s why I responded to the poster: there just is no need to jump to Firefox’s defence like that. People just wanted to see how Konq-webkit compares to recent development version of Firefox and they did, nothing else. There is absolutely nothing wrong with comparing something current against another current thing, just do another comparison later on if you ain’t satisfied with the current results.
Sure, there is nothing wrong in comparing anything to anything. The point was about the language of the article, which made the comparison look like something unexpected (taking in account some expectations, that Firefox 4 is fast).
Maybe he meant that the current release of the Webkit engine that was integrated in Konqueror has reached… well.. release quality, while development of the new engine of Firefox 4 is still in progress…
You behave as if you’ve read a different benchmark than me.
The benchmark I’ve read focused on the difference between KHTML and QtWebKit. Firefox was just thrown in to have a more mainstream browser as rough comparison in there. Nobody claimed “OMG QtWebKit is so fast, Firefox can never reach it”.
It sounded like something unexpected:
Surprisingly Konqueror with WebKit is also faster than Firefox 4.0 Beta 2
If this was all about integrated JaegerMonkey – that would be interesting. Otherwise there is nothing surprising here.
Firefox is still not the focus of that benchmark. Konqueror with its two engine choices is.
Just wait until the webkit version that will be released after ff4.0 sharp will be released.
WebKit is out for years and for years there were plans to drop the home-brewed KHTML in favor of WebKit. In fact it wasn’t difficult at all. I’ve build a WebKit-enabled Konqueror version myself years ago.
But the KDE crew have done this step only now. And there are still people complaining about big evil corp’s policies that inhibit tech progress…
As I understand it, QtWebKit didn’t expose APIs required for feature parity when KDE 4.0 was released and their API stability policy requires that all APIs present in 4.0 remain throughout the release cycle, so that might have been a drag or disincentive on KWebKit development.
(That API stability policy and having been burned by an earlier GStreamer release is actually what lead to Phonon… apparently at least one GStreamer dev showed quite a lack of maturity at that development.)
Of course, API stability and functional stability are two different things. As much as I like the idea of KDE, I switched to LXDE after waiting through 4.4, still having too many bugs, and discovering that one guy in a basement can easily have a more thought out release testing plan than the KDE guys. (It also doesn’t help that they see no problem with the “chuck it all out and start over” pattern that appears at every major version increment with regards to features and stability)
Edited 2010-08-15 19:41 UTC
So in other words, they couldn’t port it themselves without Qt’s port.
I’m not sure i understand your point.
The KHTML devs split in half a while ago – half continued development on it, while insisting that Apple and Webkit were evil and not integrated into KDE well enough to replace khtml.
The other half went to work for Trolltech integrating Webkit into Qt. Eventually, those same people, along with a few others from the community that were trying to use Webkit in their projects, finished the integration work with Webkit/KDE.
i’d like to point out even if it’s easily forgotten that KHTML has had a HARD time getting proper code from Apple, so yes, they’re big evil corps, thankfully the license prevailed and we now have a very good engine which is getting contributions from everywhere.
And that’s why *GPL* OSS > x (call this flamebait if you wish, it stands true here)
Much code in WebKit has been completely replaced since KHTML. That code is BSD-licensed without the need to release it.
Collaborative open development is just beneficial for all involved parties, including Apple.
Ahhh, yes “proper code”, whatever that is. You mean that the code wasn’t returned in the same messy state that Apple had to start with it? That it was in a completely different messy state?
Yes, they returned the code in a different way than the KHTML developers wanted and there was much moaning, despite wholesale improvements in the code.
So, are there still people defending kHTML and painting Apple as the villain?
KHTML has to stay for the 4.X cylce. Binary compatility.
And further good performance does noone make more or less evil. You mix things here just for trolling, right?
Not even to forget that as long as people are working on KHTML …
The problem the last few years with the Webkit integration was that not many people were working on it.
Edited 2010-08-15 20:43 UTC
Back in KDE3 days, KHTML was the best OSS rendering engine around. It was the second browser to get 100% ACID2, long before any other one. Code was backported from pre-webkit safari dumps and it was working with all websites. It is not the case anymore, the web is more complex and evolving faster than it was. You have to give credit to kHTML instead of bashing it, it’s days are over, but it was an OSS masterpiece.
That position was only taken by the KHTML team (maybe still is — I don’t know and frankly I don’t care). It never was the opinion of the majority within KDE.
Appel is a villain regardless.
Sure, Apple are still the villains
If I remember well, the KDE guys had to threaten them with legal actions before they make an (incomplete) release of Webkit’s source. So knowing Apple, we can easily envision that without the KHTML team, Webkit would be yet another Trident
KDE never threated Apple with any legal action regarding WebKit.
From day one Apple followed the LGPL by releasing the sources. However, WebKit was not following an open development model with a public repository etc.
Development model and licensing are different things and no FOSS license I’m aware of forces open development.
Before forking KHTML Apple was an active contributor to Mozilla — Chimera/Camino to be exact. So the involved people already knew how to be part of a community project.
Apple wasn’t a FOSS poster child by not using public development right from the start, but Apple was also not violating any license.
You must be confusing the matter with NeXT (Steve Jobs’ old company that was bought by Apple in the early 1990s) who created a proprietary fork of GCC in the 1980s or so and then was visited by a few FSF lawyers.
Serious question: Is this difference mostly negated by the performance of your Internet connection?
It depends. If a web site has few scripts but many graphics, yes.
If a web site uses many scripts and few graphics (many Google services come to mind), it can be very noticeable.
JavaScript execution is not the only client-side performance factor. Complex HTML and CSS rendering also takes its performance toll even after all web site files have been downloaded (eg cracked.com is totally unusable with my Firefox installation).
Thanks for that info.
Also cracked.com not bad, but performance seems okay for me
KHTML is being actively maintained and developed. It support most of today’s rendering engine features and is not ‘far far behind’ others like many claim.
Most of the issues are not because khtml is bad, but because website dont let it work. For example, gmail, google maps and most other google websites will work just fine if you change your user agent.
Also, khtml is not there just to keep binary compatibility. Its there because its an actively maintained and developed KDE project which is very well intergrated into KDE.
(Im not a khtml developer, I follow its development however).
http://khtml-konqueror.blogspot.com shows khtml’s activity.
This benchmark is not about how many web standards are supported but how fast KHTML’s JavaScript execution is and in all posted benchmarks KHTML is far behind.
I wasnt talking about the benchmark. I was talking about khtml vs. others in general.
Yeah, selecting tests are good. So do you like javascript speed or rendering speed: http://ie.microsoft.com/testdrive/Performance/01FlyingImages/Defaul…
I’ve tried making khtml work with gmail and google docs. It sorta works, but often many parts of it do not work correctly. I think the biggest problem, as evidenced by this article, is that its comparatively super slow at javascript
If they’re getting just over 1000 ms for the Sunspider test they’ve got a long way to go.
I’m getting 480ms on a Pentium D 940 4GB Ram with Epiphany Trunk.
I’ll have to test Konq 4.5 when Debian releases KDE 4.5 after Debian 6.0 is out the door.