Late last week, Nokia dropped what many consider to be a bomb on the WebM project: a list of patents that VP8 supposedly infringes in the form of an IETF IPR declaration. The list has made the rounds around the web, often reported as proof that VP8 infringes upon Nokia’s patents. All this stuff rang a bell. Haven’t we been here before? Yup, we have, with another open source codec called Opus. Qualcomm and Huawei made the same claims as Nokia did, but they turned out to be complete bogus. As it turns out, this is standard practice in the dirty business of the patent licensing industry.
First, let’s talk about Nokia’s patents. As we’ve discussed before, VP8 has been proposed as a possible codec for WebRTC, a W3C API definition to enable browser-to-browser voice calling, video chat, and file sharing without the need for any plugins. If browsers supported this, we’d no longer need separate, often closed tools like Skype and FaceTime – in other words, a major step forward for everybody but those running said closed tools.
As part of the process of defining the standard, the Internet Engineering Task Force wants to find out if any proposed technologies are problematic on the intellectual property front, which is a very wise thing to do in today’s litigious environment. This is partly why Google decided to essentially put the MPEG-LA offside (without actually admitting infringement), so that the largest producer of ‘fear, uncertainty, and doubt’ around VP8 no longer plays a role.
Sadly, just as everything seemed to be cleared up around VP8, out of the blue comes Nokia, which has made an intellectual property rights declaration to the IETF. They dumped a seemingly long and scary list of patents that they claim VP8 infringes upon. They also stated that they are unwilling to license the patents on a royalty-free or FRAND basis, and gave the following statement to Florian Mueller, a consultant paid by Oracle and Microsoft:
Nokia believes that open and collaborative efforts for standardization are in the best interests of consumers, innovators and the industry as a whole. We are now witnessing one company attempting to force the adoption of its proprietary technology, which offers no advantages over existing, widely deployed standards such as H.264 and infringes Nokia’s intellectual property. As a result, we have taken the unusual step of declaring to the Internet Engineering Task Force that we are not prepared to license any Nokia patents which may be needed to implement its RFC6386 specification for VP8, or for derivative codecs.
But how unusual is this step, really? I distinctly remembered something like this happening before, and after a short bit of digging, it turns out I wasn’t going crazy. In fact, the mandatory-to-implement audio codec selected for WebRTC – Opus – faced the exact same challenges; except for Opus, they came from Qualcomm and Huawei.
Opus is an audio codec developed mostly by Mozilla and the Xiph.org Foundation, with contributions form Skype and Broadcom, as detailed over at Ars. Patents covering Opus are owned by Broadcom, the Xiph.org Foundation itself, and Skype/Microsoft; however, all of these organisations have committed to making these patents available for royalty-free use.
Two other companies, Qualcomm and Huawei, were not so keen on Opus. The companies claimed they owned patents that Opus infringes, and made similar intellectual property rights declarations to the IETF as Nokia has just done (Qualcomm’s, Huawei’s). Like Nokia, and unlike Microsoft/Skype and Broadcom, Qualcomm and Huawei were unwilling to license these patents royalty-free. The end for Opus, right?
Well, no, not really. The Xiph.org Foundation decided to investigate Qualcomm’s/Huawei’s patent claims, and after a lot of legal legwork, they came to the conclusion that the claims were complete and utter bogus – as detailed by LWN. While the Xiph.org Foundation first considered the harsh option of a declaratory judgment, they instead decided to just document why the claims were invalid. In fact, they went so far as to employ an external counsel to review the patent claims. The conclusion?
When it comes to patents, it is difficult to say much without making lawyers nervous. However, we can say something quite direct: external counsel Dergosits & Noah has advised us that Opus can be implemented without the need to license the patents disclosed by Qualcomm, Huawei, or France Telecom.
They won, and Opus was declared ‘mandatory to implement’ for WebRTC with a ‘strong consensus’.
This raises a simple question: why do companies file these IPR declarations if the patents enclosed do not apply? The Xiph.org Foundation’s Christopher “Monty” Montgomery explains this quite well:
We deal with this in the IETF all the time. Someone files a draft and a slew of companies file IPR statements that claim they have patents that ‘may’ read on the draft. Unlike other SDOs though, the IETF requires them to actually list the patent numbers so we can analyze and refute. And despite unequivocal third-party analyses stating ‘there is no possibility patent X applies’, these companies still present their discredited IPR statements to ‘customers’ and mention that these customers may be sued if they don’t license. This is not the exception; this is standard operating procedure in the industry. This style of licensing, for example, accounts for more than half of Qualcomm’s total corporate income.
The root of the problem is that while these intellectual property rights declarations have no legal standing, they are free and simple to make, and there’s no obligation whatsoever to actually defend these statements. The press and bloggers automatically assume the declarations are valid, write their articles from said viewpoint, and thus, ‘fear, uncertainty, and doubt’ is born, which can then be exploited by companies such as Qualcomm.
The detailed article at LWN, which actually covers a talk given by Montgomery, provides additional insight into this blatant abuse of the system. It’s quite shocking.
The patent game is essentially a protection racket, [Montgomery] said, and those who are trying to do things royalty-free are messing things up for those who want to collect tolls. “The industry is pissed at Google because they won’t play the protection racket game”, he said. Qualcomm and others just list some patents that look like they could plausibly read on a royalty-free codec, because it doesn’t cost them anything.[…]
Companies “have figured out how to fight ‘free'”, Montgomery said, by making it illegal. In order to fight back through the courts, there would be an endless series of cases that would have to be won, and each of those wins would not hurt the companies at all. There is a “presumption of credibility” when a patent holder makes a claim of infringement, and the press “plays along with that”, he said. But Eben Moglen has pointed out that an accusation of infringement has no legal weight, so there is no real downside to making such a claim.
Which brings us right back to Nokia. It is my firm belief that Nokia is very busy trying to make sure the world is aware of its patent portfolio. Nokia hasn’t exactly been doing well lately – they only sold marginally more smartphones in Q4 2012 than in Q2 2012, despite brand new handsets and a new operating system release – and a very good way to drum up acquisition value is to emphasise the strength of its patent portfolio.
This intellectual property rights declaration has been very successful in putting the spotlight on Nokia’s patent portfolio. The press the world over has reported the supposed infringement as fact, even though the declaration itself isn’t even worth the bits it’s written with. The fact that this has the potential to derail the creation of a fantastic advance for the open web is of no concern to Nokia. Or, maybe I should say: to an Elop-led Nokia.
As Montgomery noted, one good thing is working in our favour: the IETF requires companies to list their patents. This means that work is now underway to check Nokia’s claims, or to possibly invalidate the patents in question. While the list of patents looks daunting, this is just scaremongering on Nokia’s part; when broken down, the list isn’t even half as scary as Nokia makes it out to be, due to lots of duplicates and continuations.
Let’s hope we’ll see the same thing happen to VP8 as happened to Opus: independent counsel declaring Nokia’s claims invalid. The race is on, and I have a sneaking suspicion the outcome will match that of Opus’. Nokia’s listed patents are all old and have been publicly available for years, so it stands to reason that On2 (and later Google) have designed around them.
Still, the damage has been done.
Contrary to the other companies named in your article, Nokia win’s most of it’s patent battles, i.e. they know when they’re right.
I’m not gonna bother with your wall of text, big rant/propaganda as usual and even when you get refuted on the subject, you refuse to answer.
Also I don’t see how Nokia’s statement is relevant to the fact that Mueller is an advisor to Oracle/MS (something he publicly discloses).
My favourite recently was the Google coming to a deal with MPEG-LA … and somehow Thom thought it was a win for Google.
Everyone is entitled to their opinion but he shouldn’t be judging people like Mueller when he (along with Groklaw, yes the site that is consistently bias towards Google, among other things, http://linux-blog.org/Disagreements-+-Groklaw-Deletion/) is at the other end of the spectrum.
I was more talking about the behaviour where he starts arguing and once he realises he is wrong is never replies.
http://www.osnews.com/permalink?555238
I don’t have problems with opinion pieces and sometimes I get useful links from this site. But I really get a bit bored of the big conspiracy articles, because the world is really too complicated and companies themselves for this to occur.
Edited 2013-03-25 22:59 UTC
That’s funny, I didn’t see you or bowkota replying to Valhalla post (http://www.osnews.com/thread?556636) which showed that both of you are wrong..
Different Story 😐
Edited 2013-03-26 12:31 UTC
I have now replied.
Nokia is right… If you rewrite what they said.
Of course it wasn’t a “cock-up” when their lawyers wrote explicitely “proprietary technology such as Ogg”: it was FUD, a claim that Ogg was patent-encumbered. They claimed it, and it was on purpose.
I do not see how you can use the rest of Nokia’s argument to change the meaning: all I see is
-“we made our egoistical solution by choosing a proprietary solution in which we have an interest as we have invested in it and can collect royalties from it, so in our interest, don’t choose another option”,
-“disregard that we failed in the past to make it royalty-free to make it acceptable to everybody else but us, next time it will work”,
-“”free” is a lie, look, we put weasel quotes around it, also we assert that Ogg is proprietary and it is so obvious we do not feel the need to develop here”…
Why would they try fooling a technical standards body on something that could be looked up on wikipedia in seconds?
I would love to know your answer!
Edited 2013-03-26 16:23 UTC
They don’t need too. They just need to fool enough “Technology” journalists that will faithfully regurgitate their claims verbatim without any independent fact checking.
Hey look, it’s working!
Do you think a tech standards council cares what some journo thinks? I doubt it.
False equivalency…
Being biased is not at all the same thing as being a paid mouthpiece… The issue isn’t Mueller being biased, the issue is that he is paid to be biased. I know of no evidence that Thom or Pamela are being paid for their opinions…
Really? Like when they claimed Ogg was proprietary (same lie as they are making here concering vp8):
http://boingboing.net/2007/12/09/nokia-to-w3c-ogg-is.html
He ‘came out’ quite late which was likely due to him being about to be exposed.
Before that he pretended to be an independant observer of software patents, something few if anyone believed as his reporting was so incredibly anti-Google slanted.
Actually while nokia said that it is massively over-quoted considering the rest of the PDF link
http://www.w3.org/2007/08/video/positions/Nokia.pdf
Firstly, the proprietary bit isn’t really relevant to patents what-so-ever. So I don’t know why you are bringing it up.
I also think it is actually a cock-up … you can replace proprietary with “existing” , though I would agree with you I am reaching a bit with that.
However when you read it in context with the full block of text you can see where they are going with their argument. Also why would they bother trying to fool a technical standards body, when it can be looked up on wikipedia?
Also Ogg isn’t a codec it is a container format … so what this has to do with a set of patents on VP8 is a codec?
Edited 2013-03-26 12:55 UTC
Why would any of the past companies try to fool the standards body? Also you cant lookup patents on wikipedia typically.
So you’re saying, Google wasn’t guilty of infringement so they cut a check? What the fuck kinda of logic is that? If thats the case, can I sue you, and you cut me check because you are innocent without admitting guilt? Tell me you really aren’t this naive…please.
So we should probably wait for the gloating of MPEG-LA about it.
Called a business deal. They made an announcement, Google blatently gave-em a bundle of cash.
Cash-money is more important than gloating.
Whats to gloat? They infringed, they paid. but don’t act like it didn’t happen, don’t act like, “Oh, we’ve been here behavior”. Its more predatory behavior from the same fucks that brought us , don’t be evil, and I’ll be damned if that slogan shouldn’t be sitting there right alongside “fair and balanced”
Prof they infringed? You know that just because someone settled does no mean they infringed. It could mean the cost of litigation was much more that the settlement. That is the largest problem with patent fights in my opinion. The loser should shoulder the entire expense.
No one has said that Google were infringing. Not even the MPEG-LA. You or I have no idea if VP8 really did infringe on those handful of patents the MPEG-LA had pooled.
Google would not hand over cash money otherwise would they?
If their options were 1) Hand over the cash and the problem goes away 2) Spend the same plus some on a long, drawn out court case and subsequent appeals, why not chose 1?
Either way, there are only two entities who know whether Google were infringing: Google and the MPEG-LA. Google have continued to say that they didn’t infringe and the MPEG-LA are staying silent on the matter.
Well we will never know then.
I rest my case.
Another patent rant from Thom at PatentNews.com
Don’t go judging on Thomklaw, you’ll get voted down.
The problem is, you’re just trolling.
You’re trolling because you pretend that all is well and good in patent law land, that poor, innocent Nokia is just asserting rights trampled by big, bad Google. As if it weren’t well known by now that patents are indeed more a way of to stifle innovation and attack competitors than a way to protect your own innovation; that this specific behavior has already happened in exactly the same context, as remarked by Thom; and finally that poor, mismanaged Nokia might have other goals in mind, like trying to appear desirable in an acquisition because of their “intellectual property”. Note how I left aside any “conspiracy theory”.
You’re trolling because you pretend not to know that F. Mueller is a well known paid shill, a pawn in a war against Google, and that he’s been proven wrong many many times (not really that difficult a task when you side 100% with one of the parties, his anti-Google bias is so clear that I often wonder at the press picking up his statements).
You’re trolling because you imply that Groklaw is, viceversa, a pawn in Google’s service, while if there’s a bias that’s towards open standards and open source software; sure some of PJ’s conclusions or arguments are not always convincing, but most of her work is based on (crowd-sourced) FACTS, if you have a bone to pick with her start showing where she’s wrong with facts, ok?
So if you don’t like being voted down, here’s a simple solution: stop trolling.
Rehdon
Unfortunately Trolling on here also includes having a different opinion.
Having a different opinion is fine. People have different opinions here all the time.
What’s blatantly obvious to anyone reading this particular thread is that the first few comments posted offered absolutely ZERO insight, ZERO arguments as to why the article was supposedly wrong or flawed, something that can easily be classified as trolling – and people picked up on that, and voted accordingly.
You at least properly partake in the debate, with arguments. The downvoted commenters did not.
Damn right, Thom! That’s how it is. You either make your point with clear, concise and supported statements, or you keep your filthy tongue behind your teeth.
lucas_maximus: perhaps this website is just not for you. But wait, I have some good news! I’ve found another one that may serve you better: http://turnofftheinternet.com/
Well last time you basically said I was almost as bad as some troll off of some Apple evangelical blog. Make your mind up.
Also those comments were giving feedback to you, which you obviously didn’t want to take on board.
Edited 2013-03-26 23:52 UTC
What are you doing on this site? You apparently don’t like the topics, the articles, the way they’re written, Thom, or the responses to your comments.
Sounds like you need to find another blog.
Aren’t you one of the ReactOS devs? Funny having someone re-implementing an existant OS from a patent-wielding company like Microsoft being so pro-software patent as you are.
It seems likely that ReactOS potentially infringes on Microsoft patents, at the very least FAT32 which Microsoft has threatened with, or does ReactOS not implement it?
I’m sure ReactOS tramples on all sorts of patents.
However ReactOS is a research project, it’s not being sold and it’s certainly not being used to compete with any patent holders.
I may not agree with some of the more crazy software patents issued by the USPO, but I would certainly try to respect them. Google knowingly infringes on patents, releases it as free software and then tries to control the internet with its stolen art.
Edited 2013-03-26 09:25 UTC
Only because it is as of yet (and sadly likely always will be) too incomplete to pose as a viable alternative to Microsoft’s own Windows. That is hardly through intent though, but rather due to lack of developers and resources. Or are you deliberately making sure it’s not ‘too compatible’?
What patents would these be? I’d say Google would be crazy to knowingly infringe on patents given how much of a target they are.
Not going to court over patent claims is not the same as actually being guilty over patent infringement. As we’ve seen over and over again, patents are promiscuously granted by USPTO and their actual worth is up to a court to decide.
Court cases are long, expensive and also they seem sometimes to be hard to predict (unless they’re in east Texas). So it’s often a last resort as it’s still a risk even if you are certain you are not infringing.
How would they ‘control the internet’ with the royalty free vp8 (I’m assuming that is what you refer to with ‘stolen art’) ?
Quite. I can’t help fell that I’m in backwards world, where open source, royalty free code is “proprietary”, trying to give it away is “control” and deliberately throwing a spanner in the works of standardisation is “collaboration”.
It’s Google hate.
No it isn’t.
They can enforce whatever is the dominate codec through android and chrome, lets not forget they own YouTube and that is the largest video site in the world and the second largest search engine (if you consider it to be, which some may not) after Google itself.
Yet they haven’t done that. They could, but they haven’t. In fact they’ve done quite the opposite, by working with standardisation bodies to get VP8 & WebM properly included as a future web standard, rather than unilaterally deciding to make it a de-facto standard.
The point is that every where else is h264 and h265 is already an accepted standard.
Time they get it approved it won’t matter whether it is a web standard anyway.
It not like Google cares about Web standards when it doesn’t suit them. Look at the markup for google it is fecking awful.
It’s not about toppling the established codecs or about becoming the new, #1 codec to use, it’s all about the lowest common denominator: if VP8 or VP9 or some other royalty-free codec were to be defined as the web standard then ALL developers — mobile, desktop, embedded, those aiming for the established OSes, those aiming for niche OSes — would have atleast one, single codec to fall back on if everything else were to fail. As such it’ll always matter, just not in the way you seem to understand or care.
Actually I do care, I am a web developer.
You generally support anyone who is actually giving you money i.e. your existing user base.
You support those scenarios that benefits you userbase, not some pretend base-line that doesn’t exist.
Anyway it requires you encoding the video 4 times.
http://luke-robbins.co.uk/video-on-the-web/
I already spoken about this like last year. It is far easier to just support flash for most people and h264 for iOS.
Edited 2013-03-26 14:31 UTC
Which raises the question why you oppose the adoption of VP8/9. If VP8/9 gets declared mandatory-to-implement (like Opus), Apple will have no choice but to implement it or become non-compliant with web standards. This would mean VP8/9 becomes a baseline – requiring only one, single encode.
Why do you oppose this?
Rome wasn’t built in a day, and H.264 is not an option in any way, shape, or form because it’s proprietary and thus incompatible with the open web. VP8/9 is the only viable option to solve this issue, whether you like it or not.
I don’t.
I just don’t think it matters as much as you do as there are two decent already existing solutions that work now. and have decent GPU acceleration support.
h265 will likely have this also.
The open web what does it really mean?
If the majority of people can use flash or h264 to view video content, then surely that is open enough?
What kinda frustrates me is this idealistic notion that every single part of the net must be 100% free, it just isn’t going to happen.
Who f–king cares if some large company has to give another large company a pile of cash … it doesn’t change actually viewing the content for the end user does it? Which is the whole point of using the internet.
Even the x264 developers on that git hub reckons that patent free codec won’t happen in time so it won’t matter.
At the end of the day the only approach you can take is the pragmatic one which at the moment is flash and h264.
Edited 2013-03-26 14:44 UTC
It isn’t. The MPEG-LA has made it VERY clear they will NOT shy away from suing individuals if they violate the H.264 licenses. Considering anybody who makes money off an encoded video violates said license (since even professional cameras and software do not allow for commercial usage of H.264 content), this is far less unlikely than you think.
The web should not move from one crippling and shackling technology – Flash – to another – H.264. That’s moving sideways, not forwards.
http://diveintohtml5.info/video.html
http://shaver.off.net/diary/2010/08/27/free-as-in-smokescreen/
Distributing the content via streaming isn’t a problem it is creating devices or software that consume it.
This cost is usually either part of the hardware or the software. This does not affect those that are streaming it or where it is streamed from.
Then they probably can afford to pay the license, so what is the problem?
VP8 is about decreasing the money Google have to pay for encoding videos on youtube and including codecs in Android. Not about a “free and open web”.
It isn’t crippling, at no point have I felt crippled when I have seen a flash video, or by watching h264 video.
At the end of the day it doesn’t change the openness of the web as much as it matters and that is what is really important.
If everything supported VP8 tomorrow then great I would be all for it.
Edited 2013-03-26 15:06 UTC
The problem is your “probably”.
VP8 is about decreasing the money anybody has to pay for encoding videos and sharing them.
This gets tiring.
If you are producing a piece of software that encodes it or decodes it you have to pay a license fee.
If your software is used by a small amount of people you don’t have to pay anything.
VP8 benefits Google far more than anyone else. In fact I would argue it so out-weighs everyone else that it is almost an irrelevance.
Edited 2013-03-26 15:36 UTC
And you think only those companies exist and will exist? If I launch a video startup, should I cough up money too? And my website? And my app? Everybody just has to pay when it couldn’t, when it shouldn’t – which is exactly the principle that made the internet what it is: a very low cost, a very low barrier of entry to the basic tools of exchange and communication, enabling anybody and his dog to build contacts, a community, a business, a movement?
You haven’t understood anything about the internet. You should pay 100 dollar for each new post you write, maybe you’ll understand. We’re entitled to it anyway, for the time, the storage space, the computational power you are wasting. Cough up the money or shut up?
Edited 2013-03-26 15:28 UTC
Have you looked at how much the pricing is?
It absolutely nothing if you are streaming it, from a web-page. If you are producing something which consumes it then you have to pay, i.e. a decoder or an encoder.
If you are getting the for software that does the encoding/decoding from a reputable source, then you you have already in most likely hood paid for it already.
Like it or not all those things you listed as possible activities will have a cost associated to them.
Angry are we?
Edited 2013-03-26 15:42 UTC
Yeah, and have you looked at how much the pricing was? I mean, before VP8 appeared and threatened h.264? The MPEG-LA backed off thanks to the same Google you bash.
And now, could you tell me how much the pricing for h.265 will be? Especially once VP8/9 will be killed thanks to the lazy efforts of short-sighted geniuses like you?
Edited 2013-03-26 15:47 UTC
I am not bashing Google, it just in their best interests for people to use VP8/VP9.
Sorry people producing devices that will consume will brunt the cost. There is no sense in MPEG-LA charging those who distribute content or consume it on the web.
But you can be as angry at me as you like, the market is already decided h264 and is likely to use h265.
Be as angry as you like, it won’t be as draconian as you make out because if it is, everyone will use VP9.
Edited 2013-03-26 16:19 UTC
VP8 has fine GPU acceleration support where it is needed (ARM SoCs for mobile devices):
http://wiki.webmproject.org/hardware/arm-socs
VP9 will likely have this also.
No. You encode twice, WebM and then h.264 for legacy browser support. You use <video> to use either of those files and then you nest inside a Flash block to load the h.264 file. Then you offer the WebM for download to anyone else who cannot watch in-browser.
If VP8 hits MTI, then you can stop making the h.264 files.
I forgot flash supported h264. However WebM wasn’t support by firefox at the time of writing, so I should have said 3 times.
I will update it thanks.
Ooooh, so you only support IE?
Flash works in Chrome, Firefox, Opera and IE.
Radio stop being a dick please.
Lie.
Flash works in Chrome, Firefox, Opera and IE on Windows. It does not work on any iOS device. Aren’t you pissed off at Apple that they caused fragmentation in their own interest?
But I was talking about IE specific interpretation of css, and the use of ActiveX. A webpage is not just a video player. IE is the dominant browser and its is not going to change, should you support only IE, and too bad for everybody else?
I wasn’t talking about mobile platforms. H264 has good support on iOS and Android. I don’t develop for mobile platforms.
Your reading comprehension is a little weak it seems.
Edited 2013-03-27 12:52 UTC
h.264 is not an accepted standard, it is a de facto standard.
h.265 is a standard only amongst big companies who have put their money in it, and who want to tie everybody else in it to extract royalties. It is all about increasing their money, not the common good.
It is an accepted standard for video, just not for the web, however it is as you say the de-facto standard.
Except they can’t get any royalties from streaming which is 100% internet-centric.
So as I keep repeating it not a problem.
The point is that every where else is h264 and h265 is already an accepted standard for commercial endeavors where there is someone willing to pay for it.
There, I fixed that for you.
So? Video on the working well enough.
There is no charge for streaming h264 and I doubt they will change this for h265.
I find it really really frustrating that you guys are pretending this is a problem when it really isn’t.
The W3C could have chosen theora back in 2007-8 if they got their act in gear.
Whine all you like. It doesn’t change a thing.
Oh? You “doubt” it will change? How come? Have you forgotten how much they tried to charge for h.264 streaming?
http://www.osnews.com/story/22812/MPEG-LA_Further_Solidifies_Theora…
http://www.osnews.com/story/22828/MPEG-LA_Will_Not_Change_h264_Lice…
They can’t pull it again otherwise everyone will use VP9 you are a moron.
Anyway h265 is here to stay.
Does that count as enforcement?
If they change the codec to VP8/9 only and make themselves incompatible with existing browsers and devices, aren’t they just hurting themselves? Are they blocking the way for other browser makers to implement VP8/9 or extract a levy from them?
If they put it in Android and Chrome, why would it be “enforcement”?
H.264 has been enforced. I never had the choice, no public certification commitee has been consulted, and nobody should have been able to implement it freely – until the MPEG-LA suddenly came out to declare it free to use for usual folks* (*conditions apply).
When you have the second most popular desktop browser, the most popular mobile phone platform (maybe) and the most popular video site on the net … umm if this was Microsoft I would say the sentiments would be skewed the other way.
It the better codec, in every single way.
Microsoft did not have “the second most popular browser”… What is so difficult to understand in the definition of monopoly? Soon you’ll tell us that a 25% market share is a monopoly!
and if it was Microsoft… They would try to impose their proprietary technology, tied with Windows, and different from W3C standards. Which is what the MPEG-LA/Nokia is doing, and the opposite of what Google is doing.
It is hard to be more wrong than you are, but damn you’re trying.
I was talking about google and chrome
This gets boring, you are really angry and not really reading anything I say. Oh well.
So, you knowingly infringe on patents and release it as free software?
“Just a research project” is as bogus an excuse as excuses go, and the affirmation that you do not compete… Well, by releasing it “free”, you undercut the patent holders, who would have no problem arguing it in court. Bogus bogus bogus.
Google tries to control the web. Right. By making a better product, under licenses so permissive that they will never be able to “bait and switch” the users. While everybody else churns brain-dead patents to be sure to scare anybody who could come close to the same ideas, and lock wide portions of computing.
Edited 2013-03-26 10:17 UTC
I can’t understand why anyone would mod this bullsh*t up…
Do you have any idea how expensive it is to write software that is knowingly free of patent issues??? You make it sound like people doing OSS are just running around intentionally trampling on patents like gleeful children… The problem isn’t anyone knowingly doing it – the problem is it is virtually impossible to find out if your code is infringing without spending huge amounts of $$$ on lawyers and staff to do patent clearances.
Sure, ReactOS is very likely infringing on some patents somewhere. But so it just about every piece of OSS in existence. So what? If the patent holder has a beef, they can take it up with the project and it will usually get worked out amiably. If your project is non-profit, for the most part there is no incentive at all for a patent holder to bother with it. Most of the OSS world ignores patents – and there is nothing at all wrong with that.
You imply that taking this approach is somehow “wrong”. Why? The entire system is geared to protect financial gain. Most OSS projects are not in it for financial gain – therefore there is no justification for the expense of doing patent clearances. Patents are simply ignored – and unless there is genuine damage being done, why would anyone care? Do you really think that ReactOS is hurting Microsoft sales right now?
No.
Otherwise your project would have been shut down (at best; I can imagine you would be in prison or bankrupt in the worst case scenarios).
So if it is soooo difficult, why do you not cut some slack to Google and keep accusing them of “knowingly infring[ing] on patents and releas[ing] it as free software”, of “trying to control the internet with stolen art” like a B-movie Fu-Manchu? With, guess what, OSS software.
Or are you too thick to notice the irony? Well, you already are oblivious to the contradictions in your own arguments, so…
Dude… You REALLY have me confused with someone else…
Someone is angry!
Do you have any idea how expensive it is to write software that is knowingly free of patent issues???
I was under impression VP8 guys doing exactly that.
You make it sound like people doing OSS are just running around intentionally trampling on patents like gleeful children.
You may try to argue with ReactOS troll about “stolen art”.
Thats my point exactly. Google can afford it – a small OSS project with no corporate backing cannot and shouldn’t even be expected to.
I never said I agreed with him. He is perfectly entitled to his conflicted viewpoint as far as I am concerned, Im not defending it… Im just saying that picking on him for not doing a full patent clearance on small, community driven OSS project is over the top – no one else does either.
Long and happy, the sun gobbled up 98.86% of the matter. Now we revolve around its huge gravity and raging power. Corporate finance is that sun.
The original starring roles probably channeled moral energies to the willing of the construction of civilization. We do not worship the sun, do we?
Heirs naturally fail to appreciate. They too could contribute to an exponential drive, but can get lazy and abuse even each other.
The best financiers must unnaturally worry that the algorithm needs to be reinvented. Does one say that that judgement (reinvention) is unlikely to flow because of money?
Yes, and there any healthy carpenters with tools who refuse to work. Controlled breeding actually constructs things.
But patents cannot stop or otherwise control the romance of the find, the harem warehousing, and the reuse of algorithms or OS’s in any recent information age, and for this reason the corporate or other financiers who advise to cause what Thom wrote is (“cost” free, and free libertarian “marketing”) happening are reason alone to despise the outrageous competition between Jones’s whose “discernment” will radiate the reward that Thom feels are gamma-rays, unlike any gentle yellow-dwarf of a sun ever worshipped.
It costs the original starring company money to 1) organize, 2) pay the coders, 3) decide which code is truly unique, 4) apply for patent. The patents are then (assumably) rightly rewarded, and the financial investment is (rightly?) rewarded. That is all previous money.
Where is the problem we perceive as stoppage of flow?
Coders should be paid like Wall Street physics doctorates? We do not worship coders do we? Yet they are the true (if Marxian) masters aren’t they? (Recent information ages are for thought, and is that food to?) One cannot stop or otherwise control the love of computing or OS’s, or money, so when is the full flavor a mix of flavors? Which financial intermediaries who broker what? Protection? Information? Protection of information? Protection against processes?
There probably should be one large monetary reward for each improvement of an algorithm for each deemed homo-faber type person and helpmates, and none for pure echolalia. Then one does evolve a sort of Stepanovian/Stroustrupian maximum, understood or cared by only a few human financier beings.
WTH? :>
As far as I remember, the FAT32 – patent was deemed invalid. Microsoft does hold a patent on FAT, but alas, I think there are work-arounds for that, too. I can’t remember any details right now, but there was some discussion about this exact topic some while ago here on OSNews.
The patent patrol is out en force judging by the early comments.
Look, I’m in favor of protecting intellectual property rights (FLOSS wouldn’t be possible without such protections) but software patents give me the willies. They tend to be so nebulous as to be useless for purposes other than protection racket schemes. With that said, I look at something like this as kind of a good thing.
For starters, what the IETF is looking to implement is an inherently good thing. It’s a royalty free codec that allows for communication using the web that implements methods previously exclusive to proprietary methods. Considering the heady idealism that fueled the early internet, this is a logical continuation. Anyone can run their own web server, mail server, chat server, etc. so why not person to person video chat? Why not plugin-less video/audio? Why not create an experience that provides a great baseline for everyone who has access? This certainly doesn’t preclude anyone for extending things to enhance it further.
These days, the internet is incredibly personal and incredibly corporate at the same time. Settling on a lowest common denominator for audio/video enables it to continue in that uneasy lock step. If Nokia wants to assert rights over that lowest common denominator video codec, better now that later.
If they are right and VP8 does infringe on their rights, then those “violations” can be coded around and we can achieve the ideal. If not, then we can proceed without that encumbrance. I don’t think this changes the end game at all.
So, all of that to say I hope things are sorted out and progress can proceed.
Edited 2013-03-25 23:10 UTC
What would be really cool about this patent mess, is to simply change some rules :
1- Claiming you have patents under the hood and keeping them secret shouldn’t be enough. Each time a company claims for holding a patent, they should open it for review.
2- If the claim was wrong, the patent is automatically invalidated and thrown into public domain : never leave a robber with its weapon !
3- If some FUD was successful in the past, some anti-litigation contracts signed, and the patent is finally found to be invalid, let’s the tolling company payback the fees… twice !
These simple 3 rules would make the claiming companies a bit more accurate before making great noise.
Kochise
Edited 2013-03-26 10:13 UTC
I think software should be protected by copyright, not patents. It is a bit like patenting story ideas. Can you imagine a world in which you have to pay royalties to Romero each time you create something involving zombies?
It seems that Google eliminating the MPEG LA from the picture was not enough.
But now that the patents that Nokia thinks may apply to VP8 have been disclosed, they can be analyzed in greater detail and in the end the codec will have much uncertainty removed about its patent situation.
Nokia itself is currently digging a PR hole. Several previously impartial observers have now voiced their disapprovement of the company and its practices. Actually, only the usual Microsoft apologists defend Nokia now.
It seems to me that Nokia is all but officially a Microsoft subsidiary these days. You can change ‘Nokia says…’ to ‘Microsoft says…’ as far as I’m concerned.
I concur. While I still like M$ for their useful Office products and still excellent Windows 7 release (don’t get me started on Windows 8), I really dislike their use of other companies as a proxy to fight their war against their competition.
So your saying that because some companies said some bogus stuff about a given code that automatically means any similar comment by any company about a totally unrelated codec is likewise bound to be bogus?
You do see the sort of irrational nature of this argument, right?
That’s indeed a very irrational argument.
That’s probably why that’s not the argument I’m making.
Ok well that’s the argument you appear to be making. Do you think you could perhaps restate your argument for those of us not in the know.
That does not appear to be the argument he is making at all. He is pointing out that this sort of thing is not unusual (although nokia claims it is) and that they really only have a small number of patents they are claiming (again contrary to nokia’s claims). No where did he say that nokia’s claims were bogus, just that they had to list the patent numbers so we MAY see if they are bogus or not.
reading comprehension my friend.
My question is how much did they pay MPEG LA? Everyone one acts like Google just bowled them over when in reality they paid MPEG LA which means VP8 is now covered by license agreements. (Sounds familiar)
They gonna have to pay Nokia also, or Nokia might start suing Google directly over Android.
Gonna be messy. LOL.
When these kind of claims occur, I actually find the end results productive, instead of looking at the painful process. After all is said and done, if (and hopefully when) the codec comes clean, IETF will have legal grounds to claim the codec is indeed patent free.
If Nokia really wins, we can still find something else. It will be a slight setback, but not a significant loss.
Basically, I prefer an open battle to sneaky shenanigans like the GIF/LZW patents back in the day.
It is not good. Google did a very thorough patent examination of VP8 before they bought On2, and that examination revealed that On2 had done their work well, and had avoided other patents.
The first troll claiming patents that read upon VP8 (MPEG LA) has fallen by the wayside, granting Google permission to continue to offer VP8 royalty free to everybody (even downstream re-implementers) for a small once-of fee that is merely a pay-off to avoid the costs of a trial.
Google’s original assessment of VP8 patent status withstood that first challenge.
Now comes the second challenger to VP8, with even weaker-sounding claims. This will, no doubt, do great harm to Nokia’s rapidly deteriorating public image, and it may end up costing Nokia as much to attack Google (and fail) as it did for Oracle before them.
Who loses out of this?
Google: court costs in defending their VP8 IP.
Public: No codec for WebRTC, a W3C API definition to enable browser-to-browser voice calling.
Nokia: court costs, complete loss of any remaining goodwill, crashing share price.
Who wins?
Lawyers: fees.
Skype: no competition from browser-to-browser voice calling.
Hmmmmmm.
Edited 2013-03-26 08:10 UTC
Looking at the big picture here, MPEGLA members wants to corner the market for video codecs so that they can continously collect royalties on all things video, be it on the web or on physical media.
As such, vp8 is a huge threat to them on the web as it is a quality video codec which is royalty free to use and implement.
Worse than the existance of VP8 is the proposition of it becoming a HTML5 video standard and thus mandatory to implement for HTML5 compliancy, same goes for RTC.
Obviously this would mean an even lesser incentive to licence their h264 codec through royalties for anything web video related, as you can suddenly be certain that by targeting the royalty free vp8 codec you will support all HTML5 compatible browsers.
When MPEGLA agreed to back off vp8 through their deal with Google it seemed almost to good to be true given what they stood to lose on vp8 becoming a web standard, as it turns out it likely was. It seems to me now that MPEGLA were well aware of the pending Nokia patent suit, and possibly others lined up behind it.
As it stands, Nokia (or any other patent sock puppet which comes after them) doesn’t have to win in court, they just need to cast enough of a patent cloud over vp8 for it to not be accepted as a HTML5/RTC standard codec.
H.264 and H.265 can never become official mandatory standard codecs due to demanding royalties, but that doesn’t matter since if the standard can’t be vp8 which is a technically competitive codec, w3c will be forced to use a technically ancient codec like mpeg2.
And if mpeg2 is used as standard and thus mandatory to implement, it will still never be used in practice due to it’s poor quality by todays standards.
This will lead to h.264 (and later h.265) being the ‘standard’ in practice and allow MPEGLA members to harvest royalties.
Sad thing is that even if Google wins this patent fight with Nokia, there will likely be another patent holder outside of MPEGLA suddenly emerging with a claim on vp8 technology which will then have to be dragged through courts.
In short, MPEGLA will fight tooth and nail to prevent any competition to their video codecs and by continously having vp8 being under patent infringement claims (they don’t need to hold any water when it comes to actual court scrutiny) MPEGLA will be able to prevent vp8 becoming mandatory in HTML5 browsers and RTC.
As such I now doubt we will ever be able to enjoy the benefits of a royalty free quality codec being standarised across browsers. MPEGLA members stand to lose to much potential royalty revenue and will make sure there’s always going to be some patent cloud lingering over vp8, atleast until h264/h265 is so cemented as the ‘de facto’ HTML5/RTC standard that it doesn’t matter anymore.
Result is that everyone but the MPEGLA members lose out, as competition in this field has been non-existant until vp8 showed up and will go back to being non-existant without vp8.
It also poses a barrier of entry through the royalties which prevent many smaller players aswell as open source projects, a barrier which would not exist should we have had a royalty free standarised video codec.
here’s a bigger picture for you.
VP8 is and has been inferior to H.264 in every single way. Performance, power savings, adoption, you name it.
Moving forward, H265 and VP9 will turn out the same.
Have a look at the amount of info on each one of them.
http://en.wikipedia.org/wiki/VP9
http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
Good thing VP9 is “open”. Look at the endless amount of information on it and the wide spread adoption. Oh no wait, thats H.265 !
Adoption, obviously given how long h264 has been around. Power savings, I have not seen any comparison between hardware based h264 and vp8, do you have any links? Performance, depends on what you mean by ‘performance’, decoding/encoding speed? visual quality (here h264 is better due to excellent mature encoders like x264 but certainly not by a wide margin) ?
And it’s certainly not inferior in cost.
Yes it’s open, here is the git repository where you can follow the actual development in real-time, modify/build it yourself and use it, send patches, examine the code.
http://git.chromium.org/gitweb/?p=webm/libvpx.git;a=summary
The vp9 spec is not yet finalized but there are design documentation and progress reports:
http://downloads.webmproject.org/ngov2012/pdf/04-ngov-project-updat…
http://www.ietf.org/proceedings/85/slides/slides-85-videocodec-4.pd…
http://downloads.webmproject.org/ngov2012/pdf/02-ngov-product-requi…
What part of your definition of open does this fail to qualify for?
Free from patents.
Until there is a world wide change to patent law, open source software will no longer hold to the dream it once had if the developers can get sued at any given moment.
This is an advantage of commercial software over open source.
Right, because Microsoft, Apple, and others never get sued over patents on closed source software.
Wait.
He didn’t say that.
More pointing out that this will continue to go on as long as these patents are around and it doesn’t happen nearly as often in commercial software.
That is completely the opposite of reality. Commercial software ends up in patent litigation FAR more often than open source software. Why on earth would you think it is the other way around???
…and even if that were not true… The issue here is not that the software is open – it is that it is being backed by a commercial entity with a bank roll and an agenda. Community backed OSS (the far more common scenario) is virtually immune to patent litigation. There is no one with money to get a judgement from, and even if there were try getting a court to calculate damages when the infringing product is free and there are no profits to go after…
Sorry, but the patent problem is ignored by most of the OSS world – it is commercial entities that are suffering with this problem.
Any like statistics on this?
What happens if it isn’t backed by a big commercial entity? What happens then.
Anyway I live in sensible countries where there isn’t software patents, there is copyright which is all that is really needed.
Edited 2013-03-26 15:55 UTC
No, but do I need any? Isn’t this obvious when you think about it. Name a patent lawsuit filed against a non-commercial entity…
Im not saying OSS is never involved with patent litigation… What I am saying is unless there is a commercial entity to sue (whether it be a significant contributor to the project or even just an end user) then no one files suit. Google gets sued. Autozone gets sued. DaimlerChrysler gets sued. OSS Projects rarely if ever get sued.
Even in this case with Google… On2 could have been sued at any time over the course of the last 20 years for their various codecs leading up to VP8. The reason there were no lawsuits wasn’t that they didn’t infringe, it was because there was no money in it (they were small potatoes).
Im not saying they did infringe either – Im saying it really doesn’t matter much one way or the other unless you have enough money to be worth shaking down.
Usually nothing. That is my point.
If only that were true in the US…
If it is not backed by a big name there are plenty of third parties that will help you defend it, such as the FSF.
Only probably if I give over my licensing to FSF. No thanks I don’t want stallman and the cronies having control of the licensing.
galvanash: save your breath. Everybody can see that you are 100% right and stating the obvious. Unfortunately, there’s no use arguing against stupidity, which I may say, is fairly pervasive in this thread…
If you want to argue, argue against my points.
They get sued by what is seen in the UI, not implementation code.
Edited 2013-03-26 14:24 UTC
Well, aside from making very little sense of mixing two orthogonal properties in a kind of comparison (one being about productising, the other one being about code licensing), I’d rather say the other way around.
A developer commericializing a patented technology is more likely to be sued by patent holders then one not doing so. Whether the software in question is open source or not doesn’t change much.
Actually people get sued once they popular enough to be noticed. Regardless of whether it is open or proprietary.
What I meant is that with commercial software you don’t get to see how something got implemented, as such it is hard to come with arguments to sue a given company.
Take Oracle vs Java case, Oracle got Android’s source code and used it as starting point for their case.
Of course, even commercial software is not free of being sued based on the L&F of a given application or using unlicensed APIs or protocols.
With open source, you can get patent trolls going through the code and trying to find situations that they can leverage for patents. SCO’s case for example.
They just need to raise the doubt that the software might break the patents. If the community cannot support a court case, they win.
No being able to see how something is implemented strikes me to be a result of software sources being not available, not as result of being commercial.
In any case, patents like those for codecs are applied to all possible implementations, so it really doesn’t matter how someone implements the codec.
You don’t need to source of software to see that it is capable of encoding or decoding in a certain format and it makes no difference either if the software was paid for or distributed free of cost.
No software is, commercial or non-commercial, open source or closed source.
Even if we assume that this is so, how does it make commerical software less targetable. This seems to be about reading the source, not having paid for the product.
Or do you assume that the EULA would contain clauses that forbid using anything found in the sources?
Sure, but some communities, e.g. IBM, Micosoft, Apple, Oracle, Google and so on are likely to be able to do.
As we have seen in the SCO vs. IBM case, they were and SCO lost. Everything.
Yeah, like Opus. The best technical solution, open, but patent-encumbered, as Qualcomm and Huawei claimed rightly so.
Oh, wait.
you do realize that being open does not mean free of patents right? There is no open source license that I know of that requires you to not patent anything you discovered.
No, but there are ones that require you to license any patents you have in the code on a royalty free basis to anyone who receives the code under the terms of the license. The GPL for example (Section 11 specifically). The net effect is therefore the same.
Perhaps because H265 patented obvious themes for consumption, performance and since it’s an evolution of H264, doesn’t require to change everything for an implementer POV ? Perhaps also the mpegla ‘mafia’ that ‘force’ industrials that have some parts into the play to get cut-off of royalties back ?
Hard to say, because obviously everything is made to hold technical progress and freedom with proprietary racket. If it was at least free, but not !
Kochise
Edited 2013-03-26 10:35 UTC
So, h.265, a codec developed and owned by major companies who are willing to earn royalties from it and lock competition out, is likely to become widespread? What a surprise!
And how lucky we are, us, end-users! Corporations know what is best for us! Let us prosternate in deference! Please milk me.
VP8 is not inferior in performance to h.264 except for just one factor: encoding speed. In every other respect VP8 can match or exceed h.264 performance.
VP8 is actually less computationally expensive to decode than h.264. This is because VP8 puts much of the “hard work” into the encoder process rather than the decoder.
If by “power savings”, you actually meant a hardware decoder versus a software one, be advised that VP8 is a part of the Android Multimedia Supported Formats:
http://developer.android.com/guide/appendix/media-formats.html
Here is a list of ARM SoCs (which are used in mobile phones and tablets) from different manufacturers showing which support VP8 and which do not:
http://wiki.webmproject.org/hardware/arm-socs
Unless you buy Apple gear then your (recent) mobile device is more likely than not to support VP8 decode in hardware. VP8 is getting quite prevalent in terms of adoption, just about every current Android device on the market would support VP8 decode in hardware.
Since VP8 is easier to decode than h.264, and since it now has hardware decoding in mobile SoCs, then VP8 is actually likely to out-perform h.264 in terms of power savings.
“Open” means royalty-free, anyone may implement it. That is most certainly VP9 and not h.265.
Here is an alpha-version implementation of VP9 you may wish to investigate:
http://news.cnet.com/8301-1023_3-57561111-93/googles-new-vp9-video-…
[quote]
VP8 is not inferior in performance to h.264 except for just one factor: encoding speed. In every other respect VP8 can match or exceed h.264 performance.
[/quote]
What? Didn’t you claim this months ago and couldn’t provide any evidence for it?
http://www.osnews.com/thread?542644
The conclusion from most comparisons is that x264 outperforms VP8 encoders in quality/bit. So VP8 is inferior in two ways: encoding speed and quality. Maybe the quality is acceptable, but it’s still inferior to what a good H.264 encoder can provide.
I’d be happy to see some direct comparison that shows otherwise, but until then maybe you can stop repeating this unfounded claim?
Ouch! Nicely played!
Your recollection is utterly inaccurate. The conclusion was that, for video on the web resolution and bitrates, if you are prepared to put up with a slower encoding speed, then for a given number of bits you absolutely can make at least as good if not better encoded video with webm as you can with h.264.
You have to go to extreme high resolution/bitrate/quality profiles before there is any noticeable advantage (other than encoding time) for h.264 that webm cannot match reasonably closely. Such videos simply aren’t used over the web.
Edited 2013-03-27 05:22 UTC
Then see for example on page 27 (the same page I pointed out in the old thread) how x264 constantly out performs VP8 at ANY BITRATE. How can you still say that VP8 is better than x264? Do you have evidence?
VP8 is at least free. For most people youtube quality is enough and are not expecting Full HD. The quality difference is not that abyssal as you say.
Kochise
Agreed. As you say, not only is it not that much of a quality difference (at quality levels as used on the web), but also one can match that quality per bit using VP8 by choosing a higher profile for VP8. There will be a penalty to pay in terms of encoding time, but as long as one is prepared to pay that encoding-time penalty, then with VP8 one can still achieve the same performance as h.264 in all other parameters (for the video quality levels that are commonly used over the web).
Remember WORM discs ? Write Once Read Many. Same could be applied to VP8 and any other codec (even MP3) : Encode Once Replay Many.
So for me focusing on encoding time is bullshit. Beside video conference that needs real time encoding.
Kochise
I didn’t say it was abysmal, did you see that somewhere?
I just said x264 has shown to provide better quality.
All one has to do is go to a higher profile (at the cost of encoding time). You would argue that one can go to a higher profile also for h.264, and that is true, but one can go to a higher profile again for VP8, and so the two chase one another until we reach such a high profile that it is beyond what is used on the web.
As I said, for any h.264 video such as might be actually used on the web, for a given number of bits, it is possible to get a VP8 video to match the h.264 video as long as one is prepared to take a longer time to encode the VP8 file (normally done by having to use a higher profile).
This does not say that VP8 is better than h.264, it merely says that one can get the equivalent quality per bit (as far as any blind test can tell, within the range of quality as is used over the web), provided one is prepared to take the hit in encoding time.
Since videos over the web are commonly decoded & rendered thousands, if not millions, of times more often than they are encoded, if there is one area where a compromise has to be made, the encoding time is the best area to choose.
Edited 2013-03-27 07:36 UTC
False, the high-quality presets were used in both cases. That test represents the upper limit, that’s why I used it. There is not an infinite ladder of profiles, there are just a handful.
Seriously, can you stop now? You haven’t presented any hard evidence, so there’s nothing to discuss until you do.
An up-to-date test done by a member of the IETF mailing list:
http://www.ietf.org/mail-archive/web/rtcweb/current/msg06787.html
http://downloads.webmproject.org/ietf_tests/vp8_vs_h264_quality.htm…
http://downloads.webmproject.org/ietf_tests/vp8_vs_h264_speed.html
Edited 2013-03-27 08:35 UTC
Great! Thanks for this!
It looks like VP8 can indeed outperform x264 on the baseline profile, which itself is a nice result, and perhaps necessary for their intended usage (realtime video conferencing).
Well, note that this is a PSNR test. Nice theorical test, objective and quantifiable; but as any audio and video specialist will tell you, there are many other ways to trick human vision to achieve better apparent quality, even with a worse PSNR: psychovisual optimizations. Hard to test, as you need a good hundred test subjects to rate samples in blind tests and to run an ANOVA of their subjective opinions.
Browse http://x264dev.multimedia.cx/ and Xiph’s Monty’s blog for a good description of the trickiness of defining what is “good” compression.
Edit: here’s what such a test looks like for an audio codec:
http://people.xiph.org/~greg/opus/ha2011/
http://x264dev.multimedia.cx/archives/458
Edited 2013-03-27 10:34 UTC
Yeah, I noticed it was PSNR as well, which I had heard (although I am not directly knowledgeable about it) that if one optimizes strictly for it, it can prefer some “blurriness” in the output image. I thought SSIM had been proposed to compensate for that deficiency(?).
In any case, I think we generally conclude that we have at least two options (x264 and the VP8 encoders) which have both acceptable output.
Play time over, back to work for me .
Actually, I believe it is the reverse. VP8 tends to blur areas of “high motion”, whereas h.264 tends to add small “artefacts” (features which were not present in the uncompressed data). AFAIK, PSNR just objectively compares the still frame(s) of the uncompressed image with the image after compression, and so larger “areas of blur” are “marked harshly” under PSNR compared with smaller “artefacts”.
The thing is that when looking at a video played at normal speed, the human eye tends to see areas of high motion as blur anyway. So subjectively the as-rendered VP8-compressed video can look better to people than the as-rendered h.264-compressed video, even though the latter can have a slightly better rating as measured by PSNR.
I think SSIM may well have been proposed to compensate for the deficiency in PSNR for penalising areas of blur where there was high motion. If that is the case, then SSIM would give a better rating to VP8 than it gets under PSNR. Giving comparison figures in PSNR actualy under-rates VP8, AFAIK.
PS: On investigation, I think PEVQ (not SSIM) might be the comparison method that attempts to take account of the way that people actually perceive the playing video stream.
http://en.wikipedia.org/wiki/PEVQ
Agreed. They both have acceptable output. One of them has far more acceptable terms of use, especially for a web standard.
Edited 2013-03-28 06:00 UTC
I think all quantitative measures share the same flaw, that they do not necessarily relate to real perceived quality, although SSIM was designed to relate better to how people perceive image quality (I don’t know if that’s true in practice).
True enough. I believe (I’m not sure since I can’t find any results to back it up) that VP8 actually wins out in testing which attempts to score real perceived quality.
http://en.wikipedia.org/wiki/PEVQ
I don’t even know if anyone has actually done such testing for VP8 vs H.264. It is important also to note when any comparison was done, because VP8 has had five releases (named Anthill, Cloudberry, Duclair, Evergreen and Foxtail) since the original announcement, and each new release has seen appreciable improvement.
Sigh! Most video on the web doesn’t use high quality presets, but rather a low quality or mid quality preset. Because they aren’t the same codec, the presets are not directly comparable. Sometimes to get comparable quality for VP8 you have to use a higher profile. Sometimes VP8 is better than h.264 at the “same” profile. The curves don’t follow the same slope, they cross over.
As poster Radio pointed out, in the recent tests done by ITEF, the quality curves for VP8 and h.264 cross over each other at the extremes, and are all-but-identical in the middle::
http://downloads.webmproject.org/ietf_tests/vp8_vs_h264_quality.htm…
If you want to get the same quality for webm as you get for h.264, then in some cases only you have to go to a higher preset (sometimes you don’t), but it will cost you in encoding time.
http://downloads.webmproject.org/ietf_tests/vp8_vs_h264_speed.html
Seriously, can you stop now? Why are you trying to pretend h.264 is better than VP8 when there is effectively nothing in it (apart from VP8 longer encoding time)?
Who is paying you?
Edited 2013-03-27 09:42 UTC
Do you have proof of this?
I’m well aware there are different presets, to help mitigate this, just look at the highest quality presets. It eliminate a variable from the experiment (ie, just use the max).
Is this your favourite ad hominem attack? To suggest that anyone with an opinion counter to yours is a shill?
LOL. You were the one attacking, insisting I was wrong and should shut up.
Now you are hunting around for somewhere else to attack after the evidence contrary to your bias is staring you in the face.
I can’t think of a reason why you would spread misinformation to support the commercial, big-business, interests if you aren’t being paid to shill for them. Do you just like lying for the 1% or something?
Saying someone is wrong is not an attack; and I didn’t tell you to shut up, I said that you should stop discussing until you can base it on data, because it’s not a useful discussion without data.
I accepted the contrary evidence supplied by Radio, contrary data can exist at the same time, because if you read the details of the IETF data, it’s for WebRTC, and uses the baseline profile. This is a legitimate use-case, I said that already. Where do you find bias?
I find bias in your lack of belief that VP8 was competitive with h.264, when by the very measures you use to “support” your bias, VP8 actually performs better than h.264 in about a third of cases, the reverse is true for about a third of cases, and there is very little difference between the two in the remaining third of cases.
The actual truth is that these two codecs by objective measurements (e.g. PSNR, SSIM) have very little difference. Most non-biased observers would call it a dead heat.
http://en.wikipedia.org/wiki/PEVQ
By subjective measurements I believe that VP8 is the (slightly) preferred codec, but most people cannot really tell the difference.
How is it not bias to rant and rave when someone questions your dubious (and provably false) claim that h.264 was better by every performance measure?
It is trivially easy to show that some people would judge some cases of VP8 video as better than h.264 even thought the VP8 video filesize was smaller. It is trivially easy to show that slow-motion or still frames from a VP8 video are marginally sharper than those from a h.264 video of the same filesize, resolution and bitrate. This better basic (still) compression is I believe the inspiration for webp.
http://en.wikipedia.org/wiki/WebP
Edited 2013-03-28 07:53 UTC
Wrong, I didn’t say it wasn’t competitive, I actually said that BOTH were acceptable in a sibling thread!
What I said was that x264 tends to produce better results on objective metrics, and supplied some data to show this for a given profile setting.
How is this worse than what you did: claiming VP8 was better with zero evidence? So it’s bias in the case where I provide data, but not bias where you provide none?
Define “very little”? Some of these metrics are based a logarithmic scale, so it’s not straight forward to relate the differences.
http://www2.tkn.tu-berlin.de/research/evalvid/EvalVid/vp8_versus_x2…
Here is a comparison on the metric you wanted, x264 “wins” at least 85% of the time (lower is better).
You’re free to believe that.
I didn’t claim it was better by every performance measure, I just presented evidence that showed x264 was better (for the defined experiment). I don’t think there was any ranting or raving involved.
No, it is bias in the case where you say you did not claim h.264 was better when you did, and you also saying that I claimed VP8 was better when I did not. I said multiple times that one could get VP8 to perform as well as h.624 … claiming “as well as” is not claiming “better”.
“Very little difference” would be the words of some people who have performed test, and others involved in blind testing of subjective quality.
Using codec: Google: VP8 0.9.0-13-g6be1d93 from WebM
http://code.google.com/p/webm/downloads/detail?name=libvpx-0.9.0.zi…
Version 0.9.0 was current May 18, 2010. That would have been the original version. Since then there has been The following releases:
Thursday, October 28, 2010: VP8 Codec SDK “Aylesbury” Release
http://blog.webmproject.org/2010/10/vp8-codec-sdk-aylesbury-release…
Tuesday, March 8, 2011: VP8 Codec SDK “Bali” Released
http://blog.webmproject.org/2011/03/vp8-codec-sdk-bali-released.htm…
Thursday, August 4, 2011: VP8 Codec SDK “Cayuga” Released
http://blog.webmproject.org/2011/08/vp8-codec-sdk-cayuga-released.h…
Friday, January 27, 2012: VP8 Codec SDK “Duclair” Released
http://blog.webmproject.org/2012/01/vp8-codec-sdk-duclair-released….
Friday, May 11, 2012: VP8 Codec SDK “Eider” Released
http://blog.webmproject.org/2012/05/vp8-codec-sdk-eider-released.ht…
Each release has brought improvements over the previous version. We are now quite a way ahead of the original release, and the improvements since have been substantial.
Of course I am. I have good reason to believe it too, since the original VP8 release was thought to be better (subjectively) than h.264 in 15% of cases, and it has been substantially improved five times since then.
I think we are simply going to have to differ there.
Edited 2013-03-28 09:41 UTC
This whole thing started because you stated:
Then I showed some relevant data where x264 outperformed VP8.
I’ll state why I feel x264 has a higher maximum quality/bit than VP8 (Google’s encoder)
– the comparison of x264-baseline and VP8 for WebRTC uses the baseline profile, which was written about 10 years ago. They also didn’t ask x264 to optimize for PSNR (what they measured), slanting the results away from x264. The test was done by a Googler (apparently) and the methodology ripped apart on the x264 mailing list http://mailman.videolan.org/pipermail/x264-devel/2013-March/009913….
– the study done at MSU which puts x264 above VP8 on SSIM at least on the high profile setting (I think it does well/better on the others as well, would have to check).
– the study from TUB which also puts x264 above VP8 on VQM, although it is from the initial public release of VP8 (which still had some years of development within On2).
– using recent versions of both encoders, this screenshot comparison https://gist.github.com/Hupotronic/4645784, single screenshots such for comparison, but c’est la vie.
– another comparison on various video clips measure PSNR and SSIM http://blog.existentialize.com/tag/vp8.html
– the opinion of an expert on the topic, the author behind x264 and also a vp8 encoder
The above statement is true.
If you encode two video to a certain filesize & resolution with “standard” or “nominal” options, then a h.264 video will in about one third of cases be better quality than VP8 (Eclair), in a third of cases it will be the other way around (VP8 quality will exceed h.264), and in about a third of cases the video quality will be essentially the same quality. H.264 does better at the higher-bitrate end of the quality spectrum.
However, you can make up the quality difference in that one third of cases where h.264 is better by opting for a higher profile when encoding VP8.
Normally the VP8 video will take longer to encode, and in the cases where you have to use a higher-than-standard profile it will take even longer to encode the VP8 video.
Nevertheless, it is possible to do it. One can match the quality. Note the operative word can.
None of this says that VP8 is the better codec, it merely says that with some extra effort it is possible to match h.264 in those cases where h.264 ordinarily produces a better outcome.
What is wrong with any of that? Don’t you speak ordinary English?
Edited 2013-03-28 14:18 UTC
I’m not sure where you’re getting your 1/3 number, can you provide data for that?
Ok, you pick the highest profile from each of VP8 and x264 and you find x264 gives you better quality: how do you now make up for the quality difference?
Edited 2013-03-28 17:12 UTC
Several times I said that this applied “to the quality of video as used over the web”. This is more or less “default profile”, as very little web video is encoded at high profiles. When you are looking at a video of a newscaster on some web news page, or looking at standard resolutions on youtube, you are simply not looking at the highest profile.
When you are constrained to use only one profile for each codec (even though profiles are not equivalent), then one can always find some cases where VP8 cannot match h.264. Then again, for another profile and another use case, the reverse can also apply … one cannot get h.264 to match VP8 in some profiles.
So??
Well, which one is it now, only the profile “used on the web”, or “you can always pick a higher profile” ?
How do you know what’s used on the web?
Youtube recommends H264 high profile to be used, and I would assume they have decent recommendations:
http://support.google.com/youtube/bin/answer.py?hl=en&answer=172217…
My 2 year old phone can decode H.264 high profile, I don’t think its unreasonable to use it as the representation of highest quality, that’s what it is there for after all.
I don’t know, a strong majority of YT videos is mostly ignored… only very few reach “thousands, if not millions” views.
This is rubbish.
http://en.wikipedia.org/wiki/VP8
over twice as much data! So obviously it is not as good.
Edited 2013-03-26 13:30 UTC
May 2011 – a benchmark two years old.
I’m not saying anything about whether either of you is right or wrong, but we’ll need more recent benchmarks than that, especially considering how fast VP8/9 develop.
Fair enough, did a bit more googling and the file sizes differences are negligible now.
However performance wise …
https://gist.github.com/Hupotronic/4645784
That test compares “best vs best”.
The main issue is that for h.264, the mandantory-to-implement variant in WebRTC would be “baseline”. Higher h.264 formats are incompatible with baseline decoders (though you might not find baseline-only decoders in the field anymore, once people paid royalties).
The “higher” VP8 settings are more computationally expensive on the encoder side, but produce a bitstream that remains compatible with all VP8 decoders (there are no decoder profiles).
If universal compatibility is a goal (as should be on the web), an ideal h.264 baseline encoding should be used.
I won’t argue that you are completely wrong – in best case scenerio tests for VP8 and h.264 I find both codecs are within sptting distance from each other in most measurable metrics, but VP8 loses more battles than it wins.
However, your example is ridiculous. Videoconferencing??? You pick a scenario that h.264 was specifically designed for (low resolution, extremely low bitrate, realtime encoding) and because it is better at that you say VP8 is “obviously it is not as good“.
Sorry, but there is nothing obvious about that. You cannot pick one edge comparison and make such a broad generalization.
Again, I am not saying VP8 is better than h.264. I will say that for resolutions and bitrates routinely used for web based video distribution (720p and 480p, 500-1200kpbs) it is definitely close enough in most measurable metrics that most people would not notice the difference.
Besides, frankly I think arguments on the technical merits of VP8 are wasted breathe (for or against). No one uses VP8 because it is technically superior – they use it because it is open and royalty free. The fact that it is actually comparable to h.264 when used for its target use case (web video) is just icing on the cake.
Edited 2013-03-26 14:22 UTC
Firstly it is pretty important seeing that Google Talk is probably using it that it uses twice as much bandwidth.
In any case Okay, how about this one?
http://www.osnews.com/thread?556675
Except it doesn’t look at the above link.
Well that might not be the case, we don’t know yet.
Except it isn’t.
Goog Talk uses h.264…
https://developers.google.com/talk/call_signaling
So I don’t think your point here has any merit at all.
Did you read what I said…? That comparison is purely about subjective quality:
and the target encode is:
No one streams 1080p50 on the internet at 13600 (!!!) khps – that is better than most commercial bluerays are mastered at. All I said is within the parameters I specified it is close enough to h.264 in quality that most people wouldn’t notice. I never said (and do not think) it is technically better, and it is definitely slower. But for the common use case it is targeted at it is pretty damn good.
Clue: The reason you won’t find many good comparisons between webm and h.264 at commonly used resoltuions and bitrates is because the result is very boring and neither side of the argument can use them as ammo…
Edited 2013-03-26 16:50 UTC
Actually it proves then that VP8 is still a bit crap for video conferencing.
H264 is better, whatever way you shake it whether it is perceivable or not. Most of the web is already using it.
Edited 2013-03-26 18:17 UTC
All it proves is that Google, like everyone else that does video conferencing, is using h.263/h.264/SVC – because that is the industry standard for video conferencing… Not the “defacto” standard, the ITU standard… At least at this point in time.
But there is no standard for web video. And the reason there is no standard is because of h.264 patent holders killing it in committee. We would all be using Theora now as a baseline if that were not the case.
The fact that most of the web is using it doesn’t do any good for people that need a royalty free codec. I personally could not care less about what everyone else is using – I care about what everyone can use – and not everyone can use h.264.
You can make all the arguments you want about how h.264 is better. I personally think Theora (as bad as it is) is better than h.264 – because I don’t have to pay anyone to use it…
I don’t care about what everyone can use, because there is always someone that will only change from a rotary telephone because they have to.
You cannot support everyone well, you can only support the majority well.
Also Where is this royalty thing coming from?
There is no charge for streaming. Only for people that make an encoder or an decoder.
WHERE IS THIS ROYALTY FOR WEB CONTENT COMING FROM? There is no charge from the MPEG-LA for streaming broadcasts.
It keeps on being brought up but it isn’t relevant to web content, which is what web standards relates to.
Edited 2013-03-26 20:55 UTC
That is a patently stupid analogy in this discussion. You are equating h.264 to technological progress, as if users who don’t want to use it are trying to stay in the stone age. No one wants to use VP8 or Theora because they are ludites – they want to use them because they are licensed liberally – and h.264 is not.
Why the hell not? You saying that Microsoft and Apple are incapable of including a liberally licensed video codec? They are perfectly capable of doing so – they choose not to…
There is no charge for streaming free content… If I want to sell my video (I don’t, but if I did) why should I pay MPEGLA for the privilege of doing so? They had nothing to do with creating it, and I don’t want them to have anything to do with me distributing it…
The streaming exemption also doesn’t apply to downloadable videos (free or not)… So if you supply a download link you have to pay for that (at some point).. Also, and I cant stress this point enough, it isn’t only about streaming. You have to pay for licensing to include h.264 encode/decode in software. You technically can’t even use x264 freely (the most popular encoder on the planet) without purchasing a patent license from MPEGLA
Do you really understand how the licensing works?
http://www.mpegla.com/main/programs/avc/Documents/AVC_TermsSummary….
ps. x264 does offer commercial licensing now. This is an excerpt from their terms of use:
Sure, the licensing only really costs money at high volumes… 100,000 units may seem like a whole lot of breathing room, until you create something popular and hit the limit. In my world 100,000 ain’t all that much…
Wouldn’t you rather have something that doesn’t have such strings attached?
Edited 2013-03-26 23:11 UTC
Way to misunderstand. The whole point of is that h264 is available to most people and those who don’t want to out of bullshit reasons will only do so when they really need to.
Sorry those who are using HAIKU, ICAROS and OPENBSD aren’t really going to worth supporting by most web developers.
Anyway VP8 is technically worse than h264 so why are you bringing up luddites.
If I said I should support IE6 users as well I am sure your tone would change. This is what you are asking me to do outside of users of Windows, OSX, iOS and Android.
It is bullshit, and unrealistic. If WebM is dominate over flash/h264 then I would agree. It isn’t.
I argue with what is the current reality not idealism.
Quite easy don’t use a codec that infringes on their patents. Oh there probably isn’t any.
Only after 100000 views and over 12 minutes long. Sorry if you are able to provide that sort of bandwidth your $2500 is not spent on the f–king MPEG-LA license.
FFS. This is ridiculous.
YOU DO NOT UNDERSTAND THE TERMS OF USE IT SEEMS:
If a said website is getting more than half a million views it isn’t going to be independent except for the in extraordinary circumstances and it is going to be hosted via youtube anyway.
We are getting into the realms of sillyness.
I don’t care because it doesn’t affect me or the work that I am doing. It doesn’t affect most of the end-users I support. It called pragmatism … look it the f–k up and tell me why then I am wrong for having this point of view.
I make my living out of delivering the right solution, not what OSNEWS commentators think I should.
Also why shouldn’t the MPEG-LA be re-reimbursed for a codec that is used so much and works so well?
Anyway I made my points clear. I am out of this conversation.
Edited 2013-03-26 23:46 UTC
Windows is available to most people and those who don’t want to run it out of bullshit reasons will only do so when they really need to.
That sound fair to you? Same thing.
Again with the implication that VP8 is old and/or broken… Why are you equating supporting an 11 year old deprecated piece of software to supporting a well maintained and actively supported video codec? Its not at all the same thing and you know it.
Besides… Who is asking YOU to do anything at all? I don’t care what video codec you use – use whatever you want. I just want(ed) the standard for internet video to be universally accessible, and that means NO ROYALTIES. Didn’t get my way unfortunately, so we have the cluster f*ck that is the video tag now.
You don’t have to like VP8 and you don’t have to use VP8, all I want is for you to understand why some people (like me) want it (or something equally as liberal) to exist – it opens up all kinds of opportunities for innovation that are simply not possible when you have the MPEGLA tax collector looming on the other side of success…
Why is it unrealistic to ask all browsers to incorporate a baseline codec? I totally don’t get this argument. It is certainly not because of patent concerns, the only holdouts hold the patents everyone is concerned about. What is Microsoft and Apple going to do? Sue themselves?
If VP8 infringes on someone’s patent and they are willing to go to court over it we will all see how that turns out. Right now it is a moot point. Id rather hold out hope that it will survive patent litigation than simply give up and use h.264.
Im sorry, but I refuse to use a product with a licensing regime modeled after street corner drug dealing…
Your understanding of the math for this is seriously f*cked up.
Streaming 2.5TB of data (25MB (roughly the size of a 12 min video) * 100,000) over a period of a month costs about $300 using amazon S3… That is $3,600 a year.
So month one is free for your MPEGLA license. In month 2 you would have to pay $2,000 US to MPEGLA (2 cents per stream). That ends up being $22,000 for the entire year.
bandwidth is BY FAR cheaper than MPEGLA licensing…
Yes it is.
Didn’t I already concede streaming free videos is free? Why are you bringing it up again? I just don’t understand why otherwise intelligent people are so willing to accept this street-corner-crack-1st-hit-is-free licensing model like it is the greatest thing ever…
You are seriously sheltered or something. 500,000 views is nothing…
Im pragmatic too. I don’t see how VP8’s existence in any way affects you – so why do you care? It’s non-existence on the other hand effects me greatly, so why are you so hell bent on arguing against its existence and industry support?
I never said they shouldn’t. I am not opposed to h.264, it is a great codec. Some people just want an option that doesn’t cost money – the web NEEDS an option that doesn’t cost money, and the only reason such an option doesn’t exist universally is because of a protectionism racket which intentionally squashes any attempt to create an alternative…
After this I am out arguing with you.
THERE IS NO CHARGE FOR STREAMING DATA WITH H264. SO THERE IS NO COST UNLESS YOU ARE MAKING THINGS THAT ENCODE OR DECODE. This has nothing to do with the specification.
I don’t care about in my line of work about some fringe user case that doesn’t visit the site I work for.
You just want things to be open because they must be.
And I am out of this.
Edited 2013-03-27 12:51 UTC
THERE.
IS.
AND YOU SAID IT YOURSELF:
And you got the cost completely wrong, BY A FACTOR OF TEN, as galvanash has shown.
There isn’t a cost.
Why don’t you actually read?
http://diveintohtml5.info/video.html#licensing
You are wrong.
Edited 2013-03-27 15:32 UTC
Again… Not everyone wants to only offer streaming, and not everyone wants to give their shit away…
So as you are so fond of doing:
IT IS ONLY FREE IF YOU ONLY OFFER STREAMING AND YOU DO NOT CHARGE FOR YOUR CONTENT!
Which is what I said like 10 posts ago. Twice.
So you are getting your nickers in a twist because they want reimbursement for a technology that they own the rights to when you are using it to make money?
HOW DARE THEY! /sarcasm.
But I DO NOT WANT to use it! But h.264 has a… monopoly. (And some idiots want to impose it as a standard.)
Eh.
You can do sarcasm, but you are completely oblivious to irony.
I am English, we understand irony and sarcasm very well thankyou.
Google Talk / Gmail uses even something somewhat better than your ordinary H264 (Vidyo is mentioned at one point during the installation of the Gmail video plugin)
http://en.wikipedia.org/wiki/Vidyo
http://en.wikipedia.org/wiki/Scalable_Video_Coding
Edited 2013-03-29 16:50 UTC
Way, way, way out of date.
You can catch up a bit here, if you care to:
http://blog.webmproject.org/
Encoder also happens to a part of the system that is exposed to patent issues. Luckily, it’s also a part that is the most flexible in dealing with them – all known issues can be worked around (at worst encoder will be slower) and, even if you inadvertently trip on a patent, you can still control the damage by simply updating the code.
OTOH, it is really difficult to infringe on patents in a specification of the data format, unless the specification mandates such infringement. Some do, but it does not happen by accident – you have to force conforming implementations to use specific patented techniques.
You purposefully misrepresent the situation. Qualcomm SoCs don’t support VP8 in hw – and Qualcomm is the gorilla in the room, in great many Android handsets.
Edited 2013-04-01 22:32 UTC
There is no pending Nokia patent suit. An IETF IPR declaration is not a suit – it is a commonplace procedure for patent holders to make the IETF aware of 3rd party patents that apply to pending drafts so that they can be reviewed for suitability.
I would be very surprised if Nokia actually files suit against Google over this. They are simply trying to bar VP8 from being used in IETF standards, partially to just cause Google grief and get some free press, but more than likely just to throw everything at the wall and see if anything sticks. They are essentially gaming the system to get free patent reviews – if there is anything to this we will know if a few months.
The reality is that I doubt MPEGLA knew anything about this – it is probably the other way around… Nokia threw this together as soon as they found out Google caved to MPEGLA, hoping for an easy payoff (or possibly to have some additional leverage in settling their other lawsuits with Google). If any of these patents are found to be applicable that is exactly what will likely happen – no lawsuit required (blood sucking lawyers are very expensive)…
Edited 2013-03-26 15:34 UTC
I also see this primarily as a vehicle for marketing Nokia patents.
It is basically a thinly veiled message towards license holders of MPEG LA video patents (e.g. for H264), saying “You though your MPEG LA license covered you use of contemporary video codec technology. You were wrong. Please get in touch with your Nokia contact to, ah, avoid any unfortunate things happening.”
Well, Nokia is in bad shape today (thanks to Elop’s “Burning Platform” strategy), and therefore has enough good reasons to uses its patent pool to attack anywhere and grab some protection money whenever it can,
on the other hand, it can be noted that Nokia is now little more than an MS’s puppet, directly controlled by an MS’s lieutenant.
Therefore, it must quite a coincidence if Nokia chooses to attack Google’s VP8 now, but don’t do the same on Microsoft assets.
It looks like the list contains six unique patents. With perhaps a couple more in some fringe countries.
Nokia is positioning itself to be purchased.
Oh for goodness sakes Google just buy Nokia already.