We’ve all had a good seven years to figure out why our interconnected devices refused to work properly with the HDMI 2.1 specification. The HDMI Forum announced at CES today that it’s time to start considering new headaches. HDMI 2.2 will require new cables for full compatibility, but it has the same physical connectors. Tiny QR codes are suggested to help with that, however.
The new specification is named HDMI 2.2, but compatible cables will carry an “Ultra96” marker to indicate that they can carry 96GBps, double the 48 of HDMI 2.1b. The Forum anticipates this will result in higher resolutions and refresh rates and a “next-gen HDMI Fixed Rate Link.” The Forum cited “AR/VR/MR, spatial reality, and light field displays” as benefiting from increased bandwidth, along with medical imaging and machine vision.
↫ Kevin Purdey at Ars Technica
I’m sure this will not pose any problems whatsoever, and that no shady no-name manufacturers will abuse this situation at all. DisplayPort is the better standard and connector anyway.
No, I will not be taking questions.
HDMI, as long as it continues to stay a closed expensive licensable spec, needs to die.
DP doesn’t have this problem.
Sadly HDMI won. People don’t care that is “closed expensive licensable”, all monitors and TVs come with them, cables are cheap, and everybody knows about it.
Out of sheer curiosity, what was the HDMI Forum supposed to do? You cannot have higher data rates without a more expensive cable, either we’re talking about a cable that has more conductors in it (the USB 3.0 way) or a cable that has higher-quality conductors in it (the HDMI way, since the HDMI connector is pretty packed with pins already). The best the HDMi Forum could do was to trademark a name such as “Ultra96” and enforce it as best they can. Which is what they did.
DisplayPort is irrelevant for TVs because it has no ARC/eARC, which is the only way to connect your device to a Dolby Atmos receiver (unless your device has a second dedicated HDMI port just for audio, which only a handful of Blu-ray players have).
For the nitpickers: Yes ARC can work with lossless Dolby Atmos if your source has the ability to transcode it to lossy Dolby Digital Atmos (aka E-AC3+JOC), you don’t necessarily need eARC.
A problem to solve instead of investing time and resources in a closed format.
What is the “problem to solve”?
– If the “problem to solve” is the lack of ARC/eARC support in DisplayPort, then the VESA dullards would have added it years ago (back in 2009 when HDMI got ARC), they just don’t care about such “consumer concerns”. This makes HDMI a must-have for setups with a Dolby Atmos receivers.
– If the “problem to solve” is ARC/eARC itself, it’s unfortunately a requirement due to HDCP. You see, the HDCP people charge a royalty per port, and ARC/eARC allows source devices to have only one HDMI output (that outputs both video and Atmos audio) and then let the TV split the Atmos audio from the video and pass the Atmos audio to the receiver via ARC/eARC. Atmos will never happen over SPIDF or even USB-C because those don’t do HDCP, and Hollywood requires that Atmos only travel over an HDCP link. Even GPUs don’t have a second dedicated HDMI port for audio, precisely because they want to save on royalties, only a handful of expensive Blu-ray players have it.
Really, I don’t see how the problem can be solved, regardless of the definition of the problem.
HDCP is horse shit, that makes everything expensive an incompatible – without ANY benefit, to anyone, including the content owners. Let’s not pretend that has any merit or value.
I don’t think kurkosdr was defending HDCP, just pointing out that we all have to deal with it. I’m sure they would agree that it has no merit or value. It certainly doesn’t for us mere consumers!
I agree with Thom that DisplayPort is overall the better standard, and if not for media companies’ desire for control over anything media related, it would be the default connector on consumer devices just as it is on PCs and other IT equipment.
HDCP served its intended purpose, which was to make HDMI inputs unattractive in most devices. No manufacturer wants to advertise an HDMI input on their Blu-ray recorder that will fail to work most of the time. Keep in mind that not everyone is tech-savvy enough to find decryption software such as MakeMKV, and then there are things like cable/satellite boxes where no straightforward decryption method exists. So, a lack of HDMI input in common devices goes a long way towards reducing casual recording and copying. This is what Hollywood wanted and they got it thanks to the DMCA (which outlaws “unofficial” implementations of DRM that don’t comply with Hollywood’s demands), end of story.
DisplayPort ports can have HDCP[1], it’s just not mandatory for a TV or monitor to be able to use the DisplayPort name and logo, but it can be present on DisplayPort ports on an optional basis (and most TVs and monitors do indeed have it for compatibility reasons). My personal gripe is that there is no special logo for DisplayPort ports that have HDCP, so there is no guaranteed way to know a given DisplayPort port has HDCP, but again, most have it.
As I’ve said above, the main issue with DisplayPort is the total lack of support for even basic ARC (let alone eARC), it’s a boneheaded omission that gives HDMI a bonafide reason to exist.
[1] https://www.displayport.org/faq/
CaptainN-,
This is the standard refrain for all DRM. People hate it, but publishers keep demanding it and manufacturers keep delivering it. Yes it can interfere with legitimate uses, but it doesn’t matter. Yes, it’s broken so the real pirates can bypass it, but it doesn’t matter. Publishers still lobby collectively to keep everything infected with DRM anyway and they carry a lot of weight in our tech standards. It’s just the way it is and probably the way it will always be.
“Out of sheer curiosity, what was the HDMI Forum supposed to do? You cannot have higher data rates without a more expensive cable, either we’re talking about a cable that has more conductors in it (the USB 3.0 way) or a cable that has higher-quality conductors in it (the HDMI way, since the HDMI connector is pretty packed with pins already). ”
No, that’s not true obviously. The other option, would be to develop a better encoding for the data. Think about how upgrading from a 14.4d to a 28.8k modem surprisingly didn’t require you to rewire your phone cord. Of course, if it were easy they would have done it. Changing the cable format is a worse choice for most of the participants. So I assume that they didn’t have any easy protocol improvement they could throw at it. Or the tin foil hats are right and the accessory minded participants won the argument. But I don’t believe in tin foil hats.
All good practical encodings for short-to-medium cables have already been invented. Display port and USB-C have the same problem that HDMI has (you need a better cable for the higher data rates).
The difference is that HDMI has trademarked certifications for cables that can do the higher data rate while DisplayPort and USB-C are a wild west.
Bill Shooter of Bul,
The range of a cable is extremely dependent on signal frequency. You can go higher if the bandwidth isn’t already close to being maxed out, but most modern data cables are already being pushed near the max frequencies they’re engineered for. Look at the chart for “Values of primary parameters for telephone cable”…
https://en.wikipedia.org/wiki/Telegrapher's_equations
Notice that at low frequencies, doubling has very little impact on resistance. This is why modem speeds could be increased without upgrading wires. But as frequencies keep going up the resistance gets exponentially more pronounced. If the run is short enough, then there isn’t much signal degradation and it can still overpower the noise. But as the cable gets longer the signal drops and the noise increases. This is where higher quality cables are needed.
You can go beyond the cable’s specs, but you’ll start to get errors. For example some of us have tried 10gbe on cat 5e cables. It actually works ok for short runs in environments with low electrical noise. But it doesn’t take much to start getting data errors. Even these errors might be tolerable depending on what higher level layers do. Computer networks are quite tolerant of this and if your not explicitly testing for lost packets you may not even notice. As far as I know HDMI does not do retransmission, errors will turn into audio/video artifacts. If you find someone who’s tested this stuff specifically, it could be an interesting read.
Personally I haven’t tested HDMI cables, but I have tested very long USB cables – around 30m, both passive and amplified. In my testing I found that active cables solve the signal quality problems, but it appears that some hardware/drivers have critical timing requirements. If you try to go too far (around 10m with two repeaters IIRC) data packets can start to drop. For example hard drives would work absolutely fine the full length, but my logitech web cams wouldn’t work reliably. As far as I can tell this was not electrical signal quality but the hardware/driver’s inability to deal with latency.
kurkosdr,
This makes it even more confusing for USB-C / Thunderbolt connector users. Even though my monitor has support, and my computer, too, they would not negotiate properly.
Add in docking stations to the mix, and it gets even worse. Which ports work? Which don’t? Do we need adapters? And then they are sometimes hidden. The type-C cable can carry HDMI, but most cables are internally type-C -> DP and then DP-> HDMI which “erases” capabilities.
Wish they could come up with a better “Lowest common denominator” between these three standards.
This is why most GPUs and laptops have one HDMI port, so you can connect to a TV without all those adapters. This gets you the full HDMI feature-set, including (e)ARC.
The “lowest common denominator” is the intersection between DP and HDMI. Just make sure the docks you buy have the ports you want And avoid USB-C for video output, it’s buggy as hell usually.
kurkosdr,
Yes, in my previous setup I had to use a type-C port on the dock to connect my display. And I agree, it was really buggy.
We have all this technology, and not being able to solve a simple thing as connecting two devices reliably (actually with a dock, but…) is disappointing.
I don’t think that makes display point irrelevant, I have no idea what ARC/eARC is or Dolby Atmos. I could literally use it with any of my equipment and it would work fine.
You don’t know what Dolby Atmos is? Have you be living under a rock for the past 10 years?
No and I kind of doubt most people do? I’ve never heard of it and I really doubt I own anything that can use it.
HDMI is more than 20 years old, in most regions there is nothing they can enforce patent related and the connector is unchanged, so I suppose a new standard is the way to make sure the fees keep rolling in. Even if the new claims are as bogus as an ISP that can suddenly pump 2GB down the same fibres that could only delivery 200MB a few years ago!
I suspect it’s just another case of “Our gold cables are better, if you listen carefully you can hear it!”
Up until fairly recently (2017), new HDMI versions were very welcome, as they introduced things like eARC and the ability to do 4K HDR at up to 144Hz framerate. This is especially true for TVs, since TV manufacturers don’t want to waste an entire port for a DisplayPort connector that can’t do Atmos passthrough, so HDMI is your only choice when it comes to TVs (so, progress on the HDMi front is very welcome). This is objectively measurable performance btw, not fake claims like audiofool cables.
So, HDMI is set until at least until 2037 as far as patents are concerned.
Now whether HDMI 2.2 gets adopted, that’s dependent on whether 8K or 4K HDR stereoscopic gaming find a market.
kurkosdr,
One advantage of higher generation cabling is being able to “over provision” to avoid signal issues.
For connecting a 4K monitor at 60Hz, HDMI 2.0 might be sufficient. However I would step up to 8K capable cables that are overkill for the purpose, but will give me significant headroom for the signal. And in the future if I ever upgrade, then the cable will already be there.
(With I could just use type-C, though, we are so close to having a single standard cable for everything).
It’s also worth mentioning that even if the HDMI patents have expired for early versions of HDMI, HDCP still remains royalty-encumbered because of the DMCA (in countries where the DMCA or similar laws apply), so you can only implement cleartext HDMI without paying royalties (either in a source or a receiver). Also, you can’t use the term “HDMI” since it’s trademarked, so you have to either leave the port unlabelled or label it something generic like “Digital AV out”.
Cleartext HDMI is practical on a source device if your device isn’t meant to officially implement any DRM, that’s how some FPGA projects have HDMI output: they only output cleartext HDMI (and they also leave the port unlabelled or have it generically-labelled). Cleartext HDMI on receiver devices (for example TVs and monitors) isn’t practical obv.
@kurkosdr – Thanks, that explains something I wonder about on new generation low cost systems that often have a unlabelled HDMI port, and most of what I’m thinking of is FPGA / ASIC based like low cost digital oscilloscopes with unlabelled HDMI ports.
I gather though on the Trademark issue, you can make the cable without the labels but list “compatibility” on packaging, which won’t hinder the low cost / grey market at all.