USB Implementers Forum, the support organization for the advancement and adoption of USB technology, today announced the publication of the USB4 specification, a major update to deliver the next-generation USB architecture that complements and builds upon the existing USB 3.2 and USB 2.0 architectures. The USB4 architecture is based on the Thunderbolt protocol specification recently contributed by Intel Corporation to the USB Promoter Group. It doubles the maximum aggregate bandwidth of USB and enables multiple simultaneous data and display protocols.
[…]As the USB Type-C connector has evolved into the role as the external display port of many host products, the USB4 specification provides the host the ability to optimally scale allocations for display data flow. Even as the USB4 specification introduces a new underlying protocol, compatibility with existing USB 3.2, USB 2.0 and Thunderbolt 3 hosts and devices is supported; the resulting connection scales to the best mutual capability of the devices being connected.
How many years until USB4 (or later) replaces HDMI and DisplayPort? Since everything is data packets now – analog is a thing of the past – do we really need to have separate video cables? I’d love to one day build a PC that just has an array of USB-C ports on the back, with the ability to plug anything – monitor, keyboard, mouse, serial port adapter for the Windows CE mini laptop I’ve been craving to buy for ages, you know, the usual stuff – into any of the ports.
One can dream.
Thom Holwerda,
As far as I’m concerned, micro usb, while a marketing success, was an engineering failure. It was shoddy and after a hundred normal uses electrical contact and grip would fail. Mechanically micro usb was just too fragile and never deserved to be a USB standard. The usb port/cable on all of my microusb devices either failed or were on their way. Some might accuse me of being too rough, but even a slight pressure over time would cause the pins to eventually loose contact. On top of this, it was too hard to tell the orientation in the dark.
Apparently a lot of manufactures agreed it was terrible and many of them stuck with miniusb or the old usb type b for non-phones peripherals, thankfully!
https://www.newnex.com/usb-connector-type-guide.php
IMHO USB-C is a good standard though and I’ll be happy when microusb finally dies.
I agree this ought to be possible, but you’ve got various consortiums who don’t want to give up power or become redundant so unless they find a way to rise above the politics, I’m not sure this will happen.
And then you’ve still got things like ethernet ports that still need magnetics for long distance transmission.
That said Micro USB tends to survive the drop test… while USB-C does not, I’ve ripped the shell of probably over 10 USB-C cables but have rarely done this to a micro b cable.
they have different modes of failure due to the differences in how they are made but neither is actually all that resilient.
cb88,
I’m not sure exactly what you mean by drop test? I’ll grant you I haven’t owned USB-C devices as long, so maybe I just need to wait longer to start experiencing problems, in which case I’ll have to report my findings in the future, but so far it feels more solid. My experience with all microusb devices over time was that they got loose to the point where I’d need to tilt the cable just to make contact…and this was after only ~2 years with no accidents. If I had dropped my phone, at least then I could rationalize its failure, but for a connector that’s officially speced at 10,000 insertions, microusb obviously failed to live up to its promise in real world use cases.
Well I gave up on both of them and got an inductive charging phone… a no brainer for me really. As I have broken both micro b and usb C cables. It wireless fast charges almost as fast as cable fast charging.
The C cables I’ve broken tend to be from me picking it up and fumbling it or tripping over a badly placed cable… note I’ve very rarely broken a micro b cable that way as they tend to just come disconnected rather than rip off the shell though sometimes they do bend after which they are typically not quite right but tend to work for awhile anyway. Note this doesn’t happen often but if you goober it once or twice a month then you can see how cables end up getting broken alot.
cb88,
I haven’t broken any USB cables this way. While not the same thing, not long ago the kids yanked on my laptop’s power cord with enough force to rip the barrel connector off. I was very worried about permanent damage to the laptop, but with a new power brick it works fine. I agree with you that this is an advantage for wireless. I actually liked the magnetic power cord that apple used in older laptops. I think they could have fixed the cord fraying issues without abandoning the whole concept.
Can’t reply to your comment below as it is too deep in the thread… but. The wireless charger battle already game and went. It was Qi vs Powermat… Powermat lost and now everyone uses the Qi standard.
Qi charging is good for up to 15W currently though for instance my device maxes out at 7.5w.
Also I think powermat has made its older device compatible with QI via software update… and they are all now part of the same wireless power consortium.
cb88,
Thanks for responding, I wasn’t aware of these two merging.
https://www.pcworld.com/article/3302838/best-wireless-charger.html
Wireless charging is likely the future that consumers will flock to, but honestly I still have reservations personally. You get worse specs & worse efficiency than wired contacts, which may not matter for practical use cases, but it bothers me that this runs counter to the green energy movement.
Some of these wireless chargers use a usb power supply that’s rated *much* higher than the wireless charger can deliver. You get the implied choice to sacrifice efficiency with wireless, or use wired for reducing power consumption.
https://www.pcworld.com/article/3328789/ravpower-alpha-series-fast-charge-wireless-charging-pad-review.html
In terms of scaling up “wireless” charging technology, have you seen dell’s wireless charging two in one laptop?
https://www.theregister.co.uk/2017/07/12/dell_latitude_7285_wireless_charging_two_in_one_pc/
I understand why people like the idea of wireless, but it’s hard for me to accept it as a better technology. At what point do you just call it quits and say an old-school wired dock is the way to go?
HDMI is founded by TV manufacturers, and they charge a license fee to use the privilege. I do not think they would want to let go of that stream.
DisplayPort on the other hand is done by device manufacturers, and they do not care about license fees, only compatibility. They are okay with Type-C. However it does not have the bandwidth for all applications. Especially when sharing with USB3 (or USB4), since they use some common pins: https://www.displayport.org/displayport-over-usb-c/
There are already type-C only monitors, and I might expect this trend to continue. There are now “bi-directional” cables, that allow older DP ports to connect to new type-C monitors: https://www.monoprice.com/product?p_id=39240. So new monitors can only keep type-C for data (or even power!), and leave older device compatibility to adapter dongles.
However I do expect TVs to go type-C only in the near (or distant) future.
HDMI (the protocol) is a standardised “alt-mode” for USB Type C controllers. It’s possible to connect a device to a monitor via Type C connectors/cables, and output an HDMI signal. I don’t know if that signal supports HDCP, but most likely it does.
Similarly, the DisplayPort video stream is a standardised mode for USB Type C controllers.
Thom’s suggesting replacing the HDMI connector and the DisplayPort connector with a Type C connector. Let the device and the monitor work out whether to send DVI signals, HDMI signals, or DisplayPort signals for audio/video data. Instead of having a mishmash of DVI, HDMI, DisplayPort connectors on monitors, TVs, projectors, AV receivers, etc, just have an array of Type C connectors.
One downside to doing that is the loss of the locking connector (which is part of DP/mDP), which is an upside for DP over HDMI. Another downside is that it would take a beefy controller to provide support for everything (TB, USB, HDMI, DP, Power Delivery, etc) over every port. And, if you don’t provide it over every port, then you end up playing “which Type C port supports which protocols”, and you’re back to square one of having special-purpose ports.
Edit: ah, I see you’re also talking about capabilities and throughput, which could also be an issue with Type C (especially when using it for USB and not TB).
Apple does not want a standard charger because it would allow users to enable rapid charge with ease. And the reason Apple wants to block is that they are on the brink of moving to portless phones and wireless charging is much slower. It would be hard to sell the move to no charging port at all as innovation should it slash charging speeds by 70 % or whatever.
sj87,
Not only is it slower, it’s also less energy efficient. What’s everyone’s opinion of induction charging? Does efficiency matter? My opinion is that we need to put more emphasis on efficiency going forward. I have no preference for induction charging. If there’s a cradle that’s well designed and charges using electrical contacts, then to me that’s just as good as a cradle/pad that works inductively I know these went out of style though, haha.
Wireless charging much slower?? What? My S8 Active takes about 1.5-2hours from Zero to full on inductive charging. Regular slow charging is about 5+ hours. I really don’t care at that point… though I guess some people may have gotten spoiled by some phones that can do 50% in 15min or something akin to that.
If your inductive charging is slow it is because of the implementation of the charger or phone… not a limitation of the tech.
cb88,
If you want to say the wireless charging is “good enough”, then I can accept that. After all, it hardly matters for those of us who sleep, haha. However when talking about the limitations of the tech, I think it’s totally fair to say that inductive charging really IS limited with regards to the amount of current you can deliver without causing damages. With induction you face inherent inefficiencies, and that lost energy manifests itself as heat (sort of how an inductive cook top works). You’re looking at 8W – 12W range. USB-C is rated for up to 100W. Obviously this is more for laptops than phones, but nevertheless the point remains that wired has far more headroom than wireless.
Even with your s8 phone, wired wins if you use a “fast charger”…
https://www.blogtechtips.com/2017/10/12/fast-wireless-charging-vs-fast-wired-charging-galaxy-s8/
Granted, all of this is probably irrelevant for the average consumer, who just buys whatever seems “cool”. But we have a new problem: without wireless standards, we could easily end up in a situation with numerous incompatible wireless chargers (just like wired chargers were many years ago).
https://www.edn.com/wireless-charging-the-state-of-disunion/
There already is a universal, standard, charger. Pretty much every charger out there has either a USB Type A port, or a USB Type C port. The chargers work with pretty much every mobile device out there (phone, tablet, netbook, laptop, etc). You only need to carry around a single charger for all your devices.
The cable you plug into the charger depends on the device you want to charge. Some use MicroUSB, some use Type C, some use Lightning. Just carry around the cable you need for the device you have, and you can plug it in anywhere to charge.
Hopefully never.
USB has some security problems (undetectable keyboard loggers, etc); video has some security problems (recording everything on your screen); and Thunderbork is a massive security disaster. By combining all of this unrelated stuff in the same port you create the possibility of “super trojan” devices that are able to take advantage of all of the separate security problems simultaneously.
At least when I plug a monitor into a HDMI or DisplayPort socket, I know the monitor isn’t able to do anything except see the video card’s signals.
LOL… you actually believe that. You realize that HDMI and Displayport snooping is equally as possible it isn’t a secure link in any shape or fashion. The only security it does have is to more or less prevent people from playing DRMed streams and recording them basically all of that encryption is broken anyway… even for the signals that may end up encrpted with HDCP etc…
The main dangers of Thunderbolt are that it allows access to the system bus… Hopefully USB4 addresses this in some way.
Also HDMI has an Ethernet channel… as part of the spec.
cb88,
You brought up an excellent point about thunderbolt’s notorious security problems. What do you think the odds are they addressed it before making it part of the USB4 standard?
To mitigate damage in the past they relied on hardware virtualization features available on some x86 chips to isolate the vulnerabilities (Intel® Virtualization Technology for Directed I/O (VT-d)). But there are several problems with this. It requires the OS to virtualize everything, which requires driver re-engineering. This not only increases complexity but also risk if they get it wrong:
https://www.osnews.com/story/129501/thunderbolt-enables-severe-security-threats/
Furthermore every DMA access needs to be re-routed to a safe staging area which nullifies all performance benefit thunderbolt is supposed to offer in the first place. There’s no other way to be 100% safe using thunderbolt’s faulty security model.
In retrospect the fault with thunderbolt is so obvious: you don’t put a DMA controller inside of untrusted peripherals. Seriously, when you plug in a thunderbolt camera, it shouldn’t have frickin’ access to the host ram, this is so stupid. This would be fixable by explicitly moving the DMA controller out of peripherals and onto the chipset, but I’ve heard nothing of the sort. My understanding was that they were going to take thunderbolt as is, which means we either end up with speed killing mitigations to babysit DMA transactions, or USB4 will inherit thunderbolt’s vulnerabilities, which is a damn shame.
And what about CPU architectures that don’t have hardware virtualization? Are they all going to be implicitly vulnerable going forward when they get USB4 ports?
Hopefully I’m just out of the loop and they’ve already managed to fix it somehow, but if not then this is extremely disappointing. Who would green light external buses with known vulnerabilities and why?
Note that this can never make it 100% secure. A malicious device can tamper with the code that sets up IOMMU before that code is executed. Even if the code is proven correct (e.g. digital signature check, or maybe TPM) a malicious device can modify the code after it was proven correct and before it’s executed.
To guard against that an OS could (in theory) set up a “secure enclave” (on Intel CPUs that support Software Guard Extensions), then prove the code in the enclave is correct, then execute the code. However; a malicious device could modify the code that sets up the secure enclave and… It’s just shifting the problem somewhere else and solves nothing.
One “100% secure” solution is for firmware to set up IOMMU (using code in ROM that can’t be modified, taking special care to ensure that no data in RAM, including temporary data on stack, is used by the code in ROM) before the firmware starts an OS. This is not possible (not part of any BIOS/UEFI spec and not something any OS will expect).
Another “100% secure” solution is to have a “Thunderbork enable/disable” flag somewhere; so that the device is isolated from the system bus until after the OS has configured IOMMU. This would mean no Thunderbolt device could be used by firmware or during boot (e.g. no possibility of “boot from Thunderbolt device”); and it would be a backward compatibility disaster (impossible for an older OS that doesn’t enable Thunderbolt to use any Thunderbolt device); and there is no flag and nothing in any standard (e.g. in ACPI tables) to allow such a flag to be described.
The only other “100% secure” solution (and the only solution that is practical) is to rely on physical security – e.g. literally pack the socket/s with epoxy resin (on every computer for the entire company) to ensure nobody (no unsuspecting employee working for the company) can ever plug anything into the socket/s.
When I see: “Universal Compatibility with Thunderbolt 3 devices”
https://gbhackers.com/usb-4-released/
That doesn’t sound like they changed much. 🙁
HDMI also transmits audio. And HDMI is bi-directional. The CEC standard allows one device to transmit user-control signals (e.g. volume up/down) to another device.
So how easy would it be to send a malicious packet over HDMI? It’ll take less time to crack HDMI than it took for USB. All the lessons from the USB-compromise research are being put to use against HDMI.
‘Not only is it slower, it’s also less energy efficient. What’s everyone’s opinion of induction charging? Does efficiency matter?’
>
>
Induction charging is just as much of a joke as 3D TV was. Most devices aren’t and never will compatible with it, so what’s really the point of it?
Do you *REALLY* want to induction charge a USB Battery Pack? Tell you right now the answer is *NO*.
None of my phone have induction charging built-in,but there are adapters you can buy on e-bay for under $5.00 that you can attach to the phone case or other device and plug into the usb port, making it induction charging compatible.
Don’t bother.
You’re not saving time,engery or money with induction charging.
It’s just another fad like 3D TV was,and is pretty much aimed at the same crowd.
I’ll tell you the point… No physical connections. Maybe I’m a klutz, but a device plugged in sitting on my night stand will get knocked off on average twice a week. That induces a lot of wear on the physical connection between the two. This gradually results in a busted usb port on the device requiring excessive fiddling to get it to make a good connection. What happens with wireless charging. The device still goes flying, but the case is enough to protect it, no additional wear is induced at another stress point.
Bill Shooter of Bul,
I don’t say this to contradict you, because I hear what you are saying, but what would you think about a wired dock that the phone could fall off without damaging it? (I’m not suggesting a USB-C connector per say, but just hypothetically)
‘I’ll tell you the point… No physical connections. Maybe I’m a klutz, but a device plugged in sitting on my night stand will get knocked off on average twice a week. That induces a lot of wear on the physical connection between the two. This gradually results in a busted usb port on the device requiring excessive fiddling to get it to make a good connection. What happens with wireless charging. The device still goes flying, but the case is enough to protect it, no additional wear is induced at another stress point.’
>
>
If you’re knocking it off this hard, a busted usb port is the least of your concerns.
You’ve most likely cracked or otherwise damaged the screen itself.