Intel and others plan to release a new version of the ubiquitous Universal Serial Bus technology in the first half of 2008, a revamp the chip maker said will make data transfer rates more than 10 times as fast by adding fiber-optic links alongside the traditional copper wires. Intel is working with fellow USB 3.0 Promoters Group members Microsoft, Hewlett-Packard, Texas Instruments, NEC and NXP Semiconductors to release the USB 3.0 specification in the first half of 2008, said Pat Gelsinger, general manager of Intel’s Digital Enterprise Group, in a speech here at the Intel Developer Forum.
it will be interesting ot see how eSata and Firewire evolve to compete. but a thought occors, usb is nice because it is a reletively cheap (and virtualy universal) way to connect things, and yes i mean cheap (note that 40$ you paid for a printer cable from best buy was all in mark up from the store, thosse things dont cost didilly).
wont the addition of fiber optic links up the price considerably? though for that kind of speed boost I wouldnt mind to much.
Edited 2007-09-19 13:22
I come from the early days of the MiniDisc, from way before NetMD (hook up your MD recorder to a PC) was introduced, and I can tell you, back then, a good selling point of certain MiniDisc recorders was that they included the optic cable to hook the MD up to the CD player.
Dunno where it stands now. I still use the same optic cable that came with one of my MD recorders ten years ago, and it still works flawlessly.
How many CD players actually had optical out?
About every stand-alone (and even many integrated devices) after, say, 1992? I’ve never had a CD player without optical out – except for the one I’m using in my HiFi set up now, because that one is from 1990 or 1991 (the reason I use it even though it’s old is because it’s such a sturdy and good piece of equipment – plays every disc I throw at it ).
In Denmark many were sold without any digital out (neither optical nor electrical) even in 1998 (and even after Y2K). I’ve yet to see a CD-player with digital out. DVD-players is however a quite different issue.
One of the advantages of optical data/audio connection is that there is no electrical connection between the two devices. This is why MIDI, for example, is opto-isolated. It sounds like this new USB connection is going to have electrical and optical connections running side by side however.
Offtopic…yes
The guy at best buy told me that I have to use a quality printer cable if I want quality prints.
Seriously though, you’re right. I’m waiting for the class action law suit against best buy for their damn cable prices and lying to consumers.
HDMI is the worst. They’ll tell you the picture quality on your digital tv from your digital dvd player will be better if you use a $150 cable.
My $3.50 HDMI cable from monoprice.com works just fine.
Digital is great, 1’s and 0’s, it either works or it doesn’t, nothing in between. If I see a picture on my TV I know that it is impossible for the cable to be any better.
It’s frightening how much people pay for those cables. I simply buy them online where you can get them for about 5 bucks. What’s even more disturbing is the fact that pretty much all printers I’ve ever seen don’t even come with a USB cable! Even my DSL Modem came with a USB cable. Everything else that requires a cable comes with it, but Printers for some reason never have, even the older Parallel printers. Why is that? I guess it’s one of those mysteries like the Hotdogs versus Hotdog buns…
Itäll be interesting to see what the real world performance will be, but anything that makes USB faster is a Good Thing (TM).
I feel that calling this technology ‘USB’ is solely a marketing decision.
Most users don’t need such a speed of a serial bus, neither hardware would utilize it.
The question is: what’s that for?!
Hardware interfaces don’t really need it at this point since 2.0 has more than enough bandwidth but it would be incredible to tun your whole studio right out of an external device with absolutely no lag.
Most users don’t need such a speed of a serial bus, neither hardware would utilize it.
The question is: what’s that for?!
External (hard) drives, for example. I use them a lot and USB 3.0 definitely is something I’m looking forward to.
That is the same example I was going to give. Transferring a lot of data to an external harddrive can take time, its even worse if you have an older computer with USB 1.1. This USB 3.0 is more than welcomed for me.
The only thing is – it will take some time before USB 3.0 is out there and affordable, as somebody already said.
I actually am looking around these days to get eSATA card and external drive for it. Any first hand experiences?
“I actually am looking around these days to get eSATA card and external drive for it. Any first hand experiences?”
I just recently (like a week ago) replaced my (aging) external USB 2.0 drive with an eSata MyBook…I can safely say that performance roughly doubled, if not even more than doubled. I don’t have any raw numbers, but it sure does seem a lot faster.
BTW, MyBooks don’t come with an eSata cable…you have to buy one, and they are about $20.
If you’ve got free internal SATA ports then you can use a simple bracket to convert them to eSATA ports. In my experience that works perfectly, I’ve been using eSATA drives for a while now.
Personally I prefer buying a separate enclosure and drive rather than a complete external hard drive. Those drives are sealed up and opening them invalidates the warrantee, while an enclosure can be reused with a different drive if required.
Amen. That, combined with the fact that with enclosures sold with the drives already installed, you pretty much have to destroy the enclosure to get the drive out if anything goes wrong. Which seems to happen fairly frequently, since many external drives have plastic enclosures with no cooling.
If you’ve got free internal SATA ports then you can use a simple bracket to convert them to eSATA ports. In my experience that works perfectly, I’ve been using eSATA drives for a while now.
Now, that is the advice I was hoping to get. I do have free SATA ports internally. And yes, separate enclosure and drive externally.
Thanks.
USB is a horrible interface for external harddrives. IEEE1394 (Firewire) and eSATA are much better suited to these uses.
The main problem with USB is that is requires shuffling all data through the host CPU. Transferring data to an external harddrive can send your CPU usage spiking. Yet transfers between internal harddrives doesn’t (as IDE/SATA/SCSI doesn’t require the host CPU to be a traffic cop).
Use the right bus for the job. USB is good for devices without processing power (cameras, mice, keyboards, gidgets and gadgets, mp3 players, etc). For larger devices or devices that need fast transfers without bogging down the host system.
It is not THAT bad, please: it depends on what you’re doing. If you’re copying many not-that-big files, you’ll have no problems. At least I didn’t. Nothing really worth complaining.
But, if you need to move, for example, 10 GB of data to external drive (including, for example, 5 files that are close to 1 GB in size each), then it gives horrible experience on Windows 2003 server (Vista is handling this *MUCH* better).
So I started thinking of eSATA. It would solve freezing problems and it is faster too.
I Imagine that streaming HDTV would be one good example
You can forget that.
Only way that will happen is if they add all the HDCP BS to it like they did to DVI and called it HDMI.
Not if you’re using your own footage. high-def camcorders are on the market now,
Harddrives?
It’s version 3 already and they still haven’t figured a way to hook it up right to your brain’s axons…
Sadly that won’t be part of the specification until USB 4.0
*Bring on the cybernetics dammit!!*
If they change the cable, then they should call it something other than USB. USB 1.x, and USB 2.x devices use the same set of cables. If they have to change it for USB 3.x, then it won’t be backwards compatible and it’ll start confusing people.
Just imagine the conversations:
Customer: I have a USB cable, and this device, but I can’t plug it in.
Agent: Well, that’s because is USB 3.0, and you need a DIFFERENT cable, which we can sell you for (=cost*excessive multiplier).
I agree with the sentiment. Standards like SCSI were bloody annoying because there were different cables for the different revisions (and different connectors, sometimes just to suit an implementor’s whims).
But that’s not going to stop them from keeping the name. After all, USB is a valuable trademark and they are going to keep it just because the name will give it a leg up in the world.
I also wonder about having an electrical and optical interface in parallel. Surely people aren’t that easily confused, and will be able to handle USB for low speed peripherals and a separate optical bus for high speed peripherals until the optical bus takes over. Since the cables will be less complex, it should keep the price of entry lower and maybe even encourage adoption.
I too envisioned something that looks like the old cable, has the same four metal contacts, and adds a highspeed optical link. The thing is, how would an optical signal be passed through? would usb3 hubs have lasers?
would usb3 hubs have lasers?
I imagine that they would. Low-bandwidth data fiber channels are getting pretty cheap nowadays, What would be impressive would a an all-optical miltiplexed uplink on an 8-port hub. Otherwise, you might get latency and bandwidth issues with recoding the signal.
Yes the cable will be different, but they might be able to make it work with the same plugs, just like with USB 1.0 and USB 2.0. So if you have a USB 3.0 plug into a USB 2.0 port it’ll just work at 2.0 speeds. That might be possible, but I’m just talking w/o looking at any specs here.
I find that hard. Have you seen a fiber optic network cable before? You have to have a sheething that protrudes into the device in order to protect the light emissions from outside the device and cable. Building that into a current USB cables would not produce a cable that would still work in older USB revisions.
And if they make it thinner, then it will likely be very likely to break.
Either way, you still have the issue of people trying to plug the old USB cables into new USB3 devices and wondering why they are not getting the speed they expected. So my original point is still valid.
The only real solution is make it a different cable – perhaps a USB C/D cable, instead of the current USB A/B would be the only way to avoid confusion (then the devices would have to support both to be backwards compatible, or at least the USB3 hubs and root devices would have to) – but then you’d just as well call it something else.
Have you seen a fiber optic network cable before? You have to have a sheething that protrudes into the device in order to protect the light emissions from outside the device and cable
But USB already has that, the metal shield around the USB plug would act as a light shield, it’d be tricky, but you could build the optical coupling into the end of the plastic insulation that the contacts are mounted on. The tricky thing is getting the alignment right.
Yes, USB has the metal shielding, but you need a non-shiny, non-reflective shielding that is tightly wrapped to the fiber – which is why a lot of fiber cables uses a plastic or rubber shielding on the end.
So no, that wouldn’t likely work.
The actual interlink would be in the non-shiny, non-reflective insulating part of the USB plug (if it could be made to fit), but the Shield, and the shield-part of the socket would provide a thorough physical barrier to the light.
I can only assume that’s how they are doing it, but my god that’s fiddly. The plastic bit inside the plug’s metal sheath is, what, 2mm thick or so?
Either (a) an exciting manufacturing challenge, or (b) very easily breakable
http://crave.cnet.com/8301-1_105-9780794-1.html
The link shows a picture of the card and cable and it’s got a USB standard connector on it.
USB 3.0 will (supposedly) be completely backwards compatible, just like 2.0 was.
why would you think that the cable is not going to be backwards compatible? They are ADDING an optical link not replacing the coper wires with optical. Just like you need a shielded cable in order to run at usb 2.0 speeds and you don’t need shielding for a low speed usb cable you will have an ultra high speed cable that has an optical channel in order to achieve the really high speeds. The jack is exactly the same the protocol is pretty much the same and there is no problem using the new cables with older devices or new devices with older cables as long as you don’t mind the reduction in speed. In every respect this IS a usb revision. The big problem however is that there is quite a bit of overhead on most usb drivers that exist today. There will have to be some better quality control on those in order to reach the promised speeds. But then again there are hardly any usb 2.0 devices that perform up to the standard …
Truth be told I really don’t know of any device on the market today that can benefit from a usb 3.0 connection. Sure it allows for more power to be drawn and has higher promised throughput but you can get high speed with eSATA and a lot of power and reasonable speed with firewire800. Yes firewire 800 seems to be an apple exclusive ( almost ) but this is mostly due to the fact that there is no consumer interest in firewire products when it comes to the windows world. That still doesn’t mean that the technology does not exist or that you need to create a super expensive optical hybrid cable for the next revision of usb. Oh and whatever happened to wireless usb?
I don’t want a new USB/ATA type of connection that tax the CPU/memory.If USB3 is as inefficent as USB 1/2, then I don’t want it. From what I have heard is that ATA and USB wasted many CPU cycles where Firewire/SCSI was much more intelligent and did much more on itself without requiring assistance by the CPU. When I build my computers I want parts that Offload the CPU, not requiring it to work harder.
Then why would you keep on buying better CPUs if it’s to let them sit there do nothing ?
Why do you think computing is getting cheaper and cheaper ? because we are building actually less.
SCSI is a fine technology but it’s much more expensive than SATA.
Winmodem for example were cheapers because they offloaded much of their operations to the driver. That’s the way technology is going into.
Use my CPU, damnit, if it’s making everything cheaper. Use it as much as you have to.
No, technology is going towards offloading the main CPU, not burdening it with more stuff. Thats why we have 3d accelerators, sound cards, tcp checksum offloading etc.
I agree with you. We are seeing that the CPU speed/performance isn’t increasing as before, and memory access is one of the BIG bottleneck. As I see the only way to increase performance in the future is:
1. Offloading the CPU(s)/main memory with more smart devices/specialized processors that offload the (main) CPU/Memory of routine work.
2. More efficient OS’es, that is from ground up build for multiple CPU(s)/multithreading…
Vista doesn’t accelerate the sound part of DirectX and that made the users of Creative X-Fi cry because all of their old games are not going to get EAX if they were using DirectX for the sound part, so you are wrong about the sound card part.
Now the games are either required to use OpenAL if they want to benefit from hardware sound acceleration or shut the f–k up and get used to it.
Of course, professionals having requirement for professional sound hardware will always be able to use it, with ASIO for example, just not by DirectX.
3D accelerators are going to merge with the CPU. You need to see the research done by AMD+ATi and Intel about this. AMD bought ATi for a reason.
And then there’s also the Cell of the Playstation 3 that is actually offloading some of the work usually done by the graphic card.
Edited 2007-09-20 20:44
[quote]From what I have heard is that ATA and USB wasted many CPU cycles where Firewire/SCSI was much more intelligent and did much more on itself without requiring assistance by the CPU. [/quote]
That’s because USB specification has a horrible, terrible software interface. I can’t stress how badly designed it is. But then I am unsure, maybe it is done that way on purpose. Perhaps to create a more involved developer market.
The usb spec exposes the ridiciluous transaction-level protocol details to the software. What this means is when a transaction fails, (a transaction is like a request that is part of a single data transfer) the software has to retry it, constantly monitor multiple request queues, reclaim and restore them etc. and take account on very low level usb error codes. So much crap you have to do. Another hint to this complexity is no other device protocol requires a whole stack.
Same comes for ATA, damn, all you do is read sectors, write sectors, do dma, and ok some extra features are acceptable. But, 8 ATA standards? So many registers and modes. For what? Hardware is overly complicated nowadays, and I blame incompetent spec authors. So much bloat.
Final word, try PCI-Express. You configure it, then simply access your memory mapped device conveniently at 2.5Ghz.
Agreed 100%. USB is rediculously designed, and overly complex. I swear it was designed just to make CPU’s run slower.
Perhaps it is because it replaced the old Serial ports, which provided a very basic interface that nearly any device could communicate over and create its own protocol over.
So, this is likely on purpose and done specifically to allow the same amount of freedom in the protocol that the old serial devices had. Thus why you would have to have an extensive software interface that monitors even the lowest level errors – it’s really a driver interface to some unknown device that only the driver knows how to completely communicate with and interpret, which btw has to run on the CPU.
Wireless USB will be used by then, taking the low bandwidth hardware, leaving USB 3.0 to fight over things like wired keyboard, mouse, webcam, hard drive… all of which can get by pretty well with standards other than USB 3.0. I’d like to have my external hard drive on eSATA and my mouse on USB 2.0. Oh wait — I already do!
USB 3.0 will have the additional disadvantage of more volatile cables due to the optical component. Not to mention the extra price you’ll pay for such a unique and complex cable. Copper and fiber optics in one cable doesn’t sound very consumer friendly to me!
The potential market for USB 3.0 has already been fleshed out by Firewire 800. That is to say, there is not much of one at all.
Appendix:
Last I heard, regular old cat5e was doing pretty well at gigabit speeds without needing fiber optics. A Firewire 800 standard even supports using it instead of traditional Firewire cabling.
So most data cables could be the exact same thing, but consumers are happy to pay $30 for a new plug at bestbuy. I’m not sure why that is. I know why corporations are happy using different cables everywhere, however. Fat profits on accessory sales.
Yeah, but we’re not talking Gigabit, we’re talking ~5 gigabits with USB 3.
Cat6 cable is capable of reaching 10 Gbps+ over short distances. The next Ethernet spec includes something like 380 Gbps over 10′ using copper cable.
Fibre optics is really only good for EMI-free transmission and/or long distances. Neither of which are needed for USB-like connections.
On one hand, the initial devices using USB3 will be mighty expensive, as they’ll have to include optoelectrical transceivers, and so will the motherboard.
On the other hand, if Intel really manages to pull it (and Intel tends to do this stuff fairly often), transceiver technology will become cheaper and cheaper still. Which is great!
On the third hand, USB sucks *an awful lot*, leaves everything unspecified and always leads to competing and incompatible solutions battling to become a de-facto –and then ratified– standard in a future revision. Like mass storage (not standard on 1.1), digital audio (same as mass storage), networking (still not standard on 2.0 and requires special dongles for no good reason), debugging interface (same situation as networking), and the list goes on…
Contrast to Firewire, where everything I wrote was either standardised from the get-go, or was just a matter of implementing a driver bridge instead of requiring yet another dongle. Firewire is a blessing, and I’m truly saddened pre-Jobs Apple of the 90s didn’t push it hard enough, and post-Jobs Apple of the 90s up to today didn’t lift the royalties on the trademark and didn’t push it hard enough as well.
BTW, Thom, every revision of ATRAC sucked. Its quality is only better than TwinVQ and ancient MP3 encoders that had a messed up the psychoacoustics engine. Modern encoders like LAME and modern decoders like MAD are leaps and bound better than ATRAC. And I’m not even going to compare it to Vorbis or AAC, and the SBR versions of AAC and MP3.
Get over your MD fanboyism; MD was only good for bootlegging out of the sound table when one couldn’t afford proper DAT equipment. Let it go.
Browser: Mozilla/5.0 (SymbianOS/9.1; U; en-us) AppleWebKit/413 (KHTML, like Gecko) Safari/413 es70
(non-glass) Fiber optics are pretty cheap nowadays, Optical TOSLink connections are built into many motherboards (the macbook pro), or available as $20 expansion cards. In 5-10 years time, the mass-market cost of high-bandwidth optical connections should be well below that.
MP3 sounds canned to my ears. I prefer the warmer, fuller sound of proper ATRAC. I don’t care what ten million tests say – the only ear I trust is my own. MP3 sucks for me, and I have every damn right to say that whenever I want.
Meianotaaoiew, if you have a problem with me (and we know you do), please keep it on email or something.
Unfortunately most people here are living 20 years ago, I have a MZ-RH1 MiniDisc player which supports ATRAC3pro, and compared to mp3, it is wonderful at low bit rates.
Which is precisely the reason I mentioned the SBR variants of MP3 and AAC (MP3pro and AACplus).
ATRAC messes up with frequencies, stripping ones and *creating* others which don’t exist, and it seems to decide to do it in some all-too-arbitrary fashion, specially on sound pulses above 16kHz (like hi-hats). AAC likes stripping stuff above 16kHz as well, but it’s at least consistent and keeps details better. Later revisions of ATRAC do alleviate this somehow, but they never solved it. And Sony is too stubborn to let it go and embrace modern, standard technologies.
And I really like Sony’s build quality. But not their stubbornness of keeping everything proprietary and implementing competing technologies (like MP3) in a way that makes it sound (artificially) worse than what proper codecs yield.
Saying a compression scheme creates “warmth” is the same audiophile myth that preaches coaxial S/PDIF sounds better than toslink. It’s a digital protocol, dammit, any differences are due to inability to perform error correction. Anything else is an excuse to justify paying an awful lot for some equipment that’s not nearly as good as the price tag would justify, so all one have left is to appeal for subjective emotions that nobody else can relate to.
The MP3s LAME can produce when disabling the psy models (–athonly and a high bitrate) is just *fantastic*. Try it sometime. Quicktime’s encoder doesn’t hold a candle. Nor does Fraunhoffer’s.
IMHO, not when it’s completely off-topic. So here it goes, a taste of your own medicine…
Look, you should at least begin by showing some respect and spelling my handle correctly; then, instead of using subjective reasoning that by its own nature one can’t refute, you halt the flow of the discussion. That, too, isn’t the least polite; worse than going off-topic IMO.
I have some beefs with you, I’ll give you that. The timing of your news posting; the changing of the comments system that benefits “easy unanimities” over consistently good comments (I even tested it myself, by posting a very agreeable-upon one-liner comment which yielded me 18 easy points!); the linkbaiting strategies (like on the recent post about ACK); the completely arbitrary modding behaviour (by slamming -500 points on some poor dude that disagreed with you on a bad day, while OTOH you leave some notorious trolls untouched); the general attitude of “I’m the boss here”, which is historically the #1 most blatant sign that someone ought to be stripped such powers.
Learn to take some criticism, Thom. Your inability to do so, coupled with abusing your modding powers, leaves quite a sour taste in the mouth. OSNews has lost the interest of some very valuable members (like rayiner, which now posts only seldom), and I have lost count of how many times I considered jumping boat as well. I withheld, because for better or worse you and Eugenia have quite some number of connections, and sometimes nail a good interview or two.
OSNews is becoming increasingly more of a meta-RSS-agregator to me than a forum where I can learn from the experience of others (ormandj, dylansmrjones and rayiner taught me a lot, for instance), expose my opinions freely, and let the ecosystem take care of itself. The linkbaiting and traffic-generating strategies took away the appeal of this site over /. and some others.
Anyway, since I’m on the verge of going down the -500 drain myself, I wish a nice day for my OSNews pals, it’s been a good ride. Goodbye.
Have you actually tested yourself to see if this is anything but placebo effect? Your own expectations and biases can have a powerful effect on what you hear…
Seeing if you can actually tell the difference between the original WAV and a high quality MP3 would be a start, there’s software that makes a blind comparison very easy. If high quality MP3 is indistinguishable from the source audio (as it is for almost everyone), then it’s pretty obvious that ATRAC isn’t offering more accurate sound.
Personally I think that the perceived superiority of ATRAC (if it’s anything other than pure placebo effect), comes from it adding distortion that’s pleasing to some people’s ears. Sometimes less accurate encoding can actually sound better to some people, especially if it’s a sound they’re used to. For example, I’ve known a couple of people who prefer low bitrate WMA to the original CD. I think they described the sound as more ‘warm’ as well. You might be able to get the same effect by using certain EQ settings or messing with the MP3 files in a sound editor.
It’s always pissed me off that HDMI came into existence. USB handles much more throughput then HDMI does. HDMI sucks balls, and only came to be because of DRM, which pisses me off more. With USB 3.0, I can just imagine, if they would have been smart, and just used the already existing standard in USB instead of HDMI, what kind of content we could have
HDMI isn’t a new standard. It’s just DVI with more consumer friendly connector and support for audio. All copy protection stuff also runs on DVI. So get your facts right before start flaming.
DVI, and HDMI are the same thing, with different connections. Both pointless, stupid technologies
You don’t want to bash HDMI, but rather HDCP:
http://en.wikipedia.org/wiki/Hdcp
Also before deeming HDMI and DVI “pointless” beware that the transmission bandwidth is 3.7 Gbit/s for DVI and 7.4 Gbit/s for Dual-DVI links, required for most 30″ displays. How would you produce a Dual-DVI level bandwidth with USB3.0?
I’m surprised no one’s raised this topic yet: is USB 3.0 going to be an open or closed standard? Is Intel going to lock it up as a proprietary interface, protected by a wall of software patents, non-disclosure agreements, DRM, license fees and a requirement that you use Windows Vista? Or will it be totally open source?
Inquiring minds want to know.
the reason there’s no usb cables with new printers is kind’a historical as in the past Macs had a different printer ports so that was an excuse to not provide any cable in a box (as supplier could not guess which iface youll use)
re usb 3.0 logicaly thinking there must be at least one copper pair as how would you power devices like pendrives? i doubt that by internal battery.
Browser: Mozilla/4.0 (compatible; MSIE 6.0; Windows CE; PPC; 480×640) Opera 8.65 [en]
Once Macs and PCs came standard with USB, what was the excuse for not including a USB cable with printers that only had a USB connector? It’s not like the days when there were Mac-specific printers, or kits to use PC printers with Macs.
honestly, this sounds stupid. fiber optic cables are usually pretty fragile. not only that, but you can still get similar speed on a regular cable. the only reason to have fiber optics is for long distance connections, where the speed of regular cable becomes noticeable.
Sorry if this has been mentioned before but it’s itching me and i have to say it..
Keep backwards compatibility, but please adopt the Micro and Mini plugs more than the normal ones. The standard ones are becoming a little too big nowadays. Imagine how nice a USB flash stick would look with a micro/mini plug.
I’m guessing a mini/micro plug would be way more fragile on your system. Smaller shallower connectors == more chance of breakage somewhere. Not saying any connector is great.
As for fragile, what is up with the SATA power/data cable plugs?