When Intel unveiled its Light Peak optical interconnect (video) at IDF earlier this week, many noticed that the demonstration computer used to show the new technology was in fact a hackintosh. Well, thanks to Engadget we now know why: Apple is very, very involved in the conception of Light Peak. Let’s take this opportunity to look at some of Apple’s other connection standards from the past.
Apple sees the light
Many assumed that Intel was the main driving force behind the creation of Light Peak, but as it turns out, it’s actually Apple who is behind the whole idea. Of course, optical interconnects are anything but new (fibre-optic communication stems from the 1970s), but Apple brought the concept of Light Peak to Intel. Intel then developed it, and showed it off earlier this week. On a hackintosh. Mystery solved.
According to documents seen by Engadget, Apple first brought the concept of a super-fast optical interconnect to Intel in 2007, with the goal of replacing the several standards we use today (USB, FireWire, various display ports). In fact, initial conversations and disagreements took place directly between Steve Jobs and Paul Otellini, CEO of Intel.
Apple has grand plans for the technology, according to Engadget. It will start off with the ability to daisy-chain several peripherals into a single Light Peak port, with later plans involving Light Peak replacing every other port. In other words, all your networking, peripheral, and display needs will go through (a) Light Peak port(s).
The technology will arrive sooner rather than later, according to Engadget. The first Macs to have this new technology on board will arrive in Autumn 2010, and a low-power variant will arrive in 2011, destined for mobile equipment. While the full plans are not entirely clear, it looks like Apple is pushing hard for a single-port solution for all peripherals and communication needs. They could even skip USB 3.0 altogether (or at least not emphasize it too much). I do wonder, however, how Light Peak plans on sending power though the cables, something for example USB is able to do. CNet is reporting that Intel is planning on adding copper to the cable for power transfer.
Apple and ports
Assuming this is all indeed quite true, this won’t be the first port standard introduced by Apple. In the mid-1980s, Steve Wozniak was looking for a project to work on, and someone suggested he create a new peripheral connection system. The story goes that he returned a month later with the Apple Desktop Bus, a bit-serial computer bus (it looked like a ps/2 connector) which could daisy-chain devices together.
It would be used on Macs from 1986 (Apple IIgs) up until it was superseded by USB on the iMac in 1998 – however, Apple’s laptops kept on using it internally until February 2005. ADB never really gained any traction beyond the Mac platform, although it was occasionally used by Sun, HP, and NEXT.
Another Apple connection standard was the Apple Display Connector, a proprietary modification of DVI, which combined digital & analog video, USB, and power signals all in one cable. It was introduced in 2000 on the PowerMac G4 and G4 Cube, and was used to connect to the Apple Cinema Displays of the time. In July 2004, the new Cinema Displays with an aluminium shell and DVI connector appeared, which sounded the death knell for ADC. The last Mac to ship with an ADC connector was the single-processor PowerMac G5, which stopped shipping in June 2005.
A standard which didn’t die, but never gained the traction Apple had hoped for, is FireWire. Even though Apple didn’t create it, they did initiate it in 1986, but it was eventually developed by the IEEE P1394 Working Group. While Apple played a major role in its inception, other companies also contributed to its development, like Texas Instruments, Sony, DEC, and IBM. While FireWire “lost” the battle with USB, it is still used in a lot of specialised markets.
Every now and then, people will claim Apple invented the USB standard, but this is a misunderstanding. In fact, Apple played no role in the inception of USB, as it was developed by Compaq, Digital, IBM, Intel, Northern Telecom, and Microsoft. The first specification, USB 1.0, was completed in 1996. Some do claim Apple popularised USB by switching to it on the very first iMac.
As you can see, Apple has quite the history when it comes to these matters. Let’s hope they are able to make Light Peak a success, as the promised speeds and daisy-chaining abilities sound too cool not to have.
Apple has grand plans for the technology, according to Engadget. It will start off with the ability to daisy-chain several peripherals into a single Light Peak port, with later plans involving Light Peak replacing every other port. In other words, all your networking, peripheral, and display needs will go through (a) Light Peak port(s).
I like the idea of a cheap fiber optic cable replacing my ethernet cables in my home, but lord knows how much this is going to cost. If the plan is to really use it as a network cable then I expect a big push in Airport development unless Cisco/linksys and the like get on board with this idea.
The idea of daisy chaining I don’t think has been successful. The last time I really daisy chained anything was SCSI devices. My PC only had 1 port but I had a JAZ, ZIP, 2 external HD’s all connected together with adapters going through that 1 port. I haven’t really seen anyone else do it either. I’m not knocking the idea but they should maybe puts lots of ports on the monitor so the consumer sees the monitor as a hub.
Edited 2009-09-27 21:24 UTC
I think really the goal here is for Apple to make laptops cheaper, smaller and cut costs across the board by moving to a single type of port that can handle all connections. With all the complaints about the MacBook Air only having one USB port, Light Peak could solve the connectivity problem in these ultralights by providing screen, USB, networking and more through just one socket. I certainly like the idea!
My FireWire disks are daisy-chained, and people I know that try to stay with Firewire HDDs do so to.
Rather than the feature not being successful (it works just fine), it is the connector that features it which has not gained traction. Pretty much every FireWire disk has 2 FW sockets just for that. Pretty much no USB disk does. That is probably why you have not seen it.
The reason why you don’t see USB HDs daisy chained is because while the B in USB stands for “bus”, it is of course not a bus technologically. That’s why.
Also, when you read transfer/access speed comparisions from ye olden days, FW always lagged substantially behind plain IDE setups, so there was really no point in having HDs with FW interface.
Actually daisy chaining has been the preferred connection method in firewire, firewire can work hostless, but without intel on board it had always sort of a nieche market, big nieche, but it never really has been as popular as the inferior usb.
Apple seems to search for a universal firewire replacement, and this time they have Intel on board.
Unless the OEM’s (Dell, Acer, HP) get on board with this though it will just be another niche market that will fizzle out. Sony and Apple were the only 2 OEM’s I remember really pushing firewire which is one reason why I think it never caught on to the mainstream.
OEM’s got on board with USB when x86 chipset manufacturers started putting support for USB into their chipsets. The most important chipset manufacturer that did that, and drove adoption for the PC masses, was Intel.
As Apple wasn’t a consumer of x86 chipsets when they came up with and later introduced Firewire the only option for OEMs was a separate chip to support Firewire for quite some time. That made it expensive for them to add at a time when prices were being driven rapidly downwards. There was also the mistaken idea that USB is “good enough”… Anyone that’s used to Firewire that has ever had to copy multiple gigabytes off a USB drive knows that’s a bunch of crap, but marketing droids and folks that are extremely budget sensitive just don’t get that.
Who was/is Sony using to make motherboards that have their i.Link on them?
firewire was invented in teh 80’s and was a standard before USB.
I remember the first time I saw firewire. Apple was demo’ing it at an educational conference in 1994. A $1,000 card and a $500 camera.
At that time it was only being promoted as a video streaming technology.
I asked the engineer doing the demo if it could be used for other technologies like printers and hard drives. “Why would anyone want to do that?”
The idea of having only one type of connector for everything seems really cool. If it catches on there would be very interesting applications. For example imagine an iPhone with a LightPeak port which you can either use for data transfer or you could plug in a display to watch movies from the device. Or if it becomes really ubiquitous you could even use the same port to connect external speakers and listen to your music.
As for the ADC it’s really a shame it didn’t become popular (I don’t know if Apple licensed it to other manufacturers). The idea was that you plug all your peripherals on the display and you have only the ADC cable going from your desk to the tower, thus eliminating cable clutter. Current Apple displays have USB and FireWire ports on them and a single cable that branches at the end to a DVI, a USB, a FireWire and a power cable achieving pretty much the same thing.
It also had a downside. The power button on the display would shut down the computer.
This is an issue if you want to leave your computer running and only turn off the displays.
This depens on whether the power button also switches off the bus, if it does not then this is a non issue.
In firewire devices you often have a mixed environment, some devices do some do not.
But it is not that bad because usually you only daisy chain 2-4 devices together, on the other hand all this saves you to have a hub up and running with a handful of cables going into it, and you dont need a central computer for device2device data transfer!
While those Apple Cinema Display-s are great, the connector technology is quite poor IMHO. I have several computers at home, like PC-s with windows and linux and also a Mac Mini. It is really painful to switch the cable from one computer to the next, as that multiheaded cable is not too comfortable to work with. And think what? They also put power adapter to one branch, so every time I want to move the display cable, I also have to move a power adapter and also need to find a place for it. Damn I hate this. And all because of the idea for having single cable coming from display.
I know that there are KVM switches for things like this, but I do not switch the computers that often and therefore have not invested yet. But the cabling solution is a pain to use with Apple displays. Who had a wonderful idea of having quite heavy power adapter connected to display cable?
I never had to move my ACD between different PCs so I hadn’t thought of that. I am sure it is a pain to move it even if the PCs are a few meters apart.
If you don’t want to buy a KVM switch, you may want to have a look at synergy ( http://synergy2.sourceforge.net/ ). It’s basically KVM over the network.
Apple is anything but the driving force behind Intel. Apple ain’t driving anything besides Apple.
“The Big Guys” are “big guys” and Apple does not play in their league. And Intel has engineers and software developers to found one thousand of new apples. So they don’t need somebody to drive their business. To take an example, for IBM wasn’t a big tragedy that Apple decided not to use their processors any more.
The whole idea of optical interconnect was in the wild since ’70-es. But I guess that for Apple fans, it was Apple who invented the wheel some thousands of years ago. Maybe it was invented by Steve’s grand, grand, granddaddy.
Spot on trolling.
Even with a workforce of a billion of engineers if your company doesn’t have a true visionary then nothing exciting is going to happen. Intel would probably be happy with USB 3.0 and won’t put resources into any technology with an uncertain future, they’ve been bitten by Rambus and Itanium.
As of today, AAPL’s market cap is $163B while INTC’s is $108B. They both have similar revenue. AAPL also does custom chip design, media, retail, software and hardware design and cell phones. Which of the preceding factors disqualifies Apple as one of the “Big Boys” as you so intelligently put it? Does the company in China that makes $0.02 capacitors count as a “Big Boy” since they make a critical component that goes into every electronic device?
Agreed. While Apple’s marketshare may not coincide with that fact, they are one of the leaders in the industry when it comes to pushing technology and have been so since their inception. They seem to take pride in being their first whenever possible. They are definitely one of the “Big Boys”. Consider the fact that regrdless of how low their marketshare is, they are still profitable, at times more so than other companies who actually have far more marketshare. Apple is not want for resources, They have plenty of resources at their disposal and can rally behind a technology and get it accepted in the larger technological community pretty much on their own (though history has shown that things don’t always work out that way).
Just this year alone, you have opencl which all major gpu makers have rallied around even forgoing plans for their own implementation of the concept. Who here can’t name something that Apple has introduced to the masses and is now ubiquitous. They may not always invent everything they introduce, but dammit if they didn’t make them popular.
Bull – at MOST Apple is an early adopter of technology they didn’t invent & which would have become just as popular if Apple weren’t around to “introduce” it. And if they deserve credit for that, it’s more than outweighed by the numerous examples where Apple was WAAAY behind everyone else (PCI, AGP, IDE, USB2, etc).
And when they actually HAVE introduced technology (rather than just jumping on someone else’s bandwagon), that technology has ALWAYS failed in the long run (NuBus, ADB, ADC, Firewire, etc).
Not true. Even for tech they didn’t introduce, they’ve had a lot if influence on use.
USB is actually a very good example. Neither Intel or MS could get it working properly, nor could they get manufacturers to produce much product for it.
It was dying.
When Apple introduced the first iMacs, and had USB instead of their own technologies, they got it working well for the first time. Manufacturers began to produce many USB products, many in new iMac colors.
This inspired MS to finally get USB working in 98 SP2.
They had promised it would work with 3.1.
Even with Firewire, despite what is said in the article, Apple was one of the developers of the the standard.
I love the way you make it sound like the iMac magically revived USB, an otherwise “dying” technology. Despite the fact that the USB specification was first released in 1996 and the iMac only came out in 1997, and despite the fact that USB was included in many PCs (particularly laptops) released at the same time as (and even earlier than) the iMac.
At best, the iMac caused a minor spike in the sales of USB devices – mainly because it was crippled (even by Apple standards of the time) by a lack of any other connectivity options.
Uh, no – try Win95 OSR2.
Are you high? When did they ever promise USB support in Win 3.1? They had been selling Windows 95 for almost two years by the time the USB standard was even finalized.
My Firewire Audio Interface says different. I can also name many attempts where Apple has succeeded. What’s your point? They are still usually ahead of the curve while other manufacturers are still deciding if they should remove the ps/2 port on their PCs. At least Apple tries something different, they may not always succeed but they put their best foot forward. When was the last time Dell did anything innovative or even different than any other OEM on the market? We can all name the most influential tech companies on one hand, Apple is part of that group, along with IBM, Intel and even MS.
Firewire failed? really? You might want to let all the pro audio and video guys know. They need to stop using it I guess.
Considering that Apple was pushing it as an alternative to USB2, yeah, it failed. Even Apple admits it, how long has it been since you could buy an iPod with a firewire connection? And it’s only a matter of time before Apple drops firewire support entirely, as they inevitably do with their failed technology (effectively giving the shaft to everyone foolish to adopt said tech).
Try reading the article next time. No one is saying Apple is the driving force behind Intel. That would be foolish.
However in this particular case with Light Peak, they are the driving force. Big. Difference.
I wonder if they could use filtered light on the same optical channel… red, green, and blue. 3 times the bandwidth.
There’s such a thing as “multimode” optical. You can run thousands of channels of anything over one wire.
It’s much more complex than using different colors of light. It’s also common.
Again, Apple’s going to hang themselves (or rather, their users) with another cable.
I can’t stand changes just for the sake of changes. This might be a nice marketing gimmick (oooh, 10Gb – pfft..) but who needs it? Show me a $100 mouse with fiber-optic connection and I’ll show you an idiot. A USB slot is just over a centimeter wide, and that’s apparently TOO big to put more than one? What a joke, go get a hub dummies. And Apple’s history of socket bastardization doesn’t make me optimistic either (remember the USB connectors with the Apple-Only notch?)
So let’s hit the points:
…”it daisy-chains”…
So does a friggin AC extension chord, what else ya got?
…”10-G”…
10G to _what_ exactly? A monitor? That’s about it. So apparently HDMI and VGA are replaced with an expensive breakable optical fiber – super! And it’ll probably be a Mac-only monitor as well – super duper!
…”only need one plug”…
Great! So when the ONLY plug breaks, the whole box is unusable – wheeee! Tell me, where do I insert this non-optical USB flash drive – oh, into yet another dongle adapter? Wooo, we’re having fun now!
The last cable Apple threw at it’s users was Firewire, and that was a painful process – not necessarily for the users, but for us in tech support. Having to say over 1000 times: “No, that’s a USB plug – you need a Firewire plug. Don’t ask me why, you just do. No, I don’t have one because I don’t need it. You do. Don’t ask why.”
I’m not against Firewire, but with so few devices that use it – it’s never really been a meaningful feature to me. I mean, it’s nice – if you have something that uses Firewire which are few and far between for PC users. And once USB 3’s commonplace, there really won’t be a need for it.
So get ready, all you 10GBps-speed-typists! Apple’s making a completely overpriced overblown fiber-optic keyboard just for you! You can even daisy-chain a noose around your neck..