Sometimes, you wake up in the morning, check your RSS feeds, and you know you just hit the jackpot. From the AT&T archives comes a video and description of Blit, a UNIX graphical user interface from 1982, on a 800×1024 portrait display. It employs a three button mouse, right-click context menus, multiple windows, and lots, lots more. It was developed at Bell Labs, and is yet another case that illustrates how the technology industry doesn’t work in a vacuum.
Blit isn’t content with just being a graphical user interface with multiple windows and right-click context menus. The windows can also overlap, and the system employs, true to its UNIX base, full multitasking. The video below demonstrates how you could edit code in a text editor, switch to a different windowed terminal and execute make
, and play Asteroid in another overlapping window while the code compiled. Further along the video, it shows an application for integrated circuit design with a debugger running in a separate window.
It ran on a Motorola MC68000 processor in 1982, but would be ported to the Western Electric WE32000 microprocessor so it could be used on commercially available machines like the AT&T 5620 starting in 1984. It was created by Rob Pike and Bart Locanthi, and detailed in this paper on the subject. For instance, this is how windows (or layers, as they call them) were implemented:
The operating system is structured around asynchronous overlapping windows, called layers. Layers extend the idea of a bitmap and the bitblt operator to overlapping areas of the display, so a program may draw in its portion of the screen independently of other programs sharing the display. The Blit screen is therefore much like a set of truly independent, asynchronously updated terminals. This structure nicely complements the multiprogramming capabilities of Unix, and has led to some new insights about multiprogramming environments.
Interestingly enough, the goal behind creating Blit was to make graphical user interfaces run on less powerful, cheaper hardware than the Alto. “The original idea behind the development of the Blit hardware was to provide a graphics machine with about the power of the Xerox Alto, but using 1981 technology (large address space microprocessors, 64K RAMs and programmed array logic) to keep size, complexity and particularly cost much lower,” the paper notes. This may remind you of some other projects developed at around the same timeframe – you know, independent groups coming to the same conclusions because they’re working in the same constraints set by available hard and software.
The Blit was connected to a timeshared host through RS-232, but this wasn’t a big problem due to its relatively powerful hardware. It was also small and “portable”, and could even be used on 1200 baud connections from the engineers’ own homes; not as smoothly as on higher-speed connections, of course, but “a Blit at 1200 baud is much nicer than a regular terminal at 1200 baud”.
Despite being a relatively early graphical user interface, it already sports some interesting ideas. For instance, inactive windows receive a dotted texture to indicate they are not the currently selected window. Furthermore, Blit doesn’t have focus-follows-mouse:
One decision which differs from the usual, but in which we are on firmer ground, is our requirement that a mouse button hit changes the current layer. In most systems, the location of the mouse defines the current window, but when the current window may be partially or even wholly obscured, this is unworkable.
The conventions around mouse operations are also incredibly fascinating. Blit uses a three-button mouse, set to the following conventions:
The mouse has three buttons, and the Blit software maintains a convention about what the buttons do. The left button is used for pointing. The right button is for global operations, accessed through a menu that appears when the button is depressed and makes a selection when the button is lifted. The middle button is for local operations such as editing; a simple view of the situation is that the right button changes the position of objects on the screen and the middle button their contents. For example, pointing at a non-current layer and clicking the left button makes that layer current. Pointing outside the current layer and pushing the right button presents a menu with entries for creating, deleting and rearranging layers. Clicking a button while pointing at the current layer invokes whatever function the process in that layer has bound to the button; the next section discusses a couple of programs and how they use the mouse.
The mouse cursor is modal; it will change its appearance depending on the actions one can perform. “For example, when the user selects New on the mpxterm
menu, the cursor switches to an outlined rectangle with an arrow, indicating that the user should define the size of the layer to be created by sweeping the screen area out with the mouse,” the paper explains, “Similarly, a user who has selected the Exit menu entry is warned by a skull-and-crossbones cursor that confirmation is required before that potentially dangerous operation will be executed.”
There’s a whole boatload of interesting stuff in the paper about Blit, and it’s definitely recommended reading – easy to read, too. There’s also a detailed FAQ on the AT&T 5620, the commercially available machine from 1984 which ran Blit.
This is once again more proof that the industry heavily studied, implemented, and experimented on graphical user interfaces in the ’70s and early ’80s. It also shows that, unlike what some want you to believe, it wasn’t just one company that saw the value in bringing the graphical user interface to the masses – even UNIX people at Bell Labs saw its potential and ran with it. You wouldn’t believe it from reading about entitled corporations competing in courtrooms, but it almost seems like this is how the technology industry has always worked.
Truly fascinating stuff, and you owe it to yourself to dive into it.
Okay, so the Lisa development began in 1978. This also looks more like Windows 2.0 than a modern GUI, as seen with the products from those companies that you choose to discredit.
Nice find though.
Edit: it’s also worth noting, this is just a terminal. It’s not on the same level as the Xerox machines – they were independent work stations. This is more like a dumb X terminal. Not to belittle the achievement. But from what I can tell, you needed a mainframe in a server room to make the terminal do anything useful.
Edited 2012-08-29 11:36 UTC
The video states that is has a 68000 processor and 256 kilobytes of RAM, that is hardly dumb terminal hardware (same processor as the original Mac an twice the RAM, two years before the Mac was released). The question is how it is driven, but if applications on the mainframe can upload fairly general programs to the terminal to run and communicate with now and then it is very much a hybrid system.
Edited 2012-08-29 11:48 UTC
And? What do you think generates the bitmapped graphics? All terminals require some kind of processor, they don’t just work by magic. Indeed, a lot of the Citrix/Win Terminals you see these days are ARM based. The processor runs the basic protocol drivers for inpur, comms and UI, the clever stuff is done on the mainframe.
Dumb terminals are defined by not having programmatic capabilities, it is likely that this terminal does given its advanced hardware.
Yes, the Blit had the ability to download programs from the main computer. For example, Rob Pike’s text editor, sam, could download its GUI portion onto the terminal; then the only communication needed was the text being edited, instead of the pixels to render to the screen. This sort of thing was what made the terminal usable on a 1200 baud modem.
Cool… so it really is the original graphical “thin client”.
What advanced hardware? It has 500KB of RAM and a 68000 processor. It would need that kind of power to generate the bitmapped graphics. Look at your average Citrix/Terminal services hardware client, aka “thin client”:
http://www.igel.com/uk/products/hardware/ud2-lx-multimedia.html
So this one, I randomly picked from a google search, has a Cortex A8 1Ghz processor and 1GB RAM. All it does (pretty much) is connect to a Terminal server/Citrix farm or VMWare virtual server and serve a remote desktop. It has an embedded Linux OS, but all that does is provide the interface to choose/log on to the provided clients and then run the client full screen.
When they say “terminal” they mean “terminal” in the true sense of the word. Just because it has a graphical UI doesn’t make it any less the grandaddy of thin clients.
Dude, you’re comparing apples to oranges here. This is 1982 we’re talking about. A 68k and 256k of memory was pretty huge back then, as vaette noted, same CPU and twice the memory as the Mac that was released a year later. It’s like saying a PC with a Core i5 and 16GB of memory is a low-spec dumb terminal by today’s standards.
That being said, whether something is a dumb terminal or not depends mostly on the ability to run application software on the machine itself, not by the grunt in its hardware.
There’s not a single bit of discrediting going on here. I’m only trying to make clear how this industry works: independent groups of people working within the same constraints coming to the same conclusions, building upon one another, as opposed to how some people think – namely, that companies work in isolation, coming up with everything all on their own.
It seems like to me the only person discrediting anybody’s work is you with your comment, trying to downplay Blit to make other products appear more advanced.
Edited 2012-08-29 12:00 UTC
Well no Thom. No discredit here. The product was in development in 1982, but look at when the final product was released.. 1984. So no need to argue. As I said, the Lisa was in development in 1978. You’re very righ… in the late 70’s there was a lot of buzz about GUI.
Again, this is a fine achievement, but it’s no Xerox Alto. It very much reminds me of screen shots I’ve seen of Plan 9, though I think that window manager tiles.
No reason to assume that Lisa development began before that of BLIT.
If so, it was well ahead of it’s time.
Not really. It is more much more significant is that this is yet another early GUI. It further establishes the fact that there were a lot of pre-Apple GUIs.
I used to have a terminal at home, just a plain text one. It could dial in using a 2400 baud modem to my employer.
I only did it a few times just for fun, because we used Solaris and for some strange reason the keyboard didn’t have a | key, which rather sucks if you use an UNIX system.
Or is almost every post now some under handed or sly slap at Apple? Or am I reading too much into this?
Considering there’s not a single negative mention of Apple in this article, your comment is a bit off-topic.
“It also shows that, unlike what some want you to believe, it wasn’t just one company that saw the value in bringing the graphical user interface to the masses”
“You wouldn’t believe it from reading about entitled corporations competing in courtrooms, but it almost seems like this is how the technology industry has always worked.”
I mean the only company I hear you talking about in these terms is Apple. I guess I am just seeing fire from smoke? Good article and interesting besides the sly bashing.
So, stating simple facts is now considered bashing.
Okidoki.
If you use Linux you probably bash too.
If it is fact (Which remains in dispute) it doesn’t need to be salted into every article posted. (Not every but you get my drift)
Things like this can stand alone. Like I said great and informative article otherwise.
I sure do; and it smells.
Like the truth.
There is a typo in the hyperlink to ‘some’ (that should lead to Wikipedia page of McIntosh).
[just_kidding]
I’m sure you did it on purpose!
[/just_kidding]
Very interesting article, thanks Thom.
Just ignore the last bit of the article, enjoy the rest + video and you’ll be fine.
Even if it’s just a terminal, it’s a rather nifty one and having GUI access to a multitasking operating system was pretty cool at that time.
Many years later the first generation of Linux users didn’t do much more than run a few terminal windows in an X session either.
Given GNU/Linux was pretty much a complete rewrite from the ground up, it’s not all that surprising.
There are people who around who still do this.
Mutt is a very powerful console email client for example, it’s very easy to have it running in a console window while doing other CLI stuff.
On the first Mac I used I only ran only some terminal program and the 30 day Netscape version.
Oh sorry, you meant out of choice. I thought you were making a swipe at Linux development.
Yeah, I basically run X just to run multiple terminals (though I typically use tmux terminal multiplexer with tiled sessions – so technically that’s only one terminal emulator). I’m not a great example of a typical desktop Linux user though, given I’m UNIX administrator by trade.
I used to have a number of terminal windows open until a guru noticed that and told me the secret of “screen”.
This was great, now I only had one terminal in which I had 9 or so “screens”, detach my session, go home, reattach.
http://www.rackaid.com/resources/linux-screen-tutorial-and-how-to/
Yeah, I know how to use screen. Like I said, I use Tmux – which is based upon screen’s code but developed into something much more advanced.
Edited 2012-08-30 07:48 UTC
How long have you known this and why did you never bother to tell us???
lol I assumed you knew to be honest.
There’s some great pages online with Tmux config snippets too.
Tmux (and GNU Screen as well) have saved my bacon on a number of occasions
What killed the Linux desktop:
Laurence assuming we already knew everything.
I’ll check out Tmux!
its not you.. the bashing is obvious, especially in the phone area. Most have never owned an apple product but they just KNOW the junk they bought is superior, just ignore the malware and random freezing. I used to be like them avoiding and bashing apple. After using apple products I can appreciate a quality stable design.
Feel free to vote me down now :}
Because you used to bash Apple without having taken a deep look at their products first, does not mean that everyone else does so…
The reason I say this is that I have met more than one Apple fan who cannot grasp the concept that even after using the company’s products, I can still prefer alternatives for their respective strengths.
Not idiots, even, they are all pretty very smart persons, that are very competent in their respective areas of expertise. It’s only when it comes to computer-related discussions that they start to exhibit this weird religious faith that if you don’t like Apple products, you must not have used them yet, period.
It’s a bit like LaTeX advocates, only worse.
Edited 2012-09-02 16:23 UTC
Great work Thom. Keep unearthing such early works in the computing industry. Despite all the muscle of the myth making machines of some corporations, the truth must ultimately prevail.
He didn’t exactly unearth this… given it was an article on the Verge and it was on the Verge’s RSS feed. I saw it there first too.
Aside from the technology vaccuum, I think there is also a musical vacuum. I am pretty sure the music during the end credits of this film were the basis for Chocolate Rain (youtube it if you don’t know).
this becoming hilarious …
did you know Thom that only most polished products remain on market? hint: Mac OS X and iOS.
regarding past:
Atari rediscover UNIX in 1992.
NeXT rediscover UNIX in 1988.
Microsoft rediscover VMS in 1993.
Xerox rediscover Douglas Engelbart in 1971.
WHAT IS YOUR POINT?
Point is that: “most *polished* products remain on market” (except for Microsoft Windows – it is a glitch in the system, but it should become soon eradicated)
I’m confused as to whether that (above) is a “bad thing”. But if it is, let me throw you a curve ball: both iOS and Mac OS X are based on the original NeXTSTeP OS and therefore… based on UNIX.
Well, that explains why Linux is still in the market then.
Mac OS X and iOS basically *are* NeXTStep. Aparently you weren’t aware of this…
And much of VMS architecture has found its way into the NT kernel, which powers every single Windows PC machine on the market today.
Especially coming from someone who doesn’t even take care to check his facts.
why you think that? that I wasn’t aware of this?
or that Microsoft bring Dave Cutler and half of his team to make NT.
apparently you did not understand me.
what I want to say is that there is nothing new:
as Thom trying to say: “look, there was work on GUI before Mac” of course there was.
same you can tell for NT that is “based” on VMS (NT could be a DEC product anyway)
or NeXT that is based on UNIX
or as DOS is “based” on CP/M
GEM is based on GSX
…
there is nothing brand new on this world, but ultimately everything is come back to TASTE. that’s separate successful product and not successful product. (except in case of Microsoft Windows )
http://www.youtube.com/watch?v=DAc05PeNAuU
Knowing that there is a huge number of people that don’t think so, maybe it’s not a bad idea to tell it in the articles 🙁
It was good enough for Pike back in the day.
*hugs his Honeywell opto-mechanical 3 button mouse*
…but I think a few people here have misunderstood what Thom intended with this article (so maybe it’s me that has misunderstood)
Bascially we have a machine that is not a PC, but a terminal, so noone is about to go home with one, but that’s not the point.
However, the ideas behind the terminal are very similar to what the Lisa and then Mac teams came up with at Apple (yes, I’ll mention the company that dares not say it’s name).
Apple and Bell had obviously both seen what Xerox was doing (and I am sure other companies too) and they worked in parallel with each other coming up with similar ideas.
Personally, I love the name Layers to Windows (maybe MS has dirtied the word).
I also love the idea behind the mouse, having each button used for local, global or pointing.
I wonder how many other projects like this will be unearthed in the coming years, surly these aren’t the only two that saw what Xerox was doing? I don’t count MS, sorry (they had a more indirect link to Xerox)
Some in this thread are comparing apples to oranges, almost literally ;-). This was an exercise in network transparency, multiprogramming and multiuser operation, using a GUI. The point of this project was not “just to create a GUI.”
There is also the misconception that PARC was the sole originator of the concepts behind graphical user interfaces. In reality other companies and project had already been there and done that by the late 70s not just Xerox. The “mouse,” for example, predates PARC. The tragedy for Xerox is that they had the majority of the key components that would define computing and office automation in the 80s and 90s (GUI, local area networks, and several key programming paradigms) ready in-house by the late 70s.
The GUI was inevitable by the early 80s, with or without the mac or Lisa.
Edited 2012-08-30 01:00 UTC
Back in the ’90s I worked at a bank and we had a few graphics monitors for our mainframe. I made a pseudo Windows 3.1 OS on the “dumb terminal”. I say pseudo OS because it wasn’t an OS at all but a simple program that looked like W3.1 and I could click on one and then another window and have the focus change between them. I also set it so that if I clicked on File -> Open and then a file name from a list (only one worked) text would appear in the window and made it appear that it wa an editor.
The reason for this was to prove to someone that you could make any computer, even a mainframe, look like Windows 3.1 or any other OS if you wanted to. It was just a matter of taking the time to program it. Obviously a BIG job for lots of people to do this. But anything can be done if someone puts their mind to it.
PS: Yes they did think I had ported Windows 3.1 to the mainframe. They even went around telling people I had done this. Of course all the programmers in our group knew that I didn’t have the time or resources for this and that I had done this only to fool this one person.
First I want to say I’m not one of the people that think that only one company comes up with ideas that are similar.
People do think that Apple stole IP from Xerox PARC. The truth is that Xerox was asked by Apple if it was OK with them. They made a deal where Xerox got Apple shares in exchange for Apple using as much IP as they wanted.
Note that a few years later Xerox sold that stock for over $16 million. Keep in mind that this was in the mid ’80s when $16 million meant something to companies.
Nevertheless, Apple did not originate the GUI, and several companies (not including Xerox) were developing GUIs prior to Apple.
The only items that Apple truly contributed to the GUI is the trashcan, and, perhaps, Expose’ and just the bounce-back moment of bounce-back scrolling. Note that all of these Apple items are decidedly optional.
There were also other pre-Apple GUI players, other than Xerox. As I recall, there was significant excitement about GUIs in computer magazines, before 1983.
Most notable in regards to the BLIT is the Three Rivers Perq. The Perq first appeared (and was marketed) in 1979. Of course, its development began long before it was released. Here is a video about the Perq from 1982: http://www.youtube.com/watch?v=xOD4T442X6I
In light of the Perq, it is interesting that the later Blit was originally called the “Jerq.”
there are even overlaping windows after 4:13
Edited 2012-08-30 09:55 UTC
Thanks for the link to the video, which is fascinating, not just because of how early it was, but also the fact it was created by two final-year students who had “been working on this for a couple of months”. That’s pretty impressive.
The quote at the end is just great: “By the way, isn’t it somewhat interesting to think that these guys, these students who have made the program shown here, when they started computing education four years ago, the freshmen course in computing was a punchcard based-course in Forth.”
I am so tired of hearing how Steve Jobs single-handedly invented the personal computer AND the first GUI that I could scream. Steve Jobs stole every idea that he ever had. He was just a slick talking salesman. Thanks Thom for another piece that will educate the bleating mass of sheep out there who are part of the Steve Jobs personality cult. And BTW, Al Gore did not invent the internet!
I really wanted an apple II back in the day, but had to settle for a commodore vic20 then 64. Apple pretty much did create the personal computer market. Yes, Steve Jobs was a Sales guy, however he knew something big when he saw it and helped push the market. The IBM PC couldn’t do near what you could on an apple. The PC just took over due to a more open hw platform, but it took about a decade to be at a same level.