Why did computers come to adopt the GUI as their primary mode of interaction, and how did the GUI evolve to be the way it is today?
Why did computers come to adopt the GUI as their primary mode of interaction, and how did the GUI evolve to be the way it is today?
I just love these tours down memory lane. It is fascinating to see how far things have come over such a short period of years. If you want a good insight into the development of the Macintosh and Lisa from an insider’s perspective, head over to folklore.org. You get to see what kinds of tricks people were playing to coax what they did out of relatively limited hardware.
was a visionary. amazing how some people can see technology in the future so accurately
He earned his Ph. D. in 1955 .
That was an excellent writing on the GUI history. It cleared up a few of my misconceptions about the sequence of events. I hope that the some readers of osnews correct their constant rants about Apple didn’t invent to MicroSoft didn’t invent (pick your own topic) or (my OS) did it first.
umm.. read again bub.. there are 2 different people. the guy in the 30’s was the inspiration for the guy in the 50’s.
Good call.
I feel the gui was adopted because humans have eyeballs that we use to see things. Sometimes we use our hands in conjunction with the eyeballs and that works pretty well for us.
The book Being Digital by Nicholas Negroponte has some interesting historical bits in about computer/human interface (E.g., while many poeple believe Apple invented the mouse, it was actually an academic in the 60s IIRC). It’s a bit dated, but still interesting – and there’s the wonderfully-named section on computer-generated speech, “Teaching Computers to Talk Good”.
Were are the screenshots of amiga????
“Were are the screenshots of amiga????”
Hmmm. Did you miss this bit?
http://arstechnica.com/articles/paedia/gui.ars/5
It’s funny how a lot of people unfamiliar with just how much work and R&D has gone into the standard GUIs we use today are so quick to declare them ‘obsolete’ and pine for something radically different. I think what we have today is pretty optimal and will continue to be in use for at least a few more decades, mostly unchanged. We’ll see.
I found a really interesting site charting the history of various GUI’s with tonnes of screenshots
http://toastytech.com/guis/index.html
It will be interesting to see how the GUI progresses in the next decade.
The article stops right now, with modern GUIs, I would like it to talk a little about new experimental GUI trends in more depth, like vectorized, 3D or Zooming interfaces ala Archy. I like the idea behind the later, the idea of working on documents with only one set of commands that are called all the same way and can be expanded.
CONTIKI !
http://www.sics.se/~adam/contiki/
People actual think/thought that? wtf?
” I hope that the some readers of osnews correct their constant rants about Apple didn’t invent to MicroSoft didn’t invent (pick your own topic) or (my OS) did it first.”
I don’t think the argument is who invented it first(HP,Microsoft,Apple). The argument is who pushed it into our homes first? Who knew it had to happen the minute it was known? There is no argument there.
Rather good writing this, thou maybe a bit shallow, it’s great reading for us young folks who didn’t live in those exiting times.
I’ve one gripe thou:
Smalltalk was _not_ the worlds first object-oriented programming language.
Simula vas.
Other than this, the article gets two of my thumbs up.
cool article. pretty good.
I have some mpegs on Douglas Englebart’s demo of the mouse, the *bug* (cursor), that weird 5 finger keyboard thing and the realtime video conferencing. really neat stuff.
put those demos up somewhere!
as awsome as raskins stuff is, its completely and totally alien to anyone who has used a computer, so i dont see it getting widespread use. you see it all the time, the further something is from what the person is used to, the “worse” it is, and its rare that they will take even a fraction of the time to learn the new way of doing things.
good example was iTunes. tried it when it came out on windows, and didnt like it anywhere near as much as winamp. then i got an ipod. since using two music players is kinda dumb, i switched over to using itunes all the time. fast forward a year, and now i find the winamp way of doing things clunky and awkward, and the only windows app i miss in linux is itunes.
Yeah… Missed out screenshots of Workbench 1…. Sorry!!
Ah, what a trip down memory lane! (Insert memories of the exciting day in 1983 that the family took a 2 hour trip to San Bernadino to buy a C64)
However, the article doesn’t do a really good job of explaining *why* the GUI has become the standard way of providing input to a computer. (I’m thinking of some articles I’ve seen posted here that “show” how a CLI is really a superior interface.)
—
Finally, Mattb, as an OS X user, this comment of yours made me giggle a bit ” and the only windows app i miss in linux is itunes.”
yeah, may not count as much since im an ex-mac user who has been lusting over osx for the last year or two. but i can say that i genuinely didnt like itunes as much when i tried it, and its the only thing i cant find an application i like as much in linux.
as for gui vs cli, pretty much every study done on it shows that the gui is more efficient pretty much accross the board. im not a designer, but from what i have read, interacting with a gui leverages a different part of your brain then when using a cli. it makes it far easier to create patterns that let you do things “without thinking”, the same patterns (or “gestures”) that you come up with for things like walking and talking. you will still build them under a cli, but it is much harder for the brain. a visual interface takes some of the “translation” work your brain goes through when using/learning anything, which makes you better faster.
of course, plenty of hardcore cli users will chime in that archaic operation x is a heck of alot easier from a cli then a gui. this is because the cli is a far more abstract kind of interface, which allows for more abstract ways of operating it. a vim user can do more in one command line then a word user could do in five minutes of playing with widgets. however, thats outweighed by what i was talking about in the previous paragraph.
last thing i want to mention is that accross the board, a cli will alwas feel faster. thats because your perception of time is subjective, if your brain is working harder, it will pay less attention to how much time is passing. when talking about computer interfaces, feelings are almost totally worthless for judging your efficiency. whats (remotely) effective is judging how tired you are at the end of the day, which will give you a very rough and general idea. but nothing can beat a stopwatch.
I have come across a few people who still reject a GUI interface for the Command line. CLI is not always the most effecient inteface. For instance in HP-UX why spend an hour or more recompiling the kernel by hand when you can load up SAM and get it done in a few minutes? I like to use both in my work personally.
Funny the way everything looks like ass up until OS X Aqua.
You mention NeXTSTEP, as you should, but don’t put it in the timeline bar at the end with the others. I’d fix it but I’d think the better part would be to acknowledge that while Windows 3.1 and other OSs were hampered by crappy graphics NeXTSTEP was nearly a decade ahead of everyone else.
NextStep has features that today’s GUIs still have yet to achieve, even OS X in some ways.
<<yeah, may not count as much since im an ex-mac user who has been lusting over osx for the last year or two.>>
Give in to iLust. Listen to the siren song of the gently used G3 iMac and/or Mac Mini …
<<as for gui vs cli, pretty much every study done on it shows that the gui is more efficient pretty much accross the board. im not a designer, but from what i have read, interacting with a gui leverages a different part of your brain then when using a cli.>>
Oh, totally. Using a GUI is much more an associative task, while, generally speaking, using a CLI is a cognitive task.
<<it makes it far easier to create patterns that let you do things “without thinking”, the same patterns (or “gestures”) that you come up with for things like walking and talking. you will still build them under a cli, but it is much harder for the brain.>>
Exactly, and since a GUI is built around principles and concepts, the basic skill set of one GUI transfers to another.
(e.g. it took me less than a day to find my way around OS X thanks to years of W98. My dad and brother have no problem making the adjustment when they come to visit.)
<<last thing i want to mention is that accross the board, a cli will alwas feel faster.>>
Well, that depends. As somebody with a real talent for transposing letters and numbers, the CLI is a real exercise in rereading to make sure I really typed what I think I just typed.
That and consulting the book to read what unique command I get to enter next.
Despite the fact that I do know my really basic command line commands (mv. ls, etc.) I find it faster to hit the GUI to accomplish the same tasks … which is why I find the lack of complete integration of GUI and OS so frustrating in Linux.
The Apple Lisa was not meant to be sold for 10.000 $ but for 1.000$ dollars. But it became more and more expensive of several reasons, some technical (problems with unreliable floppy drives – Steves wishes for new functionality and so on). The macintosh was meant to basically replace the Lisa, also at a price of 1.000$ but again it was delayed and delayed and more ram, 2 floppy drives and so on meant the price increased to just below 3000$.
/dylansmrjones
It’s interesting how bias shows in articles like this. For example, take this quote from the Windows 95 section.
Windows 95 introduced the concept of the Start Menu, from which all programs could be launched, and the Task Bar where all running programs could be switched between.
What he MEANT to write was obviously “introduced to Windows users.” I was doing both those things on my Amiga years before Windows 95 ever came out. Apple users also had similar capabilities long before Windows 95 shipped.
This bias also showed in the ordering of the “other OSes.” The Amiga was released before Windows 1.0, but he gives the impression of the opposite using ordering and certain phrases in the article. The Amiga 1000 shipped in July of 1985, while Windows 1.0 didn’t appear in stores until November 1985.
Overall, the article was biased to give the impression that Windows pioneered many things when in fact it didn’t. It’s not as blatant as some articles, but that just makes it worse since many people won’t be able to distinguish between the fact and the fiction.
It seems more like inaccuracy than a Microsoft bias.
For instance,the article seems to give credit for the “dock or shelf” or taskbar to Acorn, while the article shows one appearing in Windows 1.01 several years earlier, and it can also be seen here:
http://toastytech.com/guis/guitimeline2.html
It looks like this early Windows version can launch functions (floppy icon) and can hold running applications (Paint and spreadsheet{?} icons).
AnthonyC: I don’t think the argument is who invented it first(HP,Microsoft,Apple). The argument is who pushed it into our homes first? Who knew it had to happen the minute it was known? There is no argument there.
Perhaps. But that person would be Douglas Engelbart who pushed it in the 60s, while Steve Jobs and Bill Gates were busy watching Johnny Quest.
The inspiration for the Start menu and taskbar came from the Apple menu and menubar. They just moved the global menubar to the bottom, made it show open apps, and had the Start menu do the same as the Apple menu.
Most Windows users don’t even realize the Apple menu was storing program shortcuts before Windows did with the Start menu.
I will try to get them up later tonight or tomorrow. They are on one of my drives. They were on some website i stumbled across like a year ago. I’ll look for it again.
Found it!
http://sloan.stanford.edu/MouseSite/1968Demo.html
very cool stuff.
The inspiration for the Start menu and taskbar came from the Apple menu and menubar. They just moved the global menubar to the bottom, made it show open apps, and had the Start menu do the same as the Apple menu.
The notion taskbar was derived from the Apple menu/menubar is conjecture.
In the first place, menus existed long before Apple incorporated them into their GUI. The article clearly states that Bravo (circa 1973) had a menu at the bottom of the screen, and menu items can be discerned at the top of each tiled window in the Xerox Star screenshot. These menus can be more plainly seen in this image of the Xerox Star from 1981 (two years before the release Apple’s Lisa):
http://toastytech.com/guis/guitimeline.html
Secondly, application-launching icons also existed years before Apple started using icons and menus, so it is not a huge mental stretch to realize that these icons could be aligned and always visible, in a strip along the side of the screen (like M$ did in 1985). It is more likely that the taskbar was derived from icons rather than from menus.
Also, keep in mind that the Windows “Start” button first appeared in Windows 95 — the taskbar had already appeared in Windows ten years earlier.
Perhaps this “Start” button was inspired by the “apple-icon” menu, but it is interesting that the article shows the Alto file manager from 1973 with a list of “.RUN” programs and a “Start” button to launch them.
The whole point isn’t who did it first, it’s the fact that the article states the Microsoft did it first in Windows 95 when that is clearly not the case. The article is riddled with inaccuracies like that.
It’s not a bad article, and the mistakes aren’t glaring, but they’re still there. If you’re going to portray your article as history, be sure to do a little more thorough research and double-check statements like that.
Why did computers come to adopt the GUI as their primary mode of interaction, and how did the GUI evolve to be the way it is today?
>
>
Because the GUI was basically *FORCED* upon people by Apple and Microsoft and their supporters within the UI “community”.
The GUI was never really “adopted” by people in general, it was basically forced upon them.
You still see this attitude from the UI “community” especially where the Free Software/Open Source and especially Linux is concerned.
Notice how the UI adovactes tend to go off the deep end when Linux community inform them that they aren’t all that interested in their “ideas and suggestions” in regards to Linux design.
Ever notice how upset they seem to get over the fact that there *ISN’T* a company they can go running to demand that Linux developers adopt their “suggestions”
of course, plenty of hardcore cli users will chime in that archaic operation x is a heck of alot easier from a cli then a gui. this is because the cli is a far more abstract kind of interface, which allows for more abstract ways of operating it. a vim user can do more in one command line then a word user could do in five minutes of playing with widgets. however, thats outweighed by what i was talking about in the previous paragraph.
Well, that depends on the task. Every now and then, I run into tasks where “perl -pi -e …” saves me about an hour of point and click. In addition, the abstraction can really help with automation of complex tasks.
Vim and MSWord are a bad comparison, being two different application types for two different domains. Vim is also not a cli. The usability problems with vim can be summed up by saying that vim makes it harder than is necessary to do the right thing.
However, MSWord has its own usability problems that are not well measured by timing clicks on widgits with a stopwatch. I’ve spent hours trying to reformat a document with conflicts between style definitions, paragraphs manually formatted as headings, and 100-word headings formatted as paragraphs, resulting in a mess in which trivial changes produced unexpected side effects. The usability problems with MSWord can be summed up by saying that MSWord makes it easier than is prudent to do the wrong thing. The problem is that the consequences of doing the wrong thing frequently don’t pop up in micro-level usability studies.
i remember being blown away the first time I read about Alto, and how advanced I thought a lot of the late 60’s and early 70’s projects were. pretty amazing stuff. especially since the general public thinks nothing really happened with comptuers till the early 80’s.
i’d never heard of douglas’s NLS though. i had no idea he invented the mouse. and that dude from the 30’s should get as much credit as ol’ babbage does. definately a visionary way before his time.
Because the GUI was basically *FORCED* upon people by Apple and Microsoft and their supporters within the UI “community”. >>
Really? Forced? Do you think that if businesses wanted DOS Mordorsoft would’ve made Windows?
Do you think that if businesses didn’t want it they wouldn’t have bought all those 512k Macs?
Forced … by overwhelming positive feedback of employees and by tinkering at home.
Hmm … which is harder … memorizing various dos commands to navigate to and open a file or the concept of how to navigate to it? Which is the less abstract?
<<The GUI was never really “adopted” by people in general, it was basically forced upon them.>>
Yes, at home, given the choice between doing things in Dos5 &6 or Windows 3.1, Windows won every time. Forced to the GUI … by nature, ultimately.
I didn’t have to call my dad every 15 minutes to ask some obscure Windows command.
The whole point isn’t who did it first, it’s the fact that the article states the Microsoft did it first in Windows 95 when that is clearly not the case.
I strongly disagree. Pioneers are rare and crucial. Carpet baggers (Jobs/Gates) are inevitable. This article is about the history of the computer GUI — “who did it first.” If you are referring to who did the taskbar first, it seems that M$ wins the prize back in 1985.
The article is riddled with inaccuracies like that.
It’s not a bad article, and the mistakes aren’t glaring, but they’re still there. If you’re going to portray your article as history, be sure to do a little more thorough research and double-check statements like that.
I wouldn’t say that the article is “riddled” with inaccuracies, but some parts were “glossed-over.” If you are referring to me as the one who “portrays your article as history,” please do YOUR research — the first to claim an “inaccuracy” in this article is me, at the top of this page. Also, please note that I have sited examples from other sources to support my claims.
What details and examples have YOU cited?
and we meet again….. 😉
Well, that depends on the task. Every now and then, I run into tasks where “perl -pi -e …” saves me about an hour of point and click. In addition, the abstraction can really help with automation of complex tasks.
Agreed, although its more black and white then it should be. Look at things like automator in tiger, making a good gui for a truely abstract and flexible task is just exceptionally hard, not impossible. But since everyone is stuck in 80s paradigms and metaphors, such things are few and far between. I *alwas* have a desktop with three or four terminals open.
Vim and MSWord are a bad comparison, being two different application types for two different domains. Vim is also not a cli. The usability problems with vim can be summed up by saying that vim makes it harder than is necessary to do the right thing.
Yeah, I know its not the greatest example in the world, but I was trying to get a general idea across. vim lets you arbitrarily string pretty much any command together to make very complex operations. vims usability problems also place a huge burden on the users memory, and make users far more error prone then nessicary, but i was ignoring that 😉
However, MSWord has its own usability problems that are not well measured by timing clicks on widgits with a stopwatch.
again, was trying to find a generic example.
I’ve spent hours trying to reformat a document with conflicts between style definitions, paragraphs manually formatted as headings, and 100-word headings formatted as paragraphs, resulting in a mess in which trivial changes produced unexpected side effects.
There was a great Seth Nickell blog talking about design and user requirements being based on expectations formed by using existing products. (http://www.gnome.org/~seth/blog/allworknoplay)
“If I were commissioned by Microsoft to dramatically improve Office, my first step would be to position the project not as a next-generation Microsoft Office, but as a new product. I might even start with the Office codebase, but I sure as hell couldn’t work with the smothering mantle of user expectations that looms over Office. Done well, I think you’d largely displace Office in the market (assuming this was a Microsoft product, I don’t mean to imply that anybody could just make a better product and flounce Office in the market). So you are meeting the goals people have in using Office. What you’re not doing is slogging through trying to meet the specific needs people have of the existing software. If you do that, you’ll just end up writing Office again.”
The usability problems with MSWord can be summed up by saying that MSWord makes it easier than is prudent to do the wrong thing. The problem is that the consequences of doing the wrong thing frequently don’t pop up in micro-level usability studies.
yeah, i agree, and i was presenting it as far more cut and dry then it is to make a point. word does too much (far more then the average user needs), and is locked into the way it does it by user expectations. i would love to give pages (http://www.apple.com/iwork/pages/) a try.
it may be conjecture, but look at the context. windows 95 was a total os7 ripoff. im not saying that everything in os7 was invented by apple, but it was like they took win3.1 and said “how can we make this look and act like a mac?”. most of their implementations of the concepts were vastly inferior (at times they made no sense whatsoever, like the spatial file browser). if we look at this in context, the start menu looks like one of the many other things copied. there is even a windows icon in it.
it may be conjecture, but look at the context. windows 95 was a total os7 ripoff
Consider the following context: In 1985, Windows 1.01 was the first OS to have a taskbar featuring icons that represent running applications.
http://toastytech.com/guis/guitimeline2.html
If such icons appeared years later in the OS7 menubar, who is ripping off whom?
im not saying that everything in os7 was invented by apple, but it was like they took win3.1 and said “how can we make this look and act like a mac?”.
Not sure what is meant by “look and act like a mac.”
From 1988 through 1994, there were a lot of non-Apple, OS players that aquired a look of finer resolution than Windows (NeXT, Amiga Workbench 3, PC-GEOS, QNX Photon, etc.):
http://toastytech.com/guis/guitimeline3.html
Nobody “invented” this finer look — it was merely an inevitable result of ever-increasing computer power.
most of their implementations of the concepts were vastly inferior (at times they made no sense whatsoever, like the spatial file browser).
Perhaps. I don’t remember the spatial file browser. No doubt M$ has come up with some dogs, but Apple has had a lot of doozies, too.
you misunderstand me, all im saying is in the specific case of windows 95, they were really ripping off the macos.
apple never had a taskbar displaying running applications, the closest thing to that in apple history is the dock. it alwas had a “Finder Menu” in the upper right hand corner. the taskbar definately came from ms, as did things like the menu in the upper left hand corner of all windows (no clue what its name is).
but windows 95 had alot more then that. the start button was one, new instance of explorer every time you clicked on a file was another. control panel differed from apples control panels by removing the s at the end, and the recycle bin was obviously “inspired” by the trash. the mdi was a way to have a mac-like environment for an application. hierarchical menus were done at apple, but done better (the tog has talked about his design for the “buffer” for the hierarchies, and alwas points out how ms didnt get it). just in general, win95 was a bit of a hack job from a usability point of view. of course, looking at pre win95 non apple operating systems, its not hard to see they werent the only ones.
now, im not trolling, ms has gotten ALOT better since then. but the ways it gets better are mostly by either dropping or severly modifying concepts introduced with win95 that simply didnt work well in a windows system (how many people do you think kept the default explorer behavior?). of course, some things were done better at ms, like the full recyclebin icon not looking like its painful and *needs* to be emptied, or their mouse implementation (i can also dig up some stuff by jef raskin explaining why he went for one button mice, why it was a bad idea, and what he would do differently now) i can harvest some asktog links if you want, he still has a chip on his shoulder about the whole thing (they were mostly his ideas they were copying), and will revel in pointing out all the shortcommings of their “implementations”.
It may not be intentional, but the article is flawed. It probably
For example, in the “other 1980’s GUIs”, we get, in this order:
VisiOn – the author states this came out in 1983.
Windows 1.0, announced in 1983, released in 1985.
Tandy DeskMate, released in 1984.
GEM & Amiga Workbench, released 1985.
Windows 1.0 should really be placed in 1985, but by placing it in the date of the announcement, is given a 2 year headstart.
Then, on the next page, a screenshot of Arthur is offered. The writer claims it was later renamed RISC OS, and continued to be used up until the late 90’s. This is the equivalent of showing a screenshot of Windows 2.0, and saying it later changed its name to XP and continues to be sold. Without a screenshot, and without knowing what XP looks like, one would imagine that Microsoft continues to sell something that looks like Windows 2.0. Incidentally, RISC OS is still being sold and developed.
In addition, for an article about the history of GUIs, very little is said about the way users interacted with the GUI. Why did different GUI’s require 1-, 2- or 3-button mice? IIRC, the Amiga’s Workbench could have icons of any size – if so, why didn’t that feature filter into othe GUIs (I can answer that myself, but for an article about the history of the GUI it deserves a mention).
Different drag and drop concepts should have received more mention – after all, they are one of the core features of a GUI, and a fundamental difference between GUI and CLI operation.
The history of Menus should also have been explored. As well as the Windows and Mac models, what about context menus and how they evolved and were implemented?
The whole point isn’t who did it first, it’s the fact that the article states the Microsoft did it first in Windows 95 when that is clearly not the case.
I strongly disagree. Pioneers are rare and crucial. Carpet baggers (Jobs/Gates) are inevitable. This article is about the history of the computer GUI — “who did it first.” If you are referring to who did the taskbar first, it seems that M$ wins the prize back in 1985.
So you admit that when you claimed Windows 95 did the start menu and taskbar first, you were only PARTLY right about the taskbar. You were wrong about the start menu and gave the wrong version for when the taskbar was done first.
The article is riddled with inaccuracies like that. It’s not a bad article, and the mistakes aren’t glaring, but they’re still there. If you’re going to portray your article as history, be sure to do a little more thorough research and double-check statements like that.
I wouldn’t say that the article is “riddled” with inaccuracies, but some parts were “glossed-over.” If you are referring to me as the one who “portrays your article as history,” please do YOUR research — the first to claim an “inaccuracy” in this article is me, at the top of this page. Also, please note that I have sited examples from other sources to support my claims.
You just PROVED it WAS riddled with inaccuracies as I just showed above.
What details and examples have YOU cited?
I referred you to the fact that both the Apple Macintosh and the Commodore Amiga had the start menu long before Windows, and that the Amiga was being sold well before Windows 1.0. Two easily verifiable facts. If you miss such things, all the rest of your article is suspect.
So you admit that when you claimed Windows 95 did the start menu and taskbar first, you were only PARTLY right about the taskbar. You were wrong about the start menu and gave the wrong version for when the taskbar was done first.
What? When did I ever claim that the taskbar first appeared in Windows 95? The article made this assertion.
I did not write the article.
By the way, the article also stated earlier that, in 1987, the Acorn Arthur introduced “a “Dock” or shelf at the bottom of the screen where shortcuts to launch common programs and tools could be kept.” This dock looks a lot like the M$ taskbar which appeared two years prior.
You just PROVED it WAS riddled with inaccuracies as I just showed above.
My earlier statement about one inaccuracy does not prove that the rest of the article was full of mistakes. However, after reading later postings in this thread, I admit that it seems that the article has more problems than I first realized. Also, I don’t think the author intentionally mislead.
I referred you to the fact that both the Apple Macintosh and the Commodore Amiga had the start menu long before Windows, and that the Amiga was being sold well before Windows 1.0. Two easily verifiable facts.
Okay. You gave two examples without details, references or links to verify them. I already know that the “apple” button lists applications (and has probably done so since the first Mac — I don’t remember). I do not doubt that you are right about the Amiga, but I cannot find a “start” button in any Amiga screenshots. Does one click on the title “Amiga Workbench” to list available applications?
Keep in mind that that a “start” button does not constitute a taskbar, and that a start button that launches programs appeared in 1973 in the Alto file manager:
http://arstechnica.com/articles/paedia/gui.ars/3
If you miss such things, all the rest of your article is suspect.
The article is not mine — I did not write it.
Sorry about that… from the way you responded to posts from me and a couple others, it sounded like you were the author. My apologies.
As for the Amiga, the “start menu” was the tools menu introduced in KS2.0. You could add your own tools (applications) to the tools menu. As such, it wasn’t first, but it did predate Windows adoption of the feature.
I had a variety of applications I commonly used in the tools menu, as well as a dockbar. The dockbar wasn’t part of the standard OS, but an add-on from aminet (ToolManager).
Probably the best GUI of the time was NeXT. Those were great machines to work on. We had a lab of those at the university we used for CAD and circuit board layouts. Quite a few third party add-ons for the Amiga were modelled after NeXT GUI features.