Even though many people will associate the dock firstly with Mac OS X (or, if you are a real geek, with NeXTSTEP), the concept of the dock is actually much older than that. In this installment of our usability terms series, I will detail the origins of the dock, from its first appearance all the way up to its contemporary incarnations; I will explain some of the criticisms modern-day docks are receiving, finishing it off with the usual conclusion.
Origins of the dock
As I already mentioned, many people assume that Mac OS X and its ancestor, NeXTSTEP, are the ones that first presented the idea of what we now know as a “dock”. While these two certainly played a major (if not the only) role in the popularisation of the dock concept, the first appearance of what we would call a dock was made somewhere else completely – far away from Redwood City (NeXT) and Cupertino (Apple). It all started in a small shed in Cambridge, England.
Well, I am not sure if it actually started in a shed, but that is generally where cool and original stuff in England comes from (British independent car manufacturers, people!). Anyway, I am talking about Arthur, the direct precursor to RISC OS (so much even that the first actual RISC OS release had the version number “2.0”). Arthur, whose graphical user interface always reminds me of the first versions of the Amiga OS (the ‘technicolour’ and pixel use), was released in 1987, for the early Archimedes ARM-based machines from Acorn (the A300 and A400 series). It was actually quite a crude operating system, implemented quite quickly because it was only meant as a placeholder until the much more advanced version 2.0 (RISC OS 2.0) was ready (two years later).
That thing at the bottom of the screen is the Iconbar, the first appearance of a dock in the world of computing. The left side of the Iconbar is reserved for storage icons, and on this particular screenshot you can see a floppy disk; if you inserted a new drive into the Archimedes, the Iconbar would update itself automatically. Clicking on a drive icon would show a window with the drive’s contents. The right side of the dock is reserved for applications and settings panels – here, you see the palette icon (which is used to control the interface colours), a notepad launcher, a diary launcher, the clock icon, a calculator, and the exit button.
Even though Wikipedia can be a good starting point for various computing related matters, the article entry on “Dock (computing)” is a bit, well, complete and utter rubbish; it claims that the dock in NeXTSTEP, released in 1989, was the first appearance of the dock concept (so not the Iconbar in Arthur). Further, Wikipedia claims that “a similar feature [to the NeXTSTEP/OS X dock], called the Icon Bar, has been a fundamental part of the RISC OS operating system and its predecessor Arthur since its inception, beginning in 1987, which pre-dated the NeXTSTEP dock (released in 1989). However, upon further examination the differences are quite noticeable. The Icon Bar holds icons which represent mounted their own context-sensitive menus and support drag and drop behaviour. Also, the Mac OS X Dock will scale down accordingly to accommodate expansion, whereas the Icon Bar will simply scroll. Lastly, the Icon Bar itself has a fixed size and position, which is across the bottom of the screen.”
Those are minor differences of course – not differences that set the NeXTSTEP dock that much apart from the Iconbar. It is obvious to anyone that the first appearance of the dock concept was the Iconbar in Arthur. Now, this whole dock thing was of course another example of similar people coming up with similar solutions to similar problems in a similar timespan (I need a term for that) – but the fact remains that the first public appearance of the dock was the Iconbar in Arthur. Credit where it is due, please*.
* I do not edit Wikipedia articles. I do not think that “journalists” like myself should do that.
So, the Iconbar was the first dock – but the dock has changed a lot since then. Let me walk you through the various different docks since the concept was introduced. Firstly, NeXTSTEP 1.0 was released on September 18th, 1989, and it included the ancestor of Mac OS X’s dock, positioned in the top-right corner of the screen. It introduced some new elements into the dock mix; applications that were not running showed an ellipsis in the bottom-left corner – contrary to what we see in docks today where usually applications that are running receive a marker. The dock in NeXTSTEP had its limitations in that it did not automatically resize when full, so you had to remove icons from the dock, or you had to put them in shelves. The NeXT dock remained fairly unchanged over its years (until Mac OS X, of course). The below image of the NeXTSTEP dock has been rotated 90 degrees for formatting issues. Clicking it will give you the proper orientation and size.
Before NeXTSTEP 1.0 was released, Acorn updated its Arthur to RISC OS 2.0 (April 1989), which included the Iconbar we already knew, but in addition, it had context sensitive menus for the various icons in the Iconbar. The colour scheme was a bit less unnerving too. In future versions of the RISC OS, the Iconbar remained fairly similar, but of course did get visual treatments. See the below shots of RISC OS 2.0 and RISC OS 4.
Other operating systems also received a dock, such as Amiga OS 3.1, but the one I want to highlight here is the dock in CDE – the Common Desktop Environment. The dock in CDE (my favourite desktop environment of all times – despite its looks) was quite the functional beast. It had drawers that opened upwards (a different play on context menus), and in the middle, you had a big workplace switcher. The dock was fully configurable, and was quite easy to use. Keep the CDE screenshot below in mind, as I will dedicate an entire Usability Terms article solely to CDE running on Solaris 9. The CDE dock evolved onwards through Xfce, also seen below.
I never used RISC OS, so I have to ask, did that “dock” show:
1. Installed applications
2. Running applications
3. Both?
That is, can I launch an application from somewhere else and it would automatically show up on the Dock? Otherwise it’s just an app launcher, no different to a “desktop” but limited to the bottom of the screen.
It shows running applications too.
OK, so then:
1. I click an app in the RISC OS dock, and it opens
2. I click on another app in the dock, and that one opens
3. I click back in the first icon and it switches to the first app (instead of launching another instance)
4. I open an app by some other means, and it appears in the dock as you confirmed
5. I close said app and it disappears from the dock
6. I can add non-running apps to the dock
Are all of those assertions true? Because that is not completely clear from the article or the wikipedia page. I guess I’ll have to try the emulator to find out for sure
1. I click an app in the RISC OS dock, and it opens
yes – and an icon for the running app appears on the right hand side of the iconbar.
2. I click on another app in the dock, and that one opens
yes – and its “running” icon appears on the right hand side of the iconbar.
3. I click back in the first icon and it switches to the first app (instead of launching another instance)
Not exactly. It would open a second instance of the application. To access the already running instance, you would use the right-hand icon. some applications would only allow you to run one instance of themselves at a time of course.
4. I open an app by some other means, and it appears in the dock as you confirmed
Yes – as another instance on the right hand side.
5. I close said app and it disappears from the dock
The right hand icon would disappear. The launcher icon you mentioned in (1) would remain.
6. I can add non-running apps to the dock
Yes.
It should also be noted that the iconbar supports full drag and drop (both to and from the bar). For instance, dragging a file to an app icon (either a launcher or a running app) on the bar would launch that app and open the file.
The iconbar was one of the huge strengths of RISC OS in my opinion – and it was a lot clearer what each icon weas for than it is on OSX, because of the seperation of left and right – launchers were on the left (disks, applications), and running apps were on the right. The two icons on the far right were always there and were for OS settings.
The Tog (Bruce Tognazzini, one of the early usability czars at Apple) wrote this up a few years ago
http://www.asktog.com/columns/044top10docksucks.html
Personally, I think the Dock is great. It takes a bit of getting used to, but when you do you quickly learn to love it. On gutsy, I use awn (http://code.google.com/p/avant-window-navigator/) and am quite happy with it.
Test.
FYI, Awn is now at Launchpad: https://launchpad.net/awn/
“Other operating systems also received a dock, such as Amiga OS 3.1”.
Only late on as a third party addon. You might have been thiking of 3.5, 3.9 or even possibly 4.0.
I must admit that am not sure on the 3.1 version when it comes to the Amiga. I am fairly sure though, and Wikipedia seems to agree with me (not that that says anything but still).
If anyone can give me a conclusive answer on this one I’ll be sure to update the article.
Edited 2007-11-18 18:27 UTC
The ‘Dock (computing)’ article found on WikiPedia and cited in the article by Thom, states that the Dock application had been present since AmigaOS 3.x.
This, however, might be imprecise according to this WikiPedia article on the various AmigaOS versions, which seems to suggest that the Dock application wasn’t added until AmigaOS 3.9.
http://en.wikipedia.org/wiki/AmigaOS_versions#AmigaOS_3.5.2C_3.9
“Only late on as a third party addon. You might have been thiking of 3.5, 3.9 or even possibly 4.0.”
Not quite, you add it since at least 1991 since ToolManager advent.
One problem related to pretty looking docks, huge wide panels & other supposedly helpful desktop desktop accessories (a bit like the notorious MS Office paper clip assistant) is that (especially if they are meant to look nice and pretty with big icons etc.) such apps tend to take quite a lot of desktop space and eat a lot of system resources that might be more useful for actual applications. Ok, maybe you can hide the dock when you don’t want to see it, but often that may be rather troublesome too. Personally I tend to prefer very narrow panels with essential shortcuts that are always visible but do not distract or take much desktop space away from actual apps that I want to use.
An off-topic joke related to the article image:
Edited 2007-11-18 19:28 UTC
Sorry if it seems like an attack, but I have something to say on virtually every point you bring up
Something to keep in mind about the notoriously awful clippy, is that it was the implementation, not the idea that blew.
First off, there was a 10 second animation before and after EVERY event. That is an eternity when it comes to UI elements. Secondly, it was a large floating element that was constantly obscuring some element you wanted to access. Thirdly, it was next to impossible to get rid of, every time you thought you did, it just came back.
All that being said, a contextual help area that is constantly being updated based on what the user is doing is a fantastic idea. The horrible implementation in Office has unfortunately soured people to it. If someone could come with an implementation similar to tooltips (there when you want it, invisible if you dont need it), IMHO it could be a fantastic way of doing inline help.
The problem with hiding the dock (or panels of any sort on other operating systems) is that the trigger area for showing it is WAY to large. If (for example), the trigger was in the lower left hand corner, and as soon as your mouse hit it, the dock would expand out, anchored from the left side of the screen, you would get the desired functionality, while very rarely triggering it accidentally. When the bottom five pixels of the entire monitor triggers the show operation, you will trigger it accidentally far more often then on purpose.
As for it taking space, as can be seen from the Fitts’ Law article, the larger the hit area, the easier it is to aquire the target. 4×4 icons may be really pro, but it is exponentially easier to hit 16×16. It really comes down to a tradeoff between work area chewed up, and difficulty in hitting the target. I am a big fan of the quicklaunch in windows (I hate, hate, hate the start menu, and always have), but even with that I will semi-regularily launch the wrong app by mistake, due to the small size of the icons.
All that being said, a contextual help area that is constantly being updated based on what the user is doing is a fantastic idea. The horrible implementation in Office has unfortunately soured people to it. If someone could come with an implementation similar to tooltips (there when you want it, invisible if you dont need it), IMHO it could be a fantastic way of doing inline help.
Newer versions of Word, most versions of WordPerfect since 8 or 9, and a few KDE apps do this. They put a ~2″ column along one side of the screen (right for Word, left for WordPerfect). This area is used for displaying contect-sensitive help links, useful hints, and similar stuff. On low-resolution setups (< 1024×768) it’s more annoying than anything as it takes up ~ 33% of the horizontal screen space. But on higher resolution setups, it’s not that bad.
The WordPerfect implementation is a lot nicer than the Word implementation. It’s actually useful. Especially when it’s part of their wizards or perfectexpert projects or whatever they call it.
The problem with hiding the dock (or panels of any sort on other operating systems) is that the trigger area for showing it is WAY to large. If (for example), the trigger was in the lower left hand corner, and as soon as your mouse hit it, the dock would expand out, anchored from the left side of the screen, you would get the desired functionality, while very rarely triggering it accidentally. When the bottom five pixels of the entire monitor triggers the show operation, you will trigger it accidentally far more often then on purpose.
Not sure about GNOME or XFce, but the KDE external taskbar and kpanel can be configured like that. Enable the left or right hide buttons and auto-hide. 3 seconds (or so) after your mouse leaves the panel, it will zip off to the side appearing as thin bar with an arrow down in one corner. Pop the mouse down to that corner and click to bring it back.
Personally, I can’t stand the dock concept, and prefer to put a taskbar at the bottom of the screen that only shows running apps. And an app launcher at the top of the screen with just the apps menu, some quick launch shortcuts, the system tray, and clock. Set to auto-hide. Since it’s only used to launch apps, it doesn’t need to be visible all the time. And since the taskbar only shows running tasks, it doesn’t need to be very tall (32px is plenty). It’s a beautiful setup in KDE; GNOME and XFce are a little more difficult to get working right, but once it’s configured, it works the same.
Separate running apps from non-running shortcuts/repositories/launchers/etc. Only show the info that is needed. Hide everything else until it’s needed.
Edit: Why doesn’t the quote feature [ q ][ /q ] work on the the v4 setup???
Edited 2007-11-18 22:08 UTC
Well, my comment about MS Clippy help app may not have been a very good one anyway, as it is not so much related to the subject here. My point in mentioning it was only to give some kind of an example about looks vs. real usability. So putting emphasis on aesthetics does not always improve usability.
“4×4 icons may be really pro, but it is exponentially easier to hit 16×16.”
Of course you’re right about that. And there’s absolutely nothing pro about too tiny 4×4 icons IMHO… 😉 Anyway, in Gnome I make my top and bottom panels 21 pixels high (possible with certain fonts like Free Sans) which is plenty enough in order for them to remain both clear to see, easy to use, and narrow enough so that they take minimum amount of space and can contain maximum amount of shortcuts or applets if I prefer to have them there.
“I am a big fan of the quicklaunch in windows (I hate, hate, hate the start menu, and always have)”
But start menus are – for a very good reason – found in almost all desktop environments. You tell that you hate them but fail to explain why? Care to elaborate?
I still wait to see a better way than a handy start menu to show, browse and get access to all the available applications? A start menu – of some sorts – seems like a necessity as far as I can tell. Running commands would be another way to browse, find and open apps – but not very newbie-friendly. The place where the start menu is located or can be opened is not essential. Some window managers have a “start menu” that can be opened by right-clicking the desktop background, but that is still the same start menu, and is also more difficult to reach if the desktop background is hidden under open windows.
Still about docks in general:
Mac OS X dock (and maybe many of its “copies” too) looks really nice. In aesthetics Mac OS X may be a clear winner. But what comes to functionality I prefer the old though maybe a bit dull looking taskbar. Not only does taskbar take much less desktop space but textual shortcuts of the taskbar show much more clearly than mere graphical icons what each shortcut represents. If you have, say, 10 open folders, and you can see only 10 similar looking folder icons side by side on the dock, which is which?
Mac OS X dock has some really nice features, though, like docklings and the availability of extended menus that control applications without making them visible on screen – but that could be implemented with a taskbars too.
Edited 2007-11-18 23:50 UTC
I don’t know why he hates it, but here is why I do: there is just so much information in them to easily find the application you are looking for. Especially in Windows, where each installed application puts its entries by default to the Programs root. I have seen such start menus all too often. And if you arrange it to folders (like Games, Utilities, etc), it will take a lot of time to go through the folder structure. Under Linux, it is a bit more sane, but it is much more difficult to edit the menu than in Windows, making it again a bit uncomfortable.
Of course, it is absolutely necessary to have something like that. But for programs I use every day, quicklaunch is much faster and more convenient. That was on Windows, though; on Linux, katapult beats every other solution for me. I do not even have a quick launcher anymore, just alt-space, type the first 2-3 characters, and there is the app I want. Brilliant.
I did not yet try out katapult, but it seems to be a very good app launcher interface. For people who know what they are looking for (know the name of the app), that is.
On the other hand, give a Windows user (or a complete newbie) the task of burning some data on a CD with a KDE desktop. What does he do? Right: Launch the “start” menue, look for “archiving” or “CD/DVD”, or maybe even “multimedia”, all sane places where you could find a CD burning application. It will take him some time, but he will be able to complete the task without having to ask for help.
But set him in front of katapult, and he will try to type “CD”, “DVD”, “Nero”, … . none of which will bring him closer to “k3b”. Or did I miss some ability of katapult there?
I dont have a problem with the start menu as sort of an installed software directory, but I don’t like it as a main application launcher. On OSX i used quicksilver (now i use spotlight), on linux I use an embedded run dialog (which has been replaced by deskbar in gnome which I don’t like at all), and on windows it always bugged me that there was nothing similar (until WDS on Vista).
Currently, I do all my work on vista. I have 7 shortcuts in the quicklaunch that I use several times daily. These require one click.
I have 20 items “pinned” to the start menu (or panel, I guess you would call it now). These are apps I use semi-regularly. It consists of the Office apps, some adobe apps, terminal, powershell, Safari and Opera (for website testing), torrent software, IM software, etc. These take me two clicks. Usually I will use several, but not all of these every day.
So far so good, but what about the hundred or so small apps I use infrequently? Stuff like regedit, defrag software, server config utils, backup software, dvd authoring software, calculator, virtualization software, burning software, etc. That is what most people use the start menu for. These I will use a few of every day, but always different, and not frequently enough to put it in an easy access location.
Start menu requires up to four clicks with a lot of sifting through folder names to get to what I want. Considering the size of the items in the list, the amount of items there are, and the bad organization (ESPECIALLY in windows), it is significantly more of a pain to find what I want. Some people just litter their desktop with hundreds of launchers, so they can quickly find what they need. This is a symptom of the start menu problem.
Nowadays, the only time I open the start menu (or application menu on linux, or the applications folder on OSX), is to see what is installed on a computer I am unfamiliar with. I know this may seem like nit-picking, but IMHO it is a badly designed UI element and I find it frustrating to work with as long as any alternative exists. The fact that so many exist kind of points that I am not the only one to feel this way.
Sorry for the huge response, I figured I may as well be thorough, as it is a hard thing to articulate
there was plenty of third party type and run apps for windows.
here is one i have used before:
http://www.donationcoder.com/Software/Mouser/findrun/index.html
i think that more recent office versions (or maybe its works versions) have a help bar on the right, so that one can look at work at the same time.
dont know if its context sensitive tho…
(i use openoffice in windows and koffice in linux)
Thank you for officially debunking the myth that Steve Jobs and/or Apple invented the dock/task-bar. Over the years, in this forum and elsewhere, I have maintained that the RISC OS Icon Bar preceded the NeXTSTEP and OSX docks and that all three had the same basic features. I hope that your article will end all contention on the matter.
However, the idea of an interactive dock/task-bar actually first appeared in Windows 1.01 in 1985: http://toastytech.com/guis/bigw101.gif This first dock/task-bar was interactive in that one could click on an icon to open a minimized program. Icons of inactive programs were not included in this task-bar.
The notion that an interactive dock/task-bar must have icons of both inactive and running programs is merely a subjective opinion, and it certainly would not have been a major intellectual leap to put permanent icons on the early Windows dock/task-bar. In fact, it is baffling that Microsoft did not offer such an obvious feature and that Microsoft immediately abandoned the dock/task-bar until Windows 95, ten years later.
I am not sure as to why you give so much weight specifically to the the OSX dock (in the second half of this article), but, before the OSX dock, there where many *nix window manager docks/wharfs/task-bars. In addition, I take issue with your assertion that “the Mac OSX dock has singlehandedly popularised the dock concept, and brought it to the masses.” The task-bar on Windows 95 and on Windows 98 had the same basic function as the OSX dock, and both Windows OSs were used by millions.
Yes, but that’s a taskbar. Not a dock.
thats one blurry line imo…
The dock is more tailored to the Apple UI ideas. In windows/linux, you launch an app and it runs fullscreen. At that point, you want as little OS interference as possible. This effectively embraces the idea that applications are Modes. The taskbar provides a way to switch modes.
The Apple way says that applications arent modes, they are operating system objects. The traditional mac approach is you never run windows fullscreen, you run them as large as they need to be. Transitions between one application and another are more seamless, they all look and act the same, have the same menubar, you can often see the work you are doing in one even if you are in another. It is a concept that is very hard to explain to someone who has never really worked with it, but anyone who grew up on mac classic not only gets it, but finds the fullscreen approach kludgy.
in linux you can work the way you want to, there is no set desktop in linux.
but gnome and kde use a lot of windows elements as that is what most potential users are used to.
btw, did you just call non-apple users stupid without being direct about it?
anyways, i would say both ways have its issues, and its related to using windows the way they do. i wonder if not one should take a step back to the days before apple introduced free-floating windows.
some of those wm’s on *nix seems interesting in that regard.
i wonder if not one should take a step back to the days before apple introduced free-floating windows.
Apple did not invent free-floating windows — the Xerox Alto and Xerox Star had them long before Apple: http://toastytech.com/guis/altost1.jpg
http://toastytech.com/guis/altost2.jpg The Three Rivers PERQ also had them before Apple: http://www.digibarn.com/collections/systems/perqt2/perqzoom.jpg
Edited 2007-11-18 22:30
He said ‘introduce’, not ‘invent.’
Although a handful of other products had them before Apple, Apple was the company that introduced them to the public (and the industry) at large.
I said “invent,” not “introduce.”
However, I am not sure what you mean by “Apple was the company that introduced them [free-floating windows] to the public (and the industry) at large” Here is a page from an October, 1981 PERQ brochure, clearly showing free-floating windows: http://www.chilton-computing.org.uk/gallery/foreign/orig/f00368.jpg
Notice the headline, “The advent of the personal workstation” — you can’t have a grander “introduction” than that!
By the way, Apple finally got overlapping windows two years later.
Edited 2007-11-19 02:30
well you learn something every day
i thought the first gui’s had non-overlapping windows because it was found to confusing for the user…
i thought the first gui’s had non-overlapping windows because it was found to confusing for the user…
The first gui in the 1960s had no windows. Likewise, I think that the first Xerox Alto gui had no windows — just a single application per screen.
hmm, ok. sadly i dont recall where i got the “info” from so…
ah, think i found it:
http://toastytech.com/guis/star.html
i see the star tried to not overlap the main windows (but dialogs still could overlap).
guess i, or someone else, got their wires crossed at some point…
Edited 2007-11-19 03:56 UTC
Thanks for the link.
Not sure why the screenshots on the Xerox Star site that you linked show no overlapping windows on the first page, but they start on the second page.
Actually, free-floating, overlapping windows first appeared in the Xerox Alto, prior to the Star: http://www.digibarn.com/collections/software/alto/alto-cedar-enviro…
By the way, to all the fanboys who claim that Apple invented the scrollbar — in the Xerox Star screenshots, did you happen to notice something interesting on the edges of the application windows?
Edited 2007-11-19 04:19
from what i gather, when a window is opened, it was arranged to make max use of the screen area, but with the condition that it should not overlap the icons on the right edge of the screen.
but the user was then able to move then windows about in any way he or she wanted.
but thats just be guessing based on the screenshots and accompanying text.
I didn’t call anyone stupid, I was talking about how it feels to work in the windows style modal paradigm vs the mac classic spatial paradigm. It has nothing to do with intelligence of the users, and everything to do with the philosophy behind the design descisions.
I have yet to run across a really spatial WM in linux. There isn’t anything that is even equivilent to OSX, and OSX is a far cry from Mac Classic in this regard. Of course, I could be wrong.
And really, I’m not trying to be insulting or anything. Mac Classic had this approach, BeOS had it, OSX has it to a degree, and even though I have never used NeXT, from what I have read it looks like it had it.
twm and ratpoison leap to mind.
That is again, a completely different way of approaching things. I was talking about the mac and the windows approach, and how the taskbar and the dock address the space of task switching differently.
I was talking about how it feels to work in the windows style modal paradigm vs the mac classic spatial paradigm.
I have yet to run across a really spatial WM in linux.
What do you mean by “spatial?”
I mean where applications are treated as objects, and not as modes (as I went into in great depth in an earlier comment). Thom did a brief overview here http://osnews.com/story.php/18829/Common-Usability-Terms-pt.-I-Spat…
For something more in depth, there is an article John Siracusa did a few years ago that everyone points to as soon as the spatial metaphor comes up http://arstechnica.com/articles/paedia/finder.ars
I don’t think that spatial design is the be all and end all (I REALLY like what Jef Raskin was talking about in The Humane Interface before he died), but I do think that the spatial metaphor is still a more elegant solution then what is the norm today.
I see. You mean “spatial” in the limited, Mac-centric “spatial memory” sense.
Well, I am glad that Thom’s article on spatial memory was mercifully shorter than the John Siracusa article (on a subject that can probably be covered in a few sentences). However, I disagree with Siracusa’s point that path models are slower and less efficient than spatial models, but that contention is grounds for a whole other discussion.
Spatial memory comes under the broader usability topic of conditioning, and both topics are integral to most design and art disciplines (industrial design, interior design, graphics design, photographic design, gui design, etc). Conditioning also applies to many activities and processes and to certain non-spatial mental models (such as command-line command-option-argument protocol). It doesn’t take a genius to realize that keeping things in their expected place, time and order can make a task more speedy and efficient.
I think you will find that many Linux WMs/Desktops do a good job of maintaining spatial conditions (and smart placement), and the same probably applies to most Windows versions.
I haven’t noticed anything “spatially special” in OSX. Perhaps you could be a little more specific on what you mean by objects and modes, because, as I mentioned in an earlier post, I can’t discern the models you describe, and I use all three OSs.
Edited 2007-11-19 06:20
That is usually what one means in this kind of discussion, since Apple did a boatload of research on it back in the day
It goes further then that. There is a finder feature from at least OS7 that is when you hold command and click on the folder title, a dropdown list of all the folders in the path shows up. It was kind of obscure, and only really useful in specific circumstances. In OSX, a little while ago I wanted to add the current folder I was in to the places sidebar. Just to see if it would work, I tried holding command and clicking the title, then dragging that to the places bar. It worked. By contrast, Vista explorer has GNOME-esque path buttons. The idea is a good one, and implementation is good to a point, but try and drag a group of files to one of the buttons. Or try and drag one of the buttons to the favorites bar, or anywhere else for that matter.
On the Mac, a handle for a folder is a handle for a folder no matter where it is, even in a relatively obscure and next to useless place. I can count on it. In windows, a handle to a folder could be many things depending on the context.
Another example would be dragging a folder to a taskbar icon. A third would be dragging a text selection to the desktop.
I am sure some or all of these are implemented somewhere in linux, but what i am trying to illustrate is that treating ui objects consistantly goes beyond window size and placement.
Maybe its just me, but virtually every serious application I use in windows (and linux) is designed to be run full screen. It may not launch full screen, you may be able to operate it at less then full screen, but at the end of the day, 99% of the time you are using the taskbar to switch between several maximized programs. For example, I often have Visual Studio, and Photoshop open at the same time. Due to the large amounts of pallets in both VS and PS, if you expect to get any work area at all you are going to run them full screen, using the taskbar to switch back and forth.
Thanks for being more specific.
There is a finder feature from at least OS7 that is when you hold command and click on the folder title, a dropdown list of all the folders in the path shows up. It was kind of obscure, and only really useful in specific circumstances.
Very interesting. Not sure I understand exactly what you mean, but I will have to try it next time I am on an OSX box. It definitely requires conditioning, but seems to have little to do with spatial memory.
By contrast, Vista explorer has GNOME-esque path buttons. The idea is a good one, and implementation is good to a point, but try and drag a group of files to one of the buttons. Or try and drag one of the buttons to the favorites bar, or anywhere else for that matter.
Won’t be at a Vista box for a while, so I’ll have to take your word for it. I am not sure I understand what you mean by “GNOME-esque path buttons.” However, I am typing this message on a Gnome box (lame, 64 Studio implementation — looks pretty, but bloated), and I can drag icons/files to different locations — somewhat. Conditioning applies to the use of this function (or its absence), if the user is familiar with it, but I don’t think spatial memory applies here.
I think KDE generally does a better job with dragging and dropping (even though Gnome is always described as more “Mac-like”). Don’t know for sure about the other *nix desktops.
On the Mac, a handle for a folder is a handle for a folder no matter where it is, even in a relatively obscure and next to useless place. I can count on it. In windows, a handle to a folder could be many things depending on the context.
What is a handle for a folder? Do you mean a link? Again, this sounds like it involves conditioning, not spatial memory.
Another example would be dragging a folder to a taskbar icon.
Don’t think this function works in any *nix desktops or Windows. This capability really involves a conditioned experience/task, more than spatial memory.
A third would be dragging a text selection to the desktop.
I have never tried this. What happens when the raw text selection hits the desktop? At any rate, it again sounds like a conditioned task, that doesn’t really involve spatial memory.
I am sure some or all of these are implemented somewhere in linux, but what i am trying to illustrate is that treating ui objects consistantly goes beyond window size and placement.
“Treating UI objects consistently” goes beyond the Mac-centric, catchphrase topic of “spatial memory.”
Maybe its just me, but virtually every serious application I use in windows (and linux) is designed to be run full screen.
I haven’t had the same experience, even with the default settings, although my Vista usage is low. Again, I am not even sure exactly how spatial memory applies to this alleged phenomenon.
Due to the large amounts of pallets in both VS and PS, if you expect to get any work area at all you are going to run them full screen, using the taskbar to switch back and forth.
Would one not have to do the same with OSX and its dock, if one had the same monitor? Again, not sure how spatial memory applies here — don’t these applications launch with the same layout in which they were closed?
I tend to use a pager (virtual desktop) when it is available. KDE and Gnome (along with many other *nix WMs/desktops) often combine pager with a dock/task-bar. Some pagers are more spatial than others, but it doesn’t make that much difference — I am able to use almost any GUI set-up with similar speed and efficiency.
Never took to tabbed windows, though, but I can use them if necessary.
No problems.
Edited 2007-11-19 07:57
ok, ok, i guess i was jumping the gun there.
thing is that if your used to windows and go to mac you run into just as many gotchas as when you go from mac to windows.
different folks, different strokes. if just we could agree on file types and let people share data effortlessly then one could use whatever one wanted. but right now thats needlessly segregated thanks to attempted lock-ins, leading to people “having” to use os whatever to work with specific kinds of data.
edit:
iirc, later versions of office have gone away from the window in window (mdi?) interface and over to one window, one file.
and you can set the explorer to open each folder in its own window, and if you try to open a folder a second time, it will just highlight the existing window.
konqueror under kde have the same option. in gnome i dont know.
still, beyond that i guess its every app for it self.
recent versions of adobe acrobat have gotten a weird behavior. it mixes mdi and spatial in the most confusing way. yes there are one button on the taskbar pr opened file. but if you dont watch out you can close them all by closing one of them…
and didnt web browsers go with tabs because people got fed up with IE having a button pr open page?
some times spatial works, sometimes it dont apparently…
Edited 2007-11-19 05:32 UTC
Huh? The Xfce I’m using even has smart window placement, so if I open up 4 terminals simultaneously, not only are they not maximized, but they’re automatically placed in separate corners of the screen so all 4 are fully visible at the same time.
Lumping all *nix DE’s/WM’s together might be a bad idea.
“This effectively embraces the idea that applications are Modes. The taskbar provides a way to switch modes. ”
Well, applications are, after all, the reason we use computers. I find it amazing that so much time and energy is spent debating the relative merits of operating systems, when our interaction with the OS comprises perhaps 5% or less of the amount of time we spend working each day.
As for the idea that Windows is somehow designed around full screen applications – hogwash. Windows is designed around choice. You want fullscreen, you get fullscreen. You want to tile 4 apps and switch maniacally between them, bob’s your uncle – windows will happily oblige you. On the Mac though, Steve has made all the decisions for you, how dare you try to maximize an application!
What happens when you launch abiword? Cause when you launch Pages, the window doesnt fill your entire monitor, only the size of your document. The “accessory” apps are corner cases, they are treated in a spatial manner, but they are one of the few classes of applications that do.
What happens when you launch abiword?
It fills about 2/3 of the screen on the far left.
If, by “spatial” you mean smart placement, lots of *nix WMs/Desktops large and small offer that feature.
IIRC abiword does not open up full screen. open office does. in fact, oo.o is really the only one I can think of off the top of my head that does. If I open up two firefox windows, the first one goes in the top left, second one fits in at the bottom right.
IIRC abiword does not open up full screen. open office does. in fact, oo.o is really the only one I can think of off the top of my head that does. If I open up two firefox windows, the first one goes in the top left, second one fits in at the bottom right.
To be honest with you, I never noticed that OO.org opens in a fullscreen window by default as KDE allows one to set that on a per-application basis among a lot of other things. You can even specify the viewport where the applications are supposed to open when you launch them: I always set GIMP to open on its own viewport as it opens way too many windows sometimes!
The same thing that happens when you launch any program in Windows, KDE, GNOME, etc … it looks at the application shortcut to see what you have set for the default window size.
Some people set it to Maximised in the .lnk/.desktop file. Some people set it to Normal or Default, which means it starts at whatever size you had it set to when you closed it, or whatever default is set by the app developer, or whatever the systemwide default is. Some people set it to Minimised.
There is no “all apps will open maximised” default set in Windows, KDE, or GNOME. These are all app-specific settings that can be changed.
Is there a way, in MacOS X, to configure an application to always open maximised, and to actually take up the entire screen?
“In windows/linux, you launch an app and it runs fullscreen”
They don’t, surely? They launch at whatever size you set them to the last time you opened them. They don’t even start out full screen do they? Now you have me scratching my head and trying to remember how they worked in Win 98 – someone I do work for is still running W98 on one machine, and I don’t think even there the apps open full screen.
I just fired up a whole bunch in Linux, and not one opened up full screen. What is this about?
Is there any difference between MacOS and other OSs in this respect? The tiling or tabbed window managers of course, in Unix/Linux, but they are a rare breed.
How applications open is set in the shortcut (.lnk) to that app. You can set it to Normal Window, Maximised Window, or Minimised Window.
The default is Normal Window, which starts the app with whatever window size it was set to the last time you opened it, or a default size as set by the app coders.
This has been the behaviour since Windows 95 and the birth of the .lnk file. It was also possible to do this in the Windows 3.x days by hand-editing the .pif file.
There has never been a “default to maximised” or “always start maximised” setting in Windows.
Yes, but that’s a taskbar. Not a dock.
Please define the difference between a dock and a taskbar.
Edited 2007-11-18 21:14
If Windows 95 had used a dock instead of a taskbar, it’s safe to say we’d all be using Macs today…
Yes, but that’s a taskbar. Not a dock.
With the difference being … ?
They are spelt differently. I would have thought that was obvious ;P
iirc, in windows 3.1 you where supposed to not run the main “window” maximized (but who didnt), as any minimized program would get a icon on the “desktop”.
Only when using the default progman.exe shell. Other shells worked differently. My favourite Windows 3.x shell was Xerox’ TabWorks shell.
The “desktop” was a tabbed notebook. Down the lefthand side of the screen was a quick launch area where you could put icons for your favourite 10 apps. 80% or so of the screen on the left was the open page, where icons would appear for the apps. The other 20% or so was taken up by the tabs. Click a tab and that page became active (similar to open a program group in progman). Running apps full-screen/maximised was the best way to work with tabworks. And alt+tab to switch between running apps.
Kept things very neat and organised, everything was easy to get to, and it never got in your way. When we migrated to Windows 95, it took a *lot* of getting used to as we had to navigate through a hierarchical menu to find our apps instead of just “click the tab, double-click the app”. Too bad the Win95 version of TabWorks was so buggy. Guessing it had something to do with the amount of internal linkage between the Windows OS and the explorer.exe shell.
heh, never used windows 3.1 much. it was dos most of the time for me.
but i think you could run progman.exe as a alternate shell on win95 (maybe even 98).
Yes, progman.exe was included in Windows 95. However, it broke a lot of built-in features like the recycle bin and network neighbourhood, as these were hooks into explorer.exe.
You could also run a lot of different shells on Windows 9x (Litestep and Darkstep being the two most popular). But, these also broke things that were hooked directly into explorer.exe.
If MS had properly separated the GUI shell from system services (like in Win3.x), then running alternate shells would have been a much smoother/easier thing to do.
true that, i recall finding a replacement for the quicklaunch area of the windows taskbar. it even came with the posibility to insert a winamp controller.
http://www.truelaunchbar.com/
there is also a free version:
http://www.freelaunchbar.com/
Edited 2007-11-19 03:45 UTC
Regarding your last paragraph. Please take into account that the term “Dock” originates from NeXTSTEP. The docks/wharfs/harbors in fluxbox, openbox, pekwm and so on exist to make use of WindowMaker’s dockapps, which became a sort of X11 “applet” standard in the old days. And the whole point of WindowMaker (and AfterStep) was of course to recreate the NeXTSTEP experience, if only at a shallow level. OS X is really the latest version of NeXTSTEP, so to have had a dock before this OS, you need to do much better than 2001, more like 1989. That’s two years before Linus released version 0.01 of his kernel, by the way.
Whether something counts as a “Dock” depends on your subjective definition of the term. I don’t consider Windows’ taskbar with quicklaunch to be the same concept at all, for instance. You’re not “docking” anything in there, really. Actually, I even think the changes Apple made to the Dock drifts it somewhat away from what the term describes.
Okay. The term “dock” relating to GUIs probably originated with NeXTSTEP, but “a rose by any other name…”
Thank you for reminding me of the term “harbor” — I was racking my brain earlier, trying to remember it.
Yes. Many of the *nix WMs that had “docks” after 1989 were emulating NeXTSTEP.
However, if “docking” an active application means having its icon appear along one edge of the screen, you need to do much better than 1989, more like 1985. Windows 1.01 had this capability then. That’s four years before NeXTSTEP’s dock was released (according to your post).
In an earlier post, I made the same statement as you regarding the subjectivity of “dock” definitions.
Don’t know “quicklaunch,” but I don’t see much functional differences between task-bars and docks.
As mentioned earlier, Apple did some changes to the Dock. Today, the definition seems to be; an icon bar where the same icons are used both to show running apps and to launch them.
Traditional taskbars don’t fit this description. So there is at least one difference right there. All the choices of ways to launch and/or monitor applications share some similarities. It can therefore be hard to distinguish them, and maybe there’s no real need to, but we humans tend to come up with words to distinguish similar but slightly different concepts.
However, if “docking” an active application means having its icon appear along one edge of the screen…
Well, in NeXTSTEP, nothing appeared in the Dock at all. You had to drag it there to dock it. See my post further down for a screenshot and some more comments on this.
By “quicklaunch” I meant the Quick Launch toolbar.
Today, the definition seems to be; an icon bar where the same icons are used both to show running apps and to launch them.
You have just described the RISC OS Icon Bar, from 1987 — preceding NeXTSTEP by two years.
Traditional taskbars don’t fit this description. So there is at least one difference right there.
My experience is that Windows 95 and Windows 98 fit this description — I could drag application icons to the taskbar and if an application was running a rectangle with an icon and the applicatin’s title would appear. The Gnome and KDE taskbars exhibit the same behaviour.
Well, in NeXTSTEP, nothing appeared in the Dock at all. You had to drag it there to dock it.
Is it an advantage to lack an indicator for an active application that one forgot to drag to the dock?
By “quicklaunch” I meant the Quick Launch toolbar.
Not familiar with the Quick Launch toolbar.
Edited 2007-11-19 09:47
You have just described the RISC OS Icon Bar, from 1987 — preceding NeXTSTEP by two years.
Ok. I know RISC OS by reputation only. Thom talked about this in the article, but I wasn’t entirely sure how the Icon Bar worked. I agree that Acorn had the first dock, but they used a different term for it.
The argument in my first post was that OS X and NeXTSTEP are evolutions of the same system, and that this is where the term “Dock” comes from. Claiming that WindowMaker had a dock before OS X makes no sense since wmaker is actually trying its best to copy the look and feel of NeXT.
My experience is that Windows 95 and Windows 98 fit this description — I could drag application icons to the taskbar and if an application was running a rectangle with an icon and the applicatin’s title would appear. The Gnome and KDE taskbars exhibit the same behaviour.
I don’t understand any of this. Sorry.
Is it an advantage to lack an indicator for an active application that one forgot to drag to the dock?
Not “forgot”, but “chose not to”. Let’s take a look at a screenshot, please click the preview to see the full size image;
http://en.wikipedia.org/wiki/Image:NeXTSTEP_desktop.jpg
Here you see a user running approximately 10 applications. Each of these are represented with an appicon, which is by default positioned on a free spot at the bottom of the screen. The user has chosen to put some of the apps in his Dock, which means they have a completely fixed position across sessions. If he decides to “dock” another application, he can drag it either from the bottom of the screen or from a File Viewer.
Apple changed this by putting in the Dock not only the fixed elements, but also the ones that ran across the bottom edge in NeXTSTEP. This looks like a small change perhaps, but it means that the Dock now changes all by itself, while in the old system the user had 100% control. It also blurs the concept slightly since you have gone from “docking” something to selecting “Keep in dock” on stuff that is already in there.
I agree that Acorn had the first dock, but they used a different term for it.
I agree that RISC OS had a dock at least two years before NeXTSTEP, however, I say that the first dock was the strip along the bottom of the Windows 1.01 screen, in 1985. That strip contained the icons of minimized applications, and application windows could not overlap onto the strip. Of course, it would have been obvious to also include permanent icons on the strip, but, for some strange reason, the dock feature was immediately ditched by Microsoft before permanent icons were added.
Claiming that WindowMaker had a dock before OS X makes no sense since wmaker is actually trying its best to copy the look and feel of NeXT.
Okay. Let’s not argue semantics. There were several *nix window managers that had docks/task-bars, before NeXTSTEP “became” OSX.
***My experience is that Windows 95 and Windows 98 fit this description — I could drag application icons to the task-bar and if an application was running a rectangle with an icon and the application’s title would appear. The Gnome and KDE task-bars exhibit the same behavior.***
I don’t understand any of this. Sorry.
The task-bars in Windows 95 and 98 and in Gnome and KDE provide the same function as the docks in NeXTSTEP and OSX.
Apple changed this by putting in the Dock not only the fixed elements, but also the ones that ran across the bottom edge in NeXTSTEP. This looks like a small change perhaps, but it means that the Dock now changes all by itself, while in the old system the user had 100% control. It also blurs the concept slightly since you have gone from “docking” something to selecting “Keep in dock” on stuff that is already in there.
The task-bars just mentioned (and many not mentioned) provide both the NeXTSTEP and OSX functions you describe, and more: one can drag any icon (from both active and inactive applications) onto a reserved place on the edge of the screen, and the icons will stay there until they are manually removed. In addition, the titles and/or icons of active applications can appear on their own in this reserved space, *but the user can disable this function*. As an extra benefit, clicking on the active titles/icons one can manipulate “modes” of their applications (now I am beginning to see what google_ninja meant by “modes”).
Edited 2007-11-19 18:33
Pardon? no one argues that Steve (personally) or Apple invented it – what they did do well is the popularisation of the said technology.
Take a look at Japanese electronic companies like Sony; very few of the ideas that came out from Sony in the early years were as a by-product of in house R&D.
LCD’s for example, was an American invention, it was the fact that it took until the 1980s when american companies finally realised the link between R&D, competitive advantage and producing products – where they finally protected the technology they developed.
Nothing today is original; the vast majority of it has already been conceptualised years ago – perpendicular recording which has finally being used in hard disks today was developed over 40 years ago – for instance.
Lets not try to start a turf war over who invested who – the question should be who implemented the best – that is, reducing the number of down sides.
no one argues that Steve (personally) or Apple invented it
You’re kidding, right?
In spite of seeing proof after proof over the years on this forum and elsewhere, the Mac fanboys would not accept that Jobs/Apple did not invent the dock/task-bar. Many still won’t accept that fact even after seeing the proof in this article.
Mac fanboy denial is a powerful force. There is someone who just posted a couple of messages on this thread who evidently can’t accept that the dock did not originate at Apple. This poster is delving into minute and rather arbitrary behavioral differences between launchers, task-bars and docks, apparently in an effort to show that the Apple dock is distinctly different than the non-Apple versions, and, thereby, bolster the idea that Apple invented the “true” dock.
Even this OSNews dock article anticipates Mac fanboy resistance, taking a roundabout, sort of apologetic path before finally declaring in the sixth paragraph, “… the fact remains that the first public appearance of the dock was the Iconbar in Arthur. Credit where it is due, please.”
(However, the first dock/task-bar actually appeared in Windows 1.01, two years before Arthur.)
By the way, do you know “Steve” personally?!!
what they did do well is the popularisation of the said technology.
Once a person has finally accepted that Jobs/Apple did not invent an item, nothing in the Universe can stop that person from saying, “… well, Steve made it popular!”
First of all, just saying that Jobs/Apple “made popular” or “introduced” something doesn’t make it a hard fact. Please see this response to someone’s claim that Apple “introduced” free-floating, overlapping, application windows: http://osnews.com/permalink.php?news_id=18941&comment_id=285052
Also, making something popular is a dubious achievement. Perhaps we should start a televised awards show for hucksterism — I wonder who will win this year’s “Huckie” for “largest reality distortion field.”
The inventor is the one who should get the credit and accolades.
LCD’s for example, was an American invention, it was the fact that it took until the 1980s when american companies finally realised the link between R&D, competitive advantage and producing products – where they finally protected the technology they developed.
Not sure what you mean here, but I think that it was long before the 1980s when American companies realized the importance of R&D, “competitive advantage” and production.
Nothing today is original; the vast majority of it has already been conceptualized years ago
An optimistic view.
A famous myth has Charles H. Duell, the U.S. Commissioner of Patents in 1899, saying, “Everything that can be invented has been invented.” Although he did not really make the statement, surely, a lot of people have held that sentiment before and after that fictional moment. It is a good thing that the sentiment also has never proved to be true.
Lets not try to start a turf war over who invested who – the question should be who implemented the best
Okay. If who “best implemented” or “popularized” an invention is more important to you than who invented an item, then you won’t mind if I reiterate three facts proven earlier in this article and thread:
– Jobs/Apple did not invent the dock;
– Jobs/Apple did not invent free-floating, overlapping application windows;
– and, Jobs/Apple did not invent the scrollbar.
Edited 2007-11-19 21:48
Before we start, your post score was at 0 – I ‘ve added a point onto it by virtue of the fact that I don’t like seeing discussion and debate killed because of the ‘tyranny of the masses’. I would sooner see your unpopular post and allow me to dissect it than have it buried by those who take a militant attitude to their platform.
1) Don’t confuse the ranting of a few fanboys for the vast majority of Mac people who know their computer history. To claim that ‘Steve invented the GUI’ is as stupid as claiming that ‘without Microsoft, there would be no PC revolution”.
No one owes Steve Jobs or Bill Gates (or their companies) anything; they aren’t prophets, they aren’t special beings or organisations whom, if they suddenly went, all innovation would cease to exist. They’re merely companies filled with humans using their knowledge to create things; if those specific companies didn’t exist, others would pop up.
2) Learn English, I put personally in the bracket as to infer that he personally didn’t sit in a lab and create it. English isn’t a difficult language, please spend time learning it before butchering it or worse, asking stupid questions.
3) There are loads of things which have been invented and never attributed to the original person; take Kellogg’s cereal for example – no one ever demands that John Harvey Kellogg should be venerated – its always his brother which has the kudos for creating the Kellogg’s we know today.
There is no use pointing out who created it if you don’t acknowledge who put the money, marketing and ‘soft capital’ behind it to turn it from an idea on the drawing board into a usable and marketable product.
4) I don’t know where you history come from but the Europeans have a had a heck of alot greater success internationally when it comes to commercialising consumer products. Most things in the US which people rant on about never make it outside the boarders.
Heck, there have been studies after studies regarding Europe vs. America in regards to consumer products – ignore them if you want and keep living in the deluded idea of the ‘star spangled banner’.
5) Name one product out there that is completely and new an innovative – that is, created in a clean room without the input of any existing ideas or technology?
Everything today is built off the ideas of years ago; its the old story of ‘on the shoulders of giants we stand’.
6) You have major English issues; learn the difference between implementation and popularisation – the two are very different. Creation, implementation and popularisation can occur completely separate from each other.
Before we start, your post score was at 0 – I ‘ve added a point onto it
Okay. I’ve added a point to your post.
Don’t confuse the ranting of a few fanboys for the vast majority of Mac people who know their computer history.
Well, unless a few fanboys are posting under numerous usernames, a lot of them out are out there posting wildly and ignoring rationality.
To claim that ‘Steve invented the GUI’ is as stupid as claiming that ‘without Microsoft, there would be no PC revolution”.
I agree that both claims are stupid. However, there are two important distinctions between the claims:
– (1) one claim refers to invention, which can have hard meaning and a definite, unique value, while the other claim involves the acts of “popularization” and “introducing a product,” which are not as unique as invention, and which are rather nebulous, conceptually, with suspect value;
– (2) one claim is definitely not true, while the other (according to the loose and dubious notions of “popularization” and “introducing”) could be interpreted to have some merit.
Learn English, I put personally in the bracket as to infer that he personally didn’t sit in a lab and create it. English isn’t a difficult language, please spend time learning it before butchering it or worse, asking stupid questions.
Sorry. I should have added “/sarcasm.”
One should probably not refer to Steve Jobs as “Steve” unless one knows him personally (or unless one is ridiculing him or his followers). It is sort of unbecoming. As you said, “Steve Jobs and Bill Gates… are merely companies,” and they are out to get your money. Referring to Mr. Jobs as “Steve” suggests a delusion that he is one’s altruistic, “good buddy.” So, I placed the quotation marks around Mr. Jobs’ first name to denote this deluded familiarity.
There are loads of things which have been invented and never attributed to the original person; take Kellogg’s cereal for example – no one ever demands that John Harvey Kellogg should be venerated – its always his brother which has the kudos for creating the Kellogg’s we know today.
If John Harvey Kellogg single-handedly invented breakfast cereal, then he should get the credit for it, not his brother.
There is no use pointing out who created it if you don’t acknowledge who put the money, marketing and ‘soft capital’ behind it to turn it from an idea on the drawing board into a usable and marketable product.
As an inventor and a designer, this view is rather disturbing, especially the notion that investors and marketing people make an invention “a usable and marketable product.”
There are exponentially more investors, salesmen, marketeers and hucksters than there are people with unique ideas. Business people are interchangeable, inventors are not.
In other words, an invention can exist (and be successful) without business people, but no one can have a product without the inventor.
The inventor is more important than the business, marketing and sales people.
I don’t know where you history come from but the Europeans have a had a heck of alot greater success internationally when it comes to commercialising consumer products. Most things in the US which people rant on about never make it outside the boarders.
Yes. I hate it when everyone rants about US-only products, such as: Ipods, Iphones, Mac Books. etc… /sarcasm
Seriously, I am not sure why you are addressing this issue — did someone make a statement about US products versus non-US products? Certainly, the consumer market is much bigger outside of the US.
Name one product out there that is completely and new an innovative – that is, created in a clean room without the input of any existing ideas or technology? Everything today is built off the ideas of years ago…
I guess a lot of non-inventors tend to think that invention is always an obvious progression, because, after the product has been released, it is easy to look back and understand the steps in the invention’s development. They do not comprehend the depth of the challenges faced by an inventor, engineer or designer when they have to come up with something that performs a certain task, while staring at a blank sheet of paper (or a blank screen). Nor do they realize that the significant “Eureka” moments often occur unprompted, and involve innovation that has nothing to do with one’s current project.
Non-inventors often seem to think that innovation is classified into only two categories — “completely new/unique” and “not completely new/unique.” However, the uniqueness of innovation is less “black & white” than this notion and is more of matter of degree, on a scale between the two extremes of very obvious and very unique.
Also, non-inventors seem to have trouble discerning a invention’s position on this scale. For instance, not much ever got mentioned about the Mac Trashcan, which sits near the top of the innovation scale, as it was completely unique and innovative, but the Iphone was declared “Invention Of The Year” by Time Magazine, and it actually sits near the bottom of the scale, being completely obvious and derivative.
By the way, the Mac Trashcan is probably the only Apple-originated item that is unique enough to be even close to the top of the innovation scale.
You asked me to name one product that is completely new and innovative, and I already have with the Mac Trashcan. Here are a few more off the top of my head: the Light Field digital lens http://www.refocusimaging.com/ ; the Weedeater; the Segway; the OPT water power buoy http://www.oceanpowertechnologies.com/index.htm ; the Hover Copter toy; the Air Car engine http://www.theaircar.com/ ; etc. There are a few more that I could name and undoubtedly zillions more of which I am ignorant.
You have major English issues; learn the difference between implementation and popularisation – the two are very different. Creation, implementation and popularisation can occur completely separate from each other.
I did use “implementation” and “popularization” as separate words. However, in Jobs/Apple debunking discussions, fanboys tend to lump the two acts into the single category of worship-worthy achievements.
Edited 2007-11-20 21:54
“The task-bar on Windows 95 and on Windows 98 had the same basic function as the OSX dock, and both Windows OSs were used by millions.”
Not at all! You are missing the entire point. There are:
1. Launchers
2. Task bars
2. Docks
And they are all different concepts. In windows you have a “quick launch” in addition to the taskbar and that allows you to launch things, but the task bar itself only allows switching between apps. Also, the quick launch bar in Windows behaves in a completely different way than a Dock, in that clicking an icon launches an app but clicking it again launches another instance of the same app, whereas on a dock it *switches* to that app if there is an already running instance.
So:
– In a launcher, clicking an app icon opens an instance
– In a task bar, only running apps appear and clicking on a button switches to the selected app
– In a Dock, clicking an app icon switches to the selected app OR launches it if it isn’t already running.
Where would you put Kicker, in that it has a quick launch area, a taskbar area, can hold icons for programs, has a systray area, has a clock, has a trash can, and can take a hold shwack of other plugins including things like a desktop switcher.
Firstly, for the uninitiated, full-screen is not automatic in Windows nor in Linux, unless, perhaps, one is using a Linux tiling WM with no applications open. In every version of Windows (as I recall) and in every Linux Desktop/WM that I have used, the only applications that run full-screen are the ones that that the user wants to run full-screen.
The “modes” and “the Apple way” are interesting models of what is happening, but, because I have never had all full-screen apps in Windows and Linux, those models never occurred to me at anytime I ever used a computer, including my first computer, a 1984 Mac. I would guess that most others who use both Mac and non-Mac systems also do not see things in “modes” nor in the “Apple way.” The size of the window is not important until there is a reason for the size to be important — mainly, when it is obscuring something or when something in the background is distracting.
And, if one wants an app full screen, there are numerous ways to make it happen in Windows and Linux, but it is not as intuitive nor as easy to get a “true” full screen on a Mac. Also, there are problems with running applications “as large as they need to be”: who decides how large they need to be (Steve Jobs?); and, sometimes, one has to scroll to reach content in a window reduced from full-screen.
As a product designer and as one who has extensively used Mac, Windows and Linux, I think that the Mac menu-bar being detached from the application window is a huge usability mistake. In addition, having all applications look and act the same can also cause usability problems — there are definitely times when a window interface needs to be different.
Anyway, it is a minor point that the Windows task-bar has an additional feature of switching “modes” — the feature is there if one wants to use it (but I don’t really see how it differs much from clicking on an icon in the OSX dock).
Edited 2007-11-18 22:20
i think there is a algorithm that tries to size the image so that scroll bars go away or something, and goes full screen if it happens to hit that size and still not getting rid of them bars.
but thats just me guessing.
as for full screen or not, that depends.
i usually run file manager, im and audio non-full screen, office apps, mail and browser full screen, and videos either taking over the screen or in a window resized to fit the video.
Edited 2007-11-19 03:31 UTC
As someone who owns a 12″ Powerbook, not being able to fullscreen an app is annoying at times. On larger displays it’s not a problem, who wants to look at OSnews fullscreen at 1600×1200, but on screens with limited real estate, 1024×768, it’s a hassle.
I’m not really sold on Mac OS X having a more modeless GUI then any other GUI out there. As far as I can tell, there aren’t a lot of apps that follow the vi mentality of a command mode and an edit mode. We maybe comparing apples to oranges, but it’s all fruit. There aren’t any vegetables thrown in.
… who wants to look at OSnews fullscreen at 1600×1200…
🙂
…there aren’t a lot of apps that follow the vi mentality of a command mode and an edit mode
I would like to have a version of OSX that runs alternately in command mode and edit mode.
We maybe comparing apples to oranges, but it’s all fruit. There aren’t any vegetables thrown in.
Agreed. Clementime is a tangerine: http://clementine.sourceforge.net/
As a product designer and as one who has extensively used Mac, Windows and Linux, I think that the Mac menu-bar being detached from the application window is a huge usability mistake.
Agreed. And what is most paradoxical about it, as an error, is that exactly the point at which you need to have the menu-bar on the application window, is when you are running many apps at the same time. But this ability was always one of the strengths of the old Apple OS, and there were lots of studies showing how Mac users did use more different apps. How weird it should be that your HIG rules end up being exactly what makes the original unique selling point of the system harder rather than easier to use!
Its a classic instance of how Apple’s HIGs, which started out in a different era as being rules fostering innovation and usability, as the world moved on, became a sort of dead hand of conservatism. The continued insistence on the single button mouse was exactly the same sort of thing.
The other classic instance of it is the way in which the guidelines are all designed around one non-tabbing desktop as the way to do things. Whereas in fact the right way in many cases is virtual desktops. If you wrote your HIG with multiple desktops in mind, half of it would have to be thrown out.
Well, they did finally get to multi-button mice, and they have now at last admitted virtual desktops with Spaces, so there is progress, however glacial.
Which error would that be? Going on an assumption of it to be true so you could declare it a paradox?
It shows that under a false premise anything can happen…
Very much like our hands… bounds, bounds bounds everywhere…
I thought that two mouse buttons were common place, sorry never used one single button mouse on a mac although I might as well use (sometimes just for fun I use it, cause you know, my two button mouse still has the left one;)), however on windows one mouse is cumbersome to use…
Please don’t tell that to those who’d made Leopard, they might get into trouble…
Wakes me up when Linux has something worth to be called UI…
Edited 2007-11-21 00:30
I was quite stumped by that comment as well; I know that many windows/linux users intuitively maximise the window of the current running app, but that by no means makes it standard.
I, for one, despise maximized windows and run my apps the size I feel they should be, and so that I still have my other windows visible beneath that – and I’ve never owned a mac until a few weeks back when a friend gave me an obsolete powermac G4, so it’s not a habit I’ve carried over or anything like that. I like toying around with OS X, but my main PC’s still running Ubuntu, and my windows are not maximized.
Going back, even on Windows 3.11, I never ran apps maximized – I even resized the main window to make the minimized app icons visible (a crude dock/taskbar?).
Again, maximizing the window of the currently running app is left to the discretion of the user, and I think that’s great. Forcing the user to not be able to do so is… not that great.
Mac Menu bar isn’t detached from the application, the window itself isn’t the window, just look at photoshop, dreamweaver, etc to look what apps looks alike, even those finder windows aren’t applications by themselves why would you repeat one menu for each one?
Giving the function remain the same, it should be on the same spot so users knows where to look for when they want to perform the action they want, in the same way that once you learn how to drive you do it automatically, no need to search for the wheel because things are where expected to. Given that where better to place it then on one of the edges of the screen where you can bump right into it with mouse?
I don’t want to go on your job but when invoking a title of ‘product designer’ without knowing even the fitt’s law just makes me wonder how did that happen…
Photoshop is still different from Cinema4d from DreamWeaver, from iDVD, Pages… Things which are the same should remain the same and not implemented in multitude of ways just for the sake of it.
Edited 2007-11-21 00:31
Mac Menu bar isn’t detached from the application
Really?
the window itself isn’t the window
If that statement were actually true, OSX might have a slightly bigger usability problem than the detached menu-bar.
just look at photoshop, dreamweaver, etc to look what apps looks alike, even those finder windows aren’t applications by themselves why would you repeat one menu for each one?
With the menu-bar attached to the window, menus don’t have to be repeated in multi-windowed applications — look at how multi-windowed Photoshop is handled in Windows, look at how multi-windowed Gimp is handled in *nix. And even if a menu is repeated, there is no usability conflict, nor will the computer explode.
Giving the function remain the same, it should be on the same spot so users knows where to look for when they want to perform the action they want, in the same way that once you learn how to drive you do it automatically, no need to search for the wheel because things are where expected to.
Yes. Yes. The subject of spatial memory and conditioning has been covered ad nauseum in this thread (and elsewhere), and the Mac GUI does not have an advantage in that regard.
Given that where better to place it then on one of the edges of the screen where you can bump right into it with mouse?
Having the menu bar on the edge of its application window is better — it prevents a lot of confusion/disorientation, which is usually more important than occasionally taking a split-second longer to hit an application menu.
Often, users interact with the content, window buttons and window borders more than the application menu. Application toolbars and palette buttons can get even more interaction. So, if you really want to take advantage of the “infinitely large” targets on the screen edge, put the toolbars and pallet buttons there (some *nix WMs/desktops allow this configuration with certain apps).
By the way, a target isn’t easy to hit just because you put it on the edge of the screen — try hitting on the edge of the screen an “infinitely large” target that is one pixel wide.
I don’t want to go on your job but when invoking a title of ‘product designer’ without knowing even the fitt’s law just makes me wonder how did that happen…
Please stop. The last thing that a forum such as OSNews needs is one more Mac fanboy incessantly barking the term “Fitts’ Law” like a flipping, hyperactive Jack Russell Terrier.
Most Mac fanboys who reference Fitts’ Law don’t really understand usability nor the psycho-motor model postulated by Paul Fitts in 1954 (when there were no computer graphical interfaces).
The notion that the Mac menu-bars are detached at the top of the screen to “comply” with Fitts’ Law is “BS.” According to Fitts’ law, distance to the target is also a direct function of the aiming precision required. So, with the Mac menu-bar always at the top of the screen, the targets are always the furthest distance away from the work — an OSX detriment.
And if Apple was really concerned with making it easier to click on targets, it would enlarge the clicking area of more of its widgets, for instance, the click-able area of the “jelly-blobs” should encompass the full window border outside of the “jelly-blob.”
Unlike OSX, some OSs/desktops/window-managers are actually designed to take advantage of the screen edge/corners, such as SymphonyOS’s Mezzo desktop (note the corner and edge widgets in this screenshot): http://www.symphonyos.com/ss/sos-2007b.jpg
Photoshop is still different from Cinema4d from DreamWeaver, from iDVD, Pages… Things which are the same should remain the same and not implemented in multitude of ways just for the sake of it.
Okay.
Edited 2007-11-21 03:51
Where it reads:
should instead be read:
the window itself isn’t the app.
Of course, it emulates the behavior of Mac…
BTW GIMP and usability even if at the same paragraph doesn’t mix…
You know, Menu happens to be there already…
[/q]By the way, a target isn’t easy to hit just because you put it on the edge of the screen — try hitting on the edge of the screen an “infinitely large” target that is one pixel wide. [/q]
How do you know, you obviously doesn’t use it.
No Mac fan boy, altough I’ve been using the menu system system since the dawn of the times (ie- Amiga) I know what feels better and Fitt’s law just happen to show I’m right.
[p] So, with the Mac menu-bar always at the top of the screen, the targets are always the furthest distance away from the work — an OSX detriment. [/q]
Although far it is reached faster, and it’s not even me saying it, usability tests are…
My corners are working fine thank you very much, it only seems that you haven’t work with all MacOSX has to offer…
cheers
No Mac fan boy
Sure you aren’t. Nobody is a Mac fanboy.
Of course, it emulates the behavior of Mac…
Perhaps you would care to be specific on exactly how the window configurations of Windows Photoshop and the Gimp emulate the Mac.
BTW GIMP and usability even if at the same paragraph doesn’t mix…
Not really. The Gimp just gets a bad rap because a lot of people who try it are already conditioned to Photoshop, and, perhaps, because it lacks a few features. Often, one assumes that a new application has inferior usability, because that application doesn’t work like the one with which they are familiar.
So, if you really want to take advantage of the “infinitely large” targets on the screen edge, put the toolbars and pallet buttons there (some *nix WMs/desktops allow this configuration with certain apps).
You know, Menu happens to be there already…
In OSX, you are correct — almost everything has already been decided for you and you cannot change it. However, with *nix and Windows, one has many more choices.
I’ve been using the menu system system since the dawn of the times (ie- Amiga) I know what feels better and Fitt’s law just happen to show I’m right.
What are you right about? You never seem to make any specific claims.
By the way, GUI menus appeared in the Xerox Alto over a decade before the Amiga. I am not going to bother linking another screenshot — look at the ones posted earlier in this thread.
So, with the Mac menu-bar always at the top of the screen, the targets are always the furthest distance away from the work — an OSX detriment.
Although far it is reached faster, and it’s not even me saying it, usability tests are…
Perhaps you could reference these tests. Did they test varying distances between the starting position and the targets on the screen edge?
By the way, a target isn’t easy to hit just because you put it on the edge of the screen — try hitting on the edge of the screen an “infinitely large” target that is one pixel wide.
How do you know, you obviously doesn’t use it.
Well, how about if I put it another way:
I will bet you US$1000.00 that nine out of ten random people cannot, in a single attempt, click on a white, 1-pixel target centered on the top edge of a black, 1024×768 screen, given a standard pointer positioned on the bottom edge of the screen and two seconds (a usability eternity) to accomplish the task.
Care to put your money where your mouth is?
Edited 2007-11-21 20:22
“Sure you aren’t. Nobody is a Mac fanboy. ”
I have a bachelor in webdesign it’s quite different. I had to learn about usability issues… as for using a Mac, only have one since January, don’t have the time needed for being a fanboy.
“Perhaps you would care to be specific on exactly how the window configurations of Windows Photoshop and the Gimp emulate the Mac.”
Who cares about Gimp, photoshop has a menu in top of the window of which when magnified has its menu on top, as mac does.
“In OSX, you are correct — almost everything has already been decided for you and you cannot change it. However, with *nix and Windows, one has many more choices. ”
OSX is *unix if you do not know. As for having more choices, do you know… consistency is a good thing, thinks shouldn’t run amok, you don’t have to learn different tools for the same job, you can have choice as long as it keeps being the same. If you open a file requester you should be able to choose what kind of file requester is, but it should be transversal along the system, that’s what explorer and finder are…
“By the way, GUI menus appeared in the Xerox Alto over a decade before the Amiga. I am not going to bother linking another screenshot — look at the ones posted earlier in this thread.”
Joe User wise? Who cares if some special cult had used/developed it before, they came up with a concept, Amiga (and Apple) actually had apps that used that concept, there’s a difference between knowing the path and walking the path…
“Perhaps you could reference these tests. Did they test varying distances between the starting position and the targets on the screen edge? ”
You know that mouse pointer has acceleration or can be made to… as for the tests, please find them yourself, they will enlighten you.
“I will bet you US$1000.00 that nine out of ten random people cannot, in a single attempt, click on a white, 1-pixel target centered on the top edge of a black, 1024×768 screen, given a standard pointer positioned on the bottom edge of the screen and two seconds (a usability eternity) to accomplish the task. ”
I don’t giva a damn with fairy ‘use cases’ tales, I care with real world real use cases, if you came up with one which belongs to it be my guest otherwise you’re talking to the wrong guy and failing to make a point altogether…
as for using a Mac, only have one since January, don’t have the time needed for being a fanboy.
Because of the permeating Apple/Jobs hype and the rabid Mac user-base, many are Mac fanboys before their first purchase of an Apple product.
Even if most Mac fanboys became so after their first Mac purchase, it doesn’t take long before they become frothing-at-the-mouth, Apple-slogan-spewing zombies.
Here is a quote made by a Mac user immediately following a keynote speech by Steve Jobs: “I’ve had a Macintosh now for a total of 35 days, and I’m really excited to be part of the Mac community.” Keep in mind that this gushing mac user merely bought a computer one month prior.
By the way, this quote can be found in this ABC News article: http://abcnews.go.com/Technology/Story?id=2782509&page=2
Who cares about Gimp
Uh, perhaps the 1000s of Gimp users and the hundreds of post-production people in the movie industry who use Cinepaint (a fork of the Gimp) and not Photoshop.
photoshop has a menu in top of the window of which when magnified has its menu on top, as mac does.
Of course, when Photoshop is not maximized the menu is not at the top of the screen — which is not as Mac does and which is often the case when one has a large monitor.
OSX is *unix if you do not know.
SHAZAM!
As for having more choices, do you know… consistency is a good thing, thinks shouldn’t run amok, you don’t have to learn different tools for the same job, you can have choice as long as it keeps being the same. If you open a file requester you should be able to choose what kind of file requester is, but it should be transversal along the system, that’s what explorer and finder are…
Yes. Consistency is important but not always imperative, and, sometimes, it is advantageous to have certain inconsistencies. Consistency comes under the usability heading of “conditioning” which, as I have said, has been thoroughly covered in this thread.
However, before a user gets conditioned to his gui, the user can make a lot of choices about the configuration of the GUI, without detriment. Also, after the conditioning process, users can often make choices about their gui which are improvements.
The Mac gui does not allow a lot of choice, while most other guis do.
“By the way, GUI menus appeared in the Xerox Alto over a decade before the Amiga. I am not going to bother linking another screenshot — look at the ones posted earlier in this thread.”
Joe User wise? Who cares if some special cult had used/developed it before…
Okay. So, you are saying that Xerox is a special, multi-national-corporation-who-revolutionized-the-gui cult.
In regards to “Joe User,” he doesn’t care about the Amiga nor the Mac — he is using Windows (and, soon, probably, Linux: http://www.desktoplinux.com/news/NS8642294935.html).
they came up with a concept [gui menus], Amiga (and Apple) actually had apps that used that concept
The insidious Xerox cult must have fabricated these screenshots of pre-Mac/pre-Amiga applications showing hierarchical, gui menus, just to fool Joe User:
http://toastytech.com/guis/altorainbow.jpg
http://www.digibarn.com/friends/curbow/star/2/p4-lg.jpg
there’s a difference between knowing the path and walking the path…
And I am beginning to realize that point more and more, with each one of your posts.
I have a bachelor in webdesign
Congratulations on that.
I had to learn about usability issues…
Judging from your ignorance of Xerox, the program didn’t exactly stress gui usability history. I wonder if there were any other gaps in the curriculum.
“Perhaps you could reference these tests. Did they test varying distances between the starting position and the targets on the screen edge? ”
You know that mouse pointer has acceleration or can be made to… as for the tests, please find them yourself, they will enlighten you.
Who said anything about mouse acceleration?
By the way, I read the studies that you have chosen not to reference, and they found that distance affects time/accuracy in reaching pointer targets positioned on the top of the screen, according to Fitts law.
I don’t giva a damn with fairy ‘use cases’ tales, I care with real world real use cases, if you came up with one which belongs to it be my guest otherwise you’re talking to the wrong guy and failing to make a point altogether…
I see. You only give credence to real world cases, such as your usability “tests.”
I think that most will agree that things get sobering and “real world” almost instantly when one is wagering a serious chunk of money. If you don’t believe me, go into Las Vegas casino and try to snatch back a lost US$1000 table bet with your argument that the bet was a “fairy ‘use case'” — you will very quickly find yourself in a real world jail.
So, how about it? As I said earlier in my bet, just because a target is on the screen edge, it doesn’t mean that it can be hit within two seconds. If you think that I am wrong, then you could win an easy US$1000, otherwise, you are admitting that I am right and that you are wrong.
Edited 2007-11-23 05:42
“Judging from your ignorance of Xerox, the program didn’t exactly stress gui usability history. I wonder if there were any other gaps in the curriculum. ”
My Ignorance? Since when Xerox had went mainstream? I was talking about implementations that real people used along with tons of apps And I DO know Xerox, and I do know that either Mac and Amiga GUIs were way ahead functionality wise then of Xerox that were mainly Primitives of GUI, concepts…
“I see. You only give credence to real world cases, such as your usability “tests.” ”
Pay the money and I might trouble seeking them again, otherwise why bother? It’s you loosing anyway…
(as it was a 2s search I will give you it freely: http://www.asktog.com/columns/022DesignedToGiveFitts.html )
“I think that most will agree that things get sobering and “real world” almost instantly when one is wagering a serious chunk of money.”
It would win against a single pixel randomly placed on the center of the screen.
“If you don’t believe me, go into Las Vegas casino and try to snatch back a lost US$1000 table bet with your argument that the bet was a “fairy ‘use case'” — you will very quickly find yourself in a real world jail. ”
Who said i was going on silly bets?
“So, how about it? As I said earlier in my bet, just because a target is on the screen edge, it doesn’t mean that it can be hit within two seconds. ”
Your bet is much worth as most of what you say. What relevance has one pixel whatsoever on the screen when there’s NO implementation relying on it? It has no relevance whatsoever, by being right what exactly is your point? Is it actually easier to pick a random placed dot in the middle of the screen then a random dot placed on the edge of the screen? Now that would be something willing to bet, but I’m not into bets, I prefer to let you keep with you money.
Edited 2007-11-23 09:44
“I see. You only give credence to real world cases, such as your usability “tests.” ”
Pay the money and I might trouble seeking them again, otherwise why bother? It’s you loosing anyway… (as it was a 2s search I will give you it freely: http://www.asktog.com/columns/022DesignedToGiveFitts.html )
Well. You certainly linked to a page that I haven’t seen a zillion times already.
You call this a usability “test.” This is just Bruce Tognazzini ranting about basic concepts (and acting like he is the gui god).
Since when Xerox had went mainstream?
Yes. Xerox is a corporate cult. It has never been mainstream.
I was talking about implementations that real people used along with tons of apps
The Xerox Alto, Star and the Three Rivers PERQ were used by robots? And now you are claiming that the Apple Lisa and the original Mac had more applications than the Alto or the Star or the PERQ?
I do know that either Mac and Amiga GUIs were way ahead functionality wise then of Xerox that were mainly Primitives of GUI, concepts…
You know that? I don’t. Please explain exactly how the original Mac and Amiga guis were more advanced functionality-wise than the Star (or the PERQ).
It would win against a single pixel randomly placed on the center of the screen.
It probably would. However, you challenged the assertion that a target isn’t easy to hit just because it is on the top edge of the screen.
What relevance has one pixel whatsoever on the screen when there’s NO implementation relying on it? It has no relevance whatsoever, by being right what exactly is your point?
It has relevance in that it will: debunk the common, Mac fanboy notion that the screen-edge is a “magical Fitts’ Law wonderland,” in and of itself; prove that you are wrong; and put an extra US$1000 into my bank account. I can create the implementation.
I am right, that is why you are balking.
By the way, here is an excerpt from the answer to question #9 in the Tognazzini quiz: “The farther away the target is, the larger it must be to retain access speed.” So, my assertion about the importance of target size is correct, even according to the ranting, Mac fanboy usability guru that you cite.
“You call this a usability “test.” This is just Bruce Tognazzini ranting about basic concepts (and acting like he is the gui god). ”
I don’t call it, I called it a 2s search…
“The Xerox Alto, Star and the Three Rivers PERQ were used by robots? And now you are claiming that the Apple Lisa and the original Mac had more applications than the Alto or the Star or the PERQ?
You know that? I don’t. Please explain exactly how the original Mac and Amiga guis were more advanced functionality-wise than the Star (or the PERQ). ”
You know what a primitive is? I haven’t called xerox primitive, the core is there, but where’s the thousand of apps that both Amiga and MacOS has? I though so…
“It probably would. However, you challenged the assertion that a target isn’t easy to hit just because it is on the top edge of the screen. ”
The only single pixels implementations on gui are the corners, the menu isn’t 1 pixel wide, case dismissed, back to your little cave now…
“It has relevance in that it will: debunk the common, Mac fanboy notion that the screen-edge is a “magical Fitts’ Law wonderland,” in and of itself; prove that you are wrong; and put an extra US$1000 into my bank account. I can create the implementation.”
It has none relevance whatsoever, why not calling a zero pixel then, it as as much relevance… it prove if anything that hitting a single pixel is difficult by itself.
“By the way, here is an excerpt from the answer to question #9 in the Tognazzini quiz: “The farther away the target is, the larger it must be to retain access speed.” So, my assertion about the importance of target size is correct, even according to the ranting, Mac fanboy usability guru that you cite.”
It’s always easier to target an known entity then it is to target a changing one. I don’t know who that guys is, although I had seen this article in the past, obviously Google has him in pretty much good consideration as he was amongst the first on the search I had done.
If you have a high resolution monitor, and a bunch of applications open all with the windows sized such that you can see and interact with a bunch of them, don’t you get annoyed at having to go to the top of the screen to do things in the menus? I know I’ve been annoyed when playing with the demo Macs at the Apple stores here in town.
Having a nice, smallish window open in the bottom-right of a 1600×1200 or larger screen, wanting to activate something in the menu, and having to move both my eyes and my cursor out of the window to get to the menu is a royal PITA.
Put the menus in the window, right close by where the cursor is.
If you run all your windows/apps full-screen-ish, then it makes sense to put the menu bar at the top. But if you have many smallish windows all over the screen, it makes a lot more sense to put the menus in the window.
The Mac menu bar made sense on 9″ screens when it first came out. It doesn’t make as much sense on 22″ screens nowadays.
“I know I’ve been annoyed when playing with the demo Macs at the Apple stores here in town. ”
Someone tries once the system then, all suddenly they all think they’d mastered it, as always had use one… do you know acceleration is? Your mouse has it…
“The Mac menu bar made sense on 9″ screens when it first came out. It doesn’t make as much sense on 22″ screens nowadays.”
No problem on my 22″…
Is anyone else suddenly having problems posting here?
Like crazy. They’re moving servers.
I like cde as well but the front panel cannot be resized. Only moved or minimized.
“I like cde as well but the front panel cannot be resized. Only moved or minimized.”
I think this is intended. As many things in CDE are designed with a well intended background, any action that would be useless (e. g. resizing the front panel “just this way”) cannot be done.
In the CDE “reimplementation” XFCE 3 you can change the size of the icons in the front panel. The panel will resize according to geometry and number of icons. I’m not sure if the original CDE has this feature, too, I’ve got no Solaris/CDE available at the moment, but out of curiosity I’ll check next week.
“Now, this whole dock thing was of course another example of similar people coming up with similar solutions to similar problems in a similar timespan (I need a term for that)”
Sound similar enough to use the term ‘parallel evolution’ to me.
“Now, this whole dock thing was of course another example of similar people coming up with similar solutions to similar problems in a similar timespan (I need a term for that)”
Sound similar enough to use the term ‘parallel evolution’ to me.
Yes, it is parallel evolution if “similar timespan” means 1985 (Microsoft dock/taskbar), 1987 (RISC OS icon bar) and 1988 (NeXTSTEP dock).
Oh sorry I wasn’t trying to pick flaws in the article or say the people copied Microsoft or whatever, so I didn’t catch that!
Be sure to check out Leopard Docks.
http://www.leoparddocks.com/
You can change how your dock looks on Leopard.
It’s still a manual way of changing it, but I’m sure there will be an App written really soon to do it automagically.
I personally love the Dock in OS X, it’s very handy.
Thom, not knowing where the application is going to be and destroying spatial memory is a carefully constructed design feature.
If the user can’t rely on muscle memory, the user has to stop and think about what needs to happen. It’s like a speed bump, it slows the user down making them think about what they are doing. Since the user is going slower thinking about what they are doing, they make fewer mistakes.
is that your three major pains with OS X’s Dock were more or less solved in the days of NeXT.
Positioning issues
Nothing ever moved by itself in the NeXT Dock. Running applications and minimized windows appeared at the bottom of the screen. Each icon found a free spot, and stayed there until it either disappeared or you dragged it somewhere else. As you can see from the screenshot linked below, removing an icon would not make the others reorganize. The Dock itself contained only what you had chosen to drag there, and things stayed in the exact positions you had placed them.
Trash icon
The Recycler was by default placed in the bottom slot of the Dock, which meant in the lower right corner of the screen. That’s Fit’s Law kicking in, although Cmd-d is probably faster anyway.
Labels on files and folders
Here the difference becomes bigger because NeXT didn’t show open files and folders in any kind of task list. Only minimized windows would have icons, and they would have a black ribbon at the top with the window title in it, except for file viewers which would have the actual folder names. Shortcuts to files and folders were not placed in the Dock, but in a file viewer’s Shelf, with proper labels. Strictly speaking, the mini windows were not part of the Dock either.
Here’s a screenshot Google was kind enough to come up with; http://homepage.mac.com/troy_stephens/OpenStep/screenShots/OPENSTEP…
I’m not saying the Dock in OPENSTEP was perfect. It could have allowed more than one screen height of icons, and it could have added a layers concept like in the Dock replacement known as the Fiend. But still, I do feel it had some sort of elegance to it, even if it was not spilled with eye candy.
You give example of hack for the Dock in OSX, do you have any links to these hacks?
You know after using Quicksilver for a couple of weeks, I’ve found I’ve completely lost interest in these interface debates. I’ve removed everything from my dock and desktop, turn on autohide and only bring it up with a keyboard shortcut for use as an occasional application switcher (although I normally use Quicksilver for that too). Down with the mouse, keyboard interface uber alles!
Anybody know how to keep the dock from popping up on mouse over? That’s really my only annoyance with the dock at this point. I suppose I could just ditch it all together by removing the dock app to prevent it from launching at startup.