The article assumes the command-line utilities are to be used on a *nix system (e.g. GNU/Linux, BSD, Mac OS X, Unix), and it will frequently reference to common tools on such systems.
If you are only going to consider *nix then IMHO, you are a mission that is doomed to failure.
If you are going to provide guidlines on what should go into a decent CLI then you MUST consider other systems. By failing to do that you are setting out on a marathon with one leg in plaster.
For example,
As I stated before the hierarchical help system used in the VMS DCL environment is wonderful. It makes a flat help system redundant. So why can’t to divorce the help system from the application?
Supply the binary and the help file that is accessible through the generic shell help widget (or whatever).
Lets get away from the limitations imposed on us by the Original Unix devs who were working on very slooooowwwww devices (eg Teletypes, ASR33, 110Baud). Times have changed.
So should the command line.
Now I’ll take cover because I expect to get blown away.
I hate info pages. I find manpages much easier to manage. It’s for this reason that when I find documentation or guides or books online, I always go for the “entire manual in a single PDF/HTML page” option instead of the piece-by-piece option. It’s much easier to scroll around or use search than navigate a hierarchy that someone else devised and that may not actually organize things in a way that helps *you* find what you need.
The Unix shell is superlatively wonderful because it comes with an extraordinarily powerful set of tools. Yet using it as an example of how to design command-line interfaces, well that’s purely insane.
The Unix shell can afford to be more descriptive. This is true for how users issue commands (command completion means that noones going to develop RSI from using ‘list-files’ instead of ‘ls’ or ‘–force’ instead of ‘-f’), but it is also true for how the program responds to user (since it is important to display progress for time consuming or complex processes).
The software can also afford to be more interactive. Yes, you need to be able to turn off interactivity for scripting. On the other hand, there are times when the software needs to ask. If it doesn’t ask, it either fails to do what it’s supposed to (e.g. ‘rm’) or it defaults to a destructive action (e.g. ‘rm -f’).
Those are just two examples stolen from the article, but the reality is that there are many ways to improve command line interfaces. The problem is that we live in a ghetto where the old way is the right way, so there hasn’t been much progress over the past 20 to 30 years.
I wouldn’t call it pleasure. It’s more consistent, yes (e.g. only one way to pass options) but highly unorthogonal too. SET and SHOW are the best examples — it’s the same old idea “let’s make it feel like English” that begat COBOL.
Actually my two issues with Powershell are the stupid ps1 extension even for Powershell 2.0 scripts, and the way you need to sprinkle your code with .Net annotations when creating scripts.
I use it as a better shell that is Windows native, but when the time comes to write scripts I prefer Python or F#.
I was referring to DCL, not Powershell. Maybe I’m a bit unfair and what I actually don’t like is VMS underneath DCL. I definitely don’t like its kitchensink philosophy, though I’m impressed as hell by its stability, security features and clustering capabilities. OTOH OpenBSD has those too, in addition to a much saner environment.
“(I have always wanted a program that shows a movie in my terminal by converting it to ASCII art in real-time, that would be sweet).”
thats easily achieved using mplayer or vlc. These play video in the ttys pretty well.
if ascii is really desired -vo libcaca or something like that would do the trick i think, if I remember well caca got colored ascii, so much fun can be had..
e.g. you could display /dev/video0 (your webcam probably) in colored ascii in tty1 if you like. :p
I admit, it’s fun to think about playing video on character-cell displays. Yet I also think that many people confuse CLI’s as something that lives within a text only terminal.
A less limiting version of CLI’s would allow for graphical displays. Take something like gnuplot: it definitely has a command driven interface, but it also pops open a graphical window to display a graph. ImageMagick is probably a better example though, since I seem to recall it fitting into the shell better (thus works as part of a coherent system), and it still allows for graphical display.
We need to accept that hybrid of text and graphics, or the CLI is doomed to die. Very few people want to play with a masochistic geek toy after all.
“I admit, it’s fun to think about playing video on character-cell displays. Yet I also think that many people confuse CLI’s as something that lives within a text only terminal.
A less limiting version of CLI’s would allow for graphical displays.”
Yes, this is actually what I meant by my maps query interface example under the CLI Article. The GUI part should remain, but a CLI query engine could be added. Something like Autocad or SQL software uses.
The CLI should function with or without the GUI portion to give the user maximum flexibility.
I think that your comment reflects where the CLI vs. GUI debate gets complicated.
I get the distinct impression that there are two major camps that support the CLI. There are the enthusiasts, who seem the be enthralled by the novelty (though they may not admit as much, even to themselves), and then there are systems adminstrators who appreciate being able to manage and script systems in the same language (which includes the SQL crowd).
There are, of course, oddities. The AutoCAD people probably fit into that group (my recollection is that it uses a variant of LISP). I’m originally from the physical sciences where LaTeX is the medium of publication (a markup language which integrates moderately well into CLI) and that continually discusses data pipelines (which fits into the Unix philosophy for CLI utilities fairly well).
Now my highly biased opinion is that those ‘oddities’ hold the best hope of CLIs regaining a respectable foothold in the industry. A big part of the reason is that CLIs are a necessity: scientists and engineers require software systems that are so complex that it is difficult to create a managable GUI. Yet another reason is the diversity of the CLIs presented to those users. Some, like Unix, are based upon commands and parameters and pipes. Others, like IRAF, are based upon similar principles but offer stateful and visual ways to manage parameters (my understanding is that it’s based upon VMS, though I don’t know how similar it is to VMS). Even environments like Mathematica or Maple should enter the picture since, even though they are specialized and standalone applications,
And the list could go on. The only caveat is that most of their programming experience is limited to data processing.
Though your comment on SQL brought up another point: CLIs are much more common than most people think. In essence, any interactive programming language can be used as a CLI (e.g. Python).
“I admit, it’s fun to think about playing video on character-cell displays. Yet I also think that many people confuse CLI’s as something that lives within a text only terminal.
A less limiting version of CLI’s would allow for graphical displays.”
Yes, this is actually what I meant by my maps query interface example under the CLI Article. The GUI part should remain, but a CLI query engine could be added. Something like Autocad or SQL software uses.
The CLI should function with or without the GUI portion to give the user maximum flexibility.
I agree 100%. It seems my understanding from other comments here is that a CLI = less GUI. It shouldn’t be. You can have CLI’s in the form of terminals flying around all over your favorite desktop environment, express yourself with the power of CLI commands on any of that window terminals, while watching high definition videos or (3d videos) at the same time. This is the future.
We can’t just revert to the old dumb terminal of the ancient Unix days.
A less limiting version of CLI’s would allow for graphical displays. Take something like gnuplot: it definitely has a command driven interface, but it also pops open a graphical window to display a graph.
Actually, consider the address bar of your browser. It’s a pretty simple CLI now, but is in many ways evolving – particularly since you can already run Javascript in it. Or for that matter, the Firebug console, for a more powerful example…
Cryptic commands with a non-uniform naming convention and “switches” are archaic, a symptom of an environment which was built piecemeal rather than designed.
Check out the command line environment used by the IBM i.
I’m not going to defend some of the choices in Unix land, but what you linked to is hardly any more sane. Also, the attempt to turn it into English (a common disease in certain parts of the computer world) is disheartening.
I don’t agree with the part that said you should always add long versions of options. The reason beeing that for once it is much more efficient to use a switch case with single characters like ‘v’ instead of doing a strcmp(“verbose” argv[x]). Just look at the difference between the short and the long version of getopts to get the picture.
Also in general although unix programs are seemingly doing one thing and doing it well they do contain a lot of bloat. Just look at all the available options for ls as an example.
There is also the issue of output. Any seasoned user know how to redirect stdout to a file (using > for those who don’t). This means there is no need to supply an option with where to output the result of a program. It should always go to stdout by default.
Another issue that this article didn’t talk about was programs that you could say acts as wrappers around subfunctions. Take git as an example. You can use git add or git commit or git clone for instance. In this case git acts as a the name for the wrapper and the keyword following is the function it runs. Why this approach was used was probably because different functions requires different options. It is a clever way of grouping together many functions into one program. This you might think is bloated but by clearly defining a subfunction using a keyword that doesn’t begin with – it clearly sets the difference between what is the intended function and what is an option. I can accept that. The other option would have been to have many programs called git-add or git-commit which would in most cases require duplication of code.
There is also the issue of output. Any seasoned user know how to redirect stdout to a file (using > for those who don’t). This means there is no need to supply an option with where to output the result of a program. It should always go to stdout by default.
Unless it outputs more than one file. Extracting an archive, for example, or running a codegen tool.
It’s old and is missing some of the more modern features found in other CLIs, but a lot can be said for the AmigaDOS shell. With a standard API for interpreting arguments, all tools can follow the same pattern of arguments, can be optionally case sensitive, and I find it to be a good balance between the cryptic arguments of *nix and being too “English”. One thing that I’ve always liked is the way it supports a template, which prompts the user with a short list of possible arguments, rather than reading man pages or whatever. For example, the copy command. If you type “copy ?” at the prompt, it gives you this prompt: “FROM/M,TO/A,ALL/S,QUIET/S,BUF=BUFFER/K/N,CLONE/S,DATES/S,NOPRO/S,COM /S,NOREQ/S” and lets you just type the arguments you need without typing copy again. /S says that argument is a switch, /A says it’s always required, /M means it accepts multiple files or objects, /K means it’s a keyword and /N means that keyword takes a numerical argument. BUF=BUFFER shows that the command accepts the keyword “buf” as a shortened version of “buffer” for those who use it frequently to save a bit of typing.
So it’s easy to see that copy supports a command line like “copy sys:testfile ram: quiet noreq” which copies the testfile to the RAM: drive, suppresses verbose and suppresses “overwrite?” requests.
That sort of thing would be very handy in a new CLI for the more complicated tools!
The article assumes the command-line utilities are to be used on a *nix system (e.g. GNU/Linux, BSD, Mac OS X, Unix), and it will frequently reference to common tools on such systems.
If you are only going to consider *nix then IMHO, you are a mission that is doomed to failure.
If you are going to provide guidlines on what should go into a decent CLI then you MUST consider other systems. By failing to do that you are setting out on a marathon with one leg in plaster.
For example,
As I stated before the hierarchical help system used in the VMS DCL environment is wonderful. It makes a flat help system redundant. So why can’t to divorce the help system from the application?
Supply the binary and the help file that is accessible through the generic shell help widget (or whatever).
Lets get away from the limitations imposed on us by the Original Unix devs who were working on very slooooowwwww devices (eg Teletypes, ASR33, 110Baud). Times have changed.
So should the command line.
Now I’ll take cover because I expect to get blown away.
I hate info pages. I find manpages much easier to manage. It’s for this reason that when I find documentation or guides or books online, I always go for the “entire manual in a single PDF/HTML page” option instead of the piece-by-piece option. It’s much easier to scroll around or use search than navigate a hierarchy that someone else devised and that may not actually organize things in a way that helps *you* find what you need.
The Unix shell is superlatively wonderful because it comes with an extraordinarily powerful set of tools. Yet using it as an example of how to design command-line interfaces, well that’s purely insane.
The Unix shell can afford to be more descriptive. This is true for how users issue commands (command completion means that noones going to develop RSI from using ‘list-files’ instead of ‘ls’ or ‘–force’ instead of ‘-f’), but it is also true for how the program responds to user (since it is important to display progress for time consuming or complex processes).
The software can also afford to be more interactive. Yes, you need to be able to turn off interactivity for scripting. On the other hand, there are times when the software needs to ask. If it doesn’t ask, it either fails to do what it’s supposed to (e.g. ‘rm’) or it defaults to a destructive action (e.g. ‘rm -f’).
Those are just two examples stolen from the article, but the reality is that there are many ways to improve command line interfaces. The problem is that we live in a ghetto where the old way is the right way, so there hasn’t been much progress over the past 20 to 30 years.
To paraphrase a character from ‘Little Britian’
Yes. But No But Yes But…
There has been an alternative to the Unix shell that has been around 30+ years.
As you sat, no one is going to get RSI from using
copy/log
instead of
cp -v
But it goes way beyond this.
The Unix shell is as pretty well everone agrees a powerful tool. So is the VMS DCL Shell.
Both have their +ve’s and -ve’s.
However in 30+ years of using both, the DCL shell is far more natural and is a pleasure to use.
Now I am the first to admit that people moving to it from Bash etc are mystified by it.
Once you reset your perceptions it becomes perfectly logical.
What I’m trying to say is that was can do a lot better than the Unix shell.
I agree that using it as a reference point in CLI design is just plain crazy.
I’d expect that if you asked this question of the original Unix devs they’d give you the country mile answer.
viz, ‘I wouldn’t start from here’.
It is a pity that more people aren’t interested in moving the CLI forward.
Lately I have been using Powershell and it is quite powerful, by merging Unix concepts with .Net and object pipeline.
Might have its quirks, but it sure is way lot better than using command.com or cmd.exe.
I wouldn’t call it pleasure. It’s more consistent, yes (e.g. only one way to pass options) but highly unorthogonal too. SET and SHOW are the best examples — it’s the same old idea “let’s make it feel like English” that begat COBOL.
Actually my two issues with Powershell are the stupid ps1 extension even for Powershell 2.0 scripts, and the way you need to sprinkle your code with .Net annotations when creating scripts.
I use it as a better shell that is Windows native, but when the time comes to write scripts I prefer Python or F#.
I was referring to DCL, not Powershell. Maybe I’m a bit unfair and what I actually don’t like is VMS underneath DCL. I definitely don’t like its kitchensink philosophy, though I’m impressed as hell by its stability, security features and clustering capabilities. OTOH OpenBSD has those too, in addition to a much saner environment.
thats easily achieved using mplayer or vlc. These play video in the ttys pretty well.
if ascii is really desired -vo libcaca or something like that would do the trick i think, if I remember well caca got colored ascii, so much fun can be had..
e.g. you could display /dev/video0 (your webcam probably) in colored ascii in tty1 if you like. :p
Hehe, that question has been answered 9 times in the comments of the original site, a comment about the number of times it has been answered inclusive
righard, I responded here before I finished the article, so I thought I was the first one mentioning it.
Looks like my memory failed me a little too..
It should of course be “-vo caca” not libcaca.
It’s a pretty fun thing to do though, especially these days when fancy GUIs are almost mandatory everywhere.
To make sure, I wasn’t complaining or anything, I just thought it funny.
I admit, it’s fun to think about playing video on character-cell displays. Yet I also think that many people confuse CLI’s as something that lives within a text only terminal.
A less limiting version of CLI’s would allow for graphical displays. Take something like gnuplot: it definitely has a command driven interface, but it also pops open a graphical window to display a graph. ImageMagick is probably a better example though, since I seem to recall it fitting into the shell better (thus works as part of a coherent system), and it still allows for graphical display.
We need to accept that hybrid of text and graphics, or the CLI is doomed to die. Very few people want to play with a masochistic geek toy after all.
MacTO,
“I admit, it’s fun to think about playing video on character-cell displays. Yet I also think that many people confuse CLI’s as something that lives within a text only terminal.
A less limiting version of CLI’s would allow for graphical displays.”
Yes, this is actually what I meant by my maps query interface example under the CLI Article. The GUI part should remain, but a CLI query engine could be added. Something like Autocad or SQL software uses.
The CLI should function with or without the GUI portion to give the user maximum flexibility.
Edited 2011-08-07 03:18 UTC
I think that your comment reflects where the CLI vs. GUI debate gets complicated.
I get the distinct impression that there are two major camps that support the CLI. There are the enthusiasts, who seem the be enthralled by the novelty (though they may not admit as much, even to themselves), and then there are systems adminstrators who appreciate being able to manage and script systems in the same language (which includes the SQL crowd).
There are, of course, oddities. The AutoCAD people probably fit into that group (my recollection is that it uses a variant of LISP). I’m originally from the physical sciences where LaTeX is the medium of publication (a markup language which integrates moderately well into CLI) and that continually discusses data pipelines (which fits into the Unix philosophy for CLI utilities fairly well).
Now my highly biased opinion is that those ‘oddities’ hold the best hope of CLIs regaining a respectable foothold in the industry. A big part of the reason is that CLIs are a necessity: scientists and engineers require software systems that are so complex that it is difficult to create a managable GUI. Yet another reason is the diversity of the CLIs presented to those users. Some, like Unix, are based upon commands and parameters and pipes. Others, like IRAF, are based upon similar principles but offer stateful and visual ways to manage parameters (my understanding is that it’s based upon VMS, though I don’t know how similar it is to VMS). Even environments like Mathematica or Maple should enter the picture since, even though they are specialized and standalone applications,
And the list could go on. The only caveat is that most of their programming experience is limited to data processing.
Though your comment on SQL brought up another point: CLIs are much more common than most people think. In essence, any interactive programming language can be used as a CLI (e.g. Python).
I agree 100%. It seems my understanding from other comments here is that a CLI = less GUI. It shouldn’t be. You can have CLI’s in the form of terminals flying around all over your favorite desktop environment, express yourself with the power of CLI commands on any of that window terminals, while watching high definition videos or (3d videos) at the same time. This is the future.
We can’t just revert to the old dumb terminal of the ancient Unix days.
Maple, Mathematica, and other CASes are very good examples of such graphical CLI programs which I don’t see going GUI anytime soon
Actually, consider the address bar of your browser. It’s a pretty simple CLI now, but is in many ways evolving – particularly since you can already run Javascript in it. Or for that matter, the Firebug console, for a more powerful example…
Broaden your perspective beyond *nix.
Cryptic commands with a non-uniform naming convention and “switches” are archaic, a symptom of an environment which was built piecemeal rather than designed.
Check out the command line environment used by the IBM i.
http://publib.boulder.ibm.com/infocenter/iseries/v7r1m0/index.jsp?t…
lol
I’m not going to defend some of the choices in Unix land, but what you linked to is hardly any more sane. Also, the attempt to turn it into English (a common disease in certain parts of the computer world) is disheartening.
I don’t agree with the part that said you should always add long versions of options. The reason beeing that for once it is much more efficient to use a switch case with single characters like ‘v’ instead of doing a strcmp(“verbose” argv[x]). Just look at the difference between the short and the long version of getopts to get the picture.
Also in general although unix programs are seemingly doing one thing and doing it well they do contain a lot of bloat. Just look at all the available options for ls as an example.
There is also the issue of output. Any seasoned user know how to redirect stdout to a file (using > for those who don’t). This means there is no need to supply an option with where to output the result of a program. It should always go to stdout by default.
Another issue that this article didn’t talk about was programs that you could say acts as wrappers around subfunctions. Take git as an example. You can use git add or git commit or git clone for instance. In this case git acts as a the name for the wrapper and the keyword following is the function it runs. Why this approach was used was probably because different functions requires different options. It is a clever way of grouping together many functions into one program. This you might think is bloated but by clearly defining a subfunction using a keyword that doesn’t begin with – it clearly sets the difference between what is the intended function and what is an option. I can accept that. The other option would have been to have many programs called git-add or git-commit which would in most cases require duplication of code.
Edited 2011-08-08 01:37 UTC
Unless it outputs more than one file. Extracting an archive, for example, or running a codegen tool.
It’s old and is missing some of the more modern features found in other CLIs, but a lot can be said for the AmigaDOS shell. With a standard API for interpreting arguments, all tools can follow the same pattern of arguments, can be optionally case sensitive, and I find it to be a good balance between the cryptic arguments of *nix and being too “English”. One thing that I’ve always liked is the way it supports a template, which prompts the user with a short list of possible arguments, rather than reading man pages or whatever. For example, the copy command. If you type “copy ?” at the prompt, it gives you this prompt: “FROM/M,TO/A,ALL/S,QUIET/S,BUF=BUFFER/K/N,CLONE/S,DATES/S,NOPRO/S,COM /S,NOREQ/S” and lets you just type the arguments you need without typing copy again. /S says that argument is a switch, /A says it’s always required, /M means it accepts multiple files or objects, /K means it’s a keyword and /N means that keyword takes a numerical argument. BUF=BUFFER shows that the command accepts the keyword “buf” as a shortened version of “buffer” for those who use it frequently to save a bit of typing.
So it’s easy to see that copy supports a command line like “copy sys:testfile ram: quiet noreq” which copies the testfile to the RAM: drive, suppresses verbose and suppresses “overwrite?” requests.
That sort of thing would be very handy in a new CLI for the more complicated tools!