Bill Weinberg, OSDL Linux evangelist and speaker at next week’s LinuxWorld conference, discusses the state of Linux adoption and its ongoing rivalry with Windows.
Bill Weinberg, OSDL Linux evangelist and speaker at next week’s LinuxWorld conference, discusses the state of Linux adoption and its ongoing rivalry with Windows.
Here we have a good discussion without showing signs of “Zealotry” and Trolling. Just what the doctor ordered. As for myself I agree that the Linux desktop adoption is slow but soon will be a complete reality (Call it Linspire, Xandros, Mandriva, SuSe, Ubuntu, Mepis or whatever). BTW, As of this time I also hear people saying that the only desktop os is windows. (Even at the CS department at college) and that Unix based systems are good only on research, servers and the likes of it(like the Linux cluster they have)
As long as you have rabit zealots (on both sides) who treat a software license as if it were a religion. In other words, “Your license is immoral, so it must be done away with.” You just can’t argue with that kind of logic.
Good thing we just have rabid zealotry. Rabit zealotry sounds painful…
speak of rabits, linux and windows…
It’s the difference between a carot and a stick
[mis-spelling intentional]
Screw the proprietary Microseft XP system.
When I first started using linux, I wasn’t even thinking
about windows as a rival. I use linux over windows
because I know more unix that I do windows. And most
of us really did not care if linux did not go mainstream.
As long as we have our own OS that’s useful to us and
let us make do whatever with it. Then all of a sudden
I read that someone at MS saying about how linux is like
a cancer and un-American.
What is interesting to me is what he cites as the main barrier to adoption. I will continue to belive that the main barrier to adoption is the level of GUI integration between the desktop and the vast majority of functions that are performed on a computer. It’s getting better with Linux, but the focus of Linux is still centered around the command line. This is fine and even desirable in some ways for a server setting, but completely unacceptable for a desktop os. In fact, I would argue the main reason why Microsoft has had any success in the server arena is due to it’s installed desktop base and how they have made server configuration and management as driven by the GUI as desktop usage.
What’s totally crazy is some people will say that this shouldn’t be the case, that it’s a good thing that there aren’t MSCE-types managing Linux and that the command line is a good filter for technical know-how. I find this attitude abhorrent at best. The main benefit of having such a GUI accessible and controllable system is that other applications, such as backup software, DNS servers, mail servers, web servers, have the same level of GUI configurability. It makes it much, much easier to switch between different UI’s and be productive without a seriously insane learning curve just to do what needs to be done inside an app. Having a GUI leads to much, much better “feature discovery” where you can, often largely without documentation, start to figure out how to use a system just by looking through the options visually with a mouse. It’s much easier to learn, and the context switch is very low between different systems.
The same is not the case with Linux and all of it’s utilities. Samba configuration, sendmail, apache, rpm, etc. are all extremely, extremely difficult to use if you only use them on occasion. It’s almost impossible to rember the advanced switches, and so each time it requires reading through the docs and undertaking a small project just to get something configured and working. Sure, if you’re a web server administrator and you spend all day with Apache, it’s perfectly great. However, each time you have to use a new application and learn a brand new set of tools, it starts to become overwhelming. If you, like me, rely on many software pieces such as apache, mysql, postgres, samba, bind, postfix, cvs, subversion, etc. it becomes a royal PITA to switch contexts so often and remember how to do things. Then, when you combine that with just basic, basic system admin tasks on a Linux system, like xinit.d, modules, even changing the freaking TIME becomes a complete chore.
Sure, there are other things with Linux that need work, such as easy software installation from third parties, but the major, most fundamental problem with the Linux DNA is how rooted in the command line it is. I think CLI is a powerful tool, but to continue to deny or go against the GUI paradigm in even the slightest way, even if it’s just subtly ingrained in the culture for historical reasons, is to remain in such a stagnated state in the overall personal computing sphere.
Why teach kids how to read analog watches.
Just give them digtal watches. Or…
Why teach kids how to divide and multiply using
pencil and paper.
Just give them calculator.
GUI maybe nice and easy for administration but
you still need to know what you are doing. CLI
on *nix machine teaches you that. There are tools
to admin *nix in GUI but most *nix guys do it
in command line becuase they are more comfortable
using it.
[i]GUI maybe nice and easy for administration but
you still need to know what you are doing. CLI
on *nix machine teaches you that. There are tools
to admin *nix in GUI but most *nix guys do it
in command line becuase they are more comfortable
using it.[i]
The problem is not using the gui or the cli, te problem is that something like apache requires several hours of learning/experimenting to learn how to configure it half decently. Having a visual “view” of the available options and their interaction is 100 times more intuitive than a 35k text file on a terminal using vi.
I’m not saying if IIS gui is better than apache’s httpd.conf. What you can not deny is that the learning curve in Linux is sometimes cumbersome just to change a line or two.
“it’s a good thing that there aren’t MSCE-types managing Linux and that the command line is a good filter for technical know-how.”
Good point. I agree with you here. Writing lame GUI’s for every function (especially on a server) is a massive waste of developer skills, time, and talent.
it is better to have a CLI with a GUI built afterwards to handl it as an “automatic user” than having a builtin GUI with a CLI to use it when scripting is needed, for the simple reason that the most basic interface should be the most flexible, and CLI is more flexible when it comes to combinations than GUI.
Going through the menus and guessing what action does what take as much time as writing
command –help
And reading the explanations (and without guessing functionalities!)
An administrator that guesses what the program does instead of KNOWING what the program does is not a good administrator and is more error prone.
GUI is nice for manual work by people that don’t type and read fast. But it is a disaster for automated work.
And a large part of an administrator’s work is automating tasks to prevent them from becoming a burden on his future work.
Any beginning GUI programmer can write a nice interface for a CLI administrated program.
But for building the program itself you need the good programmers.
And that is exactly what Ubuntu and friends do.
They give you the powerful CLI tools, with a nice GUI for easy manual work. But demanding to REPLACE the CLI by GUI is a very unwise request.
> Going through the menus and guessing what action does
> what take as much time as writing
> command –help
Sorry, but this is a poor example. In a good GUI, pointing at a menu item and pressing the “help” key does the same (yes, a GUI can be coupled with a keyboard – and you could also imagine left-click for selecting, right-click for help or similar things).
> GUI is nice for manual work by people that don’t type
> and read fast.
Then ask a secretary. She’ll probably be able to read and type fast. Would she prefer the GUI or the CLI?
The key point is not reading and typing. It’s about remembering the commmands and (sometimes very cryptic) special-character sequences to perform some action. Spatial and visual processing in the human mind has developed much better for such tasks.
> But it is a disaster for automated work.
> And a large part of an administrator’s work is
> automating tasks to prevent them from becoming a
> burden on his future work.
Of course the GUI is a disaster for automated work. Automated work means *programming* the computer to do something without your intervention. The GUI isn’t meant for programming, only for user intervention. It is designed with the fact in mind that programming is done with programming languages, and that programming languages and user interfaces are NOT the same.
– Morin
Dude, if you need a GUI to run a server, I feel sorry for your Employer.
Oh, you also need to remember the GUI is not an expressive or scalable interface. Most experienced system administrators would rather work with scripts and command line anyday, than fiddle around with restrictive poorly designed GUIs.
The command line, or shell, is not evil, it is a powerful and expressive interface. Even Microsoft has realized this and are beginning to provide command line interfaces to their services that has existed for decades in Unix.
Finally, every software application has a learning curve. Have you ever tried to use an image editing software or a movie making software or a dvd authoring software, or even a game? Many of these software are GUI based, however, that doesn’t make them easy to use or learn. In my experience they all have long learning curves that need to be complemented with a lot of practice.
Of course setting up a web server, or any kind of server demands some technical knowledge, if not much of it. Servers by their very nature are not designed for Joe n00b to use or setup. You need technical knowledge, experience and some understanding of networking to set up servers correctly. If you think a GUI can mask this, I’ve got news for you.
… and now commenting on the CLI vs. GUI topic
There are several problems in the discussion itself. Firstly, mastering a CLI does not mean technical competence. It requires certain competence, but competence does not require mastering the CLI. The second problem is that the existence of the CLI is a mistake in design that had better disappeared long ago. The reason is that the CLI tries at the same time to be a *user* interface and a *programming* interface. The result is that is does neither well. There are both better programming languages and better user interfaces out there. Combining both is bound to fail.
On the other hand, I’m not promoting GUIs here. GUIs have their own special problems, including the fact that they are simply overkill for many simple tasks. Another, very big problem is that almost all GUIs of today are totally dumbed down. They sacrifice features for simplicity, then add weird colors and animations everywhere. Next, some “wizard” is added that never does what you want. “GUI” is not equal to “simple” or “magic”! It’s a user interface, not a lifestyle.
That said, a combination of a powerful programming language (without the burden of being usable as a CLI) combined with a powerful GUI (without the burden of being programmable, and NOT dumbed down) would be better than most systems in existence today.
– Morin
get off the fence mate, what is it you want ?
a decent CLI or a decent GUI ?
Where does Weinberg say absolute in the article that Linux shouldn’t war with MS?Personally i think the title is misleading.Linux isn’t at war with MS but competes fiercely and wins terrain at a fast rate at least in the server rooms.And that rate will dramatically increase as soon as IT-managers get a bit more security awareness and realize that MS isn’t really an genuine advantage when on time patches are concerned.
Commenting TFA first…
> Linux thrives, but should not war with MS
This statement is simply not regarding the background in any way. Firstly, Linux automatically fights a war in whatever area it is applied, with those companies already present in that area. Linux on servers automatically fights MS products on servers. Linux on the desktop automatically fights MS Windows as well as the whole Mac area. If Linux were to fight nobody, it had to simply disappear. Secondly, Linux was by its roots a combination of the Linux kernel and the GNU utility programs and libraries. The article thus suggests that Linux divorces from GNU entirely (considering that the attempt to convince RMS to stop his war against proprietary software is simply a waste of time). Such a divorce *will* be painful.
@Morin
That’s where you get it all wrong. GUIs are not expressive and therefore should be “dumbed down” as you put it. There are supposed to provide a simple interface to the most used functions in an application. The problem with todays GUIs is that they are trying pushing the limits of GUI paradigm. They are turning an interface designed for simplicity into one that is complex and frustrating.
The shell on the other hand is an expressive interface. They are designed for automation, repetition, prototyping, iterative testing, throw away scripting, etc. They are designed for folks who know what the hell they are doing and who have the knowledge and the wisdom to do it efficiently and effectively e.g System/Network Administrators, Programmers, Advanced Users.
Saying the Shell or CLI should have died years ago is embarrasing. For instance, designing a GUI over Apache or Samba that exposes all its functionality is silly.
There are GUI wrappers available for the majority of functions that anonymous mentions. I don’t think a user is forced to a command line nearly oas often as is implied. I don’t _have_ to use the command line on my Linux computer to do what I need to do; when I do, it is by personal choice.
Certainly my wife, and 11 year old daughter, who both have Linux computers, don’t know a thing about the CLI, (literally wouldn’t know where to look for it,) and never have to use it. If Linux were focussed on the command line, how would that be possible?
What anonymous apparently fails to see though is having command line utilities is invaluable when scripting together various utilities and command into custom functionality that truly show the power of Linux/Unix. Doing this kind of thing in a GUI centric environment is not a lot of fun, and digging through a dozen dialog boxes trying to find what you are looking for is tiresome. Maybe it’s not a surprise to se studies which show the average unix/linux admin can effectively manage more boxes than the average Windows admin.
I find it easier to remember the most common switches, and can man or –help info on the others when need be.
Yes, but you setup the Linux computers for them and I bet you do the complex things.
Still, your wife and daughter are not about to setup an Apache web server. My point was that while those products typically have things like a graphical installer on Windows and even a GUI configuration tool, they don’t on Linux!!!! Why do you assume that just because I can figure out how to do it the “Linux way” (i.e. CLI, example config files, searching google groups constantly) that I don’t mind its being so hard?
I was going to comment that in the OS wars, that I thought it was completely one sided with MS doing most of the punching with their TCO ad’s and self-funded studies. Has anyone seen any lately? I went to such of my usual sites and haven’t seen any. I have noticed that if there are any favorable articles about Linux on Zdnet or Eweek, the MS rabid zealots attack unrelentlessly. This site is pretty tame, unless there is a OSX article.
On one hand, there exists a wealth of open source middleware, utilities and increasing applications with robust deployments over Windows and Solaris. The most obvious example is AMP [Apache, MySQL and Perl/PHP/Python]. While most commonly deployed as LAMP (Linux plus AMP), there is a growing business for WAMP and SAMP, as evidenced in the growth of companies like SourceLabs in Oregon.
What about BAMP (BSD+AMP)??? Isn’t it fairly widely used too? Last I heard, Yahoo! ran on FreeBSD, whereas Google opted for Linux.
I agree with you. The fact that there is even a debate about whether the GUI vs. CLI is better shows the problem. It’s very rooted in the philosophy of Unix not to have things configurable via a GUI. Way too elite. Sure, if you have something extremely advanced, you might want a CLI (better yet programmable interface, ala Windows) but Linux should be taking the lead. Doing things like adding a new device and configuring a piece of hardware are way to hard. It turns everyone into a system administrator, and if you want more people to become comfortable with Linux they have to have control. This is why people like Andrew Morton say it’s not ready for “power users”. Those people like to know the in and out of a system and they use the GUI almost to create a complete visual mental map of the entire system. Then, when someone is like “how do you do x, y, z”, they can literally describe it without even having to see it. That is so, so powerful and you can’t do it with a CLI. Actually, many times I can actually help someone remotely without even ever using a piece of software just by asking them to describe the options that are available and then making analogies to the way other software works. Please people, wake up!!!!!!!!
Today, the vast majority of people look at a computer as if it were an appliance. It has some features, you can use them in their intended manner.
In reality, computers can do much more. They are programmable devices. Choosing to use a computer as an appliance is a choice. Knowing how to use the tools necessary to use a computer as a fully programmable device is a choice.
The GUI is a limiting assumption on the functionality of the computer. How limiting is dependant on the implementation. Some would argue that the Windows GUI places less limitations on usage than does KDE or GNOME. However, these GUIs explicitly include more usage patterns than Windows does.
What tasks are more-or-less “point-and-click” oriented Linux users forced to accomplish at the command line? What are the commands that are particularly hard to use?
I believe that the common complaints about the command line are not really about the command line at all, but about configuration files. Am I right?
It bothers me a little bit when people make examples of what tasks need to be more “friendly” in *nix. Here are some examples we all see:
1) Linux takes at least 2-3 hours to configure everything to work with standard hardware.
2) I hate having to recompile my kernel every time I change my hardware.
Most general purpose linux distributions compile support for a ridiculous amount of hardware as modules, which udev and hotplug will autoload at boot time. This has gotten pretty reliable in the past 2 years. Even hotplugging USB, IEEE1394, and PCI(!) has become automatic with HAL.
What confuses me is the the same people who make these types of comments also (sometimes) appear to be Linux users, so I wonder why their experiences vary so much. I wonder if they are using a modern distribution with a 2.6.x kernel. I wonder if they have hotplug and udev running. I don’t know why people are not taking advantage of these features and instead post about their gripes in forums. I don’t know what the Linux community can do to help these people have a more seamless experience configuring their hardware or adding new hardware, because we are doing about all we can.
Although I can understand why it is that the problem is usually percieved in this fashion. The best way that I can devise to get at what the real underlying issue actually is is to draw a distinction between, on the one hand, products and, on the other, toolkits.
In the Windows world, products are the norm. A product is a collection of underlying functional components, which are tied together with an eye toward a particular workflow and embodying a set of best practices. The underlying components which make up a product are most likely capable of quite a bit more actual functionality than is exposed within the product (whether by GUI or CLI). In fact, the real value of the product is that it intelligently restricts the raw functionality of underlying components in such a way that the remaining functionality is grouped into well definied processes which in turn map to real world processes that a user may wish to accomplish.
On the other hand, the *NIX tradition is to provide toolkits, which an administrator then uses to build up customized solutions appropriate to their specific site requirements. The traditional appeal of *NIX has the relative ease with which an admin can piece together components of raw functionality into a solution tailored to their particular needs. The peculiar magic represented by stdout and stdin, coupled with pipes, and all wrapped up in a powerful shell interpreter that supports running premade scripts, can quite easily escape someone who’s entire experience with computers has been within the domain where products reign supreme. To someone coming from the world of finished products, *NIX can appear half-finished, at best, and, in the worst view, totally broken.
For example, a Windows admin may judge *NIX to be inadequate for a task because product X, or an equivalent, isn’t available. The typical *NIX admin response upon hearing this would be along the lines of: “Of course *NIX has that functionality. Here’s this script X which I’ve written and it does everything product X does and more.” Then the Windows admin and the *NIX admin both look at one another and smile, each unshakably secure in his belief that he has proved his position beyond any reasonable doubt.
Each approach has its merits, but it has also become rather obvious that if *NIX is to break out of its traditional markets it will also have become a suitable platform for deploying products instead of just remaining powerfulk toolkit for building solutions. *NIX can remain a toolkit while still being avehicle for the delivery of prefabricated products.
In the FOSS space, there is quite a bit of potential for commercializing the underlying components by tying them together into more traditional products. Although not commercial products in the traditional sense, XAMPP and the Kolab server are both good examples of what I’m talking about. Neither solution allows anywhere near the same degree of flexibility which would come with building a solution from the components pieces up, but rather ties together a limited subset of that overall functionality into cohesive whole appropriate to a range of defined usage scenarios.
I don’t think the “toolkit” vs. “product” argument is that critical. The problem exists, but mostly in the minds. Every product is assembled from parts, or said in CS-speak, a system is built using simple tools. In other words, Unix may not itself be a product, but can be the basis for a product. Furthermore, it might be very well possible to expose the built-from-tools view to expert users who want to do magic with their systems.
The basic idea is exactly the same as in “a GUI above the CLI”. I just think that the CLI miserably fails at it. It should read “a GUI above the code”.
Then all of a sudden
I read that someone at MS saying about how linux is like
a cancer and un-American.
You forgot to mention the ceo of Red Hat said to stick to Windows on the desktop. So who’s right?
RE: Mystilleef
> That’s where you get it all wrong. GUIs are not
> expressive and therefore should be “dumbed down”
> as you put it. There are supposed to provide a
> simple interface to the most used functions in an
> application.
Originally, GUIs were invented to provide an interface to ALL functions in a system. And GUIs are, in fact, *very* expressive if done right. Imagine a command-line interface for photoshop…
> The shell on the other hand is an expressive
> interface. They are designed for automation,
> repetition, prototyping, iterative testing,
> throw away scripting, etc.
In other words: The CLI is a *programming* interface, not a *user* interface. That’s exactly what I’m trying to say. A user interface wouldn’t need sequencing, loop statements, conditionals, regular expressions and whatever – that’s *programming*. By separating programming and the user interface, one could add to both independently.
> Saying the Shell or CLI should have died years ago is
> embarrasing. For instance, designing a GUI over
> Apache or Samba that exposes all its functionality
> is silly.
Thanks for throwing magic words like “emabrassing” and “silly”. Now could you also explain why a GUI over Apache and Samba is beyond your vision? Take also in account that I consider anything beyond configuration and *using* the functionality (for example, scripted reactions to incoming HTTP requests) to be programming and thus suggest using neither a CLI nor GUI for it, but a real programming language (a lightweight one though, intended for scripting).
RE: Anonymous (IP: 24.156.177.—)
> What anonymous apparently fails to see though is
> having command line utilities is invaluable when
> scripting together various utilities and command into
> custom functionality that truly show the power of
> Linux/Unix.
I guess that was also directed at me since I forgot to add my name to my comment.
You mentioned here the use of the CLI as a programming tool. Yes, I do not understand in what way a CLI, together with CLI tools, is superior to a programming language with an equivalent set of libraries. The latter (if done right) is able to give me the same functionality, at roughly the same code size (yes, you do add some extra characters), with a nice bonus such as load-time (static) type checking with type inference, lots of control structures, and of course the whole set of standard libraries present in said language. I’m not arguing for language X here, but merely for the presence of such features in other programming languages that don’t even try to also be a user interface.
– Morin
In other words: The CLI is a *programming* interface, not a *user* interface. That’s exactly what I’m trying to say. A user interface wouldn’t need sequencing, loop statements, conditionals, regular expressions and whatever – that’s *programming*. By separating programming and the user interface, one could add to both independently.
How would you do, in a plain Windows install do the following (a mass-rename):
for F in *.jpg ; do mv “$F” “$F.jpeg” ; done
Clicking and typing 100 times the same? Does not sound good. Go to the web and search for a special program to do it? Does not sound good either.
GUIs and CLIs are different and for a good reason. There are programs, which only make sense in GUI – like Photoshop. On the other hand, CLIs give you more power (see above example).
You cannot pipe GUI programs together.
You can barely use regular expressions in GUIs (this can be a often used functionality during server maintenance). I know this is not inherent to GUIs, but still a valid point.
The biggest problem with GUIs is that they are limiting in what the allow you to do. Imagine a GUI for Apache. It allow you to set every parameter and does so in a clear and understandable way. Radio buttons for boolean values, etc. Then a new version of Apache appears and has new parameters. Your GUI can’t set them, because it is not prepared for it. Someone has to fix it. On the other hand, using a text editor to edit httpd.conf, you can use the new parameters immediately.
Also, to use a CLI version of a program (or edit a text configuration file), you need much less resources (cpu, ram, network bandwidth) than to do the same through a GUI. If I had an Apache server to care for, it would be away from me – in a server room. I’d rather control it through SSH (which is usable even on a GPRS dial-up connection) than through a GUI, which would require a better network connection and “real” computer. I could run SSH on my Palm (when I had one). No real chance of running VNC or Remote Desktop on it.
At work, we upgrade database systems for our customers. Updates can be installed either through a GUI installer (requires cca. 20-30 clicks during different parts of the 20 minutes process) or through CLI installer, into which you feed a “response file” (which is always the same). The first times, I used the GUI way. Now, I prefer the CLI one. I ust don’t want to click the same things again and again, if I can run just one command and have the computer do all the work! The computer is the tool, not me. It should do the boring work, not me!
GUIs have their use in the computer world. Server configuration and maintenance, it is not.
Server administration requires some knowledge. I would not have much confidence in a person, which claims to have that knowledge and is unable to use a command line and a text editor to apply this knowledge.
> How would you do, in a plain Windows install do the
> following (a mass-rename):
Windows is one of the worst GUIs in the world. So this doesn’t really surprise me.
> You cannot pipe GUI programs together.
Yes, because GUI programs are regarded to be applications and not components. I know very well what you mean – again and again I wanted to combine GUI programs and it wasn’t possible. But the key is that I shouldn’t combine applications but components. The sad part of the story is that on most GUI systems, there are only applications and no components. This is something that has to change to give GUIs real power.
Piping could also be replaced by GUI gestures, for example dragging and dropping a HTML document on some HTML2ASCII converter application, the result on a word counter application, the result on the calculator application, with the obvious meaning to count the words in the pure text form of the document and entering that number in the calculator. Always with a click on such an application meaning “drop this item and pick up the result”. Of course, in the spirit of the above paragraph, for example the HTML2ASCII “slot” in the GUI is programmed as a one-liner, combining a HTML2ASCII component and some “slot-maker” component.
> I know this is not inherent to GUIs, but still a
> valid point.
But it *is* possible in GUIs. Of course many GUIs don’t get it right. So do many CLIs.
> Then a new version of Apache appears and has new
> parameters. Your GUI can’t set them, because it is
> not prepared for it. Someone has to fix it. On the
> other hand, using a text editor to edit httpd.conf,
> you can use the new parameters immediately.
But a text editor is, in its user interface, inferior to a configuration utility (because it allows entering invalid values, for example). I’m suggesting auto-generated configuration utilities for such cases (generating loader code, run-time data structures, validators, pretty printers and configuration editors from a single format description). That way, they cannot be outdated (since the application’s loader code would be outdated too).
> Also, to use a CLI version of a program (or edit a
> text configuration file), you need much less
> resources (cpu, ram, network bandwidth) than to do
> the same through a GUI.
Agreed. The CLI is probably one of the least resource-hungry UIs.
> Server administration requires some knowledge. I
> would not have much confidence in a person, which
> claims to have that knowledge and is unable to use a
> command line and a text editor to apply this
> knowledge.
Agreed – CLIs as an “idiot filter” have probably done their job well.
I’m suggesting auto-generated configuration utilities for such cases (generating loader code, run-time data structures, validators, pretty printers and configuration editors from a single format description). That way, they cannot be outdated (since the application’s loader code would be outdated too).
This sounds very good. However, “backporting” this to existing servers could pose a problem.
> This sounds very good. However, “backporting” this to
> existing servers could pose a problem.
Hehe… whenever I come up with such ideas they seem to be totally incompatible with existing stuff
– Morin
K3b is a wonderful example of a GUI on Linux (at least for me). I have absolutely no desire to figure out the very long command strings I’d need to burn a CD.
It nicely encapsulates a wide range of functionality.
“Yes, but you setup the Linux computers for them and I bet you do the complex things. ”
Like most PC’s come with Windows pre-installed. And I rearely have to use their CLI on their PC’s either.
If one has to choose yes or no, or choose between already known choices, or choose a number, or choose one of n responses, the thing can be done with a GUI. If one has to alter the flow of events or add new possibilities, there is a need for CLI, programming or scripting.
The problem of linux is “inability through obscurity.” For example, sshd has some dozens of yes/no toggles in /etc/ssh/sshd_config. By default, all choices are present, but for documentation one has to look elsewhere. Most of the time no choices are present. Just an empty file. “Google the name of this file and see what outdated HOWTOs and FAQs you can find.”
To set up a NFS server, one has to study, study, study and study. One GUI tool could accomplish that much in five minutes, and explain and show the user an equal amount of details. The user would know immediately most of the possibilities, even all if the tool is current enough.
There are many good reasons why a server must be configurable through serial link or a remote terminal. But there is no choice for an ordinary user for setting up remote login ability other than to learn to edit sshd_config. Why couldn’t there be an easy GUI tool for editing ssh_config and sshd_config? And maybe a dozen other files in /etc?
The problem is that GUI is not used enough. The capabilities that a GUI could bring to configuring linux are enormous. There is no need to throw away CLI if the two can be combined in a compatible fashion. I don’t get why some blockhead insists on discussing httpd.conf at the expense of the real issue. Why this useless blathering about GUI being “bad” or “good”, when it is used to do less than 10% of what it is good for?
Look at BeOS, look at SkyOS, look at Macintosh, and see what linux is missing. This is what it is all about. Controlling the low-level stuff. The need to learn some obscure spells for .asoundrc and loads of other config files.
The feature is useless if I don’t know the incantation, or am too angry to read a bit about it here and another bit there, try out if my configuration matches the author’s configuration and then troubleshoot it by looking for more bits and pieces of advice.
A GUI tool could give me 80% of the power of .asoundrc and other configuration files, and that would be more than I’d ever need. A CLI is not giving me 100%, but maybe 10%, because I don’t want to go and again spend hours upon hours trying and testing all the combinations to see what works best.
To set up a NFS server, one has to study, study, study and study.
Are you implying one should be able to run a NFS server without knowing what he is doing? I, for one, don’t want clueless people running servers.
CLI gives you 100% in the way, that it hides nothing from you. GUI hides everything a) it decides to hide, b) did not exist when the GUI was built but now exists.
I suggest you read my previous entry about the other complications GUI brings with it.
As for the NFS configuration. Mine, very simple, looks
like this:
]$ cat /etc/exports
# /etc/exports: the access control list for filesystems which may be exported
# to NFS clients. See exports(5).
/data 192.168.1.0/255.255.255.0(async,rw,root_squash,anonuid=65534,anongid=6 5534)
* exported directory – I see no significant difference in typing the directory name and navigating to it trough a “Open File” dialog. Actually I think typing can be faster in some circumstances
* allowed IP addresses – In all GUIs I’ve seen, this is something you type. So no help either
* options – ok, you could handle the first three through radio boxes, but would have to type for the other anyway. Unless your GUI would give you a pop-up list with 65535 lines in it (ugh!).
If you want to complain about “but you have to know what to type!” –
a) of course. You are running a server. You should know what you are doing. Or would a GUI magically teach you about networks the first time you see it?
b) The configuration file mentions where to find information – in the OS’s built-in help system – the manual pages. No googling necessary.
Creating GUI’s is far more time intensive than a CLI or text config file. Since an interface does not directly contribute functionality, developer resources are better spent on other things.
There is nothing inherently more difficult in a CLI than a GUI. One just has to be able to get into the right mindset. It’s not really that difficult to remember syntax or directives considering they are named to be as sensible as possible to work with. If there’s any difficulty in deciphering what something means, it’s better naming, or more inline documentation that’s needed, not a GUI.
> It’s not really that difficult to remember syntax or
> directives considering they are named to be as sensible
> as possible to work with.
Yes, but it’s an additional informational burden to remember. The burden is smaller with GUIs (because spatial memory is involved which is far better developed).