In this lesson in the Clueless Computer User series, Ed Hurst will discuss more about stability and interface issues. A popular buzzword these days is “interface”. That’s just a fancy word implying that two or more people are face to face. In actual practice, it usually means anything but face to face. It’s a means of interacting with another. You are said to “interface” by some means. So it is with computers.
In addition to CLI and GUI there’s also TUI, which stands for “Terminal User Interface”. Personally, I’d like to call TUIs “keyboard navigable GUIs” to separate them from “mouse navigable GUIs” but the fact remains that you have to run TUIs from terminals. Midnight Commander is an example of a TUI application.
IMO, many TUI system configuration utilities are faster and more stable than their GUI partners. Take, for instance, Synaptic GUI in Debian — besides that it groups packages in an unconventional way, it has been very buggy and the search function in Synaptic is dog slow. Compare it to Aptitude TUI, which has the same functionality as Synaptic, except that it has been more stable and it’s lightning fast when compared to Synaptic.
In FreeBSD Portsman is an excellent TUI for managing ports.
“That’s because the electronic computer was based on — suprise! — electricity. With electricity, it’s a simple matter of on or off.”
Actually, it can be in-between. A computer could be based on ternary logic (or any other) instead of binary. But we use binary because it’s been calcultated to be the most efficient.
When I said binary was more efficient, I didn’t clarify. It’s known that base e has an economic claim as being a more efficient representation, and 3 is closer to e than 2. This ignores the actual engineering, where base 2 takes the lead in efficiency.
//or i’m totally wrong about everything and should shut up
Um, I thought computers were binary at the lowest level because switches can either be on or off (1 or 0). If you have a ternary base, what values would the circuit be? On, off, sort-of on? Please clarify.
Instead of ON/OFF, you have HI/MEDIUM/LOW. The first computer based on this was the Setun, created in Russia in 1958. A few others have been made in the time between then and now. There’s even a university project devoted to creating modern ternary microprocessors, but I can’t remember what school it was at. (I could keep looking if you like.) Of course, a project like that involves creating unusual transistors, new adders/multipliers and such, language support… It’s involved.
A six part American Scientist from 2001: http://www.americanscientist.org/template/AssetDetail/assetid/14405…
What looks like a FAQ: http://xyzzy.freeshell.org/trinary/
A volunteer project to eventually design such chips: http://www.trinary.cc/Tutorial/Tutorial.htm (has some circuits, but it doesn’t look like it’s that far into it yet)
That ternary stuff’s a lot more interesting than this article!
I read some time ago about a company that was working on a flash-like memory technology. But they didn’t use silicon (as with Flash memory), but some organic ‘designer molecule’ that could absorb or release single electrons.
Now here’s the trick: they further developed that molecule, so that it could absorb/release not one, but two, three, four electrons (maybe more, I don’t have an URL anymore, sorry). So with 4 possible states, a single molecule could store 2 bits of information.
Actually, they used small groups of those molecules for redundancy/compensate for production errors.
Digital circuits with >2 states have been researched ages ago, I guess the 0/1 types simply turned out easier/more reliable/cheaper to build & operate.
It’s not on or off at the hardware level, it’s either high or low voltage states. Low doesn’t always equate to 0 either. It’s always faster to drain a line to ground than it is to raise a line up. So, it just depends on the problem at hand.
Binary is used simply for no other reason than because it’s the easiest to implement.