“Today’s operating systems are conceptually upside-down. They developed the hard way, gradually struggling upwards from the machinery (processors, memory, disks and displays) toward the user. In the future, operating systems and information management tools will grow top-down.” Read the editorial at ComputerWorld by David Gelernter (CS professor at Yale and chief scientist at ScopeWare).
I don’t know. Sounds like he is descibing more of a suped-up file system software than an OS. An OS also has to interact with hardware as much as providing a platform for software. Maybe something like a program that searches for keywords. But his argument is flawed. The file in his example is no better than just sticking all relavent information in the same directory. If I’m running a business, why would I put relavent data in different files all over my harddrive on a customer/client/product etc? Most database software can already do what he is saying only he wants to make it an entire OS. I don’t think the average user would need something so all inclusive for their computer.
They write paper just for the sake of writing paper, as if it would not be more annoying to write all app over for is way and train everybody in the world to it
In fact he don’t even say how to do “is” way. I know how and it’s called AI and it’s more work that he think. Also how can you ignore the bottom up aproach, does he want the processor to be build as a last step according to what the top would have started ? If not he will have some code that can’t be run because it could not be hook on the real world hardware.
It still facinate me how easily “technologist” these day can dissmis the hardware in favor of a less dirty, software only world. Remember me of the pure energy creature from startrek.
I blame that on the all simulator, no more, breadboard experimentation philosophy that is going on in university since quite a while now.
And that guy is a CS professor? sheesh, at least sci-fi movie try to give us a date in their opening.
Totally agree with sz3344d.
I don’T see the point with the text, regarding OS. All I see is just some sort of file organization. And still, the idea is not revolutionary, nor far from what database can already do.
It’s a slow day … 🙂
This guy seems really excited about Longhorn. I wonder if he knows that any good features from Vision will be lovingly “borrowed” for inclusion in MS’ next release..
Anyone actually give Vision a try? It looks hokey as hell to me and I don’t feel like setting up an email addy to get the serial, but maybe someone here has had a good experience?
Its data base wouldn´t work in my case, I know it works for some people, more than ten setups and tries in creating the data base with no result, maybe next beta.
I agree with AlienSoldier, that Gelernter’s criticism saying everything is wrong sounds a bit cheap, and the 1946 example is disastrous. Not his best article on the subject. It all comes down to “lets wait for Longhorn”, I guess he is just fustrated like most people by the lack of innovation on UI, but his ScopeVision (former Life Streams) isn’t revolutionary either.
He’s talking about the User Interface, not the operating system. Anybody can write a new UI or data management system and run it on top of just about any OS. But that doesn’t mean that it will be a good UI or system. Simplicity is the *goal*, and saying that the future is about programming with the end-user in mind doesn’t automatically mean that your UI or info system will actually be simple.
Now, if he had some solid ideas about how to make it simpler, that would be something else…
That is a total failure IMHO. All we got was a centralized repository of data that, if it screws up, screws YOU beyond belief.
Longhorn will be loved by the KM community, but it will take quite some time to change the paradigm most people use–files in folders. That’s the real world metaphor, and it’s how most people think.
. . . tizzyD
I tried out Scopeware vision. And it sucks. We need a 3d os not some kind of 2d crap. I want to move around files with my gloved hands on my lcd wall. Who needs another windows file management program.
That sucked. If that guy is a Computer Science professor at Yale, remind me to never go there!
First off, the concept of an operating system growing toward the user makes no sense. An operating system doesn’t even need to provide a user interface. An operating system manages resources for programs. One of those programs might be a user interface. Second, all useful applications put the user experience first. So what’s his point?
If he’s having such a hard time keeping his files organized: turn on File Indexing and use the Find Files feature. (Okay, not quite what he was suggesting.)
And so far every implementation of a 3D interface I have seen has been useless. Everybody keeps crying about improving interfaces but nobody can provide useful concepts. From the screenshot, Scopeware looks like a fancy Tile Windows function with two arrow buttons to navigate. Yay.
My two favourite interface improvements have been the mouse wheel and mouse gestures. You aren’t required to understand or utilize them. But if you do, you can work more efficiently.
That is a total SUCCESS IMHO…
Yup, I totally enjoy the way registry is. But yes, in 10 years I never get the registry corrupted, ever, nor any of the peoples I know. So I don’t think it’s THAT a failure as you said.
That said, I have to precise that I’ve always used NT based Windows. I just can’t stand any 9X based crap. Maybe that helped me to get a positive view on the Registry.
There’s no really good reason a file heirarchy can’t work to organize data. Better yet, a traditional heirarchy plus some db-like features (ie. BFS) is ideal. The problem that heirarchical organization of data has is the same problem that a database-oriented system would: users are lazy.
Seriously. If someone can’t be bothered to organize their data into folders, why do people think they’ll want to enter in all the data to organize it in a db? And don’t say “the computer will do that for you!” because there is no silver bullet that solves that problem. Having the computer do that accurately would approach having an AI. Simple tricks like indexing all the words in a document are convenient, but WinXP has a similiar feature in it’s search for finding a specific string within the files that you are searching by brute force… and it is F-A-S-T. (or ‘grep -r’ for you unix fans)
People are just becoming obsessed with this idea because it seems like it’s time has come. Yes, having the OS be smart enough to ‘grok’ MP3 ID tags and add that data to the filesystem as attributes of the file is definitely a great thing to have, but it’s not a revolutionaly idea. It’s just a very nice bit of polish. Anything more than that is going to be impossible for many, many years.
“That is a total failure IMHO. All we got was a centralized repository of data that, if it screws up, screws YOU beyond belief.”
Newer versions of Windows take a backup copy of the registry on each boot and will automatically restore the registry to the last known good copy if it becomes corrupt. So corrupt registry problems are pretty rare on the newer versions of Windows.
And the registry does have certain advantages over the text based configuration files of previous Windows versions (and the text based config files that UNIX still uses). The main advantage, of course, is that the registry can be accessed a lot faster.
Does anyone else here recall his NYT editorial last summer/fall? He made basically the same caseas he does here, but under the irresistible troll/headline of “Operating Systems are Irrelevant”. (Well, Mr. Gelernter, even if you were talking about OS’s and not file management, why is your product only available for Windows?) While I like the idea of new interfaces, and organizing by time could be useful, I felt the whole thing was a giant buildup to a plug for ScopeWare. Again, new things are good, but I don’t know about a scientist who a) crosses his concepts, and b) advertises his own product as a step forward
Read someone like Dijkstra to see clear, logical thinking.
http://www.cs.utexas.edu/users/EWD/indexEWDnums.html
I’m amazed that David Gelernter is a professor at Yale. When computing science has such “scientists,” we are in trouble.
Gelernter directs his vast intelligence at solving such well-defined objectives as, “it reflects the shape of your life. Its role is to track your life event by event, moment by moment, thought by thought.” After all, the future of operating systems is simplicity.
And the registry does have certain advantages over the text based configuration files of previous Windows versions (and the text based config files that UNIX still uses). The main advantage, of course, is that the registry can be accessed a lot faster.
Yeah, but if there was enough demand, there would be a nice graphical interface for all of this. The problem is, there is a Vi/Emacs war, not an XConfGui/YConfGui war. 😉
People who use UNIX just love their keyboards too much.
I as wel can’t believe that this guy is a CS prof at Yale. If I would have suggested this to one of my profs when I studied CS they would have marched me out and shot me!
a) He mentions that the OS should track our life event by event. We, humans, were given a brain for this … not an OS. There is something seriously wrong when the OS becomes responsible for keeping your data organized and accessible. I thought that was our job. 🙂 Really, there is NO substitute for good organizational skills.
b) He also mentions he’s buddy buddy with Bill Gates (bad idea) and that Longhorn will incorporate this type of UI. Frankly, I think this is an attempt to make the OS slower, thus forcing people to upgrade their already super-over powered PCs. All the functionality he mentions can be done using something like the BFS with a few modifications.
c) It seems that this paradigm just doesn’t match real life, like the desktop does. I don’t keep my phone numbers stored chronoligcally, or my school papers, or my digital photos. It seems like this paradigm is being forced onto the masses by MS and this guy because it is something new. My philosophy is “Don’t upgrade unless it actually provides a better system.” New is definitely not better in most cases, and that is something the public has never learned …
Anyway, enough of my ranting. I’ll let others take over.
Eugenia, weren´t you retiring?
“it reflects the shape of your life. Its role is to track your life event by event, moment by moment, thought by thought.”
This could probably be accomplished by some combination of Palladium, DoubleClick, and the Clipper chip.
“Yeah, but if there was enough demand, there would be a nice graphical interface for all of this. The problem is, there is a Vi/Emacs war, not an XConfGui/YConfGui war. ;-)”
I don’t mean it can be accessed by the user faster. I mean it can be accessed by the system faster. The registry is a database that can use random access to read any variable stored in it, and much of the registry is stored in a binary format. UNIX text configuration files can only be read sequentially.
As far as user access time, it can often be faster to change one line of a text file than to navigate through a control panel in Windows, so user access speed is not the issue. The speed at which te system can read its configuration settings is.
This is neither here nor there, but Gelertner was one of the Unibomber’s victims (seriously injured by survived, obviously). He made a small splash with a book called “Mirror Worlds” in the ’80s, about his vision of a virtual reality, and Kaczinski apparently saw him as a leader of the brave new world.
OFF TOPIC ABOUT EUGENIA: I’m thinking that a she can retire from this site, and just contribute an interesting nugget of news now and then.
ON TOPIC: Yeah, he’s been preaching this gospel for a while. Now, organizing data in chronological order in a database is good for SOME THINGS. This is why people have PDAs and appointment books and such, because brains are wonderful things, but not perfect.
But I don’t see how chronological order can help when people just have stuff.
Counterexample #1: I take a bunch of pictures for my mom’s birthday and put them on her computer. 6 months later she wants to look at them. Would it be easier to A. zoom through a Narrative stream and then pick apart everything that happened there, or B. Go to Pictures -> Mom -> Birthday_2002.
Counterexample 2#: I’ve got a humongous music collection. I want to burn a special mix CD for a party. Do I A. slog through my stream to figure out what I ripped when or B. Go through my folders organized by Genre, Artist, Album, etc.
In other words, different metaphors work better for different purposes, a good UI would simply seek to unify the basic methods used to access those metaphors, and allow differences among applications.
Oh, and if (when) voice recognition and AI become ubiquitous, it will be a LOT easier to link, organize, and comment on stuff via running commentary. Ex: “These are all pictures from my mom’s birthday party in 2001.”
The AI and db engine would associate “all” with the currently selected pictures, associate “my mom” with the current user’s mother’s name, and the status of being “mom”, and link them all with the words “birthday”, “party”, and the year “2001” (regardless of when you actually scanned the pictures, though that would be stored as well) as reference metadata.
Thus you could ask your computer for “All pictures from 2001” and your mom’s birthday pictures would come up.
THAT would make computers more useful. A natural language equivalent of the efficiency of shell scripting.
That’s all for this rant
–JM
just my humble opinion but…
its great the OS are developed bottom up… you have the lowe level building block first… because we know that yesterday, today and tomorrow we will have CPUs, PCI, IDE, SCSI, VGA, IEEE 1940, etc etc… and we can BUILD ourslves our high-level OS world from this… (echoes of unix small tools)…
now, if the author is saying we need to start at the top – ie photos, music, invoice documents, birthday card designs,… and build downwards from that… then i think that is seriously flawed. first of all that would not be an OS as i like to think of it… that would be a toy… like those dedicated electronic typewriters or “my-first-pc” boxes….
more importantly… if our idea of what should be at the top level changes … say someone invents photo files tommorow, or invents virtual 3d-worlds, or a new type of taz form tha the govt introduced… then the OS would be dead. straight. any attempt to add the new “media” or “top-level types” to the proposed OS would in my view be a dirty dirty hack.
no way… i’ll stick with a bottom up OS thanks… I can us eit to reshuffle and recreate my top-level world without re-engineering my OS!
yours truly, merely a CS student
t
The delta between the amount of time Windows takes to read it’s registry and the time a unix station takes to read it’s flat files is insignificant. When we discuss performance, we are concerned with performance during normal operation, not a ~1ms delta in boot time.
I have personally found the registry to be a big ugly kludge, which tends to degenerate over time. I’ve had issues in the past with registry corruption. I would much rather rebuild a flat file that was corrupted and restart that service than work on cleaning up a cryptic registry.
Also, it is much faster for a human to update flat files than the registry. The amount of time saved from a user perspective when the registry or flat files need to be edited more than makes up for the boot time delta as discussed above.
All in all, I don’t see there being enough advantage in flat files for it to be worthwhile for windows to migrate back, nor do I see enough advantage to the registry to make it worthwhile for any *nix flavor to migrate to one.
– Kelson
To echo Kelson’s comments: The registry does not help Windows in performance by a long shot. The registry on my desktop is probably 3 orders of magnitude larger than all the relevant text file configurations on my OpenBSD server. Yes, it can index directly into it, but this is easily solved by reading the config once upon boot and creating a hash in memory. Reload if the config files change. MUCH better solution.
Second, the registry is an all-your-eggs-in-one-basket problem. If a single byte is out of place, the entire registry can be corrupted.
Third, I have personally trashed (by accident) the registry of both W2K and XP boxes so that they wouldn’t boot AT ALL. I appreciate that it makes backups, but there ARE failure modes where incomplete corruption, which is backed up, leads to complete corruption upon the second boot.
The windows registry always was and will continue to be a terrible idea. Not just a weak spot but an honest-to-god failure of software engineering to properly address the shortcomings of the design and user behavior.
I can’t back it up without special tools. I can’t edit it from a stand alone file, which means that I can only work with the LIVE REGISTRY (who stupid is _that_?!?). And the OS vendor doesn’t even provide adequate tools for manual edit and recovery (I’ve had to hack together my own tools and boot disks for this). And its failure modes are disastrous, whereas simple text files, which can also be corrupted, can be fixed by hand.
The registry is like a lot of poor software designs: it looks great on paper, but if you don’t implement it 100% perfectly and absolutely fail-proof it is much worse than if you stick to a simpler but less sophisticated solution. Recognizing these designs is the first half of avoiding this common problem. Being honest about your ability to never be able to create something 100% perfect is the other half.
If I had scrolled down, I would have realized that Jared White had similar ideas…d’oh.
–JM
I installed ScopeWare and promptly deleted it within the hour. My brief observations follow:
“..it reflects the shape of your life. Its role is to track your life event by event, moment by moment, thought by thought.”
Translation: Sort by Date/Time -> Nothing new.
“Simulated 3-D interfaces are a perfect match to the needs of a narrative stream — which just happens to be a parade of documents.”
Why take a compact, orderly list and turn it into a parade of documents (ie. big 3D stack of cards). Each item in the list becomes a bulky 3D widget that takes up too much screen space. The 3D interface in ScopeWare only gets in the way.
Check out the OSNews article “Why Automatic Information Management is Doomed to Fail”, which is listed 3 articles down from this one. That article should actually be titled: “Why Scopeware Vision is Doomed to fail.”
I agree with him.
“When we discuss performance, we are concerned with performance during normal operation, not a ~1ms delta in boot time.”
There’s more to it than just this. For example, change one variable in a UNIX config file and the entire file has to be reread. The registry can just reread the one varialbe that has changed.
“I have personally found the registry to be a big ugly kludge, which tends to degenerate over time. I’ve had issues in the past with registry corruption.”
I haven’t had a single registry problem since Windows 95 OSR2. Windows 95 OSR1 did occasionally corrupt the registry. But that has been largely fixed since the registry started doing automatic backups.
Of course, a central registry also makes configuration backups somewhat easier. You have to backup tons of seperate configuration files, some of which may only contain a single line. It’s nice for storing program registration keys and such because if you reinstall your software, it will already remember your registration key and such.
And besides… The registry is fun. You can play with cryptic keys in the registry and make Windows do things that Microsoft will tell you it can’t do.
“The registry does not help Windows in performance by a long shot. The registry on my desktop is probably 3 orders of magnitude larger than all the relevant text file configurations on my OpenBSD server.”
What if it was 10 times bigger? It’s random access. It can be accessed just as fast, unlike sequential access of text files.
“Third, I have personally trashed (by accident) the registry of both W2K and XP boxes so that they wouldn’t boot AT ALL. I appreciate that it makes backups, but there ARE failure modes where incomplete corruption, which is backed up, leads to complete corruption upon the second boot.”
This shouldn’t happen because Win2k and XP maintain the last three registry backups. So even if a corrupt registry is backed up on the next boot (which usually won’t happen because Windows checks for corruption BEFORE creating the backup), you still have the registries from the last two boots to fallback on.
“I can’t back it up without special tools.”
You don’t need a special tool to backup the registry. You can even do it from the command line with reg.
My general experience with the registry has been less than optimal..I’ve seen it corrupt less with the NT series, than 9x, this is true, but it still can and has happened. By the by, only XP backs up on every successful boot from what I’ve seen, 2K at least only backs it up when you make a major config change like adding a driver. At least based off every time I’ve ever had to use “Last Known Good”. Also, the registry does have many design issues. For example, when you delete a key in the registry, it doesn’t actually get deleted. That part of the binary file simply gets marked as unusable. End result, your registry will perpetually grow, leading to increased mem usage over time, leading to system slowdown. I might further note that MS’s engineers have admitted before that the Registry was their single biggest design mistake with NT.
Back in the day, a GUI was considered radical. The command line was the one true “best” way to put data into a computer quoth the purists, despite the scientific evidence that in many ways a GUI was better because it took advantage of the natural “hard wired” ways the human brain percieves and processes data.
(Hell, it was a GRAPHICAL browser that really caused the explosion of the internet.)
Remember the days when you had to know all sorts of cthuloid booleanisms to really use a search engine? We had to try and think and write the way a computer does, using a syntax unlike most human languages (and particularly alien to English structures). Now we all use google because the folks behind it figured out how to make natural language searches work and how to rank/sort the way the human mind would.
The overwhelming trend as I can see it has been not to try to teach humans to think like machines, which is expensive and time consuming, and some folk will never be able to master it. (Like my 78 year old purple and green haired, blind in one eye, dyslexic great aunt)* But instead, men and women of vision have worked toward creating interfaces that are much in tune with how the human brain is hardwired to percieve and use data.
Creating a computer from the top down … not practical at the current point in time, and the current attempts are kludgy and flawed at best, but … why not? Wouldn’t that really make the best tool?
I think that this is where things are ultimately going, and tacking these issues will lead to some real innovations in User Interface design and the roles computers play in our lives.
*This woman really does exist. She really did dye her hair those colors. She’ll never ever be able to to use a command line (she can barely type an email addy), but hot damn can she paint, sculpt, and design. She’s not a stupid woman, but she’s a prime example of why it’s ultimately better and cheaper to mould the computer’s software to fit a human than attempt to train a human to think like a comptuer.
She literally can’t RTFM, but if you *show* her which *pictures* to click on, she’s laying out her own christmas cards and burning CDs full of the pictures from her digital camera, and is thus quite productive with the computer.
[And if the filing system of a comptuer could be somehow be as flexible and as ideosincratic as her way of organizing things around the house, she’d probably spend a lot less time in “find file”. ]
“End result, your registry will perpetually grow, leading to increased mem usage over time, leading to system slowdown.”
I’m pretty sure areas marked “unusable” don’t take up any memory since they aren’t actually loaded.
I tend to agree with the author to a certain extent. The whole design of all OSes is wrong in my opinion. And the design of computers in general is wrong for the average user.
First and foremost the Operating System should be run in a completely read only part of memory (like ROM or CD-ROM). It should be completely separate from the user data. Yes i know in UNIX/Linux an ordinary user cant delete OS files but there still needs an administrator there to install stuff. OS upgrades should be a matter of plugging in a new “OS ROM card”. The user data should remain in a separate area of memory-whether this be a hard disk,flash memory whatever.
Also backups should be built into the machine and not some expensive addon device thats difficult to set up. The user should be made aware of the importance of backups. There should be an icon on the desktop or whereever that clearly says “Backup my data”.The machine prompts the user to insert his/her CD-RW or whatever.
I dont know how many people ive known that have lost all thier years of word documents and other stuff when a hard drive died because they thought once they typed it into the computer it was there foreever.
These are just general ideas. The machines should be more geared towards the user and not the sysadmin or programmer.
“The registry does not help Windows in performance by a long shot.”
Agree. I’m a big fan of registry architecture, but I definitely know “speed” is not the reason at all. “Perfect organization” are the keywords.
“The registry on my desktop is probably 3 orders of magnitude larger than all the relevant text file configurations on my OpenBSD server.”
Normal, as the Registry can contain security information for every key, so that you get a full access control on every single entry in the database. That take space. But it’s a *great* feature in my opinion.
Plus, binary can be put in it (i.e. public keys). So registry is maybe more like a parallele file systeme than a flat list of entries.
“Yes, it can index directly into it, but this is easily solved by reading the config once upon boot and creating a hash in memory. Reload if the config files change. MUCH better solution. “
I don’T see it as a *much* better solution, but only as a *different* solution. Period.
“Second, the registry is an all-your-eggs-in-one-basket problem. If a single byte is out of place, the entire registry can be corrupted. “
False. The registry is actually located in multiple amount of files, and as we said before, backuped and restored individually in case of trouble. Personnally I *never* get any issues of whatsoever, nor *any* people I know. Strangely, only windows-bashers seems to ever has any corruption problems. 🙂
“Third, I have personally trashed (by accident) the registry of both W2K and XP boxes so that they wouldn’t boot AT ALL.”
How could this possibibly happen ??? The only way : screwing inside de Windows folders. In that case, I can tell you that *ANY* OS can be screwed up when their files are played with without any knowledge.
Plus, if you want to be safe before tooling around, just make a registry backup. Easy, it’s 2 mouse click.
“The windows registry always was and will continue to be a terrible idea.”
For YOUR opinion maybe. In my case, I couldn’t live without it. It’s SO useful that I literally rage when I’m fooling in a *Nix OS with all their flat config files spread around. Now *THAT* is a bad idea, on my point of view.
“Not just a weak spot but an honest-to-god failure of software engineering to properly address the shortcomings of the design and user behavior. “
I still don’t see where it fail. Registry delivered all goods we can expect from a centralized configuration architecture. It’s efficient, secure, easy to use, and most of all : CLEAN.
“I can’t back it up without special tools.”
What are you talking about ? It’s two click from the OS, with no special tool to install of whatsoever. Can it be easier than that ???
“And the OS vendor doesn’t even provide adequate tools for manual edit and recovery (I’ve had to hack together my own tools and boot disks for this). “
Again : two click to backup. Then if you want to revert : Double click on the .REG file. If you think it’s complicated, then you’re in trouble using a PC at all 🙂
Perhaps the windows-bashers became windows-bashers because they had their registries corrupted.
If you really wanted simplicity, you would go back to DOS.
This guy has a real knack/annoying habit of writing a promotional article about Scopeware and passing it off as insightful commentary about OS design.
Perhaps the windows-bashers became windows-bashers because they had their registries corrupted.
That’s about the gist of it. I’m not even a windows basher! My desktop is XP and I’m quite fond of 2K as a workstation OS. It’s quite solid. But it seems that just when I’m thinking “You know, this bit of software isn’t half bad!” I restart and find that the machine won’t boot because “Windows cannot find the file //WINDOWS/SYSTEM32/” or some such and I have to ‘Last Known Good’, often losing application settings in the process.
I will admit that generally ‘Last Known Good’ has saved me, but I have clobbered the registry to the point that the only choice is to reinstall.
Normal, as the Registry can contain security information for every key, so that you get a full access control on every single entry in the database. That take space. But it’s a *great* feature in my opinion.
1000 times as much space?
Plus, binary can be put in it (i.e. public keys). So registry is maybe more like a parallele file systeme than a flat list of entries.
Why is this considered desirable? Let’s keep things simple: _ONE_ filesystem.
The registry is actually located in multiple amount of files, and as we said before, backuped and restored individually in case of trouble.
Bullshit! You get “SYSTEM” trashed and it’s all over, except for the backup.
Plus, if you want to be safe before tooling around, just make a registry backup. Easy, it’s 2 mouse click.
So you guys keep saying. Well, my ears are perked up. Educate me. I want to:
1. Roll back specific application settings without rolling back the entire registry. (file based: keep sequential backups. app.conf.1, app.conf.2, app.conf.3, … )
2. Copy the registry, open it in regedit, change some stuff, and then boot from the new, ‘test’ version without worrying that the original will be trashed.
3. Open an old registry to retrieve application settings from a system backup. (file based: copy the file)
4. Be able to tell EXACTLY what a program is adding/changing/removing from the registry. (file based: during install, just watch the filesystem)
5. Finally, I want to just be able to copy the registry files. Plain and simple. It should be OK for me not to trust the OS and to do things behind it’s back. It’s not ok for the software vendor to say: “don’t mess with that, it’s not for you to touch”, especially when the software is my OS. I’m not asking to go in and bit-fiddle the things, I just want to make a copy of the files because I’m paranoid and have been burned before. The registry requires that I have trust, and yet the registry has done nothing but prove itself unreliable. But I guess you guys/gals who have NEVER had it fail have nothing to worry about, eh?
I still don’t see where it fail. Registry delivered all goods we can expect from a centralized configuration architecture. It’s efficient, secure, easy to use, and most of all : CLEAN.
You know what’s most annoying about this ‘feature’? I have a lot of specialized tools that I need to carry the configuration of forward when migrating to a new machine or reinstalling the OS. Putty (Free ssh client) is a great example. I have a bunch of preset hosts, keys, color schemes, etc. in there. It stores all the data in the registry. I’d love to be able to just copy the ‘putty.conf’ file from the program’s installed directory, but that’s not possible. Instead I have to search through the registry and export the relevant registry keys and then reimport them when I’m ready to restore to the new OS. And even then how can I be sure that I’m getting all the settings? Multiply this by 25 and I’ve got a serious PITA on my hands. Sometimes it’s easier just to reinstall vanilla and re-setup the software to my needs. As a developer this is a huge time waster and as a user it makes me not trust my machine to ever be ‘comfortable’ for more than a few months at a time. How is this a better solution than having the configuration of an application independently manipulatable like it would be in a stand-alone file?
BTW, have you looked through it in much detail? I definitely wouldn’t call things like
{32714800-2E5F-11d0-8B85-00AA0044F941}, (REG_SZ), {7790769C-0471-11d2-AF11-00C04FA35D02}!05,00,2014,0211 ‘clean’. Yes, I know… it’s a reference to another registry entry. Doesn’t excuse the fact that the registry is full of crap like this, defining a new extreme for the word ‘indirection’. (‘Clean’!! Man, are you on crazy? I can’t think of a single piece of modern sotware that is less elegant than the registry!)
I literally rage when I’m fooling in a *Nix OS with all their flat config files spread around. Now *THAT* is a bad idea, on my point of view.
We can agree on that, then. Unix is the opposite extreme, and just as unpleasant. Config files in random places (some in var, some in etc, some in the app dir, some in your home dir, etc.) and each with it’s own configuration syntax. That is one of of the key things holding unix back, IMHO. It drives me crazy, too. But at least I know that once I have config the way I like it I can just back up the config files to disk, cd, keychain flash drive, etc. and not worry about the heartache of having to reconfigure in the event of a rebuild.
Bottom line is that the registry was as big a mistake as not facilitating/enforcing versioning on dll’s and MS knows it. Expect it to be gone in the next version of windows after 2003 Server.
Machine locked up hard. The configuration was Win2000 on boss’s desk. Machine would come up just fine afterwards. The only thing was no network connection.
The driver was there and the machine seemed to recognize the device. Odd.
I go into the networking section. Protocols and click on TCP/IP and then try to click on Properties. Errors. Try to remove but it can’t more cryptic Windows messages. I even try to add another entry under Protocols but I can’t. The local Windows guru looks at the error message and declares the registry is hosed. I open the thing up in the registry editor and whole entries are filled with garbage characters or are literally unaccessible. As in error messages when you try to click on the entry. So what it happens.
Listen, I can tell you sad mac horror stories. Sure, I can give you the dead lilo kernel panic horror story for Linux. Yes, I can even recite the one about the batch of randomly rebooting Sun 420Rs.
It is not about windows bashing necessarily. The registry has issues. Text files are insecure and hard to keep track of in Unix. Mac use to have system extensions and there were issues with those. Blah…blah…blah
Lets all have one huge silly pissing match and stop talking about anything resembling a conversation about the future of Operating Systems.
Loser 1: My OS is bigger, better, faster, cooler and more stable than your OS!
Loser 2: Is not! Your OS sucks and lets list eighteen reasons why your OS sucks and mine is better.
Loser 1: I must habitually reply to your insult to the honor of my OS and list 20 reasons to the contrary…
etc…etc…etc… until all said OS bigots die and go to some geek hell where they have to live as Amish people plowing endless fields of crap while lugging a computer they can never use on their back running their least favorite OS.
Well, I am developing my own OS for fun and I have been considering the best way to store system and application configuration. A registry briefly crossed my mind, so did a set of flat files, and I dismissed both because of terrible experiences with those extremes (Win & *nix).
I am arguing with Steve and Simba about registries in general and the windows implementation in specific as a example. I’m down on the idea, they seem to feel it solves more problems than it causes. OS bashing? No. Heated conversation to air out some different viewpoints so that I can maybe have a fresh idea for my own hobby OS? Yes.
Is “OS bashing sucks, everything has it’s problems, everyone shut up” the most you can contribute to this?
First of all, I wondered what planet this guy was on. It is almost as bad as some of the other rectum plucking that journalist’s do about new technologies. The best one to come to mind is the so-called “preview” of Longhorn.
So, what is the problem with computers? there are no problems with computers. This is just yet another guy with way to much time on his hands so he can rectum pluck yet another pointless paper at the taxpayers expense.
Secondly, the reply to the comment of what an operating system is. Yes, an operating system does allocate resouces and so forth, it also offers the user a human like way to interact with the computer so that tasks an be done, either via a series of commands if interacting via a shell, or via graphical representation of a office environment, as the case of GUI’s and the use of “desktop”, “trashcan” and “folders”.
The GUI for example, once explained to a user, is a pretty easy way to interact with the computer. Having trained users, and explaines how the GUI is mearly a representation of their desk, hence the reason for the “My Computer” icon, the trashcan, and the ability to place things on the desktop. After explaining the GUI to them like that, they then see how everything else falls into place.
thirdly, a database “operating system” is yet another maketing driven work. Anyone remember the “Database OS” that Larry Ellison believed would be the future?
RE: Simba
Regarding text files vs. registry. You are correct ONLY IF you had one big, huge text file and in that case, looking up information would take a very long time. The alternative is quite simple, break the file down into smaller parts and thus, the speed difference between a small number of text files vs. one large binary is pretty much non-existant.
Now, regarding XML based registries, I am not too sure about how speedy they are. Now, if they turn out to be reliable and fast, then maybe that should the avenue later on to go. For example GConf2 uses an XML like registry for GNOME2’s configuration, KDE does something very similar as well.
In the Windows world, when you install Office XP IIRC, *.manifest files are installed as well, which, when viewed by a browser, have an XML layout. Could this be the future direction of Microsoft?
Matthew, AFAIK the manifest XML files you are referring to could only be for corresponding MSIL “assembly” DLL components of Office XP that rely on the .NET Framework. I realise that it is possible to install Office XP without having the .NET Framework installed, but I still think that what you are looking at has dependency on .NET nevertheless and is presumably non-functional if .NET is absent.
That being said, it is very interesting that MSFT has completely abandoned the registry as far as the .NET platform is concerned and I don’t think they made this move to ease compatibility for implementations on other OSes like Mono or Rotor. I guess, like the GNOME guys, they decided that XML flatfiles for configuration are the way of the future (which suits me just fine).
So much for remaining on-topic in this thread
Well, both the views I see here on the registry are correct. The Registry *was* a good idea and is certainly a good thing when you consider what was used before, reams and reams of INI files. At the moment UNIX is still sort of stuck in a slightly more advanced version of that.
On the other hand, the registry can easily get screwed up. It’s also utterly incomprehensible, Windows stores everything from toolbar placements to plug and play data in it. It’s practically impossible to find what you want unless you’ve been told about it.
So, some people have realised this. GNOME uses GConf, which is like the Linux equivalent of a registry, except non-sucky. Not everything uses it, perhaps in its next iteration it’ll become desktop neutral but ATM it’s gnome only. GConf is:
* Meant for user preferences and settings only. GConf is part of the gnome usability push – by moving really odd or unusual settings out into gconf you can make the UI simpler for everybody else. System info isn’t stored here, nor is object data, xml files on disk are kept for that (because they don’t change all that often).
* Based on the MVC paradigm: preferences dialogs in GNOME write to GConf then GConf updates any listening apps, which gives gnome the snazzy instant apply UI I love so much That means if you alter the window border theme in GConf, the Theme dialog will instantly update, as will all the Windows. It’s cute.
* Network transparent: the instant apply thing takes place across sessions on a network, if you’re logged in 3 times and alter a setting it’ll alter on all the sessions.
* Based on XML – you can see all the settings stored as human editable in ~/.gconf2 i think. In reality, it’s usually far easier to use the regedit style GUI program. By splitting stuff out like that, the FS controls integrity (and normally does a much better job of it) so it doesn’t easily corrupt.
* Lockable. Admins can lock down info at the key level, it’s very flexible in that way. Apps must not assume they can write to GConf at all, any cache or extra data should be stored in dotfiles.
Pretty cool eh? I wish everything used it. Unfortunately people look at the GUI and say “eurgh, a registry, yuck, gconf must be evil”. Ah well. We’ll get there, I’m sure.
A few other things I forgot:
* GConf has built in documentation for the keys, when you install the apps it installs schemas that dictate typing info and short/long descriptions of what the keys do. So you can just explore GConf and find all kinds of cool hidden stuff.
* It’s pretty high performance. GConf apps connect to the gconf daemon (it’s started if it’s not found), and this caches config data transparently. That, and it doesn’t store huge amounts of irrelevant user info, so it’s pretty fast.
* To backup, just copy the gconf home dotdir.
There are probably other things I’ve forgotten. GConf is really pretty slick.
Top down computers do exist, in large numbers. You will find them
advertised in magazines aimed at musicians.
They are called multi-track recorders, sampling sequencers,
synthesizers, etc. All of these are computers with particular
interfaces. Design starts at the user end.
It would be possible to make a portable tablet computer for artists,
too, but that hasn’t been done yet AFAIK.
General-purpose desktop computers tend to be designed bottom-up as
they are basically kits of hardware and tools which can be applied to
many tasks. That is why they are complicated and hard for many
non-technical people to use.
Windows took a good idea (a central store for configuration information) and implemented it badly.
It’s better than Unix’ conf files all over the shop approach, but that isn’t saying anything.
Each app should have its own conf file. Each section of the OS its own conf file. I’d prefer they were ascii, but I could survive having them binary where there is a clear performance gain to be had. They should all be stored in a central directory (probably with seperate sub-directories for OS conf files and App conf files).
That’s clean, simple. The registry is neither.
I’ll admit that NT is pretty robust, but if you’ve ever had to spend 2 days rebuilding a server because one hive got corrupted… And as for Win9x, a corrupt registry that gets backed up and leads to and endless boot loop…
Thats a rich arguement. I wouln’t call “configuration files all over the place” as a very logical arguement.
/etc and /usr/local/etc are where the confiuration files lie. /home/luser/* is where the user configuration files sit. as for /var, log files and constantly chaging things like the package database sit in there. /tmp keeps logs that are wiped each reboot.
So no, your argument is misleadig at best and pathetic at the most.
The real deal in OS development is to combine powerfulness and simplicity in such a maner that every one from the full novice to the neird will be happy with his computer.
In my life I had many computers under various OS. I have to admit that Apple OS have at last become my favourite, because they have a different behaviour. I have spend a long time playing with different version of Windows (3.1, 9X, NT, 2000). I have to admit that they have done a lot of improvement, but these OSes are still stupid. It doesn’t mean useless or not powerfull, it means without any intelligence, or too few to be enough.
Thanks to the infamous Registry, each Windows is particulary weak and static, and thanks to the globally unorganized os structure all is made to be unclear to everyone. In that sense that Microsoft has a great power on you. If your os is not on the C drive you must expect to have troubles, if your applications are not in “Program Files”, you must expect to have troubles. With Windows their is only one way to do the things, no flexibility. You have to fight again your own applications to make them to respect your wishes, just have a look on the Start Menu overloaded of ton of useless icons, the system tray full of unwanted things, shortcuts on the desktop. Look at the time you spend to keep your environment clean and simple to access. To bad. The half hidden Start Menu doesn’t help in anything once you need once on 6 month to do something.
In MacOS I like the intelligence behind. First when moving files some kind of Dynamic Internal Registry is updated so you can move your app, change the drive where they are, they will still work because all the link have been updated. That’s intelligent. Try it under Windows, only single-file application like notepad will support such manipulation.
Launching an application over the network. You can make real sever just with applications for all your client, it could work. Windows will only share your document file… It’s so poor, isn’t it. Effort to do more are so big !!!
Simply Apple has studied what would be the user behaviour, and what is the expected OS behaviour. Most of the time intuitiveness is the main word is Apple’s OSes.
One other thing, the filesystem on MacOS is a database. The name and the place of a file are some of its characteristics but not the identifying key. You can change the name of the exec file of an application, it will still work. Yes it will. That’s intelligence. Try it under Windows…
When you are searching for a file, the default behaviour of the file searching application is to look for the name containing the words you want. That’s simple and intelligent. Searching for “achtung mysterious” is different than searching for “*achtung*mysterious*.mp3” when you look for a file that may be named “U2-acthung baby-mysterious ways.mp3”.
Definitively, look at MacOS, it’s simple, it’s a real companion OS.
An intelligent OS is an OS that understands what you would like to do and help, not an OS that constraints you.
“Each app should have its own conf file. Each section of the OS its own conf file. I’d prefer they were ascii,
but I could survive having them binary where there is a clear performance gain to be had. They should all
be stored in a central directory (probably with seperate sub-directories for OS conf files and App conf
files). ”
I can’t imagine that the performance penalty from using an ASCII
config file would be detectable. It could perhaps at worst add 1/2
second to the time it takes to launch the program, and “Save settings”
would take slightly longer.
The config file is not accessed while the program is running – all the
settings are read in at launch time.
The logical places to keep the file are _either_ locally in the
program’s directory _or_ in a central directory (for each user).
Either would be OK, but the central directory seems more suited to a
multi-user system.
Actually bg I was addressing the message to Steve and his comments that only windows bashers have problems with the registry.
No, it is not all I have to say on the idea of a registry style system. The Gnome haters will have a field day with this but I like the Gconf idea. The configurations are stored in xml text file format. They are kept in a central location with user set options residing in a home dir. This does NOT resolve a major problem with most text file configuration systems which is of course security. However, corrupting part of the configurations does not wax the whole system and make it unusable which is nice.
The key if you go this route is to give the user a proper editor for these options. In gnome currently the gconf-editor is pretty sparse, kind of ugly UI-wise and reminds too many people of regedit which a lot of people hate. On gnomesupport.org in the topics another person is writing another editor that is quite good in terms of use.
Actually the funny thing is I liked the discussion you were having until it started into this only people who do not like Windows have Windows problems type OS bigot nonsense.
That is why I addressed the comment to Steve. Yes, every path will have its potholes and you have to choose the one that is best for you. However, if the discussion rises up from the OS partisan talk, I do have quite a lot to say about the issue including stating my favorite idea (given above) and its problems.
I can’t imagine that the performance penalty from using an ASCII config file would be detectable. It could perhaps at worst add 1/2 second to the time it takes to launch the program, and “Save settings” would take slightly longer.
Half a second, or 500ms, is an extremely long time. Seriously, it should take an application a lot less than that for the entire startup sequence. Unfortunately, this isn’t true for many applications, but that doesn’t mean those applications do it the Right Way.
OTOH, reading in a textfile for configuration will _never_ take half a second unless you make some serious mistakes.
“Half a second, or 500ms, is an extremely long time. Seriously, it should take an application a lot less than
that for the entire startup sequence. Unfortunately, this isn’t true for many applications, but that doesn’t mean
those applications do it the Right Way. ”
I was going to write 1/20 second, but then I thought how big some
config files are (Apache for example).
I can’t imagine it taking as long as 1/2 second normally.
You know, looking at the screenshots of this thing again, is it just me or does its UI look a lot like the old Windows 3.1 cardfile app, ignoring the fact that the latter didn’t sort by time?
LOL! That’s exactly what I was thinking of: Cardfile. I actually rather liked Cardfile–it was small, fast, had a rather intelligent search feature–and I continued to use it in Windows 95 and 98. Now, though, I’ve discovered a freeware program called Repertoire that uses the Cardfile format, it’s just as easy to use and it has a few extra features.
But I can’t imagine running an OS with a Cardfile-like program, at least not efficiently. It could be helpful in some cases, but it’s not appropriate for all computer situations.
> its great the OS are developed bottom up… you have the lowe level building block first… because we know that yesterday, today and tomorrow we will have CPUs, PCI, IDE, SCSI, VGA, IEEE 1940, etc etc… and we can BUILD ourslves our high-level OS world from this… (echoes of unix small tools)…
When we were working on our INDI system, I first thought of Top-down for the reasons he gives. Then I talked about it with a hardware guy in our org, and he pointed out you would end up with some major efficiency problems, if you didn’t take bottom-up into consideration.
From there, with our INDI design, I came to the conclusion, that what was needed was middle-ware that took all issues into consideration for design. So bottom up for modularity, and efficiency, and top-down to provide middle-ware that allowed user-friendly UIs.
I add into that coming from the side, so taking into consideration for networking issues, like shared filesystems, and migratible desktop screens etc.
The middle-ware layer is IMO, still relatively low-level, and not tied to the UI, but instead provides a solid frame-work for user-friendly UIs. And obviously some compromises will have to made, slanting in any direction, but overall IMO these will be small.
“Each app should have its own conf file. Each section of the OS its own conf file. I’d prefer they were ascii, but I could survive having them binary where there is a clear performance gain to be had. They should all be stored in a central directory (probably with seperate sub-directories for OS conf files and App conf files). ”
From Workbench2.0 on the AmigaOS, it had a central repository for all app and system preferences. It was accessed as the device ENVARC: (a dir off the preferences dir). It also had the device ENV: (a dir in the Ram disk).
ENVARC stored any permanent changes, and ENV stored any temporary changes (if you pressed the USE rather than SAVE gadget)- this allowed for safe experimentation, (if you screwed up royally- you just rebooted and allowed you to make temporary changes to an app’s preferences, which weren’t automatically saved on program exit, both of which are extremely handy!
Copying all the small text files to the Ram Disk, was rather slow (especially on FFS), so there were patches like Happy-ENV that only copied preference files to ENV: if they were actually accessed.
IMO having all preferences in one dir, combined with a good package manager system, makes backing up, and reinstalling so easy and simple, it’s the way to go in future IMO.
And having a multi-user system, so each users’ preferences are all in their own dir makes things simple too. I’m not sure wether all the system stuff should be customised in each dir, or separate ala Linux yet, but having them all in one dir sounds good to me so far.
With a system like this I’ve only been forced to re-install an Amiga OS once in 12 years! And even then if I’d put work into it I could have fixed it. With mine and other Windows systems, it happens way too often. Ok so most of my problems have been with Win95b, but even the other day a complete reinstall of XP was needed on a friend’s machine. I do realise that there’s more too it than the registry, but it seems to be a major reason from experience. On the Amiga if I screwed up badly I could boot from floppy and delete or edit the prefernce file for the preference concerned, and I could get back into my machine again- simple!
PS: this isn’t meant to be a pro-Amiga speech, just noting some experiences