The Zero Install system makes software installation not merely easy, but unnecessary. Users run their applications directly from the Internet from the software author’s pages. Caching makes this as fast as running a normal application after the first time, and allows off-line use.
I suggest you actually read the article. This is not Slashdot.
It is not just like any other thin-client software because software is cached locally and needs to be downloaded only once.
Slow? Once cached, software runs just as fast as when it would be installed.
Inefficient? It’s much easier to use than any package manager.
Insecure? You do not need to run as root to cache and run a program so there is actually less risk than there is with a package manager that requires you to be root when installing a package.
Once a program is cached you do not need to be connected to the internet to use it.
And this all adds up to; A Java WebStart clone! Yay!
I find this an amazing step forward toward computing for a user. I always had different ideas of how there could be a way to install programs w/out being root, and this is similar (but I dont know all the inner workings yet) to what I had in mind. I always thought of having a /home/programs folder that any user could access, and was set up like the Program Files folder in Windows. Also, when a program got instlled to that location, it would place a desktop icon on whoever installed it’s desktop, along with a “Installed Programs” list for Gnome and KDE and possibly XPde.
But since all that has yet to happen, I still want to say good work and good luck in the future!
this seems to be the next revolution in software industry however is this limited to only linux? also how about distributin preconfigured os thr this which can be cached with the help of live OS is it possible? just guessing
http://osnews.com/comment.php?news_id=5394
By the article, I mean the link in this news post. Try reading that, and then my post. Got clue?
Ooooooh! Right. I didn’t get the irony in your post. I feel so silly now.
Well it seemed more amusing to me when I first the 0install page and wrote the post. And now in classic osnews troll tradition, it has been modded down. Join me in this moment of quiet reflection on its fate.
All joking aside, 0install looks like a good implementation of a not wholly original idea and I’ll be giving it a try.
Well, I’ll say here what I said over there. I think this fixes the wrong side of the issue.
I think it’s a nice idea.
As I understand it, end users don’t have to wory about dependencies etc… as all these are accessed from the authors system.
But… doesn’t that mean a lot of “stress” for the system of the author?
Suppose if I create a program using some rather large libraries, and it gets used a lot… would there be any affordable webhosting to host it?
Amazingly, I find this to be just like Zeronconf, an interesting idea but unusable in practice.
Actually, what is the difference between this and ‘apt-get install foo’ or ‘urpmi foo’ Also doesn’t this require the author of a program to either make each package statically linked or create a binary for every possible system? both of which are very inefficient methods of distribution.
Actually I always thought that the best approach to non root linux software installation was a variation of the one used by Windows. Installing the contents of a self extracting file to an application directory named for the application and the dependencies in a lib directory located in a users home directory and modifying the user’s profile file to find the new lib directory so that he does not have to install software as root. The SU password and root install would remain however for SysAdmins installing software to intranets for large businesses. I am currently working on such a system which I will be placing under the GNU-GPL.
I have never been too keen on ANY dependancy on the Internet for running applications like the system being discusses seems to suggest. This could lead to an even worse monopoly than Microsoft owned by the corporations that own the Internet’s “backbone” servers even if a FOSS system like GNU/Linux is the basis for it.
It sounds cool, for small apps. To be honest though, I wouldn’t wanna think of my Office system or IDE’s as “cached.” realize it is essentially the same, but eventually you may want to clear the cache but not your big applications.
The other thing is, understandably, some people don’t care about recent software. They just want software that doesn’t fail.
I suggest you actually read the article. This is not Slashdot.
Yeah… On Slashdot, they had this story a couple of days ago… Why is it when someone hear reads a post they don’t agree with, they label the post “a slashdot troll”. Do any of you actually visit Slash any more? Or are you to busy sucking up to Eugenia?
Anyway… Back on topic. I found Zeroinstall to be a neat idea when I tried it, but I had problems using it from behind my firewall at work. As someone else said, nice idea, but not practial in the real world.
The other thing is, understandably, some people don’t care about recent software. They just want software that doesn’t fail.
Agreed. That is exactly what I hear today from end users, “I don’t want any more features, I just want it so my application doesn’t crash all the time”. Most people are happy with the applications they have, they just want stability, possibly better documentation and help; that is about it.
I can see this type of thing being very useful in some cases, and its interesting to think of an entire distro that is never installed, just ‘cached’ from the network/internet. Still, the usefulness seems limited. Maybe in very large intranets with many clients that all do similar tasks, such as call centers or large data entry groups. The IT guys can setup all the software required on the main server and the client machines will simply pull it off as needed. Instant client swaping without the need to spend effort making sure it has all the required software up to date sounds attractive in that kind of environment.
nah, there are better ways… http://www.csis.gvsu.edu/~abreschm/uafhs/
BTW think about LGPL or BSDL, such things under GPL limits it’s chances of becoming a standard and well adopted.
ZeroInstall, is a interesting idea, but I with hard drives today, there is not much of a reason to do a permanent install today. I think UAFHS(User Accessable File Hieracry System) with current tools and a nice front end would be a better approach for simply installing stuff as a non-root user.
Suppose that you’re sitting on a Linux box that you don’t have root on. Suppose that it doesn’t have bzip2 installed. Suppose that the email you just got has an attached bzipped file in it. Suppose you don’t want to do this:
wget http://url.to/tarball;
tar zxf tarball;
cd tarball;
./configure –prefix=/your/home/dir && make && make install; export PATH=$PATH:/your/home/dir/bin;
cd .. ;
bunzip2 file.bz2
Suppose you have 0install. Now it becomes this:
/uri/0install/url.to/bunzip2 file.bz2
Can you see the point?
To continue trolling with examples, let’s take a look at what it takes to get Mozilla running and browsing OSNews with some usual suspects.
Windows:
Download Mozilla installer package.
Double-click the installer package.
Hammer enter until it goes away.
Find the Mozilla icon.
Double-click it.
Enter http://www.osnews.com in URL bar, hit enter.
OS X:
Download Mozilla disk image.
Go to Mozilla disk image in Finder.
Drag Mozilla.app to where you like keeping your apps.
Drag it to Dock too.
Eject Mozilla disk image.
Click Mozilla Dock icon / double-click Mozilla.app.
Enter http://www.osnews.com in URL bar, hit enter.
Apt (or pretty much any other distro installer):
sudo apt-get mozilla
mozilla http://www.osnews.com
Zero Install:
/uri/0install/wherever.it.is/mozilla http://www.osnews.com
I don’t think 0install creates nice menu entries though.
And forgive me for not wanting to try writing an example of rpm dephell / compiling-from-source dephell.
Coral Snake wrote:
“I have never been too keen on ANY dependancy on the Internet for running applications like the system being discusses seems to suggest. This could lead to an even worse monopoly than Microsoft owned by the corporations that own the Internet’s “backbone” servers even if a FOSS system like GNU/Linux is the basis for it.”
Zero-Install doesn’t depend on any kind of central server. In fact, Zero-Install works through standard webservers, so anyone can put software up through Zero-Install. So no, I don’t think this could lead to any kind of monopoly, unless the internet itself is monopolised.
Kick The Donkey wrote:
“Why is it when someone hear reads a post they don’t agree with, they label the post “a slashdot troll”.”
Blimey, it was just a joke. It seemed clear from the post that the poster had not read the article (actually he had and the post was intended ironically). Also, on Slashdot people joke all the time about how nobody ever reads the articles. “Read the article? You must be new here.”
“Do any of you actually visit Slash any more? Or are you to busy sucking up to Eugenia?”
“Sucking up to Eugenia” sounds so harsh! I prefer to think of it as “getting on Eugenia’s good side”.
“Anyway… Back on topic. I found Zeroinstall to be a neat idea when I tried it, but I had problems using it from behind my firewall at work.”
That’s strange. I’m pretty sure Zero-Install uses nothing but standard http connections to download stuff. So if you can browse the web, you should be able to use Zero-Install. Are you sure nothing went wrong during the installation?
Wouldn’t this be unpractical without some kind of clever update tool? FOSS gets updated pretty often (atleast the bigger programs.) I don’t see how anyone other than Microsoft, or SUN, or some other massive corporation can afford the constant high-bandwidth costs.
“Wouldn’t this be unpractical without some kind of clever update tool?”
It works like your web browser. When you click on a program, you’ll have the cached version run. If you click the Refresh toolbar button in your filer then a progress box appears briefly and after that you’ll get the latest version.
(if you don’t have a graphical filer that supports this, you can run the 0refresh command from the shell instead)
Clicking Refresh only updates the index (so the system knows the new version exists). The new version is actually fetched the first time you run it.
Note that previous versions are still available. Refreshing just lets the system know about newer versions. There are some screenshots showing how the multi-version thing works in the freshmeat editorial about it:
http://freshmeat.net/articles/view/1049/
(see “Upgrading software”)
“FOSS gets updated pretty often (atleast the bigger programs.) I don’t see how anyone other than Microsoft, or SUN, or some other massive corporation can afford the constant high-bandwidth costs”
Unless this system encourages people to upgrade more often, the load is no higher. Also, only the index has to come from the master server. Peer-to-peer and mirrors can be used for the data (MD5 sums are in the index).
Also, the amount downloaded is typically less.
When I ‘apt-get upgrade’ my dad’s Debian system when I’m home, it fetches a few hundred Mb of programs. Much of this data won’t even be accessed until I do another upgrade next holiday (but I want it upgraded just in case he does use it). Zero Install only fetches what you actually use, so you usually end up downloading less.