The Ubuntu Live conference began yesterday in Portland Oregon with Mark Shuttleworth’s keynote presentation. Shuttleworth, the Ubuntu project’s charismatic leader, discussed a wide variety of topics relating to Canonical’s business prospects and the future of the Ubuntu Linux distribution.
Torrent of the keynote?
I haven’t found any video of the keynote at all.
Without paying the RIAA a license fee? Surely you jest?
The immediate future of computing is going to get very interesting.Jobs,Gates,Shuttleworth.
I was thinking the same thing. Shuttleworth and Ubuntu are taking on a life of their own. It’s becoming a sort of phenomenon, whether it really is the best packaging of linux out there or not, it is being “sold.”
yep, just that dell is selling desktops and laptops with ubuntu pre-installed, to consumers no less, is telling.
i just wonder how well those sales are going.
all in all it seems that what red hat was for linux in the early days (a de facto standard that everyone else followed) ubuntu is turning into now.
And based on that in another year or two the people will have a new favorite distro and leave ubuntu behind. Lets look at history, first it was slackware as the favorite, then red hat, then debian, then gentoo, now its ubuntu.
Someone could argue that history hasn’t been long enough to let us predict such a pattern.
That said, of course something will replace Ubuntu at some point. Heck, even Windows will be replaced by something else one day.
The interesting thing about the examples you’ve given is that every one of them has been *more* popular than the preceding one – which would mean that the eventual successor to Ubuntu would be even more popular. That can only be good for Linux (which I’m pretty sure wasn’t what you were thinking about when you wrote your negative little comment…)
Slackware was the favorite when itw as the only one around. RedHat became predominant soon after. Debian always had its fans, but never became dominant over RedHat in terms of users. Gentoo was never particularly significant market-wise, though it certainly has a very vocal and energetic user-base.
bingo. red hat held, for the longest of time, but when they forked of the base distro as a community project under fedora it kinda dropped of radar. and soon after that i belive ubunutu walked onto the scene.
Agreed. I began to look elsewhere when Fedora came to be after RH9. I much prefer Red Hat over Fedora, though I suspect a good portion of that is more an attitude than necessarily technically minded.
sales must be going well as they are planning to expand the offering beyond the US!
http://www.linux-watch.com/news/NS4156929011.html
Sadly in the uk linux adoption within buisiness is quite limited at the moment, this could potentially change that!
Anyone can agree that Linux has exponentially improved over the past couple years. But this conference shows how much the high-tech businesses are investing in Linux. Look at the sponsors. They are all industry movers-and-shakers.
on the topic of improving. i had a almost anticlimactic experience setting up a hp home network printer using gobolinux and kde.
a friend and mac user was talking to me on im about linux being problematic when it came to desktop tasks like installing printers.
while the conversation was going on i fired up kde’s control panel, selected “add printer” and after entering the ip and port (the default was shown in the dialog and i used that), and selecting the printer from a list of available drivers, the test page printed fine (except the config defaulting to us letter while i was using A4. minor issue that was quickly sorted).
it was so simple that i was wondering if the printer would blow up the moment i turned my back to it or something.
linux have indeed come a long way these past years.
Linux’s problem has never been high quality equipment and specialty hardware mostly due to its roots in universities.
The problem -although it’s gotten a lot better in recent years- isn’t an $50000 ultra-high resolution printer of which only 10000 have ever been built, but the cheaper by the dozen $24.95 crap that came with your PC, and your monitor, oh and another one with your cereals.
true, with pci and usb the old “winmodem” issue have blossomed into absurd proportions.
more and more hardware is a single chip, and a massive driver blob.
Edited 2007-07-25 00:54
Good for you. I have been trying get the sound working on my lenovo 300 n100. Its been almost 4 months now, but I do not yet got it to “just work”. The chipset and motherboard is also intel 845, so not really obscure. The only hack is to switch of acpi (which seems ok as acpi never works for any linux distro I have tried).
using the snd-hda-intel driver?
also, a common mistake is that when alsa loads the driver for the first time, all volume settings are 0 and muted. so even if the driver loads you may not get sound out of it.
but yes, sound seems to be much more of a issue then printing these days.
While I use and like Ubuntu, and approve of many things mentioned in this talk (virtualization, LTS releases, hardware and software certification, mobile platforms), I really don’t like the following:
…call for the open-source community to collectively synchronize major releases so that all new versions are issued within a unified time window. It’s a good idea if you’re an Ubuntu user/develop because each release will contain the latest versions of everything, but if you’re on any other distro you’re out of sync and suffer. The six month release cycle is potentially too long for upstream developers (a 6 month out of date version of your software is too old, eg. wine goes through 12 releases in that time). Also it keeps alive the philosophy that “everything is part of the distribution”, there is still no unified way of installing third-party software on any Linux distribution.
I really like the idea of Mark Shuttleworth. He has also promoted it at akademy (Video: http://home.kde.org/~akademy07/videos/1-06-Keynote-Shuttleworth.ogg)
It is not only about Ubuntu.
– Think about media echo: if we would releases the latest and greatest in the free software world in a constant rhythm.
– Think about (non-free) competitors: If they see us coming (if we are behind them) or running away (if we are already ahead) in a constant rhythm like release… release… release… release…
While they are developing and developing to get out something new in maybe 3 or more years.
– Think about the developers. Most programs depend on other programs or libs. If i know when a new lib or a new program will be released that i need i can plan my development much better. I can also relay on a developer-version of a program/lib if i develop my new version because i will know when the developer-version of the program/lib will become stable so that my release will not depend on developer-versions of programs/libs but on stable versions.
– Think about distributions. Never Mind if they have a 6 month, 12 month, 18month,… release cycle. But they could plan to release their new version with the latest and greatest of the free software world. Not like today where you typically get the latest and greatest version of X, the almost up to date version of Y and the outdated version of Z.
I don’t say that it will be the one and only way to go. But it’s definitely a nice idea which is worth to reconsider.
EDIT: fixing some typos
Edited 2007-07-24 11:51
Using 6 month old software is not outdated. It’s mature, and stable.
This is also only a problem for distro that live of selling CDs. The true spirit of open source is in Debian unstable, where you any day has the lastest and greatest regardless of release schedules, or Debian testing/stable where you any day has the currently most stable release.
Edited 2007-07-24 12:36
>Using 6 month old software is not outdated. It’s mature, and stable.
Just because software is getting old doesn’t mean that it become mature or stable.
You can see this quite good if you look at relatively young (GUI-)Apps. They become from version to version that much better both in features and in stability that it is just insane to use old versions.
Also a common release cycle doesn’t say that all distributions have to ship at day one with all point-zero versions.
Depending on the goal of the distribution they can ship on day one with X.0 versions (and than deliver updates when bug-fixes come in), in the middle of the release cycle with a X.2 or X.3 version where some major bugs are fixed or at the end of the release cycle to ship the most polished versions.
Edited 2007-07-24 12:42
Using 6 month old software is not outdated. It’s mature, and stable.
This is a conventional-wisdom that often doesn’t translate into the OSS world. It’s not like the GNOME folks release patches for old versions of GNOME. Any bugs that were in GNOME 2.6.1 are still there. GNOME’s development methodology tends towards each new release being the most stable version of what is by now a very mature platform (GNOME 2.x).
“Using 6 month old software is not outdated. It’s mature, and stable.”
What about Windows XP. It is old but it sure isn’t mature or stable. Not compared to what it should be.
By that I mean that it should be, out of the box, secure from viruses and malware (or something close to what Linux and Mac OS X are). Until that happens I will NEVER agree that Windows is mature.
Edited 2007-07-24 15:41
I think the idea is deeply incompatible with the open source software development philosophy.
The idea is that developers for a given project are free to manage their project however better suits their need, and the release schedule is one of the most important aspect of project management.
Yes, there could be some advantages in all project following the same schedule, but forgoing those is IMO a necessary trade off to obtain most of the other benefits of open source development.
there is still no unified way of installing third-party software on any Linux distribution.
Like Windows and OS X? Windows still has a dozen different commonly-used installers, and even though a lot of OS X apps use drag & drop .dmgs, Apple’s own apps still use packages, and lots of third-party stuff has its own installer.
I don’t really understand what the whole argument is about. All three systems are more or less comparable. The only feature that really shines is OSX’s appfolders, but then this may not be needed due to the more advanced automatic package management in Linux.
Windows installiers are different programs, but behave (from a user’s view) more or less the same, and few users seem to have problems with that. That’s your standard way for distribution.
OSX’s .pkg files behave a lot like installers in Windows. Drag’n’drop appfolders are simpler, but only valid for simple applications. That’s all you need.
On Ubuntu, distribution would be as a .deb file, which is enough for practically all needs.
One thing that would be nice – and which none of the three can do – is that installation and removal of packages works completely without any specific code being executed (that is, without pre-/post-scripts). Such code tends to produce unexpected behaviour, even without actual bugs in the scripts.
the issue is more that on windows or mac, the app and whatever third party libs it use, come as a bundle.
this can be both a blessing and a curse. i recall microsoft finding a problem in one of their base libs. but the only real fix they could provide was some patches for their own products, and a program to sniff out other copies of the lib in the folders of the individual third party apps.
but from a usability standpoint, having one stop downloads of software from places like filehippo, snapfiles, download.com or the classical tucows, makes it simpler for third party developers to just toss the program up there and hope someone finds use for it.
then there is ever so slight variations in where files go. when i recently tried to install some stuff for kde i encountered that one had to set a compile time option pointing to where the base directory of kde was. this because some distros, like debian, put it in /opt/kde, while others put them under /usr.
this, and having rpm, deb, and maybe a host of other variants makes it a nightmare to maintain intallers.
thank god that i choose to use gobolinux, where i can go makerecipe on a url to the sourcecode tar-ball if there isnt one up on the recipe store, and 9 times out of 10 it will install neatly in its own versioned sub-dir under /programs.
if one is lucky one can even invoke newversion when a more recent version then the one the recipe store have available shows up, and the old one will be used as a basis to make a new recipe.
and if one wants to make a binary package. go createpackage after the compile. this basically tar-balls the versioned subfolder and thats it
mind you, the current gui available for managing all this, manager, is dreadful. i wonder if i should read up on python and see if i can rework it.
This is an option, though I would not suggest doing it this way, since it will write locally compiled files into the location controlled by the package manager.
Usually it is easier to just use the default prefix (/usr/local) and extend $KDEDIRS to include it.
This often has has the additional advantage of /usr/local being writable by a non-root group (e.g. “staff”) and thus not requiring root access to install local software.
true that, but when one have such vital stuff as desktop environment files existing in different places based on what distro one is using, things get quite hairy, quite fast…
Well, I have no idea how other desktop environments are handling this, but KDE uses the list in $KDEDIRS for all its own resources, thus installing into any of them works also for config or application data, etc.
Paths can easily be queried using kde-config, including some of the desktop environment related paths from the freedesktop.org specification (e.g. location of application .desktop files, menu files, etc).
Check
% kde-config –types
for a list of resources
heh, i should have made it more clear. this was theme/style files. and at compile time no less.
dont know why it was done as it was done as i was simply reading the install file for instructions. and two separate files listed more or less the same instructions.
Why?
Just pick a distribution and stick to it.
Pick a distribution with a large community (Ubuntu, Debian or Fedora) and someone else will have already compiled it for you before you even finish your source code download.
If you are a developer … then this wont be a problem either. Set up a few different compiler environments (Ubuntu, Debian and Fedora) and compile your app once for each one.
if one could have a single compile environment that would spit out different packages for different use. or some kind of meta-package that could morph to fit into different packaging system, maybe.
but the more complexity or variations one add to something, the less likely it is that anyone will bother.
There is a huge different though. On Windows, installers tend to work. Whether an app was packaged with InstallShield, WISE, Inno Setup, NSIS or whatever, in 99% of the cases it would work. The same thing could not be said about Linux. RPMs don’t work on Ubuntu and DEBs don’t work on Fedora (*). Heck, Fedora RPMs usually don’t work on Mandriva or SuSE, which also uses RPMs. I’m not sure about Debian-based distros as I’ve never used anything besides Ubuntu. The problem there is much larger than on Windows.
(*) Unless you use Alien to convert RPMs/DEBs. But even then many packages tend not to work, for the same reason why Fedora RPMs usually don’t work on SuSE.
This has many causes, including compilation against different library versions, and ABI incompatibilities between compiler versions. But the fact is, the problem on Linux is much larger than on Windows. A maybe even larger problem is that many Linux people don’t recognize this as being a problem.
But the fact is, the problem on Linux is much larger than on Windows. A maybe even larger problem is that many Linux people don’t recognize this as being a problem.
I completely agree.
However, I don’t think we’ll ever have a common package format because that’s not the way Linux distros work. The end user pays for this in using time.
The only sure way of having up-to-date packages on eg. Ubuntu is to learn how to compile stuff yourself.
On Windows installers tend to work (for some version of “work”) because target a single OS. Ubuntu and Fedora are different OSs (even if they use the same kernel), so no surprise that software packaged for one doesn’t work for the other!
No packaging system or “unified installer” will fix this problem, and talking about installers or packaging systems is really a waste of time. The problem isn’t packaging, its ABIs.
Getting every Linux vendor to agree to a common ABI is probably out of the question. The best approach is to pick one of the popular distributions and hope they package all the software you need. In practice, I haven’t seen this to be a particularly huge problem — its been years since I compiled anything for a Linux machine because it wasn’t available in a repository. Heck, I’ve compiled more software for OS X because there were no binaries available!
Edited 2007-07-24 15:42
“On Windows installers tend to work (for some version of “work”) because target a single OS. Ubuntu and Fedora are different OSs (even if they use the same kernel), so no surprise that software packaged for one doesn’t work for the other!”
The problem still exists because people consider distros to be entirely different operating systems instead of variations of the same operating system. As I said, people don’t recognize as it being a problem.
Most seem to be pretty much compliant with the ABI the LSB specifies.
Somehow people seem to assume that distribution packages are the only way to install software on Linux and then request/demand/suggest the invention of some kind of installer based approach, while there are already several options for this, e.g. zeroinstall, autopackage, InstallJammer, Bitrock’s InstallBuilder, …
Just because most popular free software projects are available through the distribution package repositories because it removes the system integratin burden from the developers, does not imply that other projects can’t just use one of the third party installers.
I would be surprised if any of those who repeatedly rant about the installer thing on comment sections like this one have ever added a respective wish item to any project’s issue tracking system.
There is a huge difference, though. On Linux, package management tends to work.
Seriously, this argument crops up time and again, and it makes no sense. Why the hell would you have to convert anything? If your distro doesn’t package something you need, file a bug for heaven’s sake!
Different distros do things differently. That’s the point. Different methods, different standards, different layout. An installer that would work on all of them would not work well on any them.
Linux is not Windows, nor should it be.
http://linux.oneandoneis2.org/LNW.htm
EDIT: while you’re at it, why not hop on to comp.lang.lisp and suggest to replace the parentheses with whitespace?
Edited 2007-07-24 15:59
“There is a huge difference, though. On Linux, package management tends to work.”
And on Windows it doesn’t? I can’t even remember the last time that a Windows app refused to install.
“Seriously, this argument crops up time and again, and it makes no sense. Why the hell would you have to convert anything?”
Because I’m using Ubuntu and Epson doesn’t provide .DEBs for their scanner drivers?
“If your distro doesn’t package something you need, file a bug for heaven’s sake!”
If you distro refuses to package that app, then what? If they have “higher priority issues” to take care of first, then what? If you have to wait a year for them to package it, while you need it now, then what? What choice do I have other than using Alien or compiling manually?
“Different distros do things differently. That’s the point. Different methods, different standards, different layout. An installer that would work on all of them would not work well on any them.”
“Different web servers do things differently. That’s the point. A web server that would be compatible with all web browsers would not work well on any of them.”
Do you realize what you’re saying? The whole point of standards is to make things compatible. Are you telling me that nobody should follow the HTML standard? That nobody should follow the HTTP standard? That nobody should follow the British English standard or the international metrics standard?
“while you’re at it, why not hop on to comp.lang.lisp and suggest to replace the parentheses with whitespace?”
Well, what you said in your last paragraph is similar to “different Lisp implementation should use different characters in place of parentheses”.
Edited 2007-07-24 16:56
Get back to me when it updates Total Commander for you, instead of Microsoft programs only.
I notice you didn’t include a launchpad link.
Please point me to the RFC where package management is defined.
No, if you want to stick to that example, that would be like expecting the .fasl files of different implementations to be compatible, and proclaiming lisp doomed when they aren’t acknowledged as a problem.
You’re refusing to acknowledge that only source compatibility can be guaranteed by the model employed by the GNU/Linux world.
edit: typo.
Edited 2007-07-24 17:31
“Get back to me when it updates Total Commander for you, instead of Microsoft programs only.”
I don’t have Total Commander. But Ragnarok Online, Starcraft, Skype, Daemon Tools, GTK, Gimp, Gtk-Sharp, Firefox, Crimson Editor, ActivePerl, Delphi, Cygwin, they all install correctly.
“Please point me to the RFC where package management is defined.”
Of course that RFC will never come if you continue to think that ABI (which is actually more important than package management, as ABI is what makes things compatible) *shouldn’t* be standardized. If we are talking about HTTP then the conversation would be like this:
Person A> “Web browser X works with web server H but not web server I. The WWW protocol should be standardized.”
Person B> “What a load of crap. Different web servers do things differently. If web browsers work on all web servers then it wouldn’t work well on any.”
Person A> “But we’ve already standardized HTML, why not web server protocols?”
Person B> “Show me the RFC for web server protocols then.”
Well duh, of course that RFC doesn’t exist and will not exist if you keep going in circles.
“You’re refusing to acknowledge that only source compatibility can be guaranteed by the model employed by the GNU/Linux world.”
No, I’m saying source compatibility isn’t enough. Software between distros *should* be both source compatible and binary compatible.
And I suppose Windows Update takes care of those as well, then? You’re deliberately missing the point of whole-system package management. Installation is but one.
You keep bringing up protocols and web servers, when you’re debating the webserver internals. The ‘problem’ is a level removed from where you’re projecting it.
Tell you what, write up that Request-For-Comments, and I’ll provide comments on it.
From where I stand, it seems plenty enough. Upstream provides the source tarball, your particular distro makes sure it works within their environment and provide the binary package for your convenience. How does this not work?
It’s not the same as Windows, but this isn’t Windows.
“And I suppose Windows Update takes care of those as well, then? You’re deliberately missing the point of whole-system package management. Installation is but one.”
Yes I am. I’m not talking about whole-system package management. I’m talking about *binary compatibility*. I agree that Linux package management makes installation and updating a lot easier, but the binary incompatibilities between distributions bugs me.
“Upstream provides the source tarball, your particular distro makes sure it works within their environment and provide the binary package for your convenience. How does this not work?”
And this is exactly what I meant by “people don’t recognize this as being a problem”.
Imagine the following. You’re using SomeDistro version 1.0, which ships Firefox 1.5. Today, Firefox 2.0 was released. You want to upgrade to Firefox 2.0, but uh oh, your distro only ships 1.5. You have to upgrade to SomeDistro 2.0 just to get Firefox 2.0. Doesn’t that seem very odd to you? To upgrade an entire distro just to upgrade from Firefox 1.5 to 2.0?
Now, wouldn’t it be a lot easier if you can just install upstream Firefox 2.0 binaries, that work on all Linux distros? And *this* is where binary compatibility jumps in. Mozilla has to maintain a dedicated, highly tweaked “build box” – a machine with a highly tweaked Linux distro – just to build binaries that work on most Linux distros. Do you think the average developer even has the resources to maintain such a dedicated build box?
Also, imagine yourself as being a commercial developer (yes yes closed source is evil yadda yadda). You have developed an extremely popular program called Photoshoot. Lots and lots of customers started to ask for a Linux port. Because of patents and the fact that you licensed third party libraries (and many other reasons), you cannot open source Photoshoot. And now what do you do? You can’t ask distros to package your app. Isn’t it a huge pain to make 5 different binaries, compiled on 5 different Linux distributions? Wouldn’t why a single binary, that works on all distros, be a good thing?
“It’s not the same as Windows, but this isn’t Windows.”
That’s no excuse. If you say “Sure we don’t have a GUI in FooOS like Windows does. But this isn’t Windows.” then you’ll have a hard time convincing your grandmother to use FooOS.
Edited 2007-07-24 21:23
Imagine that I’m running Gentoo, Arch, or any other distro with rolling updates. Oh, wait! I am!
If SomeDistro is worth it’s salt, it will backport security fixes.
Because granny is likely to care that FooBarBaz 1.4 went to 2.1 and upgrading to it right now will make or break the deal? Pure nonsense.
“Imagine that I’m running Gentoo, Arch, or any other distro with rolling updates. Oh, wait! I am!”
In other words, the same old “I’m using Gentoo/$FAVORITE_DISTRO and fsck everybody who uses something else” elitism.
“If SomeDistro is worth it’s salt, it will backport security fixes.”
(Firefox 2.0) != (Firefox 1.5 + security fixes)
“Because granny is likely to care that FooBarBaz 1.4 went to 2.1 and upgrading to it right now will make or break the deal? Pure nonsense.”
FooBarBaz 2.1 supports the new webcam that she bought, while 1.4 does not. So yes, not being able to upgrade right now *will* break the deal.
Or, a more realistic example: Firefox 2.0 supports http://www.example.com while 1.5 has rendering bugs which prevents http://www.example.com from being rendered correctly. Grandma is an avid example.com visitor, and not being able to upgrade to Firefox 2.0 is a huge deal for her.
“If you’re in that kind of bind, you have the resources to ensure your binaries are LSB compliant and just drop it all to /opt. I understand its purpose was this. Then all you have to watch are glibc updates.”
Good for them. But big news for you: I am not a commercial developer. I’m but a lowly open source developer who wants to make Linux binaries available for my end users. Why should I file 20 bug reports to 20 distros to have my app packaged? Why don’t you let me do it myself? Why would it be a bad thing if I can do it myself?
“Also, I couldn’t help but notice that you have the autopackage website as your homepage, and you’ve posted before claiming to be a developer. That certainly explains your unwillingness to acknowledge that the current system works.”
What does autopackage have to do with this? I didn’t mention autopackage because it’s not the topic. Now you’re just finding an excuse to use personal attacks on me. What a poor way to have a discussion.
Edited 2007-07-24 22:57
It’s “there’s a solution out there for people bothered by this already” elitism.
See above.
But then, you’re not upgrading a distribution as you originally stated, are you? Because if that version does not meet her needs, it won’t get installed, period. The newer $distro will, if it does. At no point does a needed update enter the situation. A newer WhatEver might provide more features, sure, but to even install a distro presupposes that it meets all requirements. You’re arguing for a situation that will never reach that state.
Pointing out you have a vested interest in the topic is a personal attack? Heh. Cute.
Eh.. I notice you added a paragraph, but I’ve run out of editing time.
Let’s for a moment suppose software patents are valid here (they aren’t), and that I would want to run non-free software (I don’t).
If you’re in that kind of bind, you have the resources to ensure your binaries are LSB compliant and just drop it all to /opt. I understand its purpose was this. Then all you have to watch are glibc updates.
Also, I couldn’t help but notice that you have the autopackage website as your homepage, and you’ve posted before claiming to be a developer. That certainly explains your unwillingness to acknowledge that the current system works.
If you have to wait a year, I’d suggest you are definately using the wrong distro.
Why don’t you simply go for a distribution that is awake, rather than a dead one?
Use a better distribution.
Ubuntu, Debian testing, Fedora, SuSe or even a relatively small but “on the ball” distribution such as PCLinuxOS would be the go.
You obviously don’t install many Windows applications then, either.
Windows-think. Only a person stuck inextricably in Windows-think could have a complaint such as this.
Across the FOSS world, applications are compatible via their source code, not via binary executable file compatibility.
In this way, I can run a FOSS GTK application on any x86 box, on a MIPS processor, on a PowerPC, on an old Mac, on an Xbox, on a PS3 with a Cell processor, on an IBM mainframe, on an ARM PDA, on an iPod or an iPhone or even on a Windows x86 box.
I don’t rely on a particular set rigid processor architecture and an ancient legacy-problem-ridden binary API.
Ease of installation and portability of package management on Linux beats the socks over anything on Windows.
What is more, if I stick to a policy of “install only from my distribution’s repositories”, then I am guaranteed to have no malware.
Edited 2007-07-26 06:51
With rare exception* the file install process begins the same way in Windows and OS X: Double click the file.
I cannot say the same for Ubuntu, my favorite (chocolately good) Linux.
Open Synaptic … it’s probably there.
Add restricted repositories … it’s 99% certain to be there.
If it’s in Synaptic, check the box, click the button to install. (And frankly, Synaptic makes it very easy, if the file you want is there.)
If it’s not in Synaptic, or the version I want isn’t in the repositories yet …
1)Download file
2)Open terminal
3)sudo dpkg -i filenamehere if its a .deb package.
OR sudo something completely different for a .tar file
OR sudo something completely different for a tar.gz file
OR sudo something completely different for a tar2.gz file
—
Because starting with a double click or some sort of swissarmyknife command that will open and install any program written for linux is too convenient and would make sense.
And that is one of the things I found most frustrating when starting Linux — there wasn’t a standard way to start a file install.
—
*Yes, the install process for “windows update” and “software update” is different, and Ubuntu’s software update process works much the same as Apple’s; but if I go to Microsoft or Apple’s sites and download those same update files by hand, the install process starts with a double click. I cannot be sure what I’d have to do to start an install of any files I download by hand in Ubuntu.
I updated to Opera 9.22 by downloading the .deb, then double-clicking it to run the graphical installer. Auto-update from the repository happened later.
there never will be a unified way to install third party software across the linux distros
I don’t like the synchronization wish either.
Ubuntu’s job is make a distro, not to dictate how upstream works.
Of course, it will be easier for Ubuntu if they know when a specific component will be released, but I think this will introduce one constraint too many for upstream.
I admit it. I’m a Mac user. But I can’t wait until the time that there are ads on TV for Ubuntu.
I’d love to see something like:
“Welcome to Ubuntu Linux. Welcome to the outdoors. Step out from behind Windows and see what the internet and the world is really like.”
Or something like that. Play to the fact that MS’s OS is called Windows telling people to step out behind it.
“Tired of cleaning Windows from viruses? Step outside to Ubuntu and never worry about a virus again.”
Couldn’t reply to just one, since I’m trying to cover both threads…
As far as unifying the release schedules, I’m sure Shuttleworth is only talking about the major ones, like X.org, Gnome, Kernel, etc. This would be a fantastic thing. If indeed there were a 6 month release schedule for these, or even a year, then every version or every other version would have the newest.
We also wouldn’t have to worry about major versions of a project missing feature freeze of the distribution. How many times has a distribution planned on getting a new version of software into their release, but then it barely slips, and then they have to tell their users “well, wait another X months until our next release.”
As far as the packaging issues. I think this would be fairly easy to ‘fix’ (assuming you think there is a problem. I know I think there is one, but I don’t think it’s as big as what some make it out to be.)
This fix, in my opinion, would be for developers to simply provide a debian folder and .spec file within their source. Generally after that it’s easy enough for any distribution to create packages of the software. Slackware just uses tarballs anyhow, and the .deb and .rpm packages would cover 90% of other distributions.
Of course the negative side to this is that it places the problem on the actual developers of the software. Well that really isn’t that big of a problem, considering it is after all, THEIR software. If they’d like to see wider use of their software, making it easier for distributions to package it would be in their interest.
The part that would fall to the distributions would be for easier creation of the /debian and .spec. If developers could just run a script that would for example just read the configure script to see what libraries were required, then intelligently look at what packages would be needed for dependencies, then create the proper file(s), we’d be set.
There really doesn’t need to be a ‘universal’ package, though the Loki installers have always worked for me (though the actual software that they install sometimes has issues with newer libraries.)
Ubuntu wants to be the desktop OS.
With Desktop OS source compatibility just isn’t enough…not at this time or date.
This is why all Linux distros need unified standards…it’s not the GNU way but it’s only way to make a potential desktop Linux…and that’s what Ubuntu is aiming for.
Fastest and by far easiest way for a Desktop Linux to succeed is that one distro would become increasingly dominant and by so force others to follow its trail…That wouldnt break the “four freedoms” and would work significally for the benefit of end user…but the popularity of linux distros are such a wave motion that it’s impossible.
Edited 2007-07-24 19:25
We don’t want a free software monopoly, thank you, be on your way.
does it bother anyone else that the most popular distro is made by a proprietary software vendor?
Edited 2007-07-25 00:01