“Utilities like urpmi, up2date, apt4rpm and OpenPKG may be a step in the right direction, especially if they become widely adopted. Until we have a universal tool for resolving application dependencies, we can’t expect casual users to be able to install their own applications.” Read the article at NewsForge.
Why does everything have to be “universal”? There are a number of distributions out there, with a number of tools suited to those distros. You will have to rip Portage out of the cold, dead hands of Gentoo users before you’ll they’ll adopt something else as a statndard.
What matters is whether there is a single easy, widely available, and most importantly, *complete* repository tool. APT + Synaptic pretty much fits that bill, but the distro which has the most complete APT repository (Debian) is not exactly newbie friendly.
I think Fedora could be a very important step in this area. Ever since RedHat 8.0 came out, I’ve thought it was an excellent distro for the mainstream. Out of the half-dozen times I’ve installed 8.0+, there we never a time the system required more configuration work than a comparable Windows install. In fact, the most annoying part of the install was always J2SDK, whose installation procedure sucks in Windows as well.
As a community-based distribution, Fedora brings together the advantages of Debian’s huge package repository and RedHat’s excellent distribution core. One of the stated goals of the Fedora project is to package pretty much every available software package into its repository. If Fedora gets a large amount of community support behind it, its very possible it could build a repository as large as Debian. Plus, with the distribution’s RedHat base, it will be compatible with a wide range of commercial software as well.
Despite the issues involved with the repository-based mechanism, software installation is one area where Linux is clearly superior to any other OS. After being able to use a single command to install a program, or use a simple GUI to just double-click on the program you want and have it installed, all the stuff you have to do in other OSs just seems so primitive.
>There are a number of distributions out there, with a number of tools suited to those distros.
This is exaxctly the problem. For the author, and the way he perceives the Linux market and how to grow that market, this is a inconsistency and incompatibility and library hell, this is a major problem.
People want universal so that it works across all distributions. It may not be technically feasible, practical, or even the best solution for everyone – but that is what a lot of people want. If I’m getting some neat utility some person wrote off an edu server, it would be nice if that person could make on package that could install itself (binary), make a shortcut for the DE/WM being used, and then pop up a README or instructions. It’s not currently possible (afaik), but projects like autopackage are working on it.
I think once again Windows is not being addressed in its current state. This is not windows 95 anymore. I actually had the pleasant experience of installing a win95 box about 2 months ago, and yes I ran into ‘dll-hell”. But with XP now, this is simply not an issue. MS has basically standardised on the ‘common files’ that need to be installed. Combine that with the windows practice of packaging those files with the application just in case, and ur in the clear.
I do think MS has the right idea, even if it occasionlly breaks down. Have the OS maintain and update common shared libraries and have all applications count on those being there. If ur library is not on that list, package it with ur application.
Yamin
apt4rpm really works, but i can’t find any repositories with unofficial (package not in cd) package for suse …
do you know ones ?
cause using apt for system upgrade is cool but using apt for installing new software is fantastic.
It is a crying shame that so many of you are trying to kill the messenger without really hearing the message. The author of the article has a very valid point.
I have been using Linux for a few years and spent the last week trying to get Plone from plone.org to compile. There are no rpms for the latest version and rebuilding the rpm does not produce a working rpm.
I moved to Linux because of choice and no I am trapped in distribution prison, where if I find an interesting piece of software I cannot use it unless a kind soul has packed it for my distribution and this doesn’t happen as often as it should. Mac and Windows folks can use plone in a matter of minutes. This is an application with hundreds of thousands of installations.
If you don’t think this is a problem worth solving, you obviously do not deal with end users or are not an end user yourself. And end users are not stupid because they cannot compile a huge application from source. If they came from a Mac or Windows, they know how easy to install that application was.
So, let’s get this thing solved. It is the last thing holding Linux back.
Packing software is not the domain of end-users or system administrators.
Thanks for hearing my little rant. I wouldn’t sympathize as much with the author if I hadn’t gone through hell and back because of this very issue during this past week.
Unix/Linux is fundamentally not a desktop operating system. It’s like trying to make a shelf out of mashed potatoes. Have a free software desktop OS by all means, but there’re more suitable animals than Linux for the job.
To state the obvious, you are just stating the obvious boring line that we have been hearing for years.
When KDE began, they said it couldn’t be done. KDE is absolutely incredible and it gets better all the time.
The same thing can be said for many aspects of Linux. The problems will be tackled one at a time.
So make your choice, be part of the solution or be part of the problem. Telling us it can’t be done is not going to cut it.
As GNU/Linux evolves solutions to the problem described in the article will need to be found.
I am not saying that we should take away some of the freedom (Thats impossible), go ahead and do what you like but it won’t be certified.
I would like to specifications released every year that:
1. Include all the libraries, software etc that are guaranteed to be included.
2. A standard file system layout.
3. Guaranteed backwards compatability for at least 3 years.
Call it the GNU/Linux OS Specification or whatever. Have subsets like Desktop, Server, Development, etc. This way any developer interested can write software that includes any additionally libraries needed in the package.
Its late so I hope this makes some sense.
There actually already IS one. That is called .tar.gz source package. I am NOT trying to troll with this (seriously), but I’m trying people to get that sources work with all distros (unless distro maker for reason or another patches them to hell). “Universal” tool could be something like Portage – Something that works with source packages. After all. this is Open _SOURCE_, people should learn how to compile source ./configure tells what depencies source needs – why no one hasn’t written a tool that could get depencies from that, then search web/database/something for them. Well, portage does this already – but it’s limited to one distro. It could not be that way – I bet it wouldn’t require that much changes to use Gentoo reporsity worldwide – like FreeBSD does with Ports. And no, I am not Gentoo user, I’m long term Slackware user. But still I like the idea of gentoo and freebsd. Anyone has/had same feelings? Well, my ideas. Feel free to flame, bait, crush me, but universal “source reporsity” is not really a bad idea – it would also work with all architechtures too! Probably some configuration options only would be need to change in Makefiles…
>It’s like trying to make a shelf out of mashed potatoes.
this quote is from the Unix Haters Handbook, and was originally made by Jamie Zawinski.
>sources work with all distros
except when your default distro install does not have yacc, lesstiff, … installed, or does have them installed, but just not in the right place.
./configure -> damn no yacc installed.
install yacc
./configure -> damn no lesstif installed.
install lesstif
./configure -> damn the dev libraries of XFree86 not installed.
install the damned libraries
./configure -> damn it can’t find the libraries.
fix this by trial and error
./configure -> ok it seems to work.
make -> compiler error 1 (always error 1 without any message. whatever the real cause)
at that point i gave up.
this was when i tried to install ddd from sources on a default mandrake 9.0 install.
Hmm, Plone wasn’t a good example. You see, its already in Portage and APT The installation was literally a “emerge plone” away. I don’t know what it does, but it was easy…
Anyway, the problem is not that there is no standard model, but that developers are often lazy with their Linux packages. For software that’s primarily for Linux (GNOME, Totem, KDE, etc) developers can concentrate their efforts on that market. If every developer would make an RPM of their software and submit it to a popular repository site, then 99% of the problem would be solved. It doesn’t even matter which repository site, because, currently, RedHat/MDK (maybe SuSE) can access almost all of them thanks to APT and Yum. Non-RPM distros can almost always access RPMs relatively easily, or have mechanisms (large community of packagers in the case of Debian, super-simple from-source packaging in the case of Gentoo) that make “official” packages unnecessary.
Also, I think that you have a very warped idea of the “mainstream” market. Mainstream users do not install random software off the internet. They install MS Office, some popular utilities (WinZip, AIM) and leave it at that. For these sorts of uses, you will almost always find the software you need in a repository. The type of users who would install random software tend to be intermediate users who like fiddling with their computers.
The Windows software model is fundementally unsuited for modularized software. You see, what many Win/Mac users see as dependency hell is actually just a side-effect of extremely modular programming. There are huge benifets to this. In KDE, you have a consistent way to manage shortcut keys and toolbars in every app, because all apps use a common library to handle those tasks. You can type an ftp URL into any app because apps use the KIO components to do file access. If you keep building upon this model, you end up with a very modular system where each app has tons of dependencies, but those dependencies are what allow them to integrate with each other as much as possible. If you followed the Windows model of software distribution, you’d have to package most of those modules with your software, which would lead to immense bloating. There is no need to give up modularity and consistency in the interest of easy software installation. Just as OSS developers use technology (mailing lists, distributed version control, etc) to manage distributetd software development, we can use technology to manage distributetd software packaging. “Smart” packaging systems like APT and YUM are exactly the sort of things that allow both modularity *and* ease of use to be retained.
PS> And yes, I am an end-user of desktop Linux. Linux has been my primary OS for nearly two years now and in that time, the only time I’ve ever had issues installing software is during the Debian/KDE3 thing, and for in-development projects like Subversion that are not yet ready for public consumption. Of course, I’ve been using “smart” distributions, specifically Gentoo and Debian. I’ve been playing with Fedora for a couple of days, and I can definately see it becoming a
You already have the LSB, which does most of what you say. The one thing it doesn’t do is:
1) Pick a default desktop
2) Guarantee binary comptibility for any number of years.
For these two problems, I have a feeling that the biggest thing working against a standard are Linux users themselves. One thing that people seem to forget is that the Linux userbase still has the final word. If something meant to attract new users makes the existing ones unhappy, developers probably won’t do it. Its simply a matter of catering to your userbase. The problem with both of these things is that it stifles the pace of development. You either break compatibility, or load up the OS with tons of almost-the-same libraries. Windows definately suffers from this problem. My Fedora install is a mere 2.5GB, with pretty much everything I need to work, including two office suites, compilers, etc. My Gentoo install, which includes everything I’ve ever needed in a year, is only 4GB not counting my personal data. Hell, the default install of Windows XP is a little over 1GB, and soon as you add some needed apps, it eclipses most any Linux install while providing a fraction of the functionality.
Also, I think people overly obsessed with standards are missing some fundemental understanding of the free market. There seems to be this idea that if there are multiple standards, that means that there is “wasted” effort. If there are great office apps on KDE and GNOME, then it must be that, if there was a standard, there would be one even better office app. That is emphatically incorrent. Its a perception that comes from the extremely warped software world Windows folks live in. In Windows, there is pretty much one option for any major catagory of software. That’s not due to some natural balance, but due to an extremely sick industry that is bound by proprietory protocols and formats. Basic economics tells you that this is a sure recipie for poor products and consumer price gouging. In an ideal capitalistic system, there are numerous firms selling nearly the same product at the lowest possible price. That is, mathematically, the most efficient use of available resources. The explanation is a bit long (you can find it in any basic economics text) but it has to do with the fact that software is very quick to suffer from diseconomics of scale. Thus, the “dilution of effort” exhibited in the OSS world usually isn’t an issue. In addition, the ease with which competing OSS projects can learn from each other’s improvements (thanks to openly available code) means that OSS reaches a naturally efficient competitive equilibrium even faster, thus making maximal use of available resources.
> There actually already IS one. That is called .tar.gz source package.
Not all software for Linux have open sources. A commercial software developer doesn’t want 10 different ways of distributing software, and since it’s unlikely that they will distribute their source just for Linux, they just drop it.
Well, since we are talking about average users, I can offer quite a bit of experience about that. My girlfriend, and her family are quite the typical users. As a matter of fact, I’m dropping by this weekend because the accessories menu on the start bar “disappeared” (father deleted it of course). These people do NOT give two shits that Fedora installs in 2.5gigabytes (what’s a gigabytes?) of space, and that the default install of Windows fits in one (I thought it was more, but who cares). And you are _wrong_, completely wrong, about them downloading and installing software. Maybe you have used Linux too long? (Not meant as flamebait, honestly). If someone wants to install a new game, they just click on the link, click “Open,” and next next next, and now there is an icon on the desktop or start menu. That’s all there is to it, ever. And it is very within their reach to do so, and keep up with basic updates through Windows Update.
>this was when i tried to install ddd from sources on a default mandrake 9.0 install.
Yes, can be pretty problematic if you don’t install development environment. But all of those utils are relatively small. Please, read between the lines, You’ll understand what I’m trying to say. All those tools would be equal to what Windows installs by default – all those dll’s…
>Not all software for Linux have open sources. A commercial software developer doesn’t want 10 different ways of distributing software, and since it’s unlikely that they will distribute their source just for Linux, they just drop it.
Yes, true. My _suggestion_ for commercial software is to build static binaries. It really doesn’t matter with nowadays hard drives if binary takes 1mb or 10mb 🙂
Do not let Linux follow in the footsteps of vendor products, instead be a leader, find your own solution to the problem by examining it from the point of view of an open source platform.
Maybe you have used Linux too long? (Not meant as flamebait, honestly).
>>>>>>>
I probably just know different types of users. All the newbie users I know never download random programs. They’ll use stuff like AIM or whatever, but that’s the kind of stuff you can expect to find in a repository. None of them use the kind of random low-userbase software that you’d expect not to find in a repository. All the people I know that use such software would also know enough to follow a README on how to compile it themselves. I’d wager that the first scenario (specially for the corporate and “grandma-using-computers” sectors” would be more common than yours.
If someone wants to install a new game, they just click on the link, click “Open,” and next next next, and now there is an icon on the desktop or start menu.
>>>>>>>>>>
That’s so primitive. Again, the problem isn’t the repository model, but that the software developer’s you’re taking about aren’t publishing their products on popular repositories. If a game is in the repository, its a hell of a lot easier to walk your mom through “okay click on app-games, then double-click on frozen bubbles” than “okay, first go to w-w-w-dot-f-r-o-z-e-n-b-u-b-b-l-e-s-dot-com, no, no dash between the two words…no, that’s an f as in frank…”
That’s all there is to it, ever. And it is very within their reach to do so, and keep up with basic updates through Windows Update.
>>>>>>>>>
Your “newbies” are obviously more skilled than my newbiews. Windows update is too complicated for me for god’s sakes. My attention span won’t handle anything more involved than “emerge world” anymore…
… if it still wants to be called “operating system”, that is. IMHO. may be even as part of kernel …
imagine if there is no standard for C :
user1> i cannot compile this source…
guru1> what’s your problem, use SU distro, just type apt-cc
user1> cannot you guys have common standard for C ?
guru2> i prefer rpmcc ! having common standard for C undermines freedom of choice and stifles innovations !
regards
But there are different languages just as there are different package management systems and there were reasons for everyone of them to be created. I don’t see the point of your language analogy.
My guess is that you can fix this library problem with another layer of abstraction, or else, you can fix the original design problem.
I think that a vendor is more likely to choose abstraction because it is more feasible, flexible, and easier to apply to a sales strategy.
Yet this does not mean that Linux should choose to do the same thing that vendors have done, or else Linux is doomed to be a follower. Linux can not realize it’s own potential as a follower.
This type of architectural bottleneck should be an opportunity for Linux, but Linux has to place more priority on the source code. There should be more focus on learning about the source code, every part of it, and establishing a more organic architecture that can be shared and developed more freely by the community. Remember that Linux is based on a vendor architecture, and it is limited by this flaw.
My background: 4 years writing, reading, doing research with Linux. I teach international politics, so computers are a tool for me. Mandrake has been my distribution all this time.
I have learned a lot since I began using Linux and continue to learn a lot.
Gentoo: Too involved for me. Too much time for stuff to compile.
Debian: I installed Debian, actually, Libranet, beause of Plone. That’s how bad I wanted it. When I began updating a few programs, it hosed my installation.
So back to Mandrake via a short stop at Red Hat’s house.
None of the provided Red Hat or Mandrake rpms -for the latest version – install and the proposed dependency resolution for the latest version conflicts with the existing packages installed by Mandrake 9.1 and Red Hat 9.0.
I even rebuilt from the source rpm and that would not run.
I don’t want to go through this to install an application. I am a very average user. At work on a windows 2000 machine, installing Plone took 2 minutes.
I finally have an older version up and running with a few minor problems as that’s the only one that will install on Mandrake 9.1.
Rayiner, we want the same thing. Without being a developer, I appreciate the modularity of KDE and have read enough about it to value it. Yet if you read through the KDE development lists, you will see that they too are aware that application installation is horrid right now. The thing is that they have made a conscious decision that it is too early for them to stabilize their ABIs and APIs.
And KDE is probably a bad example as its pervasiveness makes it easy to get. I am just using to illustrate the fact that although developers may have good reasons for not providing an easier way to install software, they need to keep in mind us the poor souls that use the software.
If you were my neighbor, I would ask you to set up gentoo for me and all would be well, but you are not. Again, I stress that I am a very average user with some demanding needs. I don’t think my expectations are out of the ordinary.
the article just miss the point and lack of any understanding in open source stracture developer develop distro pack programs ,if user want to install prog. that don’t have package he w’ll have to wait until there will be one ,apt-get and other tools like that are much better then windows solution (dll hell) , linux is not windows it’s different installing program in apt-get much easier then the windows installer wizard all the author want it’s that all be unified well that’s not gona happend ,much of the the point in open source it’s freedom of choice different pepole different flavours , so there r many distro and many way’s to the so called averege user (and so stupid in the mind of some pepole) ,it’s doesn’t matter all he to do it’s install program from the distro package ,with the tool of it’s distro.
My main gripe really. The real OSes like FreeBSD, MacOS X and Windows all have no such troubles at all. Linux just isn’t ready for the desktop as long as the distributions which are mainly touted as contenders on the desktop leave the user frustrated with a task as supposedly simple as installing new applications.
As far as I can understand, the main argument against linking statically is that programs should immedeately profit from fixed/improved libraries.
Wouldn’t it be possible to ship applications with the needed libraries and install the whole package in one folder?
If a newer, compatible library is installed system-wide, the application should make use of it, and fall back on its own library if necessary.
This should also make it more easy to install applications without admin privileges.
But there are different languages just as there are different package management systems and there were reasons for everyone of them to be created. I don’t see the point of your language analogy.
The point is that language standard provide “language API” for programmer if he/she writes in that language, when programmer writes for a particular os – he/she uses “os API” to make the program usable under the os. IMHO lib management must be part of the “API” .
More direct analogy – kernel gives, for example, API for IDE in/out. It seems no one does argue the merits of having different incompatible ide apies in different distors…
regards
apt takes care of everything the author complains about. The “limitations” he mentions are:
1. not all software is packaged as .debs.
2. most distros don’t use apt.
These are not technical problems. The dominance of rpm-based distributions is an accident of history, not a law of nature. The fact is that an extremely robust, functional, and fairly user-friendly (I’m not joking) package management system for Linux has been in existence for years.
The “tool” exists – if it isn’t “universal”, it is not due to technical limitations.
(Gentoo’s portage also seems to address this problem well, although I haven’t tried it).
I thing that should be nice if every distro publish in one directory, like for example /ports, a style sheet with the default compiler options, app directories, lib directories, conf directories and packages installed (and possibly where), in the last case, not to control the installing process but as a guide and report.
Also, if some app or lib need custom build process for the distro, this information could be put there, in something like /ports/<app_or_lib>.
Besides, make should be a little bit modified to read this directories and grab the information that is there.
At least for .src.tar.{gz|bz2} it could easy a lot the process.
Also, it doesn’t interfere or limit the liberty that we all love to have in linux distros.
For the short term, without addressing the architecture, each distribution company should package their software offerings and make them available to the public database. If the software that you want is not packaged for your distribution than complain to the distribution company. I think that Red Hat is trying to improve the process for Red Hat Linux by assisting developers and educating developers by providing tutorials about how to package the software in Red Hat package format for their current releases.
Redhat and other Linux distributions might be able to create tools that package source code for their distribution. As another poster suggested ‘API’s’, well that tool perhaps could be maintained and updated instead of an API. It could just be a common tool for package creation.
I think that it is possible for AI programs, that is, machines to do this work, but we can start with tools.
For example, I write my QT program using my IDE such as KDevelop, and than I just have to choose an option in the IDE that packages the application for whatever distribution I want. Somehow though the distribution people have to be able to update that feature, so we need a component architecture.
That would work for me. Is it that easy to beat Java.
The concept is have the distribution company do the work once and have everyone reuse it, rather than make everyone do the work themselves. People get lazy.
Personally, packages and dependancies aren’t the worst part of keeping a system running.
By FAR the worst part about *NIX is the million configuration file formats and application preferences that are involved. It seems like every application MUST have it’s own configuration file format (if it has one at all) and it must also enforce a certain method of administration. You need only install any common service to see what I’m talking about. Apache, BIND, Qmail, OpenLDAP, SNMP, MRTG, OpenPAM, OpenSSH, OpenSSL.
And so I think the thing that has to be sorted out first is a common human readable configuration file parser and writing library, and then attempt to persuade all of the groups to at least try to adopt it. It must be lightweight, and portable across all the unices, not just Linux.
This won’t happen, just like a standard package system won’t happen. Because people are ignorant and anything they do is better than what the other guys are doing. It’s this characteristic that drives all of the projects I imagine.
Hmm.. thoughts?
Problems with conflicting libraries, in Linux or Windows, are not caused by competing packaging schemes. They are caused by the developers who wrote code that depends on libraries they assume are on the user’s machine.
A smart Windows or Mac developer — who is in business to sell code — will code to the libraries that he knows are installed by Windows or OS X. If his program needs libraries that he knows may not be on a user’s machine, he includes that library in the install package. Imagine someone’s anger if the Windows program they just bought pops up a notice that tells them it won’t install because a dozen or so library files need to be installed first, and leaves it to the user to go chasse them down. That user would have every right to think the developer was lazy and that the program was broken.
Yet, this is standard in the Linux world. Yes, distributions like Debian and Gentoo attempt to offer solutions, but they effectively require the user to have a broadband connection and take the time to learn the annoying foibles of a distribution-specific packaging scheme. (Learning to use a new OS is bad enough; why scare off prospective new Linux users by requiring them to learn about apt, rpm, emerge, or whatever?)
Isn’t the ballyhooed “choice” of packaging schemes trumped by the fact that your choice of software is confined to progams built and packaged only for that distribution?
Dependencies exist because developers don’t do what they need to do to ensure that all the code their program needs is included in its install package. This will require more standardization across distributions. Frankly, that’s unlikely to happen among the commercial distributions because they use their packaging schemes as a lock-in device.
What if, though, a single distribution started offering dependency-free programs that could be installed like a Windows program: Click on an icon, answer a few question, click “Finish”?
on RedHat 7.3 (a rather new distro) I had enormeous problems with cross-linked library dependancies, and finally, with glibc. Cannot be resolved, every linux head tells me to install a neweer distro!
Whereas, I had NO problem installing Firebird on both Solaris 8 and Solaris 9. I mean, Solaris 8! That’s how much older than RedHat 7.3?
Now, let me hear the excuse for this situation from a linuxhead, I’m realy curious.
yep
There should be an open source forum for project development and a knowledge base resource for developers who want to be involved in projects.
I think that you guys have the answers, but you just need the workshop.
Saying “Linux is not supposed to be a desktop OS”,”Stop imitating Windows”,”users should learn to do it from scratch” and “It works for me” are PURE trolls and completely narrow minded.
MacOS also has standardized ways of installing, not just Windows. You don’t see the great Adobe and Macromedia packages that made MacOS THE desktop OS for more than a decade (until Windows 95 came) being compiled from source by the users.
Why do you want to limit Linux just for yourselves? “yeah Linux is the OS, but it’s for me only, unless you turn geek like me”.
Imagine the advertising designers installing design tools by scratch. Or having only RPMs to install on their Debian/Slack. Don’t you think these people have work to do instead of learning the Linux structure. They don’t have to do it on Windows or MacOS, that’s why they are productive.
Notable exception is security knowledge, of course.
The same 1337 geeks think like that are the ones that complain when proprietary companies only deliver RPM packages. You deserve it.
I’m sure that eventually the problem will be solved because there is nobody stopping you from doing the work.
I’m sure that eventually the problem will be solved because there is nobody stopping you from doing the work.
How about hordes of angry Linux users who bristle at the thought of change? Do you really want to ask Debian users (for example) to switch to autopackage? Good luck! Every distribution views switching from their to package manager to a generic one as “losing their advantage.” The package manager is what sets them apart in most cases for the end user. You would be hardpressed to get any of them to consider giving this up.
Where is the demand than for a new way to deploy software if these hordes of angry linux users bristle at the thought of change? Why do we need to suppy a solution if there is no demand, and if everyone is content with the way things are now.
And let’s keep it that way, there is no need for a second Windows. If Linux can get a share 10% I’m happy. But only if those 10% are ‘smart’ users.
And with ‘smart’ I mean: users that read what’s on screen.
Linux doesn’t need to be the best Operating System for everyone.
I do think that there will come a single platform for Linux. And it is already more or less here: Mono. What would be nice is a folder, in which you can drop Mono applications. And those would appear immediately in the Gnome/Kde Menu. This would IMHO be very cool… And ideal for commercial software developers. Perhaps also some sort of drop standard for other languanges. But Mono seems to be the best fit for a fast implemntation of this idea.
If Linux users want a better solution but at the same time they do not want to give up their distributions deployment utility, than the answer has to allow not for the replacement but for greater decentralization of the deployment utility or tool. A tool to facilitate a cross platform automatic process and does not interfere or replace manual or specific package management.
I consider the different distributions as different operating systems. The reason software tends to install easily on Windows is that the developers know exactly what is on the system. The same if I use Redhat package manager with the Redhat system. Installing a generic source package, or and SuSE package on a RedHat system and expecting it to work is unreasonable.
Wanting SuSE, Redhat, IBM, Sun, etc to cooperate is not likely. They are competitors.
re: Yamin (IP: —.sympatico.ca)
Combine that with the windows practice of packaging those files with the application just in case, and ur in the clear.
Um… that’s how the DLL hell was created in the first place. Your application installs a previous version of common application X over the latest version of common application X. For DLLs that means some entry points no longer exist, and for COM objects that means some interfaces no longer exist.
Sometimes upgrading a common application will break existing programs that rely on it. Witness MDAC 2.7 broke applications which invoked oracle stored procedures that required return values from the parameters (if the parameters were declared as type BSTR).
re: HAL (IP: —.arcor-ip.de)
The real OSes like FreeBSD, MacOS X and Windows all have no such troubles at all.
So if we compare apples to apples, then OpenBSD applications will install on FreeBSD and NetBSD without a single problem. And Windows 98/ME applications will install on both Windows XP and Server 2003. That’s amazing! I never knew that.
<sarcasm>I’m glad you cleared that one up. It must be some other reason why my Windows 95 applications don’t run on Windows XP. And there must be some other reasons why a driver needs to be specific for Windows 95, 98, 98se, ME, NT 3.1, 3.5, 4.0, 2000, XP, and 2003!</sarcasm>
Sorry, I fed the troll!
re: enloop (IP: 69.26.137.—)
Dependencies exist because developers don’t do what they need to do to ensure that all the code their program needs is included in its install package. This will require more standardization across distributions.
Some “common” applications may have licensing issues that don’t allow the inclusion into an installation or a distribution. Therefore, they must always be installed from a third party. For example, the nVidia drivers.
Please stop insinuating that developers are lazy or taking short cuts – there are legitmate reasons why they must not be included. Why do people attribute so much power to the programmer? The programmer is just the implementor of specifications and requirements.
If you have tried Ximian Desktop you would have noticed that certain libraries have been fixed by Ximian and re-distributed. Since they are not the maintainers of the code, the version number given to the updated software doesn’t follow the version numbers created by the original developers. Ximian can’t dictate those numbers. The end result is that software that depend on those libraries may not install because they don’t understand the version numbers. This is very similar to Window’s DLL hell.
What if, though, a single distribution started offering dependency-free programs that could be installed like a Windows program: Click on an icon, answer a few question, click “Finish”?
Do you mean like OpenOffice.org for Windows? You find out during the installation that you should have had java installed, then you have to cancel the installation, install java, and then start the installation of OOo over again, and hope that you didn’t need to install anything else.
IMHO
Why does the installer need to be a single application in Linux? In Windows you have your choice of installers: Windows Installer, InstallShield, InstallAnywhere, …
The problem that I see in the posts is that all Linux distributions are lumped into one entity, and compared against single entities like Windows XP, FreeBSD or Mac OSX. It is true that the common thread in the distributions is the codebase that the kernel is compiled from, but that is where the sameness stops. Comparing SuSE to RedHat is as fair as comparing Windows ME to Server 2003 or FreeBSD to OpenBSD. In other words, Linux distributions is in the same boat as all Windows operating systems and all BSD operating systems.
Personally I found software installation in Mandrake 9.1 to be very easy. I could find all the software I wanted and it isntalled without any problems. Other distributions using different software installation methods just doesn’t seem like a big problem to me.
There were a number of reasons that made me switch back to Windows:
The lack of decent dual headed display support
The lack of reliable hibernation/software suspend
Inconsistent applications (menu layout, keyboard shortcuts, open/save dialogs, etc.)
Being limited to plain text cut/copy/paste between apps made with different toolkits
The fragile filesystem that started breaking down after a single crash
The lack of any Linux apps that can open MS Office or Photoshop files successfully (without running Windows apps in Linux)
If all of those ever get fixed I’d switch in an instant, I found Mandrakes graphical utilities and software installation perfectly good.
Why don’t the makers of the major Distro’s make an independant open standards group?
Come up with a standard file structure, library collection, packaging system… hell, maybe even some standard software to be included in every distro or some Human Interface guidelines.
Not that they should come up with these standards out of thin air- they should contantly look to the development community and see what developers are coming up with, and what’s working and what’s not. Good standards are almost always divised AFTER the systems are being used, and they just iron out the crinkles.
It shouldn’t need to be forced (i.e. limiting choice somehow), but just suggestions, so that different distro makers have standard guidelines to know what users/developers will be expecting on their system, and so that users/developers have some continuity between systems.
There already is one, called LSB – http://www.linuxbase.org
Windows leaves installation up to the software developers. Big mistake. The real problem for Linux is having what is considered the “underpinnings” of Windows being developed as hundreds of separate packages, many doing the same thing but conforming to a different philosophy. The Windows “solution” (read: hack) would never work in any Linux system.
Windows lacks package management, and as a result .DLLs often either go missing, or linger, or are caught between software they are too old for, and software they’re too new for. Package management is the only way to solve all of the issues, but unfortunately without being done properly, it brings in dependency hell.
Many, many distributions have solved dependency issues already. Debian has for a long time, urpmi solves it for .rpm based distributions, and any distro with a ports system also resolves them fine.
What distributions now need is a click-n-run type interface to their package systems. Remember, Windows isn’t expected to present the user with all the softare he’ll ever need, nor does it need to have perfect hardware support out of the box. Linux is required to be better “out-of-the-box” than someone’s existing Windows setup to get a switch to happen. Sad but true.
APT has broken my installation a couple of times already or come back and told me that installing the package would require removing another package. Which is of course unacceptable and not a solution.
LSB? How many libraries are actually included in that? Example, I want to install Rhythmbox on Mandrake without being part of the club. Won’t work without a fight. Try it on SuSE. Search the web for RPM’s. Manage to find some old one’s. I want the latest version so I try installing from source, same thing you need to fight with it. LSB might be good in theory but in reality it does not work. Just too much left out. We need a standard filesystem and not. If there was a packaging format that was a standard and you were on a system certified to have a standard file system and libraries the developers could just create one package without having to worry about it not working.
If Gentoo guys still want to use Portage, great but having a reference platform or whatever you want to call it would solve many issues.
P.S. KDE and QT apps are much more reliable when installing from source than Gnome and GTK+ apps.
I have been using linux for 2 years and i must admit sometimes it was boring that you had to keep looking for packages to install some app. sometimes even 10 or 12 missing libraries. It is ok if it is for a production environment where you prepare a station for a specific program. But just to play around and test apps….. hmmm…
=)
Have a nice day people.
Dan
Not all software for Linux have open sources. A commercial software developer doesn’t want 10 different ways of distributing software
There are binary installers for Linux. OpenOffice.org uses one, there’s the Loki installer, and finally there’s Autopackage. These are distro-agnostic, and commercial (by which I guess you mean proprietary, since you can have commercial open-source software) software developers can use any of these methods for their products.
If someone wants to install a new game, they just click on the link, click “Open,” and next next next,
You just basically described the Loki installer… 🙂
Personally, at the risk of repeating myself, I do think that Autopackage is the way of the future. Developers need to start using it more.
Packages should strictly adhere to the major.minor.teeny version numbers system. When binary compatibility breaks, you need to bump major version numbers. Also, two different major versions should co-exist fine in any setup. If you add new features or extensions without breaking binary compatibility, you should bump up the minor number. Any other changes fall into the teeny version number.
That way, if I have one app that depends on library foo, version 1.2.x, another on 2.0.x, and another that depends on 2.2.x, installing the latest foo in the 1.x.x series will make the first app work, and installing the latest foo in the 2.x.x series will make the other two work.
That way, you can get rid of the old code and not have to hang on to it, and nobody gets screwed.
Many library developers choose their own versioning system. Many others use this one, but don’t take care to make sure things don’t break. If every library followed this pattern, however, you would never see things like apt-get or emerge break systems.
LSB standardizes the wrong thing. We need to standardize how software gets developed.
I’m not sure what you’re asking for…standardized configuration options? That’s kind of impossible, since different programs use different options and even different types of configuration.
Standardized file formats? Uh, all the config files are already human-readable text files. You can’t get more standardized than that…
Standardized configuration method? Well, you can always use Webmin. It’s powerful, versatile and has a consistent interface. In any case, it’s a lot easier to use than the Windows registry.
There are also GUI tools made for some of these programs to help with some programs (I just installed the new gproftpd and it is sweet…)
How about hordes of angry Linux users who bristle at the thought of change? Do you really want to ask Debian users (for example) to switch to autopackage? Good luck!
The thing is, they don’t have to switch. As long as volunteers will make .deb packages, they won’t have to use Autopackage even if an Autopackage version is available. Also, don’t automatically assume that Linux users are opposed to change – that’s a very reductive point of view. I’m a Mandrake user and even though I love urpmi and Rpmdrake (the GUI frontend), I also appreciate Autopackage very much. I doubt that I’m the only one in this situation.
I think what he means is some kind of standard formatting of the text files. Windows for example, has the ini file format. Its a bit outdated now by the registry, but it was good.
There are standard functions which were useld to get/set items (getprivateprofilestring…). So when u open up an ini file, u know what are ‘sections”, data, and comments. You also don’t need to write/found a new parser.
Yamin
on RedHat 7.3 (a rather new distro) I had enormeous problems with cross-linked library dependancies, and finally, with glibc. Cannot be resolved, every linux head tells me to install a neweer distro! […]
Now, let me hear the excuse for this situation from a linuxhead, I’m realy curious.
Did you try to install a RedHat 7.3 RPM? You know, like this one?
http://dag.wieers.com/apt/redhat/7.3/en/i386/RPMS.dag/mozilla-fireb…
That took me less than 30 seconds to find on Google…from your past posts, I know you have a strong anti-Linux bias, but you should have worked harder on this particular Troll attempt…
Another possibility is that you’ve upgraded part of your 7.3 installation, so it doesn’t have the original glibc. Would you expect replacing important system dlls in Win98 with ones from a Win2K installation, and still have a working system?
Speaking of Win98 and upgrading…you do know that MS doesn’t support Win98 anymore, right? A lot of new software is simply not available for Win98…despite the fact that about 30% of PC users are still using it!
If these distributors would write their applications to the LSB standard we would not have that many problems. The whole problem with incompatibilities is the fact that unlike Windows Linux apps are packaged without the libraries, the LSB solves this problem and when utilized it will help close the gap between all the distributors.
There were a number of reasons that made me switch back to Windows:
The lack of decent dual headed display support
Twinview works great with GeForce Ti 4400 (using KDE). I was even amazed that the new version of xscreensaver displays a different screensaver on each screen…
The lack of reliable hibernation/software suspend
This is supposed to have been fixed in 9.2
Inconsistent applications (menu layout, keyboard shortcuts, open/save dialogs, etc.)
Well, if you stick to KDE apps, you’ll have better consistency than in Windows. Of course, GTK apps and other X apps have their own menu layouts and such…although the “File Edit… Help” paradigm is pretty ubiquitous by now.
Being limited to plain text cut/copy/paste between apps made with different toolkits
Again, the solution is to try to use apps from the same toolkit. But I understand how this can be a problem, especially with using Gimp under KDE. Still, it’s but a nuisance, not a showstopper.
The fragile filesystem that started breaking down after a single crash
Use ext3 or reiserfs4, and you’ll get a filesystem that’s stronger than NTFS…
The lack of any Linux apps that can open MS Office or Photoshop files successfully (without running Windows apps in Linux)
I agree. This is why I personally use MS Office and Photoshop with Crossover Office.
Some of these are enough for people to switch, personally the benefits of using Linux outweigh the drawbacks – I’d lose more going back to Windows. Fortunately, these issues are being resolved (not as fast as we’d want sometimes, but that’s how it is). All in all, though, this is pretty irrelevant to the subject at hand! 🙂
I think what he means is some kind of standard formatting of the text files. Windows for example, has the ini file format. Its a bit outdated now by the registry, but it was good.
This is not necessary if config files are well-commented. Personally, I’ve learned how to configure a lot of programs just by looking at the config files and following the instructions contained therein. I find this a lot more intuitive than using *.ini files…
Everytime there is a discussion about this issue, i feel odd. Because i never ever ever have a problem installing software in Debian.
Makes me wonder:
“What are you guys talking about? Debian has it all. It’s easy, maybe easier than windoze. It’s just apt-get, and you’re done”.
The *only* issue i see in ‘apt-get’ is the fact that it’s a CLI program. Yes, there is synaptic, but it sucks. It really really sucks for a newbie. The interface has tons of buttons (“update”, “upgrade”, “install”, “remove”, “dist-upgrade”, “proceed”), tons of searching methods… there’s no way a newbie will find it friendly.
The ideal GUI tool should have only 1 search field, 1 install button, and 1 “see details” button. But soon we’ll have something like that.
Victor.
<< LSB? How many libraries are actually included in that? Example, I want to install Rhythmbox on Mandrake without being part of the club. Won’t work without a fight. Try it on SuSE. Search the web for RPM’s. Manage to find some old one’s. I want the latest version so I try installing from source, same thing you need to fight with it. LSB might be good in theory but in reality it does not work. Just too much left out. We need a standard filesystem and not. If there was a packaging format that was a standard and you were on a system certified to have a standard file system and libraries the developers could just create one package without having to worry about it not working. >>
The LSB does work, and it works well. The problem that is happening is that no one wants to write to the standard. All my development work at my job is done to the tune of the LSB and I have no problem installing any of my proprietary applications on any machine with any distro. Get the developers to start writing to the standard and you will see it does work. We dont need a universal packaging system, we just need the developers to improve the quality of their work. Right now there are several apps written on the LSB standard at the LSB webpage and they all work.
<< P.S. KDE and QT apps are much more reliable when installing from source than Gnome and GTK+ apps. >>
That I do agree with. But QT has always been better than GTK+, the only reason why GNOME seems more popular now is because of the commercial support provided by SUN and Ximian D2. At Linux Journal the #1 desktop in the readers choice awards was KDE.
Personally, at the risk of repeating myself, I do think that Autopackage is the way of the future. Developers need to start using it more.
I second that. There are already a few program packaged with autopackage – which proves the thing works – now what we need is people (developers) to support autopackage.
Victor.
Oh gosh, it took you only 30 seconds?! How come I didn’t think about that! Maybe because it wouldn’t have worked anyway? Look at my session:
1. download Mozilla Firebird (binary package)
2. gunzip and untar
3. Now watch this:
[root@neonfish MozillaFirebird]# ./MozillaFirebird
./MozillaFirebird-bin: /lib/i686/libc.so.6: version `GLIBC_2.2.4′ not found
(required by /place/MozillaFirebird/libxpcom.so)
[root@neonfish MozillaFirebird]#
[root@neonfish MozillaFirebird]# cd ..
[root@neonfish /place]# rpm -i glibc-2.2.4-32.i386.rpm
error: failed dependencies:
glibc-common = 2.2.4-32 is needed by glibc-2.2.4-32
glibc-devel < 2.2.3 conflicts with glibc-2.2.4-32
glibc > 2.2.2 conflicts with glibc-common-2.2.2-11_Linox
Now I dare you to find a solution for this problem, that is NOT ripping out and installing RedHat 9! It really irritates me your “That took me less than 30 seconds”, when you haven’t actually even tried to understand what the problem is. This problem was the subject of a long thread on alt.os.linux, if you are curious about the more thoughtful answers, which all pointed to the fact that my only avenue is to reintall the OS. But I guess you figured it out better than all those folks on alt.os.linux, you must be so much smarter than they.
And what about this comment Speaking of Win98 and upgrading…you do know that MS doesn’t support Win98 anymore, right? A lot of new software is simply not available for Win98…despite the fact that about 30% of PC users are still using it!
Who even mentioned upgrading? Who even cares whether MS supports Win98 customers or not? I can install Firebird no problem, that’s all. (In fact, I would NOT use support lifecycle as an argument in this discussion, if you want to make RedHat Linux look good.)
> Now I dare you to find a solution for this problem, that is NOT ripping out and installing RedHat 9!
What about upgrading to RedHat 9? You can do that without destroying your system, ya know?
The real problem is that Firebird is built against one version of glibc. glibc 2.2.4 and 2.2.3 are, iirc, binary compatible, meaning that forcing the installation *shouldn’t* cause any problems. That’s probably what the parent did.
In my opinion, developers should only release the source code, and leave the packaging up to the distributions. That would solve *all* of those problems.
Why do they package things themselves? Because Windows users are used to packages coming with their own preferred installer (many of which are utterly broken), because Windows lacks package management. If Mozilla/Firebird’s Linux installer is really necessary, make it a script that determines what distro is installed, and downloads and installs the package accordingly.
This is not “Linux sucking,” this is mozilla.org shooting themselves in the foot.
People that criticize “Linux” should at least know who’s responsible for the shortcomings they see.
Oh gosh, it took you only 30 seconds?! How come I didn’t think about that! Maybe because it wouldn’t have worked anyway? Look at my session:
1. download Mozilla Firebird (binary package)
2. gunzip and untar
Uh, dude, I provided a link to the RPM package, not to a .tar.gz package. Again, my question is, have you tried to install the RedHat 7.3 Mozilla-Firebird RPM package? The one I gave you a link for? Please answer this question first.
If RH 7.3 uses glibc 2.2.2 (as it appears) then the 7.3 RPM should also use glibc 2.2.2. BTW, you do not need glibc-devel to install a rpm.
This problem was the subject of a long thread on alt.os.linux, if you are curious about the more thoughtful answers, which all pointed to the fact that my only avenue is to reintall the OS.
First, I very much doubt that you need to reinstall the OS for this – except if you did all kinds of upgrades since you last install. It’s true that, if you’re going to upgrade glibc, you might as well upgrade the rest of the OS to preserve binary compatibility. But since there is a Mozilla-Firebird package for RedHat 7.3 (as I already pointed out), I fail to see why you would upgrade your glibc.
Second, I would take anything written on alt.os.linux – or any newsgroup for that matter – with a big grain of salt. Given your arrogant and aggressive posting style, I wouldn’t be surprised if people told you that just to tick you off…
Who even mentioned upgrading?
You did. You specifically said that “linux heads” (see what I mean by arrogant attitude?) have suggested that you upgrade your OS.
In fact, I would NOT use support lifecycle as an argument in this discussion, if you want to make RedHat Linux look good.)
Actually, I don’t care much about RedHat. I use Mandrake, and Mozilla-Firebird works very well on my installation, thank you very much.
Again, I ask: did you try installing the RedHat 7.3 RPM? Do you still have the stock RedHat 7.3 glibc installed?
As I said, you wouldn’t mix and match dlls for different Windows versions – then you shouldn’t upgrade glibc to a higher version number unless you upgrade your distro at the same time.
Well Great Cthulhu,
I didn’t say editing linux config files was hard. I’ve had to do it myself, but its definitely not as easy as ini files. You know before hand before you open the ini file that its going to look like. But things like this are often taken subjectively, so I won’t say anymore on this regard.
***************
;my comment
[section]
paramter=value
***************
Apart from that user perspective, its much easier to code with. It helps alleviate tokenizing mistakes and other issues. Since its an API call, it also allows the OS to continue the consistency of configuration. I think in XP, there is an inifile to registry mapping right? Which may or may not be a good thing
For 95% of config information, the ini file format works wonders. There are of course constructs which are not found in it. Things like lists need be done in a less than pure manner. But then nothing stops you from writing ur own personal config file format a la linux.
I’m sure there are very good linux config file libraries or whatever, but its just not standardized. I love the QT one, even for windows .
Yamin
“I have been using Linux for a few years and spent the last week trying to get Plone from plone.org to compile. There are no rpms for the latest version and rebuilding the rpm does not produce a working rpm.”
The same thing also happened to me, i downloaded the source rpm for plone and tried rebuilding it. It just didn’t work. Anyway i ended up downloading the .tar.gz, and it seems that you don’t really have to compile plone at all!!!!! it’s just a bunch of cmf scripts, you just have to copy them to some weird zope “product” directory.
I also had problems with zope: it didn’t exist on the repository, rebuilding the source rpm didn’t work because it doesn’t compile with python 2.3,so i had to install python 2.2, wich also didn’t exist on the repository, so more compiling and lib hunting.
In the end it took me an whole afternoon to get it to work, It’s a shame that what works for one distro doesn’t work for the next.
Apart from that user perspective, its much easier to code with.
That may be true – I’m not a programmer myself…I do agree that some amount of standardization might be a good thing, but since no one controls Linux app development (in a way similar to how MS controls Windows app development), it’s difficult to enforce.
On the other hand, all config files for Linux are text documents – the same cannot be said of Windows config file. So, comparing the two, I’d still say that I much prefer Linux, even though there is always room for improvement.
Standardizing the syntax of the config files would be useless. Syntax is trivial. The hard part is figuring out what the elements in the config file do. Its only minimally harder to open up the Apache config (which has some pseudo-XML syntax) and understand it than to open up a hypothetical Apache registry tree and understand it. What is really important is to ensure that we have good GUI tools for manipulating those files. XML would help in writing those GUI tools, because it would make the parsing step easier, but most popular config files already have GUI tools, and it would be more productive to spend the effort improving those.
The problem with Linux package management is not with the underlying architecture of the system, but with the execution. Its a matter of getting application developers to write good packages. Its no different than getting application developers to write good installers — does nobody remember the absolute shitiness of Sierra’s Windows installers?
The problem with Linux package management is not with the underlying architecture of the system, but with the execution. Its a matter of getting application developers to write good packages.
Really application developers shouldn’t even bother with writing packages. Good distributions have good package maintainers that stay on top of the ballgame and make sure things don’t brake. That’s why I always hand it to the Gentoo portage maintainers, since they stay on top of things and *listen* when peope have problems.
This leads to fixes getting sent upstream to the developers if needed.
I’m sick of this “Linux sucks because developers have to write for thirty different distributions” crap. It’s wrong wrong wrong, and the real problem is that distributions like Mandrake in the past have kept their package repositories stale until new versions unless the user explicitly switches to “cooker” or whatever the “unstable” repository is.
IF developers stick to just releasing source, and follow good procedures with versioning as I mentioned above, then any ports or apt style system with active maintainers beats the pants of Windows, hands down. The system system can literally evolve on its own, in a safe manner.
Add a click-n-run type interface, and you really have a winner.
“Good distributions have good package maintainers that stay on top of the ballgame and make sure things don’t brake.”
The problem is that the commercial distributions would never accept this line of argument. They don’t want to have the overhead of packaging every piece of software that their users want; they’d much rather the community and/or the developers do it for them, for free.
Why don’t any of the commercial distributions have an equivalent of the Debian package tree? Simple: they don’t want to pay for it.
“Twinview works great with GeForce Ti 4400 (using KDE). I was even amazed that the new version of xscreensaver displays a different screensaver on each screen…”
I spent a few days trying to get twinview working with my Ti4800SE but had no success. Maybe I’ll have another look when there’s a graphical configuration tool, life’s too short to waste so much time messing about editing config files.
From what I’ve heard even twinview is very basic compared with dual headed display support in Windows. IIRC there are still problems with some apps thinking it’s one large screen and you can’t position screens exactly how you want. Anyway, there’s a good chance I’ll upgrade to a Radeon before that happens, AFAIK there’s no equivalent of twinview for Radeon cards.
“Well, if you stick to KDE apps, you’ll have better consistency than in Windows. Of course, GTK apps and other X apps have their own menu layouts and such…although the “File Edit… Help” paradigm is pretty ubiquitous by now.”
If I just stuck with KDE apps then Linux would be pretty much useless to me. The only Linux apps that come close to replacing their Windows equivalents are OpenOffice and GIMP IMO.
“Again, the solution is to try to use apps from the same toolkit. But I understand how this can be a problem, especially with using Gimp under KDE. Still, it’s but a nuisance, not a showstopper.”
Speak for yourself. For me the time wasted by the lack of this basic feature more than outweights every advantage Linux has over Windows. It makes even basic DTP slow and frustrating, I’d rather use Windows 3.1 than Linux for document creation. This is a feature that the Apple Lisa had in 1983, it’s amazing that an OS lacks it 20 years later.
“Use ext3 or reiserfs4, and you’ll get a filesystem that’s stronger than NTFS…”
I’ve had ext3 filesystems become damaged after crashes several times, I’ve never had a similar problem in Windows.
“I agree. This is why I personally use MS Office and Photoshop with Crossover Office.”
If you’re just going to use Windows apps, why bother with Linux?
Okay, now you sound a little bit like you’re trolling…
I spent a few days trying to get twinview working with my Ti4800SE but had no success. Maybe I’ll have another look when there’s a graphical configuration tool, life’s too short to waste so much time messing about editing config files.
Hey man, I just followed the simple instructions given in the NVIDIA file. If you can’t edit a text file, then that’s you’re problem. It took me about five minutes to get Twinview working…like, typing the following in XF86Config-4:
Option “TwinView” “on”
Option “TwinViewOrientation” “RightOf”
And it works perfectly under KDE – with all apps, since it is the WM that determines where the apps will open, and Kwin is TwinView aware. Oh, and if I start an app and move my cursor in the second screen, it opens there…neat!
The only Linux apps that come close to replacing their Windows equivalents are OpenOffice and GIMP IMO.
So my question is – do you really care that much that the widgets and menus are slightly different in these apps? I mean, really? As far as I know, Photoshop’s UI is quite different from Office XP’s…now how can I use both of these programs without being terminally confused!?!
Excuse me, but this is a false problem.
Speak for yourself. For me the time wasted by the lack of this basic feature more than outweights every advantage Linux has over Windows.
Wasted time? Come on! How much time is wasted because GIMP doesn’t have the same UI as KDE apps? Even Windows apps don’t have consisten UIs! Again, I question your good faith here. I think you’re focussing on this because you’re mind is already made up.
I’ve had ext3 filesystems become damaged after crashes several times, I’ve never had a similar problem in Windows.
Funny, I’ve never had a damaged filesystem after a crash with ext3. On the other hand, I’ve had this problem several times with Windows, recently with Windows 2000. Are you sure you don’t have hardware problems? Because your experience is extremely atypical, suggesting a deeper problem.
If you’re just going to use Windows apps, why bother with Linux?
Freedom of choice. I like Linux, but sometimes I must open proprietary file formats. When this happens, I open up the file format with the proprietary app. When I don’t need to use these particular apps, I use the great programs available native for Linux, in addition to the benefits of a rock-solid, secure OS. Also, I try not to encourage an illegal monopoly by limiting my Microsoft usage to the Office 2000 I purchased 2 or 3 years ago. I’m not using any other of their products, because I believe they are a bad corporate citizen and they do not deserve any more of my money. The other Windows programs I use on my Linux box (Photoshop, Quicken and Half-Life/Counterstrike) are not from Microsoft, so I’m not encouraging them.
Oh, and a distinct advantage of Linux: I can play Counter-strike in a window and check my mail when I’m waiting for something important…
Might I add that, for anything that doesn’t require CMYK, GIMP 1.3 is a kick-ass program?
Okay I am going over to LSB to learn some more.
But from what I know LSB certified systems provide some kind of layer or abstraction. That does not seem like a good idea to me, rather just agree on where stuff goes. I stand to be corrected though.
Does LSB specify which environment variables are to be set? I mean when you compile Bluefish you need the latest GTK-sourceview library. Okay download that compile copy the Pkg_config file to the location it needs to be and try again. Still fails. So I set the environment variable, still no good.
Please don’t get me wrong I love Linux and the OSS projects that make it great but I would love to see things get better with regard to installing apps outside of those originally packaged by the Distro.
The LSB does not add an abstraction layer. It simply specifies what libraries are available and where things go.
Having a standard package file does provide some benifits. The package file could be auto generated. Developers could point the auto package generator to the application files and let the generator take over. The auto package generator could also check for and record dependencies. The standard package file wouldn’t need to have file placement directives. Just have directives stating what dependiencies need filling, what files are libraries, what files are resources, and what files are the application. On the distribution side, there will be an install program. As long as this install program can read the standard package file, the distribution’s install program can have any features the distrubtion wants to add. The install program can check the needed dependencies, place the libraries in folders that the distribution uses for libraries, and place the program files in the appropriate folders. If symlinks need to be made, the install program could do that too.
The developers can worry about creating the application without taking too much time putting a package together. Distributions know where they want to install files so they can create an install program specific for their distribution.
A simple example of this type of system is text files. Text editors of all types can read text files. The features of each text editors are different but a file created with VIM can be edited by Notepad or Emacs.
All too true. I’ve had various different problems with different distros, all kinds of weird things happening, but I’m a Gnubie. I have very little idea what I’m doing when it comes to Linux, so I don’t just decide that Linux sucks. I do what a lot of these people consider unthinkable…I admit that it’s very likely I goofed up and keep trying to figure it out. Why is that so hard for people? People who are afraid of the challenge of learning a *Nix should stick to Windows.
“Hey man, I just followed the simple instructions given in the NVIDIA file. If you can’t edit a text file, then that’s you’re problem. It took me about five minutes to get Twinview working…”
Good for you, shame the simple instructions simply didn’t work for me. If it was so easy to set up then surely the posts I made to Linux newsgroups and help forums would have solved my problem? They didn’t have a clue what was wrong either.
“Wasted time? Come on! How much time is wasted because GIMP doesn’t have the same UI as KDE apps?”
A hell of a lot if you need to paste lots of little bits of an image into a document. In windows I select the part and copy and paste. In Linux I have to save each selection to a directory then import it into the document.
“Funny, I’ve never had a damaged filesystem after a crash with ext3. On the other hand, I’ve had this problem several times with Windows, recently with Windows 2000. Are you sure you don’t have hardware problems?”
It’s happened on 2 different PCs and both still run Windows perfectly.
If it was so easy to set up then surely the posts I made to Linux newsgroups and help forums would have solved my problem? They didn’t have a clue what was wrong either.
Again, I think it might have been a hardware problem. You have supported hardware, supported features, an official driver…it should work! I’ll check out the thread on alt.os.linux and see if I can figure it out. What name did you post under in the newsgroup?
A hell of a lot if you need to paste lots of little bits of an image into a document.
Okay, here there was a confusion. I though you were talking about UI inconsistencies, not the Cut/Paste issue. I agree that this is something that needs improvement. On the other hand, whenever I work in Windows (at work, during the day) and have to place photoshopped images in Word, I still save them to disk (as GIFs, because Word doesn’t compress JPEG images) and import them from disk. It’s a lot safer that way, and it’s not really that much longer. Granted, my documents only have only about 1 or 2 images per page, so perhaps if there was more I’d look for alternate ways. But in this case, there’s no loss of speed for me between the two OSes, because I don’t paste images from one app to the other anyway.
It’s [ext3 crash + loss of data] happened on 2 different PCs and both still run Windows perfectly.
Again, since you’re experience is so atypical, I’m led to believe that a) somehow you misconfigured it and it wasn’ really ext3, b) there’s something wrong with your hardware, or c) you’re making this up.
Seriously. I’ve never lost files in three years of use. I must say I haven’t crashed that often, except when NVIDIA’s driver brought down X, maybe a dozen times over a period of three months, but I’ve had at least four power failures and I’ve never lost any data on my Linux box – though I did lose data on my Win2K box during one of these power failures.
Your case seems to be special – which again seems to point towards a hardware problem. Don’t judge an OS because of a deficient setup…
“Again, I think it might have been a hardware problem. You have supported hardware, supported features, an official driver…it should work!”
If it’s a hardware problem, why does Windows work perfectly?
“I’ll check out the thread on alt.os.linux and see if I can figure it out. What name did you post under in the newsgroup?”
Here’s the newsgroup post I made at the time:
http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&selm=9888d3ea.03…
“though I did lose data on my Win2K box during one of these power failures.”
I’ve never lost data on Windows because of a crash, despite using it for over a decade and experiencing hundreds of crashes. Because of this I’m led to believe that a) somehow you misconfigured it, b) there’s something wrong with your hardware, or c) you’re making this up.
“Your case seems to be special – which again seems to point towards a hardware problem. Don’t judge an OS because of a deficient setup…”
If my hardware caused the problems on both of the PCs I’ve used for Linux, why have I had no trouble with Windows on the same hardware?
Here’s the newsgroup post I made at the time:
I’ll check it out against my own config tonight. So far it looks okay – btw, I’ve noticed that when plugging in only one monitor using TwinView, I had to plug it in the first (normal VGA) port, but when I plug in two monitors, I have to plug in the main monitor in the second port (the digital one, using the DV-to-VGA converter that came with my card)…have you tried switching the monitors around when trying to use TwinView? Just a question.
Because of this I’m led to believe that a) somehow you misconfigured it, b) there’s something wrong with your hardware
How would I have misconfigured Windows since it is a basic install? It is possible that my hard drive in the Win2K is a bit fickle, though.
If my hardware caused the problems on both of the PCs I’ve used for Linux, why have I had no trouble with Windows on the same hardware?
I don’t know. Could be a hardware combo, or BIOS setting (you did have the Plug’n’Play setting in the BIOS set to off, right?) Okay, it wasn’t nice of me to suggest that you might have made it up, I apologize for that. But since your experience seems rather unique, you’ve got to admit that it’s puzzling…
Yep it is the same thing, atleast windows has a easier install process.
Windows doesn’t have an install process.