Linux Magazine has a profile of Daniel Fore and the Elementary project. Elementary is a Linux distro that’s committed to a clean and simple user experience, but it’s more than a distro – it’s actually a multi-pronged effort to make improvements to the user experience for a whole ecosystem of components, including icons, a GTK theme, Midori improvements, Nautilus, and even Firefox. The work that elementary is doing isn’t limited to their own distro, and some of their work is available in current, and perhaps future, Ubuntu releases. The results are really striking, and I think it’s probably the handsomest Linux UI I’ve ever seen.The Linux Magazine article asks, “Do we really need yet another distribution?” It’s a rhetorical question, and of course the answer turns out to be yes, but I don’t think that that’s the salient question. What anyone who’s been in the perpetual holding pattern of “Linux on the desktop” hype has wondered, really, is what’s it going to take to get Linux on the desktop to go mainstream? Is it a question of ease of use, or are ease and aesthetics completely beside the point? Back in the 90s, using Linux wasn’t for the faint of heart, and there was a major ease of use deficit, and let’s just say that the graphical UI choices were rough around the edges. Back then I think a lot of people made the assumption that cleaning all that up would make Linux more appealing to regular folks. And they did clean it up, and they did make it more appealing to regular folks. But that didn’t lead to a huge increase in take-up. Turns out, the ease of use issue was one of the barriers, but it wasn’t the only one.
Interestingly, though, some of those other barriers seem to be falling now, due mostly to the new realities of Software-as-a-Service and mobile computing. Microsoft’s lock on PC applications has been largely bypassed, as the average PC user doesn’t need to run any native apps, and the ones they do need generally have workable analogues on other platforms. Of course, Microsoft recognized the threat to its desktop monopoly from the web a lot time ago, since the Netscape days, and took steps to try to control even web apps, but the proliferation of non-PC platforms such as mobile, netbooks, smartbooks, and tablets has made it unworkable for anyone to try to exclude alternative platforms from full participation, so open standards now rule. Linux on the desktop now has its chance to emerge.
Unfortunately, the same forces that have given Linux this opportunity have also lessened the importance of any potential victory. The personal computer is now just one of many mainstream computing devices, and, interestingly enough, Linux has risen to be a major platform for most of the rest of the device types, which has sort of diluted the idealistic fervor that Linux on the desktop fans used to cultivate. It seems that much of the institutional investment around Linux is now focused on alternative form factors, since that’s where the land grab is happening.
I believe that the rise of non-PC platforms will pay dividends that continue to benefit Linux on the desktop, and as people become more comfortable with a non-Microsoft computing experience on mobile, they’ll be willing to take a chance on the desktop, or their Television, or automobile, or neural implant, or anywhere else that Linux ends up.
It looks more like a “multi-pronged effort” to copy the Mac.
I protest! desktop environments and wms are OS-independent [at least in *nix case], so please, stop telling the world that ‘linux has it all’. It makes people to think that KDE is Linux, Fluxbox is linux, CDE is linux etc … we also have BSDs, [Open]Solaris, HP-UX, AIX, [Open]VMS [not unix though], Minix …
Some of the DEs/WMs was ported from linux [KDE, Gnome, more], but some were not [CDE, more].
Linux is just a tip of an iceberg in an OS [and *nix based, or similar] world
We’re talking desktop, here, and realistically, Linux is the only UNIX variant relevant in that environment – if Linux desktop has just a few percent of the market, all those others you mention aren’t even a fraction of a percent. Not a lot of Minix, AIX, or VMS desktops out there…
Hey, we’re talking IT, not marketing or market share here. Why would anyone care for the stats? the fact is the fact: these DEs and WMs don’t belong to linux exclusively and some of us use it on other OSs.
Don’t forget that KDE runs on Windows as well.
Can we come up with something besides ripping off OSX? That Screenshot looks identical to 10.6, even the icons are clones of the Mac. Apple Computer…err sorry “Apple, Inc.” is not the end all of UI design. Some people don’t like how the top and bottom panels work in GNOME but at least its something different from some candy coated dock that looks great for the first year and then just becomes annoying to use. (dock icons bounce in my nightmares)
Different isn’t good enough. Ultimately, I’d hope the goal is to surpass OS X, but in order to surpass it, it must first be equalled. I like global menu bars, I like application-centric window management, I like an interface that works whether you have fifteen mouse buttons or just one. On the other hand, I grant you the dock – it’s a usability nightmare, drastically overdue for a ground-up rewrite.
Well those features you like were different before MacOS/OSX, but at least in GNOME you can change it around somewhat and use different themes as opposed to OSX. I guess Apple no longer wants anyone to “Think Different”
Wait, are you really suggesting that in order to be better than apple, a interface must first become identical to apple?
I agree with the previous poster, that an over emphasis on recreating OSX doesn’t bode well for future UI development.
It probably goes for the same look, but it’s far from identical.
In any case, the elementary project is more than just about the looks – it also aims to improve some aspects of software that are ignored by upstream developers (such as with the case of nautilus). It aims at making the linux desktop refined and coherent.
The only thing keeping me from running a linux desktop is the lack of attention, polish, and elegance to UI details in the major DE’s, and linux desktop applications. It seems that some linux devs (and users) don’t care about such things. The elementary project does, though.
Oh, and stop focusing on the theme for gods sake.
Edited 2010-06-19 01:23 UTC
Maybe not identical, but certainly derivative.
http://i.imgur.com/xzZwb.jpg
That’s pretty bad. I’m looking to STOP using OS X in the home — Everything at work is FBSD and Linux and I’ve been using FBSD since the 90s in one form or another. This is not what I want to see when I finally pull the trigger.
That screenshot is so sexy.
Every time another distro is created, some complain “Why do we need another distro? It’ll never have enough users to matter; it just takes away from distro A or B”.
I think it’s actually a good thing, the distro doesn’t have to become the “next greatest thing”, it just needs to scratch a particular itch, try some ideas that may be too drastic in the established distros…
If it has good ideas, the common large distros will adopt (or “copy” if you like to complain) some of the changes and we’ll all be better for it, maybe the new distro presenting the alternative viewpoint won’t be needed at some point in the future because everyone else implemented the good ideas from this one and it’ll naturally die off because it’s unneeded.
Some good ideas are a bit extreme and may never get adopted or may take longer to get adopted (filesystem structure from Gobo comes to mind); but I definitely think they should be tried.
It’s OK making a distro that looks good and supposedly improves usability, to what seems like an OS X work flow but what about bugs?
What would help is fixing bugs that Canonical still haven’t fixed. Intel drivers, Plymouth, copy&paste without needing to keep the app open, stuff like that. This distro will still have the same filtered down bugs or problems from all the other Ubuntu derivatives.
It seems to me they are just taking the best parts of OS X and trying to make something that’s more usable. Well, the Linux desktop is usable, so I don’t see where this is going.
Edited 2010-06-18 21:32 UTC
Huh?!?!?
klipper (KDE’s clipboard manager) can do it since, KDE 3? 2?
– Gilboa
My original thoughts were oh freaking great, yet another freaking distro, but, dammed, that screenshot looks good! This really is what Ubuntu should be. (Yes, I use Ubuntu everyday).
What exactly is wrong with copying the way OSX looks? Apple spends a lot of money working on the UI (along with MS), why not take advantage of their work. Consider, the original goals of the GNU project: from what I have read, Stallman used Unix, but became frustrated because he could not modify the original closed source Unixes, so he set out to create a free os based on Unix. Was it a copy, sure, but it fixed the problem with original Unix, namely it was closed source.
I think a similar argument holds for OSX. I love OSX, but it has one fundamental problem: it is tied to the insanely overpriced Apple hardware. In my experience Linux has a much better performing kernel and certainly memory manager, than OSX. Problem with Linux is the UI (any flavor, take your pick) generally sucks compared with OSX, and even Windows. Ubuntu 10.04 has improved things, but still nowhere near the usability of OSX.
So, I’m going to give elementary a try.
Edited 2010-06-18 22:02 UTC
Do you know what GNU stands for?
It doesn’t matter what it stands for, it’s just a name. GNU was developed as a free alternative of UNIX, similarly WINE was started to run Windows software on x86 Linux.
I guess GNU and WINE acronyms (or just N’s in them) were adopted to avoid trademarks and to stress that no copyright violation took place (both products are independent reimplementation of some API’s).
If not that, we’d probably settle for names like “Free UNIX” or “Windows Emulator”.
Moron, have you *EVER* used WINE? It *DOES NOT* emulate
Windows. It translates Windows software calls into their Linux versions.
That’s why Windows programs don’t either work under WINE or don’t work correctly. For instance if a Windows program makes use of the Windows firewall, odds are it won’t work or not work correctly unless you can tell not to use it because the Linux firewall is not the same as the Windows one on a basic level.
Oh, that makes you look so intelligent. Bye.
FYI:
emulator
Main Entry: em·u·la·tor
Pronunciation: \ˈem-yÉ™-ËŒlÄ-tÉ™r\
Function: noun
Date: 1589
1 : one that emulates
2 : hardware or software that permits programs written for one computer to be run on another computer
emulator
Main Entry: em·u·la·tor
Pronunciation: \ˈem-yÉ™-ËŒlÄ-tÉ™r\
Function: noun
Date: 1589
1 : one that emulates
2 : hardware or software that permits programs written for one computer to be run on another computer
I don’t mean anything negative with butting in the conversation, but I just feel like I could perhaps help clarify this. Yes, I do understand why people often liken WINE to an emulator; after all it does indeed let you run applications designed for a different OS under an OS they weren’t meant for.
However, WINE does not emulate a computer. It does not modify application’s code in any way, nor does it modify parts of the underlying OS either. Instead it just passes certain function calls to the underlying OS, and maybe adjusts the parameters sent in order for the function to work properly. The code of the application itself however is untouched. And WINE itself mostly consists of a reimplementation of WIN32 environment. Like f.ex. Mono isn’t emulating .NET neither does WINE emulate WIN32, they’re just new implementations of the same old thing.
So, number 1 doesn’t apply to WINE. And since WINE does not indeed allow you to run x86 applications under non-x86 compatible hardware number 2 doesn’t apply either.
What does that leave us with? WINE, a program that allows you to run software designed for Windows under Linux, what would a proper term for it be? I personally would call it Windows-compatible environment, or a Windows compatibility layer. Feel free however to offer any insightful comments or better phrasings if you feel inclined
How about API layer?
I don’t mean anything negative with butting in the conversation
Sorry about that. You’re more than welcome to join the discussion.
So, number 1 doesn’t apply to WINE.
I think you’re narrowing the definition a bit too much. Look:
emulate
2 entries found.
1. emulate (transitive verb)
2. emulate (adjective)
Main Entry: 1em·u·late
Pronunciation: \ˈem-yÉ™-ËŒlÄt, -yü-\
Function: transitive verb
Inflected Form(s): em·u·lat·ed; em·u·lat·ing
Etymology: Latin aemulatus, past participle of aemulari, from aemulus rivaling
Date: 1582
1 a : to strive to equal or excel b : imitate; especially : to imitate by means of an emulator
2 : to equal or approach equality with
And since WINE does not indeed allow you to run x86 applications under non-x86 compatible hardware number 2 doesn’t apply either.
Definition says nothing about the CPU type or the implementation.
Yes, internally WINE is just an independent implementation of Windows API. But, aren’t VMWare or Bochs just software implementations of the x86 hardware interface? Isn’t VMWare limited to x86?
Like f.ex. Mono isn’t emulating .NET neither does WINE emulate WIN32, they’re just new implementations of the same old thing.
Very interesting point. I would say that Mono isn’t emulating .NET the specification but it is emulating .NET the implementation. In the end it’s pretty fluid, depending which of these really defines the platform.
Finally, let’s not be so fussy about a name. I’ve seen names like “Windows” or “Apache” applied to software products. 😉
Finally, let’s not be so fussy about a name. I’ve seen names like “Windows” or “Apache” applied to software products. 😉
Yes, let’s *DO* be so fussy about a name when idiots like yourself try applying the *WRONG* definitions to the item in question.
Do you actually have a valid argument for why wine is not an emulator, or can you only handle name calling?
Adam
As someone already said, Wine is not an emulator because it doesn’t emulate a full windows machine, it just provides a translation of Windows’ system and API calls into equivalent linux ones. This results in much better overall performance at the expense of slower startup times.
Edited 2010-06-19 19:29 UTC
Sorry, but if I look up “emulate” in the dictionary, wine meets the definition.
Adam
No it doesn’t. Unless *YOU* are using the new Apple-Approved English dictionary for the I-Pad.
http://dictionary.reference.com/browse/emulate
Yes, wine most certainly emulates.
Adam
Not according to the people who created and maintain WINE, and quite frankly, they have the *FINAL* word on the subject.
Move along, please…..
Hah… They can say whatever they want. It doesn’t make it true.
Adam
WINE is an API/ABI abstraction layer. Why can’t it be called an emulator? Because Windows is an OS and WINE is nowhere near that. Though WINE with Linux, as a tandem, could be called a Windows Emulator.
Yet, Mono does try to emulate MS.NET in all and every aspect, top to bottom.
Fair enough. I agree that in isolation (without Linux) WINE is not an emulator.
I don’t think this is a valid context, though. Software always relies on some sort of primitives (be it interpreter, libraries, OS calls or CPU instructions). This way every program should be called “a layer”.
Back to GNU&WINE. Don’t you find it interesting that, when expanded, they actually include words “Unix” and “Emulator”? This is a clear invitation to consider whether or not they have anything to do with Unix or a Windows emulator. Just think of all these discussions that have happened already. That’s how far you can go without violating someone’s trademarks and still have your message passed through.
WINE stands for ‘Wine Is Not an Emulator’.
I can say that I’m not a 34 year old male from the United States. That doesn’t mean I’m not a 34 year old male from the United States, though.
Adam
That is not the accepted definition that is currently in use in the industry.
The user contributed article from wikipedia, best describes the current definition in use in the industry.
http://en.wikipedia.org/wiki/Emulator
The problem with all these so called experts who are trying to “improve” the Linux desktop is that they believe they know what’s best for users. Sorry. Users know what’s best for themselves and that’s why I love Linux. I can have it my way; you can have it your way and we’re both happy. Why does there have to be a standard? For newbies? Sorry I don’t buy that excuse. There are plenty of distros that cater to Windows refugees by theming the Linux desktop to resemble XP, Vista or even old Win9x. There are also distros that cater to the Mac look and feel. Gnome and KDE default setups are pretty intutive in their own right so why even resort to theming? Half of these so called “easy” distros are Ubuntu re-spins, the other half Debian so compatibility with the most popular software is there. In the meantime, leave my distro alone!
Couldn’t have put it better myself. The real problem here is that the *VAST MAJORITY* of these so-called UI experts are leeches trying to make a name for themselves at the expense of the linux community.
I wouldn’t be that mean. When you compare a well configured GNOME with a well maintained Linux distribution the difference as so far as UI consistency it is no worse than Windows. Sure, it is no ‘Mac OS X’ but I sit back and laugh when I hear people try to claim that some how the OSS UI is far behind the times.
What is wrong as so far as the desktop from my stand point is the incompleteness; KDE still using HAL, projects that use HAL aren’t moving (check out GIMP, HAL being deprecated and the GIMP programmer who literally told the GNOME project to go f-ck themselves as one example), the fact that GNOME applications aren’t having their bugs fixed and are too rigidly bound to Linux with patches to support alternative operating systems being turned down (there is a reason why *BSD’s and OpenSolaris maintain so many custom patches). So it is the underlying infrastructure that is the problem, not the presentation.
As for the rest, OpenOffice.org is simply horrible, here I am in 2010 and they still don’t support chicago style citations for christ sake – it is things like that that hold back student adoption that blocks out an avenue which would have otherwise won over an end user for life. The UI is ascetic unpleasing and yes when something isn’t attractive it doesn’t help the usability or the productivity either. But again, that is outside the perspective of KDE and GNOME given that OpenOffice.org is an entirely different project altogether (I always got the impression though from the likes of Minguel that he wished there was something better).
But back to the article; what is the focus of these UI improvements? there seems to be a small group of people who are hell bent on wanting to take over the world who actually do zero in the way of real programming. On the other hand there are those who do the heavy lifting who aren’t interested in taking over the world, simply working on what they want for their own benefit. Until those who make the grandiose visions of world conquest actually contribute to the development of what needs to be developed there will always be this situation that exists today.
As for what I’d like to see, FreeBSD + Better hardware support + KDE that hooks into the FreeBSD system features instead of relying on HAL. If that was delivered tomorrow it would be good bye Apple, hello ‘hypothetically desktop operating system’. What I demand from an OS isn’t unrealistic, it is depressing how such an easy goal cannot be achieved
Edited 2010-06-19 09:03 UTC
Just to avoid misunderstandings.
KDE does not rely on HAL.
HAL is one possible way to access data about the system KDE is running on, i.e. there is a KDE platform plugin that accesses the D-Bus API usually provided by HAL.
Therefore there are two options on how to get system data from other facilities:
– implement the HAL D-Bus interface and use KDE’s HAL plugin
– implement a KDE plugin for this other system facility
Option two is probably the better choice, especially in the face of HAL having been abandoned/deprecated by its own developers.
Until someone actually sits down and codes up a non-HAL-based Solid backend, KDE relies on HAL.
… stop pretending that any more than a small fraction of users actually care.
…stop flame-baiting
We need two versions of UIs for Linux/*BSD: one ultimate GUI for consumers who aren’t going to compile anything but just want to play a few casual games, browser the web, and manipulate e-mails. We need another GUI or TUI for those who want to tinker.
The lack of a single face still confuses ordinary consumers, among other things.
It’s great to see people working toward such a goal but they’re not going to get far until everyone agrees. At least, the latest Ubuntu release looks appealing for the most part, so a few are blazing a trail.
don’t like the interface at all. Haven’t brushed metal aluminium interface and blue icons already been done to death?
Blue icons I’ll grant you — and the dock and the translucent-black windows — but where did you see any brushed metal? It looked to me more like the smooth-gradient gray that Leopard and Snow Leopard use.
I actually really like this theme; it’s obviously very similar to the Mac but something is better about it. I just can’t put my finger on it.
I might give it a spin on this Ubuntu install I’ve been using; I do miss my Hackintosh install on this computer but I’ve come to find that OS X really doesn’t work right on non-Macs.
The problem is they are all copying. While it is nice to have an interface you can use, because you already know it it isn’t really something that makes desktop users switching. Both Windows and OS X have always been copied. The two usually interesting things for the average desktop user I know are it’s security and the fact that it’s free. It has been faster (from the view of a desktop user, so I’m speaking of responsiveness), but I think Windows 7 pretty much changed this. They are now both responsive enough.
The real problem is the same as it has always been. A user can’t use the (very same!) application he used to use on Windows. This does change with web applications and the browser being the primary application nowadays. So I think Mozilla Firefox is something that makes people switch. Nut not all applications run in your browser (yet?), so if you want to increase the market share one should care that many software vendors build native applications and every windows application runs using wine.
But honestly I don’t really care which OS people use
If you’re going to copy osx, at least do it right. Use gnustep/etoile and finish the damn APIs. Get all the open source osx software ported to linux, by that point your api’s should be close enough to osx to get a lot of the professional mac software ported. Sure you’ll never get apple porting apps but you can still get to a very usable system point. Might even get Adobe porting if you do a good enough job.
There’s 200,000 iphone/ipad developers, hitting them up for code on linux would be a decent way to expand the developerbase of linux. Plus there’s a bunch of cool toys we could get ported. ATM GNUStep is working on porting coregraphics to linux. Would be interesting if linux could run most iphone/ipad apps.
Edited 2010-06-18 23:53 UTC
Finishing GnuSTEP would be nice, but I don’t think its ever going to happen. I tried GnuSTEP (Ubuntu 10.04, installed from Synaptic) a few weeks ago, and it is an absolute disaster. Its still about 5% there, I really can’t tell what has changed since freaking 1996 or 1997 when I tried it last. WTF have the GnuSTEP devs don in 14 freaking years???
There might be some hope for a new re-implementation of the Cocoa apis called cocotron: http://www.cocotron.org/
In a few years he’s done more that GnuSTEP has in its entire history.
Cocoa really is a wonderful toolkit, it should be easy to wrap it around GTK or Win32, no idea why GnuSTEP wants to roll its own rendering layer.
So, yeh, it would be my dream to have a 100% free, open source OSX compatible OS, if that ever happens, I doubt GnuSTEP will have had any part in it.
We have Mono already, and it seems gazillions of Windows developers didn’t suddenly start writing Linux applications.
BTW, how do the GNUStep guys feel about recent actions of Apple? Do they still feel objc ecosystem is something they want to support on their volunteer time?
He is doing awesome work.
It shouldn’t be improved. It should be killed. We don’t need desktop anymore. We need user interface that is better than MacOSX UI and/or iPhone OS UI. I invented it, you should just start to implement it. Cheers.
I’ve said this before, the biggest reason I see for so many damn flavors of Linux is that NOBODY gets it “just right”. Current Linux vendors just can’t throw the manpower at UI designs and get every aspect of the look and feel right. I have tried many Linux FOTM releases and I end up tweaking every one of them to overcome some deficiency. I might die of a heart attack if I ever installed a new Linux OS that didn’t require hands-on attention.
Apple gets it right because they have virtually unlimited resources to make the user experience a pleasant and consistent one.
Apple spends tons of money on usability, designers, GUI I+D and so on… Linux desktops must reimplement Mac OS X desktop. That’s all.
Please Linux developers/vendors… don’t waste precious resources reinventing the wheel. Just rip Apple off!!
Edited 2010-06-19 02:21 UTC
Errr… No. Really, no.
Had Apple spent much on usability and GUI, they wouldn’t have rolled out horrors like the Dock, application-centric window management on a single virtual desktop, and the Finder.
If desktop Linux distros start to mimick OSX, I’ll consider that the Linux desktop has failed to deliver an interesting product, and just go back to Windows and say goodbye to easy development. Or go arcane and try my best to get used to Haiku. OSX is far from being wondrous in terms of user interface, that’s one of the reasons why I’m not using it.
Edited 2010-06-19 03:55 UTC
I don’t like it either. It’s ok if you just launch a few programs but switching through a dozen programs in the dock gets annoying. The other issue I have with OSX is that it doesn’t make proper use of the right mouse button.
I think a better strategy for the Linux desktop would be to build around a cross development platform (Qt) to attract developers. People turn on a computer to use applications, not screw with the UI. Distros need to work to make life easier for cross-platform, proprietary developers. Stallman’s plan of having the people’s army code everything has been a failure.
I wouldn’t go as far as saying that, since most of the software I use on a daily basis was coded by the people’s army and works fairly well. I’d rather say that we don’t need proprietary software for everything, but that we should not bar it access either.
Two tasks of a linux distro that are often ignored, in my opinion, is to reach API consistency (instead of GTK here, pulseaudio there, QT up above, and VDPAU somewhere in the wild) and to heavily documentate said API in an easily accessible way.
As an example, for scientific calculations, the combination of Python and some science-oriented APIs is astonishingly getting widespread, but most people still prefer Matlab over Python. (I chose Matlab because contrary to other scientific software like Mathematica and Maple, its syntax is in par with that of Python in terms of awfulness and being inadapted to the job in my opinion)
What are the two top differences, before anything else ? Matlab has got a huge and helpful help system, and its various commands are tightly integrated with each other. To the contrary, with things like Numpy and Scipy, all you get is a bunch of HTML pages (which already feels clunky and unprofessional to start with), and the commands do not feel integrated with each other (as an example, when you want to introduce a formal parameter in python, you can’t just use a variable without attributing a value to it, it will get you an “undefined symbol”-like error).
Today, on Linux, when Adobe wants to decode an H.264 stream using the hardware in Flash Player and ask the community which APIs are available, the answer is something like “xv, VA-API, VDPAU, (and some others)”.
On OSX, the answer is like “Use api X in the latest safari or fallback to software rendering”. Guess which platform gets the most polished release in the end…
In my opinion, by using a single coherent set of APIs and a good documentation that’s fully available at a single place, the Linux world would ease the life of both proprietary software developers and amateurs. Better software availability would ensue.
Edited 2010-06-19 04:41 UTC
Most of it? You must use a lot of GNU utilities. Most of the open source software I use was built by corporations like Google and then released as open source to the public.
At home, I’m most often using…
-Core : Linux Kernel and various free drivers + software (FOSS, indep.+GNU)
-CLI : Bash (FOSS, GNU)
-UI layer : Xorg+KDE 4+KDM (FOSS, indep.)
-Firefox (FOSS, can arguably be considered as a new codebase seeing the time which elapsed since Netscape code was released so indep. or Netscape as you wish)
-GCC (FOSS, GNU)
-GNUplot (FOSS, GNU)
-Binutils (FOSS, GNU)
-Bochs (FOSS, indep.)
-OpenOffice (FOSS, Sun)
-GIMP (FOSS, GNU)
-VirtualBox (FOSS, Sun)
-Audacity (FOSS, indep.)
-Nvidia’s proprietary driver (Proprietary, NVidia)
-Adobe Flash Player (Proprietary, Adobe)
Proprietary software has its place on my computer, and Sun has done a lot for the open source community before the Oracle thing, but GNU and independent software still takes the biggest part…
Edited 2010-06-19 20:20 UTC
I don’t quite understand that sentence, do you mean you use Matlab, because it is not quite as awful as
Maple and Mathematica and on par with Python?
I have to disagree, Matlab syntax is pretty awful compared to Python. I actually switched to Python because Matlab syntax was annoying the crap out of me. 90% of errors are find the missing dot. Also I guess you never had to write a GUI for your Matlab code. Python is so far ahead of Matlab in that respect (Matlab GUI code makes my eyes bleed).
Why are HTML pages unprofessional? That’s what pretty much every software manufacturer uses for the help pages AFAIK. Also I actually never use the HTML pages or the Matlab help system. I just type “help command”
in the prompt (I do the same in Matlab whenever I have to use it). That said numpy/scipy still do need to work on their documentation (actually numpy made huge strides last year in their summer of documentation).
Another thing, if I compare the documentation of the Matlab language (not the commands) to the documentation of the Python language, the Python documentation is way better.
I don’t quite get what you mean, something like
def fct(x, *args) ??
No, I chose Matlab because its syntax is on par with Python in terms of awfulness in my opinion. Python is a good programming language for prototyping and low-performance software, and insane people can even use it for other use cases and get cursed by their users. Just like C#, ActionScript, and Java. But its syntax is just inadapted to scientific calculation. You can use it for that, but it is painful. Matlab has an incredibly crappy syntax which manages to be on par with that of Python in terms of inadaptation, even though it is theoretically suited for the job.
You’re preaching a convinced people, though for me GUI programming and calculation should be left to separate languages. I HATE Matlab’s syntax. Every time I have to use it, I try my best to remember the commands used to launch the GUI tools and never, ever, use its command line, except when being forced to do so.
Seriously, should any sane people have to use tf([1],[1,1]) or some other trick like s = tf([1, 0]) in order to input 1/(1+s) ? I don’t care if transfer functions are treated separately by the software, for me it is just a rational function and should be treated as such.
No, they aren’t that widespread, except maybe for the content. People often use Windows’ CHM or some in-house solution, be it only for one good reason : searching. In a good help system like that of Maple, Mathematica, and Matlab, you search for keywords like “polynomial”, “linear algebra” or “inequation plot”, and you get all the related functions, sorted by relevance order if the guys took some time to polish it. In HTML help, you get a web browser, a tool made for something else which doesn’t provide indexed search facilities, and you have to use full-text search in the index, which requires you to know the exact name of the command or the category it’s put in, and parse through loads of irrelevant answers.
1/Is python able to provide in such a case an extensive help page describing command use, options, and examples with results, with links to related topics, in a non-cluttered way like that of Mathematica and Maple ? (the “click-to-expand” way)
2/Again, this only fits one use case of the help system, namely checking syntax of an already known command. Command discovery is not poor with this help system.
I was talking about the help system shape and use cases, the contents are another thing (And that Python’s core syntax is better documented is logical, since it is that of a general-purpose programming language, more complex than a specialized one, and generally not teached in courses contrary to Matlab which is alas common).
Mathematica :
a = 3*x
Result : 3x. Plot(a, {x,0,4}) will work.
Python :
a = 3*x
Result : Undefined symbol “x”. Plot will not work.
Things like that make a general-purpose language a pain to use for everything that’s mathematics-related.
Edited 2010-06-20 07:13 UTC
Just one thing. Whether or not existing HTML help systems do this, I want to say that there’s no reason an HTML-based system can’t have a kickass search feature, if it’s running on a server somewhere. Actually, I think some systems will do it in JavaScript, so you don’t need a server.
What I really take issue with is your statement that a web-browser is not meant to display documentation. That is bollocks. A web-browser is designed and has been from the beginning, to show interlinked content, mainly of a textual variety, in a formatted way. Displaying documentation pages is probably closest to the basic principles of HTML and the web than many other uses that web-browsers now serve.
Why do you separate calculation and GUI? That sounds awfully complicated to me, suddenly you have to learn 2 languages + the glue between them.
It seems to me you want symbolic math manipulation not really scientific computing. So something like Mathematica or Sage on the Python side would be more suited to your needs.
Actually, that was my point the content is very often HTML, they just write a specific help browser for the html pages
Agreed a specific documentation search at docs.scipy.org would be great. I disagree however that a webbrowser is the wrong tool, if the HTML pages are well structured and written a browser is just as good a specific help system.
I can’t comment on Mathematica and Maple, but they do provide a rather extensive help, which describes all options etc. There’s also the short help which provides you with a short usage string if you use ipython ?<command>
Agreed but in my experience that’s not really that much better in matlabs system either, e.g. the last I recall was searching for writing to a text file in an arbitrary format. If I hadn’t known fprintf from C I don’t think I’d ever found out using the Matlab help system.
Well especially if it is taught in courses I’d expect the syntax would be well documented.
Well I can only repeat what I’ve said above, you really want a symbolic math manipulation, that cannot be provided by a programming language like python and matlab alone. I’d argue that this is another use case entirely, and although there are some packages for python and I think matlab as well. They will never perform as well as a language specifically build for that purpose. However if you want to to numeric simulations (like I do), that often does not fit into systems like Mathematica and I find using a proper language like Python is a lot more comfortable for me than using matlab.
Because mathematics have a grammar and user needs that are different from GP computer programming language. In mathematics, an undefined symbol is a parameter.
In a GP programming language, which all toolkits are based on, an undefined symbol is an error.
In mathematics, almost everything is expressed in terms of command line.
GUI programming, on the other hand, can only be made a fun task provided you’ve got the right tools.
Software which do their maths correctly already weight several GBs and eat up considerable power.
Adding up GUI to them is only slowing them down even further, while for most people their performance is already unsatisfactory.
In my opinion, adding up GUI functions to a scientific calculation software is adding up a huge bloat and making the language or its interpreter (your choice) more complicated, for a benefit that’s pretty small because 99% of time you’ll be using it to do calculations, and for most people it is 100%. It’s not optimizing the common case, it’s adding up features for the sake of adding up features. Just like when Microsoft manages to eat up twice the HDD space in a new release of Windows while the user experience is left unchanged for most people.
Everything which does calculations must handle symbolic expressions to a certain degree.
When you type Plot(3*x, x=0..5) or whatever in a numeric calculation software, you’ve entered the 3*x symbolic expression. What makes a symbolic math manipulation software different is the way it allows to act on those symbolic expressions, to apply transformations to it (expand, simplify, factor, and so on). But all software which pretends to be able to do some maths (and Matlab fits in this category) should be able to parse and store a symbolic expression in a variable.
When you tell to matlab s=tf[1,0], a kind of symbolic expression is effectively stored in s. So why separate transfer functions from other rational functions ?
It depends on the level of professionalism which the documentation team have ^^ The guys who wrote the help systems for Maple and Mathematica took the time to write it in Maple/Mathematica, which allows the user to dynamically change parameters in the example and see the result. Though this is a small feature which shouldn’t get much attention, it gives a pretty nice feeling when it is present. It looks like the people actually fully wrote an online help suited for online help purposes, instead of just putting an electronic version of the paper manual in the software bundle.
I agree. If the guys take the time to make a good webpage, they can reach the functionality of a good help system. However, I’m not sure that it is doable without using some kind of PHP/MySQL programming and hence requiring an external server (professional software should be independent from any form of internet connection). And it will still feel unprofessional because you’ll get several things which have nothing to do with help around : web bookmarks, private browsing…
Kind of like UNIX man pages, or something more colorful and mouse-friendly ?
Well, I’m pretty sure that something like the “writing text file” keywords could have done wonders, but I don’t have Matlab set up on this computer so I can’t test it…
Well, for numeric simulations I either use Mathematica’s NDsolve-like tools when I’m lazy (they work quite well if you take the time to set them up when the simulation is nontrivial) or C++ code when I want to fine-tune everything. I suppose that Python would fit the second use case quite well if you need some higher-level stuff or want to code quick and don’t mind the performance hit (which can be big on numerical simulations, that’s the reason why I learned C after playing with Python for some time). But it just lacks the “lazy” aspect of other scientific software which takes equations and various settings as input and returns results as output.
Edited 2010-06-21 10:00 UTC
The most elementary fact here is that this distro is a dead-end. Copycats of other UI’s don’t really succeed. Surely people involved with Linux should have grasped this basic idea by now.
From a UI standpoint, when an interface mocks a more popular one to such a detailed degree then it actually causes MORE user frustration than less – reason being that when it LOOKS the same, people expect it to behave and respond the same as well. No Linux distro is going to behave the same as a Mac so aping the UI is simply damaging to the distro.
Sure, a few people who can’t afford real Macs might run it but other than that… it’s irrelevant.
Why not work on something truly useful, like creating an easy to use distro where upgrading apps is as simple as it is on Windows and Mac, as well? Linux is surely a decade past due on THAT simple request.
Edited 2010-06-19 04:37 UTC
Oo How ? In my experience, the central repository system of Linux, to the contrary, made updates a much easier and smoother experience than on Windows and OSX…
Except, of course, if you’re referring to some distros which just introduce buggy updates right away, instead of heavily testing them before moving them in repositories at the expense of not having newly released upstream updates available right away. But it’s not the case on all. I’ve yet to see an update breaking the system on Debian stable (or even Testing) ^^
/begin ad
More seriously, for people who don’t like the make-your-own approach of Debian, my distro of choice, Pardus, never broke with an update, pretty much anything worked out of the box and continued to do so ever since. Even when upgrading to the newest 2009.2 release, which I did with extreme caution and backups because of my past experience with upgrading Windows and Linux to a new release, everything just worked perfectly fine. Its packages are of acceptable freshness when having the excellent stability in mind.
/end ad
My advice in that area is to use a rock-solid distribution and then put bleeding-edge repositories for some softwares only if you need it.
Edited 2010-06-19 04:58 UTC
I’ve yet to see a simple, easy-to-use, universal solution to upgrading apps on Linux.
I’m talking about not having to upgrade your whole distro just to use the latest release of your favorite music player, etc…
This happens all the time.
“For example, if Ubuntu ships with OpenOffice.org 2.0.x, it will remain at OpenOffice.org 2.0.x for the entire 6-month release cycle, even if a later version gets released during this time. The Ubuntu team may apply important security fixes to 2.0.x, but any new features or non-security bugfixes will not be made available.”
Sure there are “backports” but lots of apps never get this treatment.
This is not normally an issue on Windows or Mac OS X.
Oh but you *can* always have the latest releases, or even pre-releases, with either rolling-release distros like Arch, “testing” repositories, or downloading and installing a .deb/.rpm of the update.
I prefer not to do so when I don’t need to, because I prefer the increased stability of a stable installed base. But you can do that. As an example, my old Ubuntu box had repositories for GIMP betas and Emesene nightlies. Others have one for Opera.
Just imagine one second, knowing the famous quality of Nvidia and ATI drivers on all platforms, that they were always updated to the latest release on Windows. Your computer would effectively be broken quite often.
Edited 2010-06-19 05:30 UTC
Option 1: “Rolling-release” distros *are* essentially upgrading portions of the OS as you upgrade apps, etc… it just comes pouring down the wire and stability suffers, as a result.
Option 2: “Testing” repositories are not a solution at all, because a) not every app has one, and b) you generally have to manually set up the repository for each one (that might, possibly, maybe, happen to be available).
So, Option 1 is the equivalent of having to continually install beta releases of Windows or OS X just to get the newer apps available. Option 2 is not particularly user friendly and it’s incredibly spotty at best.
Neither of these options provide the previously mentioned easy of use in comparison to Windows or OS X. Heck, I think just about all the apps I currently use on OS X even check for their own updates and install them for you, upon approval, without me ever having to do a thing – and I never have to worry about the stability of the OS or that the core might be tampered with…
Remember, I never said that there weren’t ways to possibly upgrade apps, but these kludges are simply not suitable in a modern desktop operating system. The OS and the applications should not be so tightly bound to one another and I think the main failing here is the way libraries and resources are handled by the OS.
What about option 3 ? Not every application provides binary packages, but most of the big ones (the ones you probably want to upgrade) actually do so, just like on Win/OSX…
This may have changed since I last tried using binary packages but doesn’t that bring into play the issue of dependencies – both in having to possibly install something secondary that effects the core of the OS (and potentially breaks another app) and having said dependencies available for your particular distro?
Oh, yeah, that’s right… Yes, as far as I know, there’s no way to avoid this issue, except upgrading. Linux distros are kind of a “all-in-one” product…
(I didn’t care enough about that, again, to understand. As long as something work, I’m cautious about updates which may not make it work anymore. Consider all those Office users who upgraded to 2007 and had to re-learn the UI from the ground up…)
Hit the nail on the head. With disk space as abundant as it is today, app encapsulation is the way to go.
Actually, the overall layers of userland libraries need to first be better defined. When it comes to e.g. graphics and multimedia libraries there is no standard base. So it’s an all-or-nothing approach–every library depends on every other library all the way down to the core of the system. So it makes app encapsulation quite a bit more difficult.
Exactly! …and if a group are going to chase after OS X for anything, it should be the way it handles applications/libraries and not simply cosmetics.
There’s no reason that a Linux distro can’t do this and the fact that there have been all these attempts at landing on the desktop without anyone satisfactorily solving this issue, at this late date, is kind of mind boggling.
The deal is that with desktop becoming a less and less interesting target, reusing the already loaded libries is interesting again. It’s not as much about disk space than it is about memory usage. Having multiple versions of the same library loaded at the same time eats all your RAM.
That’s no longer an issue when every new laptop has at least 2 gigs.
What does Linux have to lose by moving away from the shared library system? Marketshare?
I’m really surprised by how many people are defending the status quo at this point.
Security is an issue too. If I maintain a GTK soft, I shouldn’t have to care about GTK’s security flaws and always keeping the GTK version which my soft provides up to date. Keeping my own soft secure is already too much time-consuming.
Library interdependencies aren’t needed to have a secure repository of software.
That’s right, there should preferably only be software->library dependencies and not any kind of library->library dependencies. Sadly, this result is hard to achieve as API size grows. Modern releases of Windows also have library interdependencies issues as far as I know, by the way.
(PS : I just remembered… When I set up a Vista partition for my grandma’s linux box some times ago in order to make a small LAN with my cousin and my brother, I noticed that you have to apply all updates before newest openoffice dares to run. It was before the SP1 days. So isn’t upgrading system packages somewhat required on windows too ?)
Edited 2010-06-19 21:23 UTC
This is a fair point in regards to certain application upgrades on Windows requiring a certain level of OS updates in order for the new version of the app to work, similar to updating system packages on Linux for upgraded apps.
The big difference I see between the two, however, is that on Windows those app upgrades are only asking for “official” system updates.
While it’s not always the case, with Linux, you may have to resort to unofficial 3rd party, unstable, or bleeding edge sources for dependencies that are not available via “official” channels. I don’t think Windows or OS X has a comparable situation to this – except maybe Fink but I’ve never really dabbled with that and I’d wager that 98%+ of OS X users haven’t, either.
Programs sometimes need the latest version of .net or java but that’s a huge difference from requiring a major OS upgrade to update an office suite. There are shared libraries in Windows and OSX but they are stable and provide backwards compatibility. You don’t have to worry about a .net upgrade breaking an existing .net program.
It is’t like Linux where programs share all kinds of third party libraries which leads to endless dependencies. Even if a distro follows the LSB you still have programs that dump new libraries into the system which another program might want to update in order to upgrade which requires the first program to be recompiled or the user can just stick with a frozen system. It’s a mess.
It is’t like Linux where programs share all kinds of third party libraries which leads to endless dependencies. Even if a distro follows the LSB you still have programs that dump new libraries into the system which another program might want to update in order to upgrade which requires the first program to be recompiled
That ain’t true. You’re twisting things and you most likely know it. Programs do NOT need to be recompiled if you update one or another shared library.
And yes, I know you are on some sort of a personal vendetta against Linux and all of its users but continuously ignoring all the failures in Windows and OSX just makes you look ignorant, not trustworthy or knowledgeable. Like f.ex. with Windows and OSX every single god damn application has to provide their own updating mechanism, and you either get a gazillion system tray icons for all of them or you have to open every single application separately to keep them all up-to-date. No modern OS should be without a centralized update system. As you so vigorously keep touting this or that feature as the reason for Linux not being ready for mass-consumption so too could I mention the utter and complete lack of a centralized application catalog and updating system as a reason for Windows and OSX not being ready for mass-consumption.
Hell, the lack of a trustworthy, secure centralized application catalog is the worst one of them all: the user is either forced to buy an application from a well-known vendor and the application might not fulfill the needs or might be completely an overkill, but the user can’t really do much about it as (s)he might not know of any alternatives. Or (s)he could Google around and download installation files from the internet and possibly infect the computer with malware.
I’m giving an example of a library conflict. App A requires lib 1.1 while App B requires 1.2 which breaks compatibility with 1.1. App A has to be recompiled against the new library if the user wants to install both A and B. Or are we going to start denying library conflicts now? This is what I’m talking about when it comes to the mob not even being able to acknowledge that problems exist.
This is OSNEWS and I’m giving my opinion on why an OS has been stuck at 1% for over a decade.
You can certainly argue that all modern systems should have central updates. However with novice users I can solve the googling malware problem by providing a link to softpedia and telling them to not download software from anywhere else. The only way I can prevent Ubuntu from breaking crap with updates is to freeze the system which is a security compromise.
I’m giving an example of a library conflict. App A requires lib 1.1 while App B requires 1.2 which breaks compatibility with 1.1. App A has to be recompiled against the new library if the user wants to install both A and B. Or are we going to start denying library conflicts now? This is what I’m talking about when it comes to the mob not even being able to acknowledge that problems exist.
I have not seen such a case for YEARS. If a new version of a shared library is installed and which is incompatible with an earlier version it will have been installed ALONG the old version, not replaced it. That’s why there are version numbers like x.y.z.: if only z is incremented it will be compatible with previous versions of the same x.y., and if y. is incremented it means it’s incompatible with the previous version and will be installed alongside the previous one.
You haven’t used Linux in ages, have you?
That’s what library version numbers are for.
App A links against lib-1.1. App B links against lib-1.2. Unless someone did something really stupid (which, I will admit, can happen), those two versions of the library can be installed side-by-side. Each app will just use the version it was linked against.
That’s better than the traditional Windows solution – stick your fingers in your ears, and pretend the problem doesn’t exist. That bought us DLL hell. Later, it bought us applications that use their own private copies of every library they use, making them impossible to update (like the GDI Plus security issue from a few years back), and eliminating all the advantages of using shared libraries.
That also appears to be Apple’s solution, for what it’s worth. Really, shifting all the burden onto application developers is no solution at all. At least on most Linux systems, there IS a solution, even if it’s not perfect.
Guess what Microsoft’s eventual solution to DLL hell was? Side-by-side assemblies – libraries are installed into the operating system, and with a unique version number. Apps link to a specific version of the library. If you install an updated version that maintains backward compatibility, the installer creates an alias that allows older programs to use the new version (the same as the symlink set installed with a typical library on Linux). If it breaks backward compatibility, you get a new version number.
It has some of the same advantages as using a package manager. You get to safely use shared libraries without breaking applications. You’re able to update shared libraries with critical bugfixes or security updates, without breaking existing applications. No central update solution though – the library developer still has to wait for application installers to update the version.
Of course, nobody but Microsoft actually uses the damned thing. Everyone keeps using the same old non-solution of using private copies of all the DLLs, with all the problems that brings. That same solution works on Linux as well, by the way. A good chunk of commercial Linux software does this.
What office suite? Last time I checked, you could download new versions of openoffice for Linux as well.
Not sure about .net, but things can certainly break when updating java, and even when moving away from one single specific version.
Case in point: HP Service Desk (and probably newer iterations of it).
1/OpenOffice does not require .Net and the required version of Java is bundled in the package I chose to download. There must be something else in the Windows ecosystem which is required to run this app, but is installed by an update. Sadly, contrary to the Linux world were at least CLI messages are self-explanatory about which library is missing, the app kept dying with an unexplanatory message and I thought for a moment that it was incompatible with Vista…
2/Backward compatibility is broken during major updates on most desktop OSs. I’ve yet to see any 2D version of Worms running on Windows >= XP without colors randomly getting in LSD mode after spending some hours playing (whatever compatibility mode is being set only helps by preventing LSD color to happen right from the start). This is exactly the same as GTK 1.x keeping backward compatibility with itself, but GTK 2.x being incompatible with it. The good thing is that with Linux, you can keep GTK 1.x *and* GTK 2.x on your hard drive without getting 12 GB of bloat (Ever got a small HDD full because of that annoying WinSxS folder, without knowing what you can remove inside of it ?)
2D worms was programmed with Windows 95/98/Me in mind and not the NT line … most games that are 95/98/Me only are a total PITA to get running, solution, most of these games have been re-released for nothing (GTA 1 & 2), or re-released on steam for less than a pint of beer down the pub. (BTW Worms armegeddon runs fine on 2k Plus if you download the latest beta patch).
I suspect this goes for many applications that were programmed only for Windows 95/98/Me .. Which unless they are bespoke business apps can be replaced.
Even my laptop (an Dell D430) with a 60GB hardrive which has windows 7 installed, can deal with Windows 7, Office 2007, Visual Studio 2010 and SQL Server 2008 and I still have about 30GB free… what takes up room is bullshit like MP3s, Movies and Games, which is all exist on my desktop not my ultra portable … really who gives a monkeys about that 12GB is taken up by the default install … 250GB hardrives are less than £40 pounds now.
Edited 2010-06-20 21:37 UTC
You mean like the all-in-one bundles you see on OSX ? I *heavily* disagree with that. I offered a pen tablet to my girlfriend for Christmas, which came in a bundle with a photoshop elements licence. I discovered that there was no DVD in the box, it was available for download only. I then said “well, no issue”. That’s what I thought. But downloading 2 GB of data over a crappy Wi-fi network is a pure nightmare. After the third time the download stopped without a warning one hour after the beginning and refused to restart except by re-downloading from byte 0, I just gave up and teached her how to use GIMP, even though the OSX version belongs to the hall of shame of most crappy software ports in my opinion. Software packages are *much* easier to download when they’re small.
Edited 2010-06-19 09:38 UTC
Yea Gimp in OSX sucks, use Seashore.
But I would fix her wifi or plug into lan since Photoshop Elements is worth the download.
Well, maybe will advice her to try that, though it seems a bit too primitive in some areas at first sight (does it support brush dynamics and tablet pressure ? My girlfriend extensively uses her tablet for painting purposes, so lack of it is a showstopper. Plus the UI looks easy to learn, but not very powerful for everyday use, since we don’t have a powerful docking system like that from GIMP and PS which allows one to have everything available at first/second sight).
I tried my best for the wifi, but her house is one of the sole places which manage to make me hate computer networks even more than I usually do.
Let’s quickly picture the setup : on ground level, there’s the router/modem, which can’t be moved, along with a computer which is close to permanently used by her little brothers. On first floor, just above, another computer is connected by wifi to said router. Her room is on the opposite side of the house, on first floor too.
The issue is that there’s very, *very* thick walls in there. You put a wifi G router on one side, you can’t detect the tiniest bit of signal on the other side. I had to hack an artisanal repeater myself with a WRT54G and install it before she could even grab some signal and painfully manage to watch Youtube in her room. CPL wouldn’t work either, because the first floor, where her room is located, is on a separate electric circuit. Making LAN reach her room would require at least 100 meters of wire, and her parents are not all for letting me drill a few holes in the floor and the walls, though they accepted to pay for the WRT54G plus a small fee for my work in lunch vouchers. She obviously doesn’t want to leave her computer next to the brother’s computer.
Plus, if I remember well, we had to download the software in the week following registration. And I can’t find anymore where customers can find their software bundle on Wacom’s website. At the time, I just copy-pasted some random URL which was mentioned on the various pieces of paper bundled with the tablet.
Edited 2010-06-19 21:10 UTC
I’ve gone over this a 1000 times here so don’t waste too much energy with the status quo defenders.
Program management in Linux is an insult to software engineering. I’m glad that I’m not the only person that can see this.
Have a nice day sir.
Yes, I’m seeing what you mean! lol
Oh please …
repeating it over and over does not make it true. Everything in a single package is a huge waste of resources:
Disk resources because you have libraries several times on your disk (and this is actually relevant if you use an SSD for example).
RAM, I’d rather use my RAM for useful things instead of keeping multiple copies of the same library in it.
Bandwidth, if a library that is part of a lot of packages needs a security upgrade, you suddenly have to download all these packages. Now let these packages be things like Photoshop and suddenly you look at multiple GB downloads. I’m not even talking about the fact that you have to wait for a new version of every one of the packages.
CPU, every app needs to check for updates, great use of resources!
Time, looking for software at places all over the internet is definitely not a better use of my time than looking in one central location.
I can’t believe you actually touting all-inclusive packages as the better engineering solution.
Don’t just criticize, what’s your solution because the current ways are utterly broken and is a relevant portion of why Linux does not have wider adoption.
..and the satisfactory alternative here is to a) force the user to wait 6 months and re-install their OS over again for the newer libraries and apps or b) resort to an unstable rolling-release distro?
No, the size of modern hard drives is not an excuse for the currently shoddy app upgrade methodology used by various distros. If you’re running some sort of resource deprived device then chances are you need a stripped down distro rather than a desktop one.
Actually, if the apps you are running use the same library revisions as the core OS, they wouldn’t need to load their own. However, if the app requires a newer version, it already has it within and it doesn’t effect the rest of the OS.
Not to mention the fact that you could continue to run the apps as they were delivered with the OS and not worry about upgrading (since you can’t normally anyway) until you upgrade the whole OS. Choice… how great is that. I’ll gladly take app bundles in trade for a few extra MB’s of drive space or memory allocation.
So what if you have to wait? Chances are the app in question has newer versions of the library in question compared to the core OS anyway, else it’s using the OS libraries and not loading it’s own. Also, Photoshop updates are not multiple GB to download on OS X now, why would they be on Linux? …or maybe that highlights another issue with Linux where you have to download the entire application over and over again every time you upgrade, instead of just the parts that have changed.
Are you running a 386 with 16MB of RAM and a 9600 baud modem or something? All these arguments you’re bringing up were barely relevant to resource starved systems of 10 years ago let alone today.
I don’t see what that has to do with anything talked about. Mac users don’t need to do that. The app checks for them. They don’t even need to load a package manager to check, don’t have to configure 3rd party repositories, don’t have to go through repo hell to solve random dependencies if they’re trying to upgrade to newer apps than their current release officially supports, etc…
Its certainly better than Linux is currently offering in its attempt to be a modern desktop OS. Your excuses seem to be tied to systems with a performance level I threw out years ago.
You are the one criticizing, making statements like “is a relevant portion of why linux does not have wider adoption” without actually backing it up.
This statement is just rubbish, most people don’t care to always run the latest and greatest. Windows has the worst software management system ever and is the most used OS, by your line of reasoning it should be irrelevant. And finally everything you want to do you can already do in pretty much all package-management systems under linux.
1. you let the system take care of all security updates.
2. if you really need one newer version of a specific software you either:
2.1 use the package provided by upstream. For a lot of packages they even automatically add the repository to your system (Mendeley, Opera etc. do this on Ubuntu) and the normal update system takes care of updates.
2.2 if upstream does not provide a package for your system, use the provided static-build tar.gz package. Now the package has to take care of updates itself.
2.3 if there is no static build, you can always build from source. Sure not as easy as double click on a package, but if you’re such an advanced user that you need to use the latest and greatest version of a specific package which does not provide builds then you should really be familiar with building from source.
First there’s many rolling-release distros which are stable, second what package could you possibly need that you can’t wait 6 months (note most proprietary software don’t bring out new versions of their software very often) and finally you can always resort to one of the options mentioned above.
But you can already do that with current systems in place if you really want the newest version of a specific software just use either the packages provided by upstream, or use a static build.
Your system is vulnerable while you wait, yeah I know Apple users think they are safe, even without fixing security issues, because Steve the great said so.
Again people complain about memory usage and resource usage all the time. I rather have my system actually do work than have 6 update managers running and checking for updates all the time.
Your system finds your software for you, great how does it do that? How does it know what you want?
“Rubbish” as you say. I hate Windows with a passion but you can surely install app upgrades easier than in Linux. You don’t have to worry about installing Windows 7 and then not being able to install a new release of Office 2 weeks later. However, Windows has a whole set of other issues.
No. you. can’t. This has been discussed up thread already.
Great. What are these magic, desktop-ready distros that don’t rely on unstable repositories?
What a ridiculous remark. The better question is why would I run an OS that always leaves me either 6 months in the past or relying on unstable/untested sources for updates?
Library interdependencies are not required to have a central update system. See: iPhone.
Library interdependencies are not needed for this feature either.
They are and they wouldn’t have to be all-inclusive. You could provide some stable system libraries for programs to share and then keep everything else separate.
apt-get install [package] works on most Debian based distros.
Most other distros have their version of this command and/or a GUI version.
Dude we all know about apt-get.
It comes with problems like users being unable to upgrade software until system dependency issues have been resolved. Updating an application should not require a major system update.
Dude. Apparently some of us don’t know much about apt-get.
Please link a few recent examples of such a problem.
Also, please name an OS that is better at handling program uninstalls.
Edited 2010-06-19 07:46 UTC
Ubuntu 8.04 users were unable to upgrade to OpenOffice 3.0. They were told to upgrade their OS to 9.04 if they wanted the latest version of an office suite. That’s retarded. 8.04 was released in 2008.
OSX. Most the time the uninstaller isn’t needed since everything is kept in a single file. But the best implementation of this I have seen is in RISC OS.
I’m just going to take your word on this one, because I can’t find the problem with a shallow web search and you didn’t provide a link.
Yes. That is probably terrible to have to upgrade the whole OS for Open Office. Not having ever had such a problem nor used Ubuntu for more than a few days, I would not know what it entails to upgrade Ubuntu.
However, Ubuntu is only one of hundreds of Linux distros.
Furthermore, what could Open Office 3.0 do that the previous version couldn’t?
Okay. I meant to say “uprgade” not “uninstall.”
However, I fail to see how searching for a directory in Finder and then deleting it is superior to or quicker than typing apt-get purge [package] in an already open terminal. I also don’t see how the OSX method is better than some GUI package manager — with a package manager, you know you are removing hidden symlinks and user config files.
Also, Gobolinux has everything in a single directory.
That may be due to the fact you use a rolling release distro. However, the OO example is not unique to Ubuntu nor is it even unique to a specific version of Ubuntu. A myriad number of distros, that are not unstable rolling releases, also frequently have this issue – Ubuntu, Fedora/Red Hat, OpenSUSE, etc…
This type of problem can effect the ability to upgrade apps both big (OpenOffice) and small (I once needed to upgrade RhythmBox as a feature I needed was not available in the current official release of that particular distro I was using. In order to do so, I had to resort to using unstable repositories, despite that particular release of RB most certainly not being a major revision, and had to accept “unstable” dependencies just to get the upgrade).
Isn’t that completely irrelevant? A user shouldn’t have to justify why they want a newer release.
Why would I, or the average user, need an open terminal on a regular basis? ..or why should we need to run one app just to delete another one? Nothing is quicker than simply DnD’ing the app bundle and its config file to the trash. Out of all the apps I use on OS X, I think only one “Photoshop” requires any sort of automated help uninstalling it.
Man, this is a totally common problem. I once tried to install a decent media player on CentOS, Songbird. There just wasn’t any version for it. Later when switching to Ubuntu LTS there was a third party package I could use (downloadable through a website that didn’t list the Ubuntu versions by number but by name like ‘Dapper Drake’, great…) but FF and OOo couldn’t be used in their newest incarnations. Awful.
Yeah, why not stick to Claris Works, had everything you’d ever wish…
Seriously, this is an insulting question!
You must be kidding me… Yeah, we all open 10 terminal windows when logged in, just in case…
BTW, I would guess that uninstalling an application is actually a very rare task a typical user performs.
There are hardly any hidden symlinks in OS X applications. And config files aren’t an issue. They are tiny, who cares? Only a nerd would. And he can do the task manually.
Yea. That’s almost as bad as when Apple requires one to upgrade OSX with almost every new release of Final Cut Pro, and other big programs. The main difference is that OSX upgrades can sometimes cost money!
Awful.
Well, is there actually much of a difference between OO 2.8 and OO 3.0?
Gee, I am really sorry if you were insulted.
A lot of Linux power users usually have one terminal open — no kidding.
So what?
Right. Who cares?… because uninstalling apps is easy on OSX: http://discussions.apple.com/thread.jspa?messageID=11572945
Edited 2010-06-20 08:44 UTC
I second that. From my point of view, OO 3.2, which recently replaced 2.4 on the Windows 2000 computer which I use at work, changed exactly three things in Writer :
1/Look : the toolbar changed slightly, and text and array selection look much nicer now.
2/Startup times : On this old Latitude C840 piece of shit, every reduction in disk acesses is much welcome.
3/Better docx compatibility : Who cares ? Office 97/2000/XP’s doc is still the standard here and I open docxes around one time a month…
There were some nice additions to Draw in 3.0, though, if I remember well. But that’s about all I can think of. It’s still as shitty as ever compared to competitors in the areas of bibliography, table handling, and UI clutter. I was pissed off that Sun called that update a major release, just like when they released VirtualBox 3 (thoug with the DirectX/OpenGL stuff, VBox 3 could at least somewhat bear this name…)
Edited 2010-06-20 09:03 UTC
This works very well for most applications, but for applications that do require an installation package, OS X does *not* provide an uninstall system. Some seemingly simple app bundles will also install a package when they’re first run. This works great for those programs, but many (most?) leave behind remnants in /Library when they are drag-to-trash “uninstalled”
As a software engineer who is also responsible for packaging cross-platform per-machine software, I can say that at Apple does not provide a good uninstall mechanism. To make matters worse, they change the installer each release such that build scripts tend to break between releases.
Actually, the software author can provide an “uninstall” .pkg that the user can run to remove files if they wish to uninstall. Off the top of my head, Flip4Mac does this as an example. It is run via the OS X Installer and is really a script that simply checks for and removes any files related to the app in question. Adobe does something similar. Even the OS X Developer tools do something similar when you want to remove them.
As far as Apple providing an Uninstaller, I’m not rally sure that should be their primary responsibility given that the greatest majority of applications are simply DnD to the Applications folder. If the author is doing something that installs a file via Installer, they should provide the mechanism to remove that, imo, and since they can do it via a pkg, it shouldn’t be any harder than making the install package originally.
It is true that most apps have at least one file in ~/Library (the user library) but a quick Spotlight search reveals those (which are generally small and effect nothing if left in place in the event you should want to re-install the application).
Most Windows applications have uninstallers but still leave the Registry littered with program related entries…
Edited 2010-06-20 01:29 UTC
Yes, this is a hack — a workaround.
Sure, it’s not *hard* but it’s an extra thing that’s mildly difficult to get right and can actually litter the receipt database with an “uninstall” package.
Windows installer provides an uninstall mechanism. It actually works pretty well in most cases, and it’s usually the fault of the packager when it fails. It’s not perfect, but it’s pretty darn good.
OS X does not provide this, even though it usually knows what files were installed by the package.
Actually, a Mac OS X application typically consists of a number of folders nested within a special folder with the .app extension, which then has specific properties assigned to it, allowing users to double-click on it and have the application run.
Many applications install parts into the /Library/Application Support folder, that you would also have to remove in order to perform a complete removal of the software.
That really doesn’t provide a solution to the previously mentioned issue afflicting Linux distros.
It always provided flawless upgrades for me and countless others.
There’s also aptitude, which is touted by many as better than apt-get.
Perhaps somebody could go into detail about a few specific upgrade problems.
I’m sorry but I don’t think you’ve been reading this discussion very closely. The problem is endemic to the majority of popular, if not all, Linux distros in general. This is not a case of isolated issues, it’s a problem with the methodology at a basic level really.
There are workarounds that are successful to various degrees but as highlighted in this thread – none of them completely, satisfactorily solve the issue.
Yes, it is a methdological problem, more specifically, a policy problem. Most apps really don’t need a base system upgrade to be upgraded. Distros just choose not do app upgrades because it’s easier for them, or they are idiots (not sure which is true yet). If major system libraries do need to be upgraded, there is no reason why a distro can’t have side-by-side installations of different versions of libraries. Windows does this. ELF certainly supports versioned libraries. The distros just don’t do it. Why, I don’t know. But it’s definitely not a technical problem. It’s an idiocy problem.
Personal attack… okay.
I’ll tell you what I am reading — I see a lot of vague conjecture without any hard facts to back it up.
I’m sorry but I don’t think you know what you’re talking about. There is a great variety in basic methodology of Linux distros.
Furthermore, I have noticed that the only supporting arguments that you provide seem to focus on Ubuntu, and even those assertions lack solid detail and links to actual forum complaints. It is a fatal flaw to assume that Ubuntu is the whole of Linux. Ubuntu and its derivatives make up only a small fraction of the many, varied Linux distros.
Ubuntu might have the few package manager problems that you mention (I wouldn’t know, having never really used it that much). However, most other distros do not have such problems, and that includes the non-Ubuntu Debian-based distros.
I can tell you from first hand experience that the repository of my 2009-01 version of Sidux continually and actively updates its packages. I do an apt-get update every week or two, and I have to wait for it to finish updating. Never had any problems with an upgrade from the sidux repos.
In addition, I use non-sidux repositories, with almost no trouble. The independent Debian Multimedia repository ( http://debian-multimedia.org/ ) is updated daily. Certainly, there will be occasional problems when changing builds on a daily basis, but I recall having only one upgrade problem with a package from this repository, using sidux, and it was automatically fixed when I upgraded the package a few days later.
Any distro base on Debian Sid will have very recent versions in the repo.
Also, there are plenty of other non-Debian, cutting-edge distros which have no problem upgrading packages to the latest version. Fedora, Arch, Gentoo immediately come to mind, but there are many others.
Actually, it wasn’t a personal attack, it was an observation based on the fact you offered up an “answer” that did not, in any way, shape or form provide a solution to the issue discussed.
In fact, you just did the exact same thing *again* with this reply and proved my point in the process. Sidux is a HORRIBLE example to bring to the table as it’s yet another “rolling-release” distro, based on the “unstable” branch of Debian.
So, it’s not a matter of not knowing what I’m talking about, it seems you don’t comprehend what I’m saying (or are intentionally ignoring the crux of the problem). Seeing as most of the other people I’ve been conversing with in this thread do “get it” – whether they consider it a detriment to their personal usage or not, then I can only suggest yet again that you start reading from the beginning. I’m not going to repeat myself and I’ve already provided clear real world issues with the way distros handle this. I don’t need to link to forum complaints as I’ve experienced everything I’ve said first hand, time and time again, across multiple distros since I first began testing Linux in the 1990’s. Other than band-aids here or there, it really hasn’t improved that much since the advent of repositories.
When all Linux can offer up to this are PPA’s and rolling releases, it’s a mark of just how far behind “desktop Linux” really continues to be… If you’re happy with that, more power to you, but it has made Linux unsuitable to my own needs or my clients.
I see nothing but vague, general statements and a refusal to support them with facts.
By the way, Sidux is not a rolling release, and the fact that it works so flawlessly as a distro based on Debian unstable pretty much proves my point on the lack of upgrade problems with Linux.
Umm, unless something has changed in the last few minutes, Sidux very much is considered a rolling release based on Debian Unstable.
Okay. I was wrong, I read the sidux manual and it is a rolling release. One can upgrade the entire distro to the current packages by issuing a single command.
However, I don’t see how that capability hinders anything. One can still upgrade individual packages with no problems.
So, again, how is it that Linux has package upgrade problems?
A “rolling release” is, in theory, a great idea. However, I think virtually all rolling releases are inherently unstable. Sidux tries to get around this by providing certain custom packages when something is borked in Debian “Unstable” but the fact is that rolling releases eventually break if allowed to upgrade willy-nilly.
You can’t, typically, upgrade individual packages without upgrading dependencies that also effect other apps in the system. If you want a stable system you have to pick and choose carefully.
You can’t provide unstable desktop systems to the average business user, to people that require typically trouble free computing.
This has been discussed repeatedly in this thread and others. It’s a common issue across virtually all distros to one degree or another. The issue is not proving there’s a problem – that discussion is 10 years old already. No, the issue is that no distro has solved this yet – and I dare say they won’t until Linux moves to app bundles.
Let’s totally reduce this to something utterly silly that everyone can understand and apologies if I make anyone hungry:
Linux distro = plain hamburger, cheese = new application.
Customer A: I just ordered a hamburger and I’d like to add cheese.
Linux King Waiter: I’m sorry sir, the current plain hamburger we have available does not offer cheese.
Customer A: .. but I see a stack of Kraft cheese slices right there behind the counter.
Waiter: I’m sorry sir, those cheese slices are being reserved for a future version of our hamburger.
Customer A: Can’t you just give me one slice anyway. All I want is that one slice.
Waiter: I’m sorry sir, that’s against policy. We do offer our Rolling Release Bacon Veggie Burger. It’s a vegatable-based burger patty with cheese, relish, mustard, tomato, bacon, fish sticks, onion rings and a side order of clam strips and you can have as many as you want for the same price.
Customer A: …but I already have this perfectly good burger in my hands and I don’t want all that additional stuff added to my existing burger just to get cheese. It’s messy. I just want to add cheese to my existing hamburger.
Customer B (whispers to A): Hey, you know you can just add cheese yourself after you get the hamburger. I’ve got some sitting out in my car. Its all good.
Customer A: Really, and its the same cheese?
Customer B: No, dude, it doesn’t come from Kraft. I churned this cheese myself but it should taste okay… You just need to change out the buns on your existing burger cuz this cheese is shaped a little differently. Oh, it might have chunks.
(Customer A decline and goes to the restaurant next door)
Customer A: I just ordered a hamburger and I’d like to add cheese.
MACdonalds waiter: Why certainly, here’s an individually wrapped Kraft cheese slice just for you…. You can place it anywhere in the hamburger you like, anytime. Enjoy!
It’s great to say that, but saying it doesn’t make it true.
My actual experience has been quite the opposite.
Again, I can upgrade packages all day long with no problems.
Uh, Isn’t that one of the main reasons why there are package managers — to handle dependency conflicts? Certainly, different package managers handle these problems in differing ways. Some are more informative than others, some are more thorough, etc.
This is a very general comment.
So, Windows 3.1 was used by most average business users. Are you claiming that Windows 3.1 is more stable than Linux currently is?
Sorry. Just saying that it is a problem doesn’t make it so. Furthermore problems that may have existed ten years ago may not exist today.
I think that you have a solution looking for a problem.
Also, I don’t think that app bundles will solve your “problem,” because I have already linked several severe upgrade problems with OSX app bundles, and there are countless more that I did not link.
I had forgotten, but I have also seen such problems firsthand quite a few times with OSX and big programs, such as Final Cut Pro.
But you don’t have to take my word for it. A simple web search quickly reveals this very problem: http://www.apple.com/finalcutstudio/download/
This is the page for the latest updates to all Final Cut Studio (2009) applications. Scroll down to “Requirements to install all Final Cut Studio applications.” Note the sixth and seventh bulleted items: Mac OS X v10.5.8 or later; and QuickTime 7.6.2 or later.
Do you know what that means? If you have OSX 10.4.8, you are S.O.L.! You have to upgrade your proprietary OS that uses app bundles, just like you claim it has to be done in Linux.
The main difference is that it is very possible that you might have to shell out money for OSX!
Of course, you have to update Quicktime (as specified), too, but who knows what other software will have to be updated after the new version of OSX is installed, and all software that has to be reinstalled will probably have to be done with disks! Not an attractive proposition.
You wouldn’t believe the number of editors that I have seen screaming bloody murder because they have had to upgrade their entire OS just to run a newer version of FCP.
These upgrade requirements escalate with almost every FCP release, and I’m sure that Apple and other software vendors have the same escalation with other big programs.
So, app bundles will not solve the “problem.”
Let’s not, because that rather involved and confusing analogy also has to apply to app bundles for OSX (and probably for Windows programs, as well), as shown above.
…and a comment like that means absolutely nothing! Whether I say it or not, it’s already true. Sidux is a rolling release based on “UNSTABLE”. They call it that for a reason, for starters. The myriad other problems, you are free to read about on the ‘net. Considering you didn’t even know it was a rolling release until I told you, and you supposedly use this release, doesn’t inspire confidence in anything you say. Last I checked, Sidux includes a rather comprehensive manual. You really ought to read it sometime.
Should we trust a single person’s experience?
Uh, no one should need to be installing dependencies at the OS level just to install an app. The fact we need “managers” just to avoid dependency hell is a telling point as to how creaky this whole foundation is…
It’s general because it’s true on a widespread level, especially with rolling releases based on Debian UNSTABLE. It’s so unstable that Sidux has to set up their own repositories in order to avoid all the broken/defective packages coming down the pipe.
I’m sorry what year is this again? I’m quite sure Windows 3.1, released in 1992. Do you want to compare it to the version of Linux available in 1992? Haha!
Oh, sadly, it’s still very much true. We may have better “managers” and more safeguards (custom packages and repositories), and other add-on’s (PPA’s) but it’s a bit like putting band-aids on broken legs at this point.
Why do I have to replace core files of the OS on a daily basis just to install apps???
I’m beginning to think you are a fanboi trying to remain in denial. You didn’t even know the distro you were running was a rolling release a day ago and I’m supposed to believe you have any knowledgable experience on a problem that is widespread and has been well known for well over a decade? lol
I’m sorry but hunting and pecking around on forums and sites that are dedicated to users who have problems does NOT indicate that a particular user’s issue is a widespread one.
I’ve been running OS X since the week of it’s release in the Spring of 2001 (and much earlier, actually, if you consider I’ve also run NeXT’s OPENSTEP – I’m sure you know how OPENSTEP and OS X are connected, right?) and have never had any of those problems. My experience has been quite the opposite of those you site – it’s been trouble free except for a bad video card in a G4 Cube 7 years ago or so.
Are you serious? lol Let’s see, OS X 10.4 is FIVE YEARS OLD. There’s a hell of a big difference between running the same OS for 5 years and finally having to upgrade versus installing a Linux distro, then not being able to upgrade to the latest version of OpenOffice a WEEK later.
The average computer has a functional desktop lifespan of 2-5 years (depending on how good of a system you purchased to begin with…). Most people will have upgraded their computer before a particular version of OS X becomes obsolete (this is also true of Windows).
There is NO release of Linux, which I’m aware of, that is supported for 5 years. Even Ubuntu’s LTS releases with 3 years of desktop support don’t provide any special accommodations to install software NEWER than what shipped with the release.
So, what you are bringing up as evidence is irrelevant and incomparable in situation. When a 4 year old copy of Linux can support the newest versions of applications without having to dramatically modify the core OS for each app, you let me know. Can you install the latest version of FireFox on Fedora Core 4 without upgrading the OS?
Uh, no, it’s very likely I will have naturally upgraded my computer in 5 years time and have gotten the latest edition of OS X for free with the computer.
However, money issues are irrelevant. I’ve never cared that Linux is “free” and use to shell out money for new editions of Red Hat every year.
Otoh, one could argue that if it weren’t free, the majority of people wouldn’t even consider Linux.
OMG! CDs and DVDs are so heavy! I think I’ll need help.
You’re right, I don’t believe you… and if they’ve been running the same OS for 5 years then they should consider themselves lucky. They might be on Linux and have to update every few months to run the newest versions of software.
Considering that every release of OS X is supported, on average, for at least 4 years then I’d say it’s solved the problem fairly well. As I said, when a specific Linux distro can do that, you let me know.
Yes, involved and confusing… just like the myriad bandaids all over the various app upgrade methods employed by Linux desktops.
Look, I’m not against ever upgrading/updating… but one shouldn’t have to do this for every app they install. …and one shouldn’t need a bunch of package managers and PPA’s just to get the job done.
The installation of apps and the installation of OS updates should be a reasonably separate process.
It means that you probably should start linking facts/articles a lot more, instead of just spouting notions.
OMG! The repository is called, “U-N-S-T-A-B-L-E!!!”
Let’s put the Debian repository-naming scheme into perspective. It’s kind of like the U.S. Department Of Homeland Security “Advisory System.” Most of the time, we exist in the second highest DHS warning level: “High.” However, that doesn’t mean that everyone should move to Canada just because they keep the warning at the “High” level. There have been very few instances of actual terrorist attacks in the USA.
Likewise, sidux (and other Debian Sid distros) warns you know about potential problems with the newer, untested packages, but there are very few that actually occur, which is fine for the desktop. Sidux is rock solid compared to the myriad of problems experienced by OSX (instances have been linked elsewhere in this thread) and Windows.
By the way, in sidux, one can use a package manager to install packages from the Debian “Stable” repository.
Sidux is a “rolling release” merely because it has a single command that allows one to upgrade the entire system and packages to the most recent (sometimes daily) builds. Although this is a great feature (one which Apple and Windows will never have), it really is incidental to sidux. Sidux has regular, solid, comprehensive “distribution releases” (along with beta “preview” releases), and most sidux users probably install such a regular system release and, then, merely update individual packages, as seen fit — just like other “non-rolling-release” distros.
You really should work on your reading comprehension, as I already stated in this thread that I found the rolling release command mentioned in the sidux manual.
I do agree that sidux has a great manual, just like many other solid Linux distros.
As I have repeatedly stated, you don’t have to take my word for anything that I say. I have provided numerous links that show problems with OSX (and a few for Windows). I say that there are more OSX (and Windows) package upgrade problems per user than there are in Linux. All one has to do to verify my assertions is to go to the sidux forum (or some other distro’s forum) and find similar complaints and compare the numbers/percentages to those of OSX.
But I will tell you in advance, that you will find far fewer of the same problems and a lower percentage of problems per user. Go ahead and check, if you doubt my word.
First of all, the typical Linux user doesn’t encounter any of the inner workings of the OS level.
Secondly, there are several Linux distros that can/do keep apps in one file/directory, such as the aforementioned Gobolinux, and others (which I cannot recall at this moment). It is not inherent in Linux to have the “apps” handled in a single, standard manner.
Thirdly, by “installing dependencies at the OS level” do you mean merely separating library/config/components files from the location of the executable file? If so, OSX and Windows enjoy/suffer the same condition. Some large OSX apps require a separate uninstaller program, for example: http://www.digitalrebellion.com/fcs_remover.htm So, what are you going on about? Don’t pretend that all OSX programs install themselves as tidy, self-contained bundles.
Another thing, if OSX’s larger, advanced apps are designed to spread themselves outside of their directory and all over the system, then, there must be a good reason for it. You wouldn’t want to disparage Steve Jobs’ methods, would you? So, fragmenting the apps throughout the system is a good thing. :^)
You see, the developers of most Linux distros are keen to the advantages of spreading application components, and such distros utilize these advantages even more extensively than does OSX.
We don’t — again, if package managers bother you, you can use Gobolinux, etc.
However, there are efficiency advantages to using a package manger system (hence, FCP fragmentation), and in many cases, using repositories ensures stability.
Once again, I have (and will, subsequently) link numerous, recent stability problems with the OSX app bundle system.
By the way, I have used both types of package handling scenarios, and each has its advantages. Certainly, one is no better than the other. I prefer using a package manager, because it is so quick and easy.
It is definitely untrue! There are countless successful desktop Linux installations all over the world right now being used by absolute newbies who have no inclination of “picking and choosing carefully.”
All the time, more and more hardware companies are offering machines with a Linux desktop distro installed: Dell; Toshiba, Lenovo, Acer, ASUS, HP; etc., to name a few.
Entire governments suddenly convert all of their desktops to Linux, and, other than initial complaints of unfamiliarity, hardly a peep is heard and people go right back to work. Likewise, companies large and small convert all of their user systems to Linux, with the same positive results.
With Linux, there are the same (or fewer) app installation problems per user than the percentage of such problems experienced with OSX and Windows.
In regards to sidux being a rolling release based on the Debian Unstable repository, again, I can only tell you off its shining success, based on my own experience of four years.
Perhaps the sidux repository you mention is one of the reason’s why sidux works so flawlessly.
Edited 2010-06-22 06:32 UTC
Just because a single person, you, has had a trouble free experience does not mean that is the cardinal rule.
Sidux has so many update problems that Sidux.com has a forum dedicated solely to “Upgrade Warnings”
http://www.sidux.com/index.php?name=PNphpBB2&file=viewforum&f=29
Also, Debian is called UNSTABLE because it’s intended as the development branch. It’s not intended for those that need a rock solid desktop environment.
A completely unsubstantiated claim. You complain that my statements are vague, unsupported, etc, and you come out with this humdinger. lol
Also, links wars are silly and prove nothing at all. The user base for any single Linux distro, and indeed all of them together, is a magnitude smaller on the desktop than Mac OS X and Windows.
In addition, the “type” of person using Linux is drastically different. There are a lot of tech heads using Linux and a lot of non-tech people using Mac OS X and Windows (this is not to suggest that there are no technically inclined people using them, however, being more mainstream and more readily available they are used by less knowledgable people who often never venture beyond what they know and, also, can’t effectively troubleshoot their own issues).
I’m sure you can, but if you run the entire installation off of Debian Stable then it’s not really Sidux anymore, is it?
No, that’s not really the meaning of a rolling release. Lots of distros have this feature, Ubuntu included.
Releases of “rolling release” distros are, more often than not, merely snapshots of the distribution at the time of release, not heavily tested and polished as is a normal distro. Sidux, being sourced from Debian Unstable, the development branch, is just such a rolling release. Running these rolling releases in a production environment is liable to causes problems and they can not be well supported.
Sure, you can “freeze” your installation but then why run a rolling release to begin with? You might as well run something that has official support and is better tested before it’s released on the world.
…and apparently you stopped there… keep going, keep going!
Nope, you can’t quantify that at all, for many many reasons including the few I’ve mentioned above.
It’s irrelevant whether they see it or not. It still has the same issues I pointed out originally.
GoboLinux is indeed very different from most distros, however, I don’t believe what it’s doing is quite the same thing as being discussed here. However, at least they are trying to organize the fs to something that makes a little more sense.
Also, IIRC, GoboLinux installs apps from source, not binaries so that’s even FURTHER away from the issues here.
I never once claimed that ALL OS X apps do this. However, the greatest majority do and I even select which apps I use and install, partially, on this feature.
However, even when Apps install components outside of the Applications directory – they still go into areas specifically designed for App storage and not into the core OS. They don’t, as a rule, go “all over the system.”
I think DnD is pretty darn easy. I think installing an app on OS X and not having to install dependencies is pretty grand, too.
…and no doubt, a great majority, aren’t using unstable distros. However, when they go to upgrade that OpenOffice or RhythmBox they may very well find they have to update the whole OS, too.
Unless you have data from an independent study then this comment and the prior ones that I skipped are completely unfounded.
I can say the same about OS X for the last 9 years.
Well, it’s not flawless, as can be seen in the sidux.com forums. However, I’m sure it saves it from being more of a headache than it is. I could never define any distro as being solid or flawless when simply running a dist upgrade on the wrong day could likely deliver a system crippling package.
x-server dies during startup
http://www.sidux.com/index.php?name=PNphpBB2&file=viewtopic&t=21013
You have a special gift for missing the point.
As you may recall, you made another vague, general declaration: “You can’t provide unstable desktop systems to the average business user, to people that require typically trouble free computing.”
I mentioned the fact that Windows 3.1 was once the dominant desktop, which “hints” that perhaps your broad, nebulous declaration is in no way absolute, and, thus, dubious. That is the point.
Furthermore, desktop Linux is more stable than either OSX or Windows. Just go to the forums and compare numbers and percentages of complaints. If I were to guess from my experience, I would say that, currently, Windows has fewer system crashes than OSX.
Impressive of you to just say that again.
Incidentally, I never acknowledged that any Linux problems existed ten years ago.
Again, continual general statements — no specifics and no links.
I wouldn’t know — I don’t use OSX!
Okay. You were correct about one little detail, and I admitted that I was wrong about it, and now you cling to it for dear life as the only support for your flimsy generalized notions.
Well, that little detail is incidental. It is so trivial that I (and surely many other sidux users) missed the memo that a distro having an easy system upgrade command constitutes a “rolling release.” A lot of sidux users blissfully run it as a standard, non-rolling-release, without any trouble.
You want to know something else, according to the above definition, most distros would be considered to be “rolling-release” just by adding the same system updating script used by sidux!
Perhaps you should stop harping on this detail and start providing a little more solidity and precision with your other arguments.
I provided numerous of links pertaining specifically to upgrade problems posted in the last couple of days. You can try to find the same problems on the forum of any major Linux distro (even Ubuntu), but you will very likely find that the percentages/numbers of complaints on the Linux forums are dramatically fewer.
Here are several more from the same recent time period, concerning the upgrading issue:
Problems after upgrade to 4.03 creating events from address book: http://discussions.apple.com/thread.jspa;jsessionid=37251A671B753B2…
Fix 9.2 upgrade: http://discussions.apple.com/thread.jspa;jsessionid=09E5C53FD8F16BD…
Software upgrade problem: http://discussions.apple.com/thread.jspa;jsessionid=D54E8ADD6DE9961…
Problem opening mail attachments after snow upgrade: http://discussions.apple.com/thread.jspa;jsessionid=B79524BD9DC2DB3…
Very frustrating problems since OS 10.6.3 upgrades: http://discussions.apple.com/thread.jspa;jsessionid=439B5A7DADD0E60…
Error -15000 since upgrading to 10.6.4 and Itunes 9.2: http://discussions.apple.com/thread.jspa;jsessionid=9CCFF5025B61208…
Upgraded Mac Mini to 10.6.4 now have sound problems: http://discussions.apple.com/thread.jspa;jsessionid=3D05FE5FD2AD5BA…
problem with disappearing emails since the upgrade this morning: http://discussions.apple.com/thread.jspa;jsessionid=70C1554E8BBE6DD…
These selections were culled. If you would care to see a whole lot more, go to http://discussions.apple.com/ and search for “upgrade problem” and “update problem.” Sort the search by date, and you will see the significant density of upgrade difficulties suffered by Apple users.
I know. I could smell it.
User adoration of Supreme Leader?
You obviously had better luck than all of those posting upgrade problems on the Apple forums.
I don’t spend my time studying OSX version number, but the specific number doesn’t matter — the point is that any version prior than the number that Apple allows means that some faithful mac chump is S.O.L.! The user’s version that Apple does not allow might be the one that came out just before they posted those restrictions, and that is EXACTLY what I have seen happen to FCP users on several occasions. And they scream bloody murder!
The same problem is inherent in all Apple app bundles, because they all have continually escalating minimum requirements. Enjoy your upgrade tinkering and your regular software payments!
Not with Linux. I’ve got Slitaz running fine on my 1998 Toshiba. There are countless other stories of old hardware that have been rejuvenated with modern versions of Linux. There are many organizations everywhere that recycle working older computers and give them to deserving, underprivileged kids and families: http://www.heliosinitiative.org/news.php
Perhaps you should inform these underprivileged kids of the fatal stability problem of the OS in which they have been doing their homework and enjoying for years without incident. Tell them that they should get a Mac instead.
Of course, one can’t use OSX nor Windows to refurbish older computers — aside from the significant expense, the modern bloated versions of Windows and OSX would bog-down the old hardware. It goes without saying that it would be difficult/impossible just to install OSX on older hardware (not to mention that there would be legal ramifications with such installations).
Enjoy your constant hardware/OS upgrade cycle!
Edited 2010-06-22 06:48 UTC
Your point is deeply and utterly flawed. What might be considered a mainstream “stable” desktop in 1992 does not necessarily apply today. I’m not surprised you lack this understanding, however. You seem to lack the grasp when it applies to Linux and stable vs developmental repositories, as well.
Completely unfounded remarks, and as I pointed out in a previous reply your flawed pointing at support forums can’t be used to reliably support such statements.
That’s quaint but considering your lack of understanding about what you, yourself, are running and how it operates, the fact that you don’t even seem to fully understand what a rolling release is and your impressively flawed means for backing up unfounded statements. I think it’s a bit like the pot calling the kettle black in regards to generalized notions.
They missed that memo because you got it wrong – again.
Considering how small the userbase is, I’d be hard pressed to use the phrase “a lot” but, regardless, a lot are also not in blissful repose – and they’re still runing a system based on an untested, less than stable, developmental repository. … but, hey, if you don’t mind running a distro that’s forever in “Beta” then go for it! I’ve been there and it can be fun – just not entirely stable
…but your “definition” is wrong, or at least very incomplete.
You keep tossing this unfounded chestnut out and it continues to be flawed as previously shown.
Yes, those are wonderful arguments to support your suppositions. They make you seem so knowledgable… or something.
Yes, the millions of people not having upgrade problems probably aren’t posting about upgrade problems.
Are you aware of how impossible that statement truly is? Now I know you just like to make stuff up.
You just claimed that a version of FCP could be released that will NOT run on a current version of OS X, also just released. So Apple is releasing software for versions of OS X that have not been made available yet???? Are they using a TARDIS to obtain these copies of FCP? lol
This has nothing to do with app bundles at all. Virtually all software has continually escalating requirements overall – yes, even Linux. Even if apple software had artificially escalated requirements, it has nothing to do with using app bundles. It seems like your arguments are fraying at the edges here.
I don’t generally have to tinker with my upgrades on OS X – EVER. It’s been trouble free for 9 years now.
Nothing wrong with people making money for their work.
This is quite true, Linux is very flexible and can be stripped down for older equipment fairly easily. Indeed, there are many distros made for just such a purpose and when you are dealing with obsolete hardware it’s less of an issue if you are stuck running older software on it.
However, This fact doesn’t resolve any of the issues that make it less suitable for modern, mainstream desktop use.
Oh you must be confused. OS X only gets upgraded once every 2 years. It’s Linux that must be constantly updated, like a flowing river, to run the latest apps.
Oh you must be confused. OS X only gets upgraded once every 2 years. It’s Linux that must be constantly updated, like a flowing river, to run the latest apps.
Actually, you’re wrong as usual. Linux isn’t a app driven OS.
It’s a *FEATURE* driven OS. People like you don’t understand the difference and why Linux works the way it does and thus it (Linux) confuses the hell out of you due to your (quite) limited mental abilities.
Let me explain. People don’t usally download and install Linux or the BSD’s because of FireFox,Open Office or any of the other reasons *CLUELESS PINHEADS* like yourself think they do.
They download and install them *BECAUSE THEY WANT TO*
Like I said this is a reason losers like yourself will *NEVER* get or understand.
Does your mother know you are on the ‘net spouting nonsense again? Your posts on this thread have, by far, been the most ignorant, pointless and childish ones here. You’re completely off the mark every time but then I imagine there are few targets you can hit.
Ahh well, I hear guys act like big blow hards when they have teeny, tiny little…. distros.
Edited 2010-06-23 18:47 UTC
As I just pointed out, your hardware remains useful much longer with Linux. And one can run a Linux install longer on a machine, without the creeping system corruption and escalating upgrade problems of OSX/Windows.
Also, I have run Linux kernel 2.2 with software for kernel 2.4. I am sure that I could do the same with some programs and version 2.6. In addition, when dual booting multiple Linux distros, I have been able to run the apps compiled for another distro on the booted distro.
Certainly, there are current programs that will run fine on earlier versions of Linux. The latest Firefox might not run an Fedora 4, but one could try — something that would be impossible with OSX.
Whether or not you’re computer/OSX upgrade is “natural,” you are paying for it, nevertheless.
I’m sure that the underprivileged beneficiaries of the Helios project would be glad to hear that news!
Or one could make a very strong argument to the contrary, for many valid reasons, not the least of which is that you admitted that you have paid for Red Hat.
Why don’t we hold an re-installation race. You with OSX and all of your app CDs and me with the latest release of sidux. Sidux takes about ten minutes to fully install, plus a few mouse clicks in the package manager to restore some extra software. I estimate that the entire sidux re-install will finish about ten minutes before you get your first OSX registration screen!
Or, I could just issue the sidux system update command and have a beer, while you are trying to install OSX.
Upgrading Apple’s FCP on OSX is frought with problems, but you don’t have to take my word for it, I provide facts/links:
Large project crashing after Snow Leopard upgrade: http://www.lafcpug.org/phorum/read.php?1,248950,248967
upgrade to studio 3: http://www.lafcpug.org/phorum/read.php?1,254971,254971
upgrade OS now DSP errors: http://www.lafcpug.org/phorum/read.php?5,246626,246626
And from the Creative Cow forums:
HELP – Final Cut Studio 3 Upgrade won’t install: http://forums.creativecow.net/thread/8/1048023
Problem with Final Cut Studio install: http://forums.creativecow.net/thread/8/1090227#1090291
Final Cut Studio 3 Upgrade Installation: http://forums.creativecow.net/thread/8/1060933#1060950
It has gotten so bad with FCP and OSX, that the FCPLUGs are recommending a complete wipe of the disk, followed by a total system reinstall — just to upgrade FCP!: http://www.lafcpug.org/basic_troubleshooting_fcp.html#anchor1194942
The problem is so bad that FCP editors are afraid to even consider upgrading.
Don’t get the point here. At least with Linux, one has the opportunity to update often — with OSX/Windows, one is at the mercy of the vendor’s timetable.
Considering the countless OSX upgrade problems that one finds with a simple forum search, I’d say that the problem is not solved.
Let me know when the OSX upgrade problems cease on the boards.
Until I encountered this thread, it never occurred to me that the RDF could so strongly affect one’s perception of non-Apple items. These posts have certainly opened my eyes to that phenomenon — thanks!
You actually didn’t refute a single thing I just said. There is NO release of Linux, which I’m aware of, that is supported for 5 years. Even Ubuntu’s LTS releases with 3 years of desktop support don’t provide any special accommodations to install software NEWER than what shipped with the release.
So, even on older hardware you still have to continually update Linux at a much faster pace than OS X or Windows in order to run the latest software. In fact, it’s worse with Linux because a great majority of the reasons WHY you must upgrade Linux so often is entirely related to policy and not for true technical reasons.
On one end of the spectrum you see Ubuntu, which must be upgraded to a new release every 6 months to continue running the latest apps. On the other end you have to run a rolling release, based on an unstable developmental repository and continually update your system with untested packages.
An Ubuntu user will have had to upgrade their OS 4 times in the same amount of time that an OS X user did it, maybe once. However, since a great deal of software can be installed on versions of OS X that are 4 years old ; A Mac user could easily choose to skip a particular release and wait another two years to upgrade, while that same Ubuntu user may have had to upgrade 7 or 8 times in that period.
I can’t speak to that. I’ve never had corruption or upgrade problems on OS X, as previously mentioned. I’ve never had to constantly upgrade it, either. It’s incredibly stable.
In fact, it’s not impossible. I haven’t been keeping track of it but there were custom builds made specifically for older versions that were regularly updated for quite some time. ..and, of course, thanks to FINK OS X users can port *NIX and Debian packages and run them under older versions of OS X. It’s actually better this way, as Fink has a strict policy to avoid interference with the base system (OS X). So, Fink uses a separate directory in accordance with the Mac ideal of not installing crap “all over the place.”
Yes, indeed, new computers aren’t free.
We’re talking about mainstream desktop usage here. The fact that Linux is suitable for specific tasks does not resolve it’s failings as a modern desktop OS for the average user.
I never claimed to be a typical computer user so I’m afraid that doesn’t support that statement at all. I service computers for a living. Back when it looked like Linux might get a foothold on the desktop I was quite interested in it, both personally and for potential support. Linux seems to have lost that chance and they’ve never corrected the most fundamental problem, for me, so it’s become less enticing.
Speed of installation, again, doesn’t negate the issues Linux has as explored in the original posts. However, I’ll bet you spend a lot more time “updating” and “upgrading” over a 2 year period than I do.
Interesting fact; Those having problems tend to have problems. Also, those having problems tend to complain. The millions of users that do NOT have problems, aren’t whining on forums and lists every day. They’re actually getting work done.
However, even if a particular release of FCP happens to be buggy, that’s not indicative of app bundles nor does it absolve Linux of upgrade issues.
Actually, no, as OS X and Windows users don’t have to wait around for the next release of the OS to run newer versions of their applications that already exist.
See, in the mainstream world, where the average desktop OS is used, a new release of the OS is made available and THEN new applications come out to run on it. Under Linux it’s the opposite, new apps are released frequently and you must WAIT to use them until the distro’s maintainers decide to make them available (or run a potential unstable rolling release distro).
Yup, as I said, your argument is fraying at the edges. I’m sorry to break it to you but I’m not an Apple fan to the extent you are a Linux fanboi nor do I worship at Steve jobs feet. I *do* think OS X is the most suitable desktop OS currently available but I’m not one to jump at everything Apple offers. I chose a Palm Pre over an iPhone (which runs Linux, btw – the Pre – and I think it actually offers a better “desktop” interface than anything seen on Linux yet), hate the iPad and most iMacs. Prior to OS X I was running Windows NT, IBM’s OS/2, NeXT, DESQview/X, etc… I pick and choose the bits that work for me, personally.
If you’re going to go the lame route and choose silly talk of RDF over any real talking points, you might as well stop replying now. You’ve already devolved to the stereotypical Linux geek’s response list of how great the OS is for obscure and obsolete uses.
completely seconded, would mod this up if i were allowed to – the cheeseburger analogy is right on spot
That’s because Windows or Mac OS X will require you to run out and spend $$$$ on the latest version of the OS, or buy a freaking new computer.
Don’t agree? Try running the latest versions of either VLC or Firefox on Windows 98se for instance. There’s quite frankly no reason that I can see for either of these programs to *NOT* run under Win98se except for the fact the the developers wanted to force people to upgrade for no real reason.
Are you kidding? Virtually everything I install on my Mac is via simple DnD. Virtually every app I run has the ability to download updates and upgrade themselves. All of this going on without having to update the core of the OS every time to do so.
That scenario is not Bullshit at all. It’s quoted verbatim from Ubuntu documentation as an example.
Oh, so the latest VLC and FireFox will install without any updates to the OS on Red Hat Linux 5.1, released in 1998?
“There’s quite frankly no reason that I can see for either of these programs to *NOT* run under Red Hat Linux 5.1 except for the fact the the developers wanted to force people to upgrade for no real reason.” LOL
I will gladly plunk down $99 every 2 years for OS X and not have to worry about installing a whole new version of the OS every 6 months just to have the latest apps. On top of that, I don’t have to worry about an unsupported app installing dependencies that screw with the stability of the system.
Then you are a frigging fool. Did you not just claim you *DID NOT* have to do this?
No, but you must be to think willingly upgrading OS X on my schedule and being able to upgrade apps ANY TIME along the way, independent of that, is somehow equal to being forced to upgrade a distro like Ubuntu every 6 months in order to get the latest apps. It’s absurd.
OS X, and even WIndows, offers so much more flexibility and freedom in this area that it’s not even funny.
I can update my apps without updating the core OS (no, toolkits are NOT core OS). Yes, I’m using Fedora, which is more bleeding-edge, but you can always add PPAs on Ubuntu if the Ubuntu repositories don’t provide the newest version.
Unless something has changed, Fedora works similar to Ubuntu in regards to their repositories, etc… Their policy may be a bit more lax on newer releases of apps appearing more quickly but it’s still a hit and miss prospect. Fedora 13 might be released with App x.0, a month later App is updated to x.1 and there’s no guarantee F13 users will have easy access to that. They may have to wait for F14. You can shop around on 3rd party repositories and hope for the best, and then end up in repo hell or be forced to upgrade shared libraries via a dependency…
…and libraries, shipped as part of the distro, are a part of the core desktop os when they are shared and could break other installed apps, or cause random instabilities, if upgraded.
Any linux distro could end this insanity tomorrow if they allowed for bundled/encapsulated apps that included their own required libraries within, rather than forcing system modifications.
Really? Both Windows and Apple has succeeded despite being copies (or “influenced”, if you like) of other UI’s.
Oh Really? What popular operating system did they copy from???
Certainly they have borrowed bits and pieces from here and there but nothing as wholesale and artistically bankrupt as we’re seeing here.
Oh Really? What popular operating system did they copy from???
Copying or borrowing doesn’t need to happen with something visual, it can be the lower-level parts. Like f.ex. Microsoft has grabbed lots of networking-related ideas and even code from IBM, Novell, BSD.. And how about the whole windowing idea? You didn’t know that there were windowing systems even before Apple and Microsoft stepped on the stage? And that they both took lots of ideas from f.ex. Plan 9?
You really need to study computer and Operating System history a tad bit more if you think Microsoft and Apple were the first ones to invent all of their ideas.
Certainly they have borrowed bits and pieces from here and there but nothing as wholesale and artistically bankrupt as we’re seeing here.
As I stated, not everything that is copied or adapted needs to even be visible at all. And yet still, go read f.ex. about Plan 9 at http://en.wikipedia.org/wiki/Plan_9_from_Bell_Labs and you might surprise yourself.
Now, having proved your point moot I’ll go and introduce my own one: why does it matter that ideas are borrowed, copied and adapted to new systems or circumstances? Nothing in this world would be like it is now if that didn’t happen because that’s called progress; take something you find good and worth of implementing, and try and make it even better. Or do you really mean that http://en.wikipedia.org/wiki/File:Benz-velo.jpg should be the pinnacle of automobile development and no one should have ever taken the idea and improved on it?
You didn’t prove my point moot. LOL
In fact, it seems like you didn’t read my original post at all which was entirely focused on the UI and user experience when trying out an OS that behaves differently than they expect despite it looking like a clone.
You’re going off on a tangent unrelated to anything I said and you’re giving me advice??? lol
Why use “f.ex.” when there is already a perfectly good abbreviation “e.g.” available which is more well-known and considerably less grating than “f.ex.”?
Why use “f.ex.” when there is already a perfectly good abbreviation “e.g.” available which is more well-known and considerably less grating than “f.ex.”?
Because English isn’t my native language? I’ve just never really thought about it and since “f.ex.” is used by several of the languages I speak I’ve just kind of gotten accustomed to it. But sure, if it really bothers you so much that you need to make an entire off-topic comment about it I’ll try my best to memorize the “e.g.” abbreviation.
Edited 2010-06-19 16:44 UTC
Yes it’s true that Linux is easier to use than it’s ever been before. But the very simple reason it’s not ready for mass consumption is that it more or less *requires* the CLI and/or editing config files by hand in order to get anything done that’s remotely advanced.
Another large-ish factor is that package managers have certain disadvantages from a consumer standpoint. App versions are months behind the newest releases, and installing new versions of anything risks breaking something else. Plus for the few applications that do offer direct-install binaries, you risk not being able to uninstall those apps easily, since the package manager either gets confused or ignores them completely. I’m not saying package managers should be scrapped, I’m saying that direct-install binaries such as Zero Install should be should be treated by distros and developers as first-class citizens. Distros should include Zero Install by default and application project pages had should have big fat links to Zero Install binaries of the newest versions. This would also increase usage and beta testing of small-time apps that distros would otherwise not support and compiler-phobic end users would never try out.
…But still, the CLI is by far the biggest reason why Linux has not “gone mainstream”.
Huh? There are numerous Linux distros in which one never needs to see a terminal, unlike the remedies required for OSX problems: http://reviews.cnet.com/8301-13727_7-20007737-263.html
I’m guessing that the average Mac chimp is going to have trouble with these terminal commands.
Huh? Try Arch, Sidux and Gentoo (just to name a few).
Perhaps you are confusing Linux with OSX. Here are just a few choice Apple upgrading problems that have popped up in the last two days:
Safari completely unresponsive after update: http://discussions.apple.com/thread.jspa?threadID=2466613&tstart=0
Photos lost upgrading library to Ap3: http://discussions.apple.com/thread.jspa?threadID=2465322&tstart=15
Latest SL update seems to screwed up Aperture: http://discussions.apple.com/thread.jspa?threadID=2464161&tstart=30
Problems after upgrade to 4.03 creating events from address book: http://discussions.apple.com/thread.jspa?threadID=2465245&tstart=15
Can’t Install Quicktime [upgrading hasn’t worked since April 11, 2010]: http://discussions.apple.com/thread.jspa;jsessionid=84756CA46355044…
Mac OS X: Issues with OS X10.6.4 [since upgrading]: http://forums.cnet.com/5208-6126_102-0.html?threadID=398175&tag=for…
Those posts came from a quick scan of the forums of only a few Apple programs. These threads barely scratch the surface of Apple upgrading problems.
I thought everything with Apple “just works!”
Please go to the forums of any major Linux distro and find this many upgrading problems existing in such a short span of time.
Of course, Windows and OSX programs are always fully uninstalled when commanded. /s
Furthermore, it is always amazing when someone tries to characterize Linux in a certain way. There are hundreds of different distros, many of which do unique and incredible things. For instance, how does the above notion on problems with Linux independent package uninstall apply to distros such as Gobolinux, where each package and library sits in its own directory?
Another thing, it is rarely necessary to install a more recent, independent binary that is not in the repos, especially with Arch, Sidux, Gentoo, etc. Such distros keep packages fairly recent, and they certainly update the packages more often than most OSX and Windows programs.
Not sure on what such a notion is based.
My 84 year old mother installed Mepis by herself, and she doesn’t know how to use the CLI, and she has never had to use the CLI.
Edited 2010-06-19 07:37 UTC
So pointing out problems in OS X somehow makes Linux better? Yeah… right.
Not only that, but for him to compare reported problems and have them mean something OS X and Linux would need to have the exact same number of users on the Desktop – and they certainly don’t. lol
Uh, there seems to be some confusion as to which OSs have the package updating problems.
The OP suggested that Linux has the upgrade problems, but, obviously, the OP mistook OSX for Linux.
So, again, which OS is it that is “not ready for mass consumption?”
pointing at other OS’s flaws doesnt solve yours
Windows and OSX are not perfect, of course, but those two OS’s are the ones the majority of users actually use despite minor annoyances like this, because they have all the applications already (users want to use applications, not the OS itself) while linux has got OpenOffice and some other horizontal applications (though there’s web browsers aplenty) but when it comes to specialized professional grade vertical applications, there’s very few to none of them available ( and even in those cases, for specific – sometimes outdated, versions of specific distributions )
other Os’s can live with this minor annoyance because they have at least one barrier to entry less than, say, linux
this is the crux of the problem
the vast majority of people would rather settle with something that *works*, has as many features that they need, as possible, and possibly stable API’s (so that third party developers can give them the *applications* they need), than be willing to choose among hundreds of different distros (incompatible with each other) just to get one unique or incredible feature but sacrificing on fundamental requirements such as overall functionality and stability
also, development wise, consolidating a platform in a certain application field, and adapting it to operate in other fields are not mutually exclusive things (in fact, they’re orthogonal aspects)
and certainly it shouldn’t need different distributions to refine feature richness and suitability in the same application field (namely, the desktop), since such refinement may well happen upstream and all distributions benefit from it equally – but this would make all desktop distributions (mutually compatible) clones of each other, diminishing any competitive advantage one may have against the others, so it’s not in the distributions’ interest to make that happen (although it would be in the users’ interest)
GoboLinux does away with package management and dependency hell – but it’s known for suffering from “symlink hell” otoh
simplifying the directory structure and putting some common sense in it (third party applications each in its own directory isn’t “the windows way”, it’s just logical) is all fine and dandy, but it clashes with a plethora of unix userland sw developed with unix paths in mind (or, often times, hardcoded)
… so when you’ll give her a new webcam for christmas, she’ll be able to install it herself with just a few mouse clicks (setup-> next->next->finish), or she’ll rather have to open the cli and type some obscure commands (or worse yet, wait for you to come and setup it for her) ?
Like on OSX, it’ll most likely either work out of the box or not work at all. Actually, V4L drivers are not that bad, just like wi-fi. Not every PC hardware will work, but there is a large amount of hardware which is known to work, just like say Sound blaster 16-compatible cards are known to work on Win9x and usb pen drives are known to work on OSX. He’d probably choose among those when buying a webcam, and everything will be fine.
Edited 2010-06-19 17:16 UTC
First of all, I have already addressed the alleged Linux package upgrade issue elsewhere in this thread. I am not going to repeat myself just because someone doesn’t bother to read.
Secondly, the OP stated that Linux is “not ready for mass consumption,” which presumes that some other OS is ready for mass consumption. I was merely highlighting the flaw in this reasoning.
OSX and Windows are in many ways less “ready for mass consumption” than Linux.
Congratulations for posting the second longest run-on in OSNews history!
Windows and OSX are far from perfect, and neither is any better than Linux.
The majority of use Windows, because that is the OS to which they are exposed, not because Windows has applications that Linux doesn’t. The same assertion applies to OSX, although I would argue that it is possible that there are more Linux desktops in the world compared to OSX desktops.
Furthermore, I dispute that Windows and OSX have a pool of professional “vertical” applications that Linux/open-source doesn’t offer. Certainly, Windows, OSX and Linux “out-do” each other on an individual program basis, but one OS does not dominate over another in terms of which has the best professional-quality applications. If you doubt this claim, perhaps you would care to list the Windows/OSX apps which have no better in Linux/open-source.
No. The crux of the problem is lack of marketing and anti-Linux FUD, which is surprisingly parroted by posters on this forum who should know better.
Wow. That’s the third longest run-on in OSNews history!
The vast majority of people settle on to what they have been exposed — it has nothing to do with quality of programs nor availability of advanced features (which most will never use).
There are plenty of applications for Linux, and I can’t think of a single, userland Linux program that cannot be ported to all Linux distros that use Xorg.
By the way, what is the point of mentioning that Linux distros are incompatible with each other? Even if such a claim were completely true, what do you have to compare it to in the Windows/OSX world? The Windows/OSX world doesn’t have incompatible distros, because they don’t have differing distros. Such amazing variety (with some distros possessing unique features) is a strength of Linux that Windows/OSX will never have.
For instance, with Linux, I can quickly and easily boot a fully function OS from a CD, DVD, USB stick, zip drive, SDHC card, etc, and all data that I create can be saved back to the CD, DVD, USB, zip, SDHC etc. I can travel with all of my applications and data on my key chain, and the OS will boot on most computers with the exact same user configuration.
Another example: without knowing about the internal workings of the OS, I can use a GUI that will easily create my own specialty distro, catered to a specific purpose, such as point-of-sale kiosks, graphics/photo editing, Pro Audio/Video editing, etc. Try to do that on Windows/OSX!
Congratulations! This sentence is THE longest run-on in OSNews history! I am proud to have it as a response to one of my posts!
Certainly, Linux can be as general or as specialized as one prefers.
One problem with proprietary OSs is that is that they can’t specialize with any efficiency, because the general OS cruft will still run in the binary blob. So, with Windows and OSX, one still needs full hardware resources for an entire bloated graphical OS, even if one is merely trying to make a garage door opener that is actuated by a cell phone ( http://www.aboutdebian.com/x10.htm ) or if one is trying to build a super computer to do highly advanced fusion reaction calculations, etc.
There is no “hell” with the symlinks, unless the hidden existence of symlinks bother you.
The reason for the symlinks is because some of the larger packages are hard-coded with *nix directory structure. It is not a problem that is inherent in the concept. Again, it is all hidden and does not affect performance.
As one of the other posters mentioned, I would choose a webcam that has Linux drivers. Linux cannot be blamed if the manufacturers omit the necessary drivers and don’t provide any information on how to make the drivers.
In some areas they are, in other areas they are not. In the real world the two rarely compete. Most use an OS for a specific reason, and most (outside of raging fanbois like yourself) do not use Linux as a desktop OS.
Really? That is the reason? Or more likely you just simply can not come up with an excuse that has any real data. Hate to break the news to you kid, but there are a hell of a lot of people that quite willingly chose your hated Windows over Linux any freaking day of the week.
Why bother listing. Clueless people will think that just about anything would have an equivalency, whether it really does or not. I could mention AutoCAD, to which some twit will pop out some idiotic link to some half ass amateur product thinking they are equal. God knows the clueless FOSS fanbois think there really is competitors to something like Premiere, or hell how about Exchange. Talk about massive failure to come up with a decent competitor there. Does FOSS have anything even close to Active Directory (and all of it’s capabilities and supporting roles)? If you even think of saying yes, then you really are one ignorant kid.
Most use an OS for web browsing, email, cell phone, writing, etc. Is that what you mean by “specific?”
Ha! I would certainly say that what you say is so, compared to Windows, but I am not sure that the same is true when compared to OSX.
I guess the important thing is, “so what?”
I’m still waiting for you to come up with the long list that proves your point.
I suggest that you study this page very carefully: http://en.wikipedia.org/wiki/Reading_comprehension
The point is that the reason that people would choose Windows over another OS is because Windows is all they know.
So, more people would rather use Windows. Again, “so what?” That doesn’t make Windows better.
Geez! I never would have seen that one coming!
Again, one platform might have one program that is superior, but no OS is dominant in multiple areas.
I don’t know much a bout CAD software, so I’ll let you have that as a piece of software in which Windows has an advantage.
I included all Linux software, not just open-source.
Please explain how Adobe Premier surpasses these high-end Linux NLE/compositors:
http://www.ifxsoftware.com/ant
http://www.ifxsoftware.com/ant3d
http://www.ifxsoftware.com/products/piranha
http://www.ifxsoftware.com/products/piranha-stereo
I don’t even know what Active Directory is or does. It sounds like a nerdlinger IT app.
Okay. I’ll give you that one, too!
So, Autocad and Active Directory are the only two programs that prove that Windows is better than Linux? Is that all you got?
Edited 2010-06-20 00:19 UTC
Funny that you mention Autocad, AFAIK there’s no OS X version for it either, so in your argumentation OS X isn’t a ready for the desktop either. Or exchange again, no OS X version. As a side node, I would argue that the number users who use windows because of Autocad is probably as big as the number of users who use Linux because of Gnome. Autocad is really not a consumer product, and to say otherwise is simple dishonest. That said if you have to run Autocad for work, it is probably a valid reason for using Windows (at least in a VM). About exchange, there are many Groupware suites around which provide the same functionality, but it’s funny that you bring exchange up in a discussion about the Desktop, as it is a server application.
Wow man. This is about the problems of Linux adoption on consumers desktops and you come up talking about kiosk-like customized distros, super computers and garage door openers.
As long as you can buy millions of products for Win/OS X that say “Requires Windows XP SP2 or above” or “Requires Mac OS X 10.5.4 or above” and for Linux it’s like no info at all or like “Requires at least Linux Kernel 2.6.32, X.org 6.9, glibc vX.X”, nothing will change.
Show me a single statistic that claims it is even semi accurate and is assembled by a large number of sources like lots of different websites. The ones that do exist give no hint at all that your assumption could be true. OS X has 5-6 times more users than Linux on the desktop. Why are there so many more products that people can buy for Mac compared to Linux?
Take music production for example. There is only one decent MIDI sequencer (Rosegarden) and Ardour mostly for audio recording. There is nothing like Logic, Cubase, Pro Tools, Digital Performer and the likes. No fully featured DAW at all. And don’t even get me started on all the wonderful Plug-Ins and software instruments there are.
And what about a decent video editing software for semi professionals like Final Cut Express or Adobe Premiere Elements?
There are so many examples one can point out to.
Mac chimp? You have to insulting and disingenuous?
Funny the last picture of a Linux developer’s conference I saw sure contained a lot of Macbooks.
So why don’t you name some of these distros that don’t require using a CLI? Ubuntu (Linux for humans) not only requires a CLI at times but in fact dumped some people to the command line after a system update broke working video drivers. The OSX example you provided was just a case of a user being unable to delete files. That user was still able to use the system and get online to find help.
But maybe you think it is ok to expect users to do this for a printer install:
hey I just got my psc 1610 working for both scanning and printing (yay me!)…
first I did
sudo apt-get install hplip gtklp xpp hpijs python-qt3-doc libqt3-mt-mysql hplip-ppds
I don’t know if you need all those packages they were all reccomended or suggested and I hate going back and adding things later… then I went to system->administration->printing > printer> add printer and in step 2 I selected psc 1600 from ‘HP (HPLIP)’ in the manufacturer drop down menu (as opposed to just ‘HP’ which did make a difference for me ) this got the printing part working.
Then xsane told me it didn’t find any scanning devices so I installed a package called libsane-extras from synaptic and tried xsane again and it worked.
http://ubuntuforums.org/showthread.php?t=151981
You’re right. I’m sorry. It was very insensitive of me to insult chimps by insinuating that they use Macs.
Here’s a chimp deftly using a non-Apple touch screen: http://www.youtube.com/watch?v=r5nTVF3uits& Wow! He certainly works a lot faster than those Ipad owners that I see in the coffee houses!
I would also like to acknowledge that you are never insulting and disingenuous on this forum.
Not that there is any point to your statement, but would you care to link that photo, or do we just have to take your word for it? If such a picture exists, I’m betting that number of Macbooks is in the minority.
Okay. Even if Ubuntu does require a CLI sometimes, it is just one distro out of hundreds! You can’t characterize the whole of Linux from the problems of one distribution. Personally, I have never used Ubuntu that much because I think it is rather bloated.
A problem with X will sometimes give you a command line, although some distros go into a GUI control panel. At least one has a way to work through the problem.
I don’t know how Windows and OSX respond to such a problem, but I guessing that one has to reboot into “safe” mode with Windows.
Mepis is a very solid, and, unless there is a serious problem, one should never need to use the CLI. Any distro based on Debian stable or based on Slackware (with a GUI installer and GUI control center) should be able to do everything without CLI use. There are probably quite a few others, but one would have to research.
I provided lots of OSX examples, and the ones that I linked barely even scratched the surface of the zillions contained in the official Apple “discussions” forum. There are tons of problem postings in this forum alone, astronomically more than I have seen on any Linux distro forum.
First of all, you have linked an Ubuntu post from March, 2006. That’s very telling on the lack of current Linux problems. Couldn’t you find anything more recent? All of my links to Apple problems were from the last two days.
Secondly, these days, almost every distro uses CUPS, which is the same printer set-up that OSX uses. So, if there is a missing driver for Linux, it is not the fault of Linux. Gladly, most printers work with Linux and CUPS.
Please find recent examples from one of the more solid Linux distros.
Edited 2010-06-19 23:01 UTC