Canonical Ltd., the company behind Ubuntu Linux, estimates
that the product has over 12 million users worldwide. And why not?
Ubuntu is free and it runs more than ten thousand applications. It has a vibrant
user community, websites covering everything you might
ever need to know, good tutorials, a paid support option, and more. Yet I often hear friends and co-workers casually criticize Ubuntu.
Perhaps this
the price of success. Or is it? In this article I’ll analyze common
criticisms and try to sort fact from fiction.I should mention that I’m a big Ubuntu fan and have used it for five years. Even
so, it pains me to see the obvious ways it could improve. As I’ll
explain, I believe Canonical’s business model holds Ubuntu back from
fulfilling its potential.
Why It Matters
One obvious response to anyone who criticizes Ubuntu is to say to them: why
don’t you just run another operating system? There are so many
competing Linux and BSD distros out there.
True. But there is a larger
issue here. Ubuntu’s great popularity means that it represents Linux
to many people. It’s the distro vendors pre-install. It’s the
distro the mainstream media always review. It’s the one distro everybody’s tried. It’s been ranked #1 in DistroWatch‘s yearly popularity ratings for the past six years (1).
Fair or not, Ubuntu reflects on the Linux community as a whole. How well Ubuntu
meets criticisms matters even
to
Linux users who don’t use it.
So what are common Ubuntu criticisms? Here are those I often hear…
It’s Bloated
To say that Ubuntu is
bloated only makes sense if comparing it to some alternative. So let’s do that.
Is Ubuntu bloated compared to Windows?
This chart compares Ubuntu’s system requirements to the last three Windows releases:
Resources: |
Windows XP: |
Vista: |
Windows 7: |
Ubuntu 10 and 11: |
Processor: |
P-III |
P-IV |
P-IV |
P-III |
Memory: |
128 / 512 m |
1 / 2 g |
1 / 4 g |
512 m / 1 g |
Disk: |
5g |
40 g |
20 g |
5 g |
Cost: |
$ 199 – 299 |
$ 239 – 399 |
$ 199 – 319 |
$ 0 |
Locks to Hardware: |
Yes |
Yes |
Yes |
No |
Sources: websites for Microsoft and Ubuntu, plus web articles and personal experience. Chart is
simplified and details have been omitted for clarity. Microsoft offers many Windows editions,
this chart addresses the most common. Microsoft
prices
are for full versions. In the Memory column, the first
number for each system is generally considered the minimal
realistic memory, while the second
is the memory recommended for best performance.
By any measure Ubuntu is not
bloated
compared to Windows. I’m writing this article with Ubuntu 10.10 running on a
seven-year-old Pentium IV with a single core 2.4
ghz processor and 768 M of DDR-1 memory. This computer wouldn’t even
boot Vista or Windows 7. It runs Windows XP great, but that’s
not current software. XP is two Windows releases back.
Is Ubuntu bloated compared to prior releases?
Ubuntu’s system requirements
indicate the product’s resource requirements have crept upwards over the years. Here are its memory requirements:
Ubuntu Desktop Version: |
6.06 |
7.04 |
8.04 |
9.04 |
10.04 / 10.10 | 11.04 |
Memory (M): |
256 |
256 |
256 |
384 |
512 / 1 G | 512 / 1 G |
Sources: Ubuntu
offical system requirements and various websites on efficient product
use. Note that some sites do report slightly different memory
requirements. 1 G is the recommended RAM for 10.04 and above.
These RAM requirements and the recommended minimum 1 ghz
processor mean that nearly any computer sold in the past seven to ten years can run
Ubuntu. I’ve run 10.x on P-IV’s and even P-III’s. By this measure, one could hardly label Ubuntu “bloated.”
Is Ubuntu bloated compared to other Linux distributions?
Linux distros divide into full-size, mid-size, and lightweight. Ubuntu is full-size.
Most full-size distros come in multiple
versions. Their standard product usually requires at a P-IV or
better with at least 512 M to 1 G memory. You may be able to get by with lesser
hardware but it’s not recommended.
Mid-size distributions like the standard editions of Zenwalk and
VectorLinux go a
bit lower than the full-size distros. They’ll run fine on a
P-III with 256 M. Lightweight distros like Puppy or VectorLinux Light
Edition will run down to 128 M or less if properly configured.
To compete with this, full-size distros usually offer pared-down versions for those with
lesser hardware. For example, Ubuntu offers Lubuntu; PCLinuxOS has
PCLinuxOS LXDE and other variants; Mint can run with lightweight GUIs like LXDE, XFCE, Fluxbox; and so on.
Compared to other full-size
Linux distros Ubuntu is not bloated. For something lighter, try Lubuntu. Lubuntu
requires half Ubuntu’s memory and only 1/3 to 1/2 of its disk
footprint. It’s also lighter on
the processor. Read my detailed review of Lubuntu here.
It Lacks Enterprise Integration
This complaint is that Ubuntu lacks the enterprise-wide integration and manageability critical to large
organizations.
System administrators require a single control point for automated
administration and monitoring of remote Ubuntu
desktops. Landscape, Canonical’s product for enterprise-wide management, fulfills this need. But it is too narrow to address the larger
integration issue. What about a single sign-on for login, email, and
web access? What about directory services? How about Kerberos network authentication and LDAP
(Lightweight Directory Access Protocol) support? How about coordinated
information management across client and server products?
Microsoft is the competitor in this space. Its full range of
client and server products seamlessly integrate. The server
products include Active Directory, Exchange Server, and SharePoint Server.
Client products like Windows desktop, the Outlook email client, and the
Office suite seamlessly integrate with the server software.
There are two ways Canonical can challenge Microsoft’s client-server headlock on the enterprise. It can either:
- Directly compete with a full range of directory, mail, and information management services
or
- Better integrate Ubuntu desktop into the Microsoft ecosystem already in place at most companies
The second option is in progress at Edubuntu but not complete. It leverages standards like Kerberos and LDAP to facilitate integration.
One system administrator summarizes the situation this way, “… Microsoft continues to win on the desktop. Not because an
individual PC running Windows is easier for most people to use, but
because its easier to set up Active Directory to work with Outlook and
Exchange than it is to roll your own directory service with the tools
available out of the box on Ubuntu.“
Here’s a management consultant whose clients manage between 50 and 150,000 desktops: “Until
there is a true competitor to Active Directory, Exchange, Outlook, and
the MANAGEMENT of the machines, Ubuntu will not succeed in the
Enterprise.“
Too bad Canonical let Attachmate Corp. buy
Novell
when the company was up for grabs late last year. Novell products like
eDirectory and GroupWise could synergize with Ubuntu. Canonical’s Linux
dominance plus Novell’s directory services and deep experience
integrating into the Microsoft ecosystem might have been very competitive.
Perhaps cloud computing will ameliorate the integration issue. Organizations may shift their integration focus
from internal servers to cloud services. This is the premise underlying
Google’s Chromebook.
In any case, Canonical needs to recognize this key
source of corporate resistance to Ubuntu and make explicit their plan
to overcome it. Then they need to promote the plan in the IT community. Thus far they have failed on both counts.
It Doesn’t Install Complete
Here’s
a complaint with which we’re all
familiar. Ubuntu bundles a ton of great
software
but leaves out some essentials. Codecs, Adobe Flash Player, multimedia
players, and proprietary hardware drivers are examples. You can
easily install the
missing programs, but you have to:
- Know what is missing
- Know how to install it
- Make the effort to install it
The underlying cause of this problem is the distinction between
free and
non-free software. Linux partisans have
strong beliefs about how to handle this conundrum. Canonical is
caught in the middle. They try to provide a complete user experience while
also respecting intellectual property rights. This task is complicated
by the fact that IP rights are interpreted differently in the many
countries in which Ubuntu is used.
Canonical addresses this criticism in several ways. They segregate non-free software into its own Multiverse Repository, so that it can easily be identified and installed. Medibuntu (Multimedia, Entertainment & Distractions In Ubuntu)
is “a repository of packages that cannot be included into the Ubuntu
distribution for legal reasons (copyright, license, patent, etc).”
Users can check for proprietary hardware drivers through the Startup
Applications panel or the Administration -> Hardware Drivers option.
Good documentation and How To’s help Ubuntu users. But
navigating these can be difficult for the inexperienced. Not
all docs are dated or identify the release(s) to which they refer. In the
worst case, the user googles and retrieves conflicting instructions for a simple task they want to perform.
Some
distros build on top of Ubuntu to give a more complete user experience. Linux Mint, for example, states its first goal as: “It works out of the box, with full multimedia support
and is extremely easy to use.” PCLinuxOS is another competitor that emphasizes it is “a full multimedia operating system.”
I feel the “completeness criticism” is but a nit for
experienced users. They can easily install the few apps or plugins Ubuntu
doesn’t initially provide. For newbies, though, this is a
hurdle. End users don’t
know and don’t care about the debate in the Linux community over “free versus non-free.” They
just
want software that does everything they want with as little effort as possible.
Here’s how Canonical could address this problem. Add an install
panel allowing the user to select what goes into his
installation. Give him a checklist of installable products — with
each
denoted as free or proprietary. Users could
choose software conforming to the IP laws of their
country. With the customer checking
acceptance of licensing conditions, Canonical would be absolved of legal
responsibility. Users would get the most complete system permitted in their jurisdiction by a simple install panel checklist.
It Doesn’t Install Secured
Comparative studies and vendors
alike confirm that Linux has a superior track record as a secure
operating system. Ubuntu upholds this great tradition. You’d be
hard-pressed to find evidence of malware infections in the Ubuntu community.
But does Ubuntu install as secure as it could, right out of the box? Surprisingly, no.
Take the default firewall as an example. In version 10.x, the
Uncomplicated Firewall, or UFW, installs as Disabled. You’d think such
a fundamental security tool
as a firewall would default to Enabled. Or failing that, that the
installation panels would give you a
checkbox for enabling it.
UFW‘s front-end management interface, Gufw, doesn’t install by default. You get the firewall without the GUI to manage it! The user must know about Gufw and install it separately.
How about configuring the firewall? Windows products like ZoneAlarm
help you “train” them. They intercept each program the first time it
communicates through the internet, and ask you to Allow or
Deny the communication. Then they automatically generate the proper
firewall rule for your decision. They also provide a checklist of
installed programs. You simply check Yes or No for each program,
indicating whether it has Incoming and/or Outgoing
Internet communication privileges.
In contrast, UFW expects the user to write its rules
with its barren, minimalist GUI. This is neither
state-of-the-art nor competitive. It’s certainly not user-friendly. As a friend complained to me: “I don’t want to manage ports, I want to manage programs!“
To anyone who claims that Ubuntu “doesn’t install secured,” I’d say the
product’s outstanding track record argues otherwise. This is a highly secure system. Yet ease of configuration is missing. This isn’t the only area where Ubuntu’s ease of use falls short…
Its File Manager Isn’t User Friendly
Ever taught a class of new Ubuntu users when they run into
Nautilus? They always ask how to create a sub-folder instead of a top
level folder in a filesystem. They ask how to copy folders to their USB drive or backup disk.
Nautilus doesn’t always show that a copy
worked as expected, and if you’re overwriting an existing file, it
doesn’t display timestamps so that you know which copy is the more
recent. It doesn’t always display error messages. For
example, try to
delete a directory for which you don’t have valid permission. Or copy
into that directory. You won’t get
an error message! Users need feedback. The old Unix dictum “no news is good news” is completely inappropriate for products that target end users.
There’s an easy fix. The huge Ubuntu
software repositories contain more than a dozen competing file managers.
Ubuntu’s superior install tools — the Ubuntu Software
Center and the Synaptic Package Manager — make it easy to download them. If you don’t like Nautilus,
just click the mouse a couple
times and install another product.
The mystery is why Ubuntu bundles Nautilus as its default. File
managers are one of the most frequently used tools in any operating system. Consumers expect to use the default file manager without having to replace it. Fixing
or replacing Nautilus should be a no-brainer.
It Won’t Run Windows Software
Those who make this accusation either aren’t familiar with Wine, or they haven’t used it lately. The Wine database
lists over 16,000 Windows programs that it runs on Linux. I’m
constantly surprised that even big, complex applications run under
Ubuntu with Wine. Examples include web site generators like Adobe
Dreamweaver and NetObjects Fusion, and office products like Microsoft
Office and Adobe InDesign.
Wine works like you’d expect. After installing it, you run Windows
programs in the exact same manner you would under Windows.
Another compatibility option is DOSBox,
an emulator designed for old DOS software.
I have a number of simple Windows 3.1 games, such as Ringo, Ludo, and Boule (free download here).
The games run fine under either Wine or DOSBox. They don’t run
natively under either Vista or Windows 7 — even with its new Program
Compatibility panel. Compare Ubuntu with Wine and DOSBox to native Vista and
Windows
7, and you’ll often find that Linux is more compatible with old Windows programs
than Windows!
I’ve found an analogous relationship between Microsoft Office and
OpenOffice. Microsoft
releases new versions of Office every three years or so: Office 95,
Office 97,
Office 2000, Office 2003, Office 2007, Office 2010. (This excludes
MacIntosh versions). As far as I can
determine, the company only regression-tests back one version. The
result in my experience is that OpenOffice is often more compatible
with older
versions of Microsoft Office than is Office itself.
When critics complain that Ubuntu is not compatible with Microsoft
software, I sympathize. In spite of all that I’ve pointed out,
gaps persist. But when one considers Microsoft’s own software — rooted
in a business model of continuous releases based on planned obsolescence
— it
becomes apparent that compatibility is not an issue only for Ubuntu.
Depending on your compatibility needs, you may get a better deal from
Ubuntu than from Microsoft.
It’s Buggy
Several academic studies and papers conclude that Linux and open source software have fewer bugs than
commercial products. Ubuntu has bug-tracking identification and resolution procedures equal to those of any large, well-run software project.
From years of participating in the Ubuntu forums, I’ve encountered
consistent anecdotal evidence. I read very few posts where a user abandons the product due to a bug. This is a huge vote of
confidence in Ubuntu. (You can’t say this about every Linux distro.)
However, it’s not unusual to see posts from first-timers who abandon
Ubuntu due to install issues. Examples are things like Ubuntu not
recognizing a sound card, or being unable to get wireless networking
going, or a display problem of some sort. While these may not be bugs,
they are cases where Ubuntu doesn’t work for the prospective user. If I were
to recommend one area for the Ubuntu team to target for a
better user experience, device recognition and configuration would be
it.
A related issue is that Ubuntu actually removes hardware detection capabilities as new versions come out. So a machine that
worked fine with an older release of the product suddenly fails when
you move to a newer release!
I’ve maintained Ubuntu instances for five years, since release
6.06, and have repeatedly run into this problem. In several cases video
worked
fine on one release and then fails under a newer one. Right now I’m
trying to fix wireless networking on a laptop that worked fine in 8.04
and fails under 10.04. It doesn’t work whether I do an upgrade or a
fresh 10.04 install. (Wireless works fine for this laptop with Puppy
Linux and Windows XP.)
Admittedly, device recognition and configuration is a sisyphean task. When you try any Linux distribution for the first time, you just hold your breath and hope that the product
recognizes all your devices. This remains Linux’s biggest
challenge.
From the user perspective, though, to have a product that works fine
under one release break under a newer release… that really doesn’t look good. If
there is a single issue that tarnishes Ubuntu’s reputation, comprehensive, consistent device recognition and configuration is it.
It Changes Quickly But Doesn’t Protect Its Users
Ubuntu improves rapidly. In the last twoyears, the product has moved from the GRUB boot loader to
GRUB 2, to continually changingnetworking management tools, to eliminating
the xorg.conf configuration file and moving to RandR for video, to switching the user interface
from GNOME to Unity, to replacing OpenOffice with LibreOffice. I’ve read about replacing GDM with LightDM, moving to more regular updates, replacing X.org with Wayland, and more.
Ubuntu’s aggressive improvements are among its greatest strengths. But this
benefit causes work for the existing user base.
The Ubuntu team could easily shield their customers from the impacts of
these changes. Often they don’t.
Here’s an example. With GRUB 2 you no longer configure the
boot menu of OS options by editing the menu.lst file.
Instead, you edit bash scripts. That’s fine for me, but an
unreasonable expectation for end users. How about a simple GUI front end for editing the boot-time menu?
Another example: new releases take away the xorg.conf video display file
that generations of Linux support personnel are accustomed to editing. You can generate this file and then edit it if you
look up the commands to create it. But why should you have to? Why
doesn’t the System –> Administration menu have a button to generate a xorg.conf file for you? And automatically plop you into editing it?
A final example. Right now I’m researching how to install the
Java browser plugin under Ubuntu 10.04. Websites are providing conflicting answers. This was trivial in earlier releases. But no longer. Apparently we switched from Sun’s Java packages to OpenJDK. Beyond inadequate details in the Release Notes, no one bothered to insulate the users from this change. Why is it put on the customer to manage this change?
The Ubuntu team does a superior job in adding new features. They need to protect their users from the disruption these
changes cause. This should be a top priority because it deeply impacts the product’s ease of use.
To
the average consumer little GUI “transitional aids” like those
I’ve mentioned would help tremendously. They would be trivial to
program. Why doesn’t Canonical include them? Is it simply a lack of focus
on ease of use? Here’s my theory …
Fix the Business Model
Of the above criticisms, those I feel have the greatest merit
focus on whether Ubuntu is as easy to use as it could
be. You see this in:
- Device recognition
- Configuration
- Upgrades
- Default file manager
- Security configuration
One underlying explanation ties all this together. Canonical embraces
the same philosophy of product development as Microsoft. The emphasis
is on introducing new
features. New features trump massaging the product to improve its user-friendliness. They trump
intra-release compatibility and disruption to the
existing user base. They trump device recognition and easier configuration.
Consider Microsoft’s business model. The company makes 27% of its total sales revenue from Windows and 27% from Office (2).
That’s over half Microsoft’s revenue. Without it, the company as we
know it would cease to exist. Microsoft can’t afford to stick with a
product
and polish it until it shines. Its business
model forces it
to constantly update, replace, and repackage existing code into new
product.
No Windows version achieves its full potential because Microsoft
must abandon it to introduce revenue-generating new product. New
features are critical because they are used to justify the new version to the consumer
public. The GUI is often the focus of “improvement” because it is the most visible to customers.
The history of Windows releases verifies this continual forced march to new product:
Courtesy: Wikipedia article
Canonical implicitly accepts
Microsoft’s disruptive business model
as the terrain for their competition. Ubuntu directly challenges Windows in
the new features competition. And it succeeds. But other design goals get pushed to lower priority.
Here’s an example. Canonical and Microsoft sell to both consumers and corporate
customers. They drive product change from the consumer side. This conflicts with the expectations of their corporate
customers. Corporate customers value stability, compatibility, minimal
bugs, and ease of upgrades over the headlong rush to new features.
Canonical tries to bridge this gap through differentiated policy, support,
and
pricing. For example, they
distinguish between Desktop and Server products, and between regular
and Long Term Support (LTS) releases. They offer corporate customers comprehensive support options and contracts.
Readers with long memories might recall that Red Hat also
got caught in the conflict between consumer and corporate
expectations. The company flip-flopped several times over their support
for desktops versus
servers. Ultimately Red Hat solved the conflict by spinning off desktop Linux to the Fedora
project in 2003, while it went forward with Red Hat Enterprise Linux for servers.
I believe Canonical would be better served by protecting those who
find
that rapid
change causes them work — its user base. Polish existing code to
improve
ease of use. Concentrate on easy upgrades, great device
recognition and intelligent automated configuration. Minimize bugs.
Abandon the pell-mell rush to new features. Improve the product at a
measured pace. Nurture and organically grow the base.
New users will come naturally if the product provides solid long-term
value. You needn’t hype an “all new” interface to attract them. That’s Microsoft’s game.
The best way to compete with Windows isn’t to mimic Microsoft’s
business model. You win by presenting an alternative vision grounded in
a unique competitive model.
And the Consensus Is?
Ubuntu’s popularity means that it represents Linux
to many people. How well the product meets criticisms is important even to
Linux users who don’t use it.
I’ve presented my views to
stimulate your thinking. But here’s a better idea. Why don’t we see if we can come up with a community consensus? Add your
comments to this article to address:
- What is Ubuntu’s greatest strength?
- Are any of the criticisms listed here valid?
- If you could ask the Ubuntu team to fix one thing or improve one area, what would it be?
Thanks for participating.
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick (President, FCI) is an independent consultant who
supports
databases and operating systems. Read his other articles and download his free guide How to Tune Up Windows from here. You can reach him
at contactfci at the domain
name of sbcglobal (period) net.
Footnotes
(1) You can view historical distro popularity rankings at Distrowatch by changing the time period in the drop-down list box under the label Data span. You must press the Refresh button to see updated statistics.
(2) The 12% Letter, March 2011. The March issue of this investment newsletter analyzes Microsoft’s business from an investor’s standpoint.
Really. The traditional approach is to offer a few meta-packages that install whatever you need, such as file or web server, desktop environment (usually multiple choice). Ubuntu’s one defining feature was to do away with that and give the user the default install Canonical thought a n00b would need so that it’s usable out of the box and fairly complete. Sort of the anti-Debian.
Personally, I’ve always disliked it intensely, but to each his own.
I don’t understand this particular line of reasoning. The minimal install is a great way to install Ubuntu – it’s light, fast, and a common root for all the Ubuntu-based desktops. You get the command line and if you want to “apt-get install ubuntu-desktop” or “apt-get install fluxgui”, you can take it either way. Or, you can just install your text editor and screen and lynx.
What they do get is a heavily QA-ed setup and a straight shot to getting productive. When I hear people griping about “They took out Gnome and put in Unity” and pose as experts doing it, my steel toed feet start to twitch, and when they say “It’s too bloated!”, I want to kick them, just a little. Or a lot.
Synaptic’s right there. It is easy to make these customizations! Ubuntu has a default that works for the average person but they didn’t take away your ability to go off the beaten path. They can go for the full distro or a fraction of it, and compile whatever they want. Ubuntu didn’t bolt the hood shut.
If these people want to feel like experts, let them download Gentoo, put in two lines of all the emerge flags they can find, and then wind up troubleshooting a lot of bugs because they effectively rolled their own distro that few others have tried. Or they can make their minor tweaks to something well-tested.
[/soapbox]
Heavily QA-ed? No. Ubuntu has a long history of extremely buggy releases. If you want QA, you have to wait a couple of months. The users are the testers.
If you want QA, do it yourself. But why use Ubuntu then, just use the source and go debian.
Rarely have I ever had a machine that I wanted to install Debian on that I actually *could* install Debian on. I get a new motherboard, memory, drive, all the accoutrements. Generally, nothing terribly fancy. But nice enough. No fancy gamer video card, and I don’t care about wireless on my desktop. But still, I go to install the latest version of Debian, and it doesn’t support the SATA chipset, or whatever, because the kernel is too ancient.
Debian spends so much time squashing bugs that they introduce the biggest bug of all: not installing on a motherboard which came out 6 to 12 months ago. That’s why I don’t even bother with it anymore.
I suppose it’s probably OK on an old junk machine, though.
Edited 2011-06-03 18:15 UTC
Not quite true. You can start with a minimal install of Ubuntu if you want to.
For most parts, my Ubuntu dev VM is like my Debian server. Except that Ubuntu gets updated more frequently, which means that I don’t have to mess around with getting some packages from stable, others from testing. Heck, my Ubuntu VM is as minimal as my Arch VM.
Edited 2011-05-31 01:28 UTC
Don’t mix testing and stable, just go testing or unstable. As always: Unstable refers to the speed of change not the stability.
I’m probably the same way as you in liking meta-packages and granular depenedncies if one chooses not to use them. For example; why on earth is a PIM application a core dependency of a graphic desktop?
If you want gnome-core; described as just the core requirnments to display a gnome graphic desktop.
You get Evolution; an email and PIM management application.
If you remove Evolution; for lack of need of the Gnome version of Outlook, you also loose Gnome due to “missing dependencies”.
WTF?
And worse still is that Debian has inherited this madness. Thank Baud this braindamage hasn’t yet become aprent in the KDE dependencies but if it does, you can bet I’ll be back to using Enlightenment or some other DE that actually lets me decide what application software to install along side it.
The dependency thing is a feature that is enabled by default, when it should really be disabled by default. Or, it should at least be disabled by default with Debian. The default is to install “recommended” packages as dependencies. I can see that Ubuntu would enable this feature by default being a distro aimed towards new users, but Debian should know better.
It is bloat in my opinion because it installs whatever Canonical want but users. There are too many that I wouldn’t use at all… So it could be good for entry users or *lazy* users (but for me). So it is bloat not because of memory usage or anything. By the way every tool has its own use.
“Too bad Canonical let Attachmate Corp. buy Novell when the company was up for grabs late last year.”
Where on earth would Canonical have gotten the money to buy Novell? Even Shuttleworth doesn’t have that kind of money.
As much as I may personally like having the latest and greatest software available for Linux; latest and greatest is just wrong for most corporate settings. Red Hat does it right, with RHEL and Fedora clearly separated. I view Ubuntu’s use of the LTS designation as a way to land more-or-less half-way between RHEL and Fedora. I’m not convinced that it’s working.
Great article, I enjoyed reading it. It’s always nice to see people contributing.
I do have to point out two things, though:
1) MacIntosh: There’s an apple called a McIntosh, and a line of computers named after it, the Macintosh. Yes, the guy that named it screwed up, but blending the two doesn’t help.
2) Sisyphean: Very apt, but most people appreciate it when you don’t use five dollar words where two dollar words will do. How about “- is an uphill battle” instead of “- is a Sisyphean task”?
Just my two cents. Thanks again for the article.
With my enormous intellect (capable of using a dictionary ) ‘sisyphean’ does not mean ‘an uphill battle’ exactly, but ‘Endlessly laborious or futile’
I did not know this word before but I appreciate texts that use words outside of my vocabulary as it is the best way the learn new ones.
Edited 2011-05-30 23:07 UTC
I’m inclined to say that the only difference between ‘Sisyphean’ and “uphill battle” is that in the former, you’re making a comparison to trying to fight uphill against a boulder, versus the latter making a comparison to fighting uphill against an animate adversary. Both are trying to imply a hopeless condition. Seems to me to be roughly the same meaning intended, except for the fact that people actually write “uphill battle” without referring to a thesaurus first. Other things could work, but either way, lower word density is better than using an obscure word alluding to a Greek myth that’s not even one of the more common.
EDIT: I do agree that it’s always good to expand one’s vocabulary, and it’s nice to see others that do. However, where, exactly, are you going to use Sisyphean? At best, it’s only usable in writing, considering most people don’t want to use a dictionary in the middle of a conversation, and if you have to stop to define it, the cleverness is already lost, leaving you just to look pompous.
Edited 2011-05-31 01:10 UTC
Unfortunately for Sisyphus the boulder immediately rolled back down the hill every time he reached the top. So Sisyphean really means an impossibly frustrating task not just a difficult task.
“Until there is a true competitor to Active Directory, Exchange, Outlook, and the MANAGEMENT of the machines, Ubuntu will not succeed in the Enterprise.”
This all exists on Ubuntu too (note: I don’t like Ubuntu personally but this is more because I don’t like DEB). People often forget that IBM has a product that can easy replace Exchange and Outlook (just to name one of the above requirements. Products delivering the same scope as AD, Management of machines, etc exist too). Yes. It exists. It’s called IBM Lotus Domino/Notes and has even packages for Debian/Ubuntu. The statement that some companies are open for alternatives for Exchange and Outlook is plain wrong. They would never accept another solution beside Exchange and Outlook. It’s that simple. So if a company needs/wants Exchange and Outlook then no replacement, regardless how good/bad it is will succeed. It will never be Exchange/Outlook because only Exchange/Outlook can be like Exchange/Outlook.
A lot of users and companies are fixed on products and not on the functionality the product is supposed to deliver.
I have done migrations from Windows to Ubuntu (this is where my dislike for Ubuntu comes from mainly) and even if you replace the whole OS with Ubuntu and install them something like OpenOffice.org the users can’t stop in thinking about their old applications. The don’t say: “How do I sum a column in the spreadsheet application?”. They just call support and say: “I don’t know how to sum the column in the new Excel. In the old Excel I used to select a column and press the icon xyz and I got the sum. But the new Excel I got installed yesterday does not have any more that icon. Where do I get that icon in the new Excel?”.
Management mostly does not much care about what OS. They mostly care about money (is it cheaper?) and that they can use all their applications. That’s all.
The IT department (if they know Linux) mostly loves something like Ubuntu. I remember a IT worker telling me that they need 17 Minutes for automatically installing a whole new system, including all applications, configuration of LDAP, network shared homes, etc… With Windows it took them way more and was (according to them) more expensive. Managing the system is easy as 1-2-3. Updates of software is damn easy from a internal repository. Once central place to manage all applications. etc…
Just look at Munich in Germany ( http://www.muenchen.de/limux ). They have decided to move to Linux and they have success with their project. Is the project without issues? Hell no! Which project is? But if want to move then what should stop you from moving?
btw: One of the above mentioned migrations from Windows to Ubuntu had some applications that needed Windows. After some while it was clear that they could not move everything to Ubuntu. So they took one of their HP blade servers and installed VMWare Server on it and installed a bunch of Windows XP VMs on it (they had enough legitimate licenses of XP) and installed a terminal software from Elusiva to run those applications on the virtual environment. It’s just a bunch of applications. Nothing ultra critical. Still needed for the business but not every desktop needs them. And for the end user it is not important where the application runs as long as they can run the application. For the IT department using such a hybrid environment is still much cheaper (short and long therm) then going fully with Windows.
I am not trying to say here that Ubuntu/Linux is better than Windows. This is not my point. My only point is that if you want/need to move away from Windows then you can.
I asked the CIO why Ubuntu and not a new version of Windows? His reply was: “We don’t sell more of any of our products because we use Windows. We need mail, office applications, fax and printing and our AS/400 for our business application. Our company exists since many generations and we like to plan for the future. Spending every two years for new hardware and migration to a new Windows and Office software is not what we want.”
What? “Migration to a new Windows and Office software” every two years?
Your CIO must be fired. He could save a lot of your company’s money by getting a VLK license. My company had used a VLK license for Windows XP for almost 7 years when the decision to move to Windows 7 was made a few months back. For the Office part, we’re still using the same VLK license we ‘bought’ almost 4 years ago. That’s what VLK really is for. It’s much more flexible than OEM license.
My Ubuntu 8.04 LTS installation (three years old), on the other hand, no longer receives security update and thus has now been practically obsolete. Let’s not bring up discussion about what new software package can be easily installed there.
yeah and on the other hand there is a free upgrade path for you from ubuntu 8.04 unlike on windows.. worst case you just have to add more ram to you machine unless it really was horrendously old to begin with.
It’s not my CIO. It is the CIO of the customers company.
They did the move to Ubuntu long ago and so far it has been a success. A VLK might have saved the company some money regarding licensing costs but licensing costs is just a minor part of the total costs and so far the Ubuntu systems are cheaper than their old Windows systems.
Your Ubuntu 8.04 LTS installation is a standalone installation. Right? That particular customer manages their installations. You can not compare your standalone install with their fully managed installation.
You still have to pay for each license deployed by a MAK or VLK program and those licenses really aren’t cheaper than an OEM license. The benefits of volume licenses are RIS and TS/RDP.
We use Lotus Notes and everybody hates it.
Yes, there is an Ubuntu package for Lotus Notes. However Notes won’t work. You manually need to google and download some library files. Keep those files save, because if you either upgrade/patch Notes or Ubuntu your Notes will be broken again.
And that’s Notes on a *supported* platform for you.
Because? Is it because it is not Outlook? Long ago when everyone and his dog was using Windows and Outlook Express at home then everyone hated this bloated IBM Lotus Notes. Now the time has changed and everyone and his dog has a iPhone and people started to use Mac OS X at home. Can you imagine that now people start to hate Outlook? Now you hear stuff like: “It is not so intuitive like Mail.App. It is not so logic like the Mac. On the Mac things just work. etc”. It’s human nature to nag around. But from the business perspective this all does not count much. IBM Lotus Domino/Notes delivers the functionality (and more) that the combo Exchange/Outlook offers too. That’s all what counts. If you do IT for so long as I do then you will realize that users always complain. They hate SAP, they hate Oracle, they hate Office, they hate their old IE, they hate their work, they hate to pay taxes, they hate traffic jam, they hate the weather, etc…
This is the job of the IT professionals in your company. They do all this stuff. And don’t think that such work is not needed on Windows too.
It is supported. I personally would make some things different but I am not IBM.
It is supported, but they can’t make an installer that installs a working client. It works on Windows and OS X.
The Windows version also has its share of problems. A number of people use Outlook (which I don’t like/use) and a Domino connector. This makes their Outlook crash 3 times a day and force them to reboot their PC. They rather do this than use Notes.
There is a lot not to like about Notes, but I only have 6000 characters left. If you forget about everything there is to dislike about Notes you’re still left with a bloated product. People here only use it for email and some the calendar function. Not even the todo. For this Notes is overkill, beyond it even.
If one would use Notes what it is intended for it is probably something wonderful.
I’m in the proces now of convincing our parent company to allow us to get rid of Notes and Domino.
Notes is the first and only thing new users complain about. When I need to do a quick mail I sometimes revert to my private mail client as it’s a lot snappier.
Outlook 2010 isn’t a diamond either, but at least it’s a true email client. Well, intended to be one.
Well… don’t force me to go into DEB packaging issues.
LOL. A crashing Outlook makes them to reboot their OS? This speaks for the quality of the OS. Sorry. But no client application should that easy and constantly be able to knock down a OS.
So you got a wonderful application and application server that allows you to build easy rich applications, etc and you just use the email part (which is just an application on top of Domino) and based on those problems you have there you say Notes is a problem. Funny.
So if you use Opera/Chrome/FireFox/Safari/etc and you use a web application (for example Google Mail or Microsoft Hotmail) and you have problems with them then the problem is not the application but the browser.
Notes can do way way more than Outlook. But if you just need Mail then Notes is an overkill. Even Outlook is an overkill. For mail you could use a gazillion of other applications that are faster, better, etc. If you need Calendaring and Scheduling then Outlook is fine.
We can agree on that, however Outlook is included in Microsoft Office so you paid for it anyway (if you bought Office of course). A lot of third party software/hardware solutions have plugins for Outlook. When Notes is mentioned people are not so sure it will work.
Personally I don’t like Outlook or Notes. Outlook 2010 has the worst looking interface of all Outlook versions. Hell, I’d rather use Mutt than Outlook.
Oh yay, someone brings up one of the things I hate most about Windows: enforced file locks. Locking a file is generally a bad idea unless you absolutely need the file to remain static for a certain length of time. Enforcing a lock on any file operation, as Windows does, and not keeping track of which programs have acquired which locks (also as Windows does), results in a mess. A program crashes, the locks don’t get released (with no official way to manually release them) and the system needs rebooted. In my experience, file locks are one of the most common reasons to reboot a Windows box, and it’s unnecessary. Resource locks of any kind should always be placed in the hands of the application developers, while the os provides a method (hopefully automatic) for removing locks that are no longer valid in case of an application crash. This problem might not sound like one that affects end users drastically, but it does. If Microsoft would remove enforced resource locks, and remove the registry at the same time (or at least remove the ability for it to run anything and limit it to config storage alone) then I really wouldn’t have anything to complain about in Windows 7.
Was it not you writing about You manually need to google and download some library files? So that DEB package is not doing it’s job right. The DEB package is written by IBM and it installs a binary application without obviously needed dependencies.
I see two problems and have one question:
1) The IBM made Outlook-Domino connector is able to make Outlook crash? I don’t think that IBM is using any undocumented API calls. So Outlook is the one needing to be blamed. But it could be that the IBM connector is the problem. I just don’t know. But for sure Outlook is to blamed too for crashing.
2) Not releasing looks? Well… this is definitely something that should be managed inside the IBM Outlook/Domino connector.
And now my question: Have you reported this issue to IBM and Microsoft? You know that software does not fix it self auto-magically?
Then just don’t build anything new and use what is already available out of the box.
You got many other things forced upon you. So you could invest all your energy in fighting (a probably fruitless) fight or you could accept the decision of your company or parent company and live with it.
Well… what should I say here? I know Notes since version 1.01 and I used Notes for many years (in the recent years I am not using Notes for mail). The newer 8.5 release is pretty fast. Just give it the hardware that it needs and it will run pretty fine. You know that every software is slow on slow hardware?
I am certified on Domino/Notes and the quality of Notes/Domino has increased very much in the last years. As a person that can develop/administer on Domino I can say that there are bugs in Notes/Domino. But calling the software buggy is wrong.
Just for mail (you just use mail. Right?): NO
In some aspects it is different than Outlook but not much more complicated than any other mail application that has +/- the same functionality.
Well… for mail alone: YES
I would be 100% on your line for anything below release 8. But after 8.0 Notes is not that non-intuitive.
NO! A PHP error is not a browser issue. But you use one application (mail) that runs on top of Domino/Notes and because that application is not the way you like it, you blame Domino/Notes for it. This is wrong. Would be pretty much the same if you would say that C/C++ and Java are the problem because the Notes client is written in C/C++ and in Java (if you use the standard client and not the basic client).
No comment.
I wish you luck with that. Imagine there is a small subsidiary company of your parent company where all the workers are hardcore Mac users. Imagine them trying to convince your parent company that Windows, Exchange and Outlook is the wrong platform. What do you think will your parent company say?
This is just because you don’t know the Notes/Domino world. If your business would be build around IBM solutions then asking a business partner about plugins for Outlook will put question marks on their eyes. It’s everywhere the same.
Welcome to the club. I used to like Notes but in the past years I have moved away from it. I know the product pretty well and I know what the Domino server is capable to do with or without the Notes client. I am as well fluent in the Microsoft world. And trust me that nothing is perfect. NOTHING. Maybe it’s my age? I don’t know. The older I get the more I like those lean and easy solutions. Notes just for mail is a killer.
btw: I am not trying to convince you that Domino/Notes is the best solution for you or your company (even if your company is using it right now). All I want to say is that if you need a groupware and/or messaging solution on Ubuntu than you could go with the Lotus Notes client.
About 12 years ago, I worked for a company that used Lotus Notes. The users absolutely hated it. It had the most unintuitive user interface I have ever seen. The IT department got more support requests for Lotus Notes than any other software. Really, it was an awful product. Granted, I haven’t used it in 12 years but back then, it was a user interface nightmare.
A lot has changed since that time. Lotus Notes as client is since some time now being transformed to a product that is not the client you used to know 12 years ago.
I could now tell you that Lotus Notes exists since ages and when it was first developed some design decision have been made for the client that had influenced Lotus Notes till today. It was a time where there was no consensus for how to make a multi platform client application, etc…
But this was 20 years ago and it is not an easy task to transform an application with such a huge install base over night. It is difficult to make changes/progress without disturbing the current install base. And IBM is trying to do that with every new release. They make subtle changes, introduce new things and slowly move the product to a more intuitive handling.
I don’t try to defend the product. In no way. I know how easy it is to make a statement that some application is hell to use. But if you would be in their shoes…. would you be able to make it better? I for my self can say that I would sure be able to make some things better. But who am I? I don’t know what resistance a >150 million user base would make if I would start to change the user interface?
That’s not how I remember it.
Rather, there was a time when everyone was using Lotus Notes. Notes, however, was a notorious UI nightmare (see http://bible2.net/content/_media/download/apps/notes/screenshotr7.j…). That’s a major reason why Microsoft managed to gain marketshare: Outlook’s interface was much more intuitive and sane compred to the Notes UI disaster.
Recent versions of Lotus Notes are supposed to be much better, but let’s not rewrite history.
Edited 2011-05-31 18:28 UTC
Recent versions of Lotus Notes are much better.
Nice article,
there is something wrong with the link to Distrowatch (in the footnotes at the bottom) though.
Totally off topic but I just did the unthinkable with ubuntu on my wifes computer. She needed a new printer, so I waltz into bestbuy got a brother multifunction laser printer on sale without stopping to consider the operating system.
Plugged it in, the cd had printer drivers, the website had scanner drivers and 10min later – everything works as expected.
When did linux start getting along with printers sos well?
i bought a samsung laserwriter and i didn’t have to do anything
2004
And as a nice side bonus you won’t have to install gigabytes of crapware just to get the driver!
When Apple computers got important ( NOTE: they use CUPS)
Regards
Use? Heck Apple owns CUPS presently. It’s one of the FOSS projects they aquired.
GPL…
Sure, it’s still under the GPL which means the current source code can be forked should Apple try and close it off. This doesn’t change the fact that Apple is not simply using CUPS but actually owns it as of 2007.
http://apple.slashdot.org/story/07/07/12/1342258/CUPS-Purchased-By-…
If you are having trouble withe your wireless, try using Wicd instead of the Gnome Network Manager. For some reason GNM doesn’t work with a lot of wireless cards.
Some basic scripting, wpa_supplicant and iwconfig has provided network stability that I’ve never seen from network-manager or wicd. Granted, the last time I mucked iwth Wicd was a while back. With Network manager though; rebooting the wifi router should not cause a workstation networking crash.
A quick aptitude purge network-manager network-manager-kde and my connectivity has been rock solid. Life is good.
(not that I’d suggest cli and script network management for any old user though either.)
I am a long-term fedora follower user as a techie, but like the ubuntu brand. A similar article focused on fedora would be good. Perhaps it could include comparisons from this article (ie windows, ubuntu) for reference.
Otherwise, not a bad discussion. Not much mention was made of the Unity stability issue of their last release or the kernel regressions. I think this tainted the ubuntu brand which is otherwise very good.
Thanks for the article!
i have a laptop that works without problem with ubuntu 10.10 and doesn’t work with 11.04
The geek always posts the most outrageous price he can quote for Windows, retail boxed.
It doesn’t matter that no one buys the retail box.
That no one wants be drawn into a DIY system install without the protection of a warranty or service contract.
The upgrade-in-place, maybe. The academic upgrade, maybe.
There are enormous economies of scale in play in the production and marketing of the OEM Windows PC. The truth is that by the time product reaches retail shelves any advantage the “free” OS might have had on price is gone.
Go her for “warranty or service contract”
http://www.ubuntu.com/business/services/desktop
the service will ensure this and if you don’t like the service or price at canonical, you can actullay get support for others; like local support you get in norway form Redpill Linpro.
just my two cents
cheers
I use debian and Scientific Linux(SL). Don’t use ubuntu because I do find it buggy.
I am also the same class of user who would not been running Fedora either.
There is such thing as too cutting edge. I have found with Ubuntu when I do play with it inserting a Linux kernel from kernel.org is hard. Reason Ubuntu has include patches incompatible with mainline in there Linux kernel.
Fedora, SL, Redhat, Debian and most other distributions you can insert a default kernel simply.
All studies of open source and Linux quality have been done on the mainline branches. Not the Ubuntu sub branches. Those Ubuntu sub branches I would not be shocked if they are extremely bad.
One of my classes hardware cases was one of my wireless cards would not work with Ubuntu. Yet stock kernel from kernel.org same version would. The ubuntu custom patches broke it. It would have been simpler to format ubuntu with another distribution than go threw the risky process of finding the extra patches Ubuntu applied and trying to apply them without breaking the driver.
This is only 1 part out of many thousands Ubuntu has flawed.
1 test if a distribution is anywhere near good is if it works on a download kernel source from kernel.org built into a kernel without patching. If not distribution is more likely to be problems.
Lot of Ubuntu issue is it persistence to run a different version of apparmor to what is in the mainline kernel version. Even worse a version that is not profile compatible yet puts the profiles in the same locations. A little diskspace to have a apparmor profile for a stock kernel and a profile for the custom new version they are testing is really not too much to ask.
These little things is the difference between a distribution built for quality and a distribution just built on the bleeding edge.
Ubuntu is a bleeding edge distribution even its LTS versions.
Also you don’t see redhat or SL pulling up Redhat or SL is high quality because these third party has said Linux has low bug counts. Redhat and SL pulls up documentation showing quality control and everything they do to make sure their production and alterations don’t ruin the base quality. Then point to the base quality they are working from.
Ubuntu people stop FUD marketing please you make every other Linux look bad due to poor quality controls. Yes good quality controls always include ways to install mainline stock to rule out alterations being cause of problem.
I used to use Ubuntu as my desktop OS. Nowadays, I only use it at work, as a java developer. I do love open source nature, but Apple makes some REALLY cool applications that I like a lot. Yeah, I may be selling my soul, but I finally can edit a freaking video easily and without bugs (“it just works!”), I can record my sounds with amazing guitar/bass/drum/voice effects using Garage Band and Logic and I still can use every single softwares I used to use under Ubuntu. It is just the best of both worlds. Yeah, I do spend more money than I used. But guess what? I don’t care. If I have to pay a few bucks for a decent software, I will (happily) do so.
Oh, yes, I do know Ubuntu Studio. And it does not (to me) substitute Logic nor iMovie.
Cya! And have fun.
that is exactly it, isn’t it. the extra quality is worth the extra cost.
ubuntu can’t get any cheaper, so they’ve got to increase the quality significantly to gain share.
Logic Studio and Garage Band are not part of the Mac OS X and have to be bought seperately.
I don’t know anything about that field to know if there are other commercial applications available which do the same on Ubuntu or run fine in WINE or whatever.
“By any measure Ubuntu is not bloated compared to Windows.”
By any measure Ubuntu is not as bloated as Windows.
Fixed. Ubuntu is still bloated, just not as badly. Looking at raw numbers given by the vendor is also a somewhat poor way of measuring bloat… that is assuming you can trust the vendor in the first place, and if you’d try running a service pack of Windows XP, you’d find that it easily requires memory in the range of Ubuntu. Especially once you’ve installed a few programs and services.
Your comparison of the latest version of Ubuntu’s “system requirements” vs. previous versions is also stupid, as is comparing to the system requirements of other Linux distributions.
Real world performance tells a hell of a lot more than posted recommendations.
Honestly, based on the first few sections, I can already tell this article is a complete, pointless waste of time to read (and a long one at that).
Edited 2011-05-31 00:22 UTC
So could you please post your real-world comparative performance measurements?
I’m not the one writing articles, am I?
Here’s a little hint though. Any of the Ubuntu variants, vs. one of the following distributions with the same desktop environment… chances are that all of the below will be faster than the equivalent Ubuntu variant:
– Their own parent, Debian
– Slackware
– Arch
– just about any Debian (NOT Ubuntu) derivative
– just about any Slackware derivative
Here’s a little hint. They’re all free. Why not do a fun little experiment and find out for yourself? The low number of daemons running by default in all of these distros (besides Ubuntu) really makes for a snappier feel, and Slack (and derivatives) & Arch are known for their simplicity–further improving performance. Ubuntu is known for… well, doing things at the cost of performance and resource-friendliness.
If you absolutely must have actual numbers and don’t want to find out for yourself, search the Web… articles of the various Ubuntus vs. other distros are all over the place.
Edited 2011-05-31 06:28 UTC
OT because I don’t have stats for Ubuntu, but I did take my standard Fedora configuration and compare the amount of space it uses (excluding /root and /home) with a similarly-confugured installation of Arch. I’m running an LXDE desktop in both cases. Results: Arch is less bloated by a whopping 2% (2462 MB vs. 2516 MB). I expect Ubuntu to produce similar results. Obviously, this is a big YMMV situation, but I thought the results were interesting.
All blow, no show.
The problem with this is that most of us Windows users would want it to work equally as well with newer Windows programs. For example, I get new versions of MS Office for about $12, due to the corporate discounts I get with my company. So, if you feed Office 2010 to Wine, how well will it work?
Last time I tried Wine was about 3 years ago. I found it worked ok for a handful of apps, other apps ran with serious glitches, and most other apps wouldn’t run at all. And for the apps that did work, sometimes it took a bit of voodoo to get them running properly. In short, Wine was a complete waste of time. I mean, I’m sure it’s ok if you just need ‘that one app’ to run, but as a general solution to run Windows apps, it wasn’t really a solution at all.
And even back then, folks were offering Wine as a solution for those who needed to run Windows apps, even though it wasn’t even close to being ready for prime time.
For this reason, even though I really want to know how far Wine has progressed since i last tried it, I don’t expect Linux evangelists (who seem eager to feed people whatever lines of bullshit they feel is necessary in order to get them to switch) to give me a genuine, honest answer.
The other side of the coin is that almost everything of interest in FOSS to the non-technical end-user is ported to Windows or begins as a native Windows app.
The MSDOS, Win 9x and Win XP program is retired for many reasons other than compatibility, age not least among them – and while you may not be able to afford the latest and greatest version of Product X, you probably can afford to step up to something significantly newer and better than what you have.
Gog.com illustrates another side of the problem with WINE and DOSBox. Windows gamers can be easily persuaded to part with $5 to $10 for a known-good port to Win 7. Especially when it comes bundled with links to fan-made high resolution graphic upgrades, patches and other goodies.
run virtual box then, this seems like the best option to me
So your answer is to pay for Windows and then USE Windows and keep Linux around for…what? geek cred?
As a retailer that tried from Ubuntu 6.04-10.04 to actually carry Ubuntu let me tell you what is broken and why I will NEVER mess with Linux again…FIX YOUR FRICKING DRIVER MODEL ^%^$^%$&^! It is 2011 and you STILL don’t have a hardware ABI? What kind of Mickey Mouse bunch is this? fricking OS/2 has an ABI, as does BSD, Solaris, OSX, Windows…WTF?
Here is what happens with retail Pcs and Linux. 1.-Install Ubuntu..2.- User takes it home, sees it needs updating, runs updates. 3.-DRIVERS DIE HARD! 4.- Is there an easy ‘update drivers” button like XP has had for a decade? nope, it is the ‘fun” of forum hunts!
Protip: Home users will NOT DO forum hunts! And my time costs a minimum of $35 an hour, at that price a SINGLE forum hunt quickly evaporates any and ALL money I could have saved by going Linux!
A wise man once said “Linux is free if your time is worthless” and no truer words have ever been spoken. I have XP boxes in the field going on 9 years without needing anything more than hardware upgrades when the users wanted more power. With Linux? I never got a single one, NOT ONE, to come through the 6 month update deathmarch with 100% functioning drivers! Not once!
So if you want to know why no B&M will touch your product, why after 20 years you are at 1%, as a retailer let me tell you. it is because Linus Torvalds is an @sshole that treats the kernel as his own little toy instead of a multimillion dollar project, and when added to all the shims and crap depending on which kernel version you have you end up with a mess. Frankly you could tie $50 bills to copies of Ubuntu and I’d NEVER touch that crap again, ever!
Linux with a stable binary ABI for drivers is an invitation for proprietary drivers and the extinction of Linux. Not having a stable driver interface is surely inconvenient but I think it is probably essential to maintain Freeness. It sucks but it’s true.
Have you ever heard of “The King is dead. Long live the King.” ? Maybe Linux has to die in order to revive.
Most end users don’t give 2 cents on “free”. They value more “works outside the box”, “free of hassle”, “stable”, “I can download and run WOW right now”.
I understand the desires of most users. I am not here to promote the desires of most users. My interest lies in having a Free operating system in perpetuity.
If you want to promote user interest in this area then there are plenty of operating systems, open source ones in fact, that can be promoted for this purpose. Haiku springs prominently to mind.
Don’t be too hard on Linux for being what it is.
I disagree. Even if there was a stable binary ABI it wouldn’t stop F/OSS devs from developing free drivers just as they’ve been doing all along so far. It simply has no effect on that. It would only affect corporations, and well, there ain’t too many corporations providing drivers even as it is. Having a stable binary ABI might entice atleast a few more to provide drivers. And the ones already providing ones wouldn’t change the way they develop their drivers just for change’s sake, they would only waste time and money.
All in all, I think it wouldn’t affect any existing drivers at all, and might bring a few more closed ones to the table for those peopel who need them.
As for actually maintaining a stable binary ABI? Well, that’s a completely different problem. I can’t say I know Linux kernel programming enough to be able to make any informed opinion.
This is one of my biggest gripes against Linux – they break drivers like it is an expected part of running Linux. When this happens, all the nerds just smile and say, “Ah, it happens. You just need to …”, and proceed to give you a long, drawn-out solution that might work. It is OH SO tiresome.
I still run Linux as my main OS, but at times I feel pretty beaten up by the experience. I do think part of the problem is Linus and his never-ending tinkering with the Kernel. For him, it is a march towards a perfect system. I do know, that he does take a dim view of devs breaking things that used to work just to fix some “bad code”. Nevertheless, the march continues.
OTOH, I think the distros are partly to blame. As this article mentions, distos like Ubuntu have traded “new features” for stability, Even in the LTS versions, a kernel update will take out hardware. That is not acceptable in an LTS version.
But, sadly, I also have to bear part of the blame. There are distro’s that don’t do this. Debian stable. RHEL (pricey) and Scientific Linux, CertOS (free). Even Slackware stays closer to vanilla Linux. But as a twitchy end-user, I tend to always want the latest greatest. Maybe I could live with less new features.
Perhaps it is time for a Vendor to step in and fill this gap. Maybe someone could create a truly-stable Linux, and market it well. I’m not sure it would work, though. Feature-itis has been around since the earliest days of computing. It is hard to resist!
Actually it works rather well. For the last 2+ years I’ve been running Office on my Linux “Corporate” workstation. True the earlier rev’s were a bit buggy but I’ve been running Office 2007 Pro and it works perfectly. The only app I use in it is “Outlook”, Word and Excel Doc’s I just use Openoffice. This all works very nicely.
Mounting CIF’s share’s, simple I use mount.cifs and it also works nicely. Now for the very “rare” occasion that I “must” use Exploder I just login to our Corp MS Term server and run Exploder there. One could also use VMWare, with VMWare’s “Unity” it’s another smooth option.
Bottom line is I’ve been running Linux in a Corporate Windows environment for some time. I have hardly any issues compared to my cohorts who run a Win 7 desktop.
I run Ubuntu (actually Kubuntu). I like it because it’s very stable. Everything just works, like EFI booting, hibernate, and video drivers. I have tried other distros like SUSE and Debian but nothing runs as effortlessly as Ubuntu on my PC.
Yes it is a very full-featured distro, and I can understand why some would call it bloated. But I have a very fast PC and I want a full-featured OS, so Ubuntu is just what I need. If you have a slower PC there are other lightweight distros available. Ubuntu can’t be all things to all people.
Ubuntu is a mess, they’re changing everything all the time. That’s cool for 16 years old geeks but not for normal people.
Stop with the “creativity” please. We don’t need a new OS every 6 months.
I think 10.10 was their best release. I have no idea what they are doing now with Unity. GNOME 3 seems like a better idea.
So I wanted to try Fedora 15 to see if what GNOME 3 is like, it didn’t work it all. In a VM (KVM on Ubuntu 10.10) it just crashed (yes kernel panic ! Hadn’t seen that in 10 years or so).
Booting from a CD didn’t work either, the video driver did not give me screen I could work it. Just Fedora-blue with bits and pieces of the loginscreen showing through.
Eventough the video driver I’m currently using for this card is just the standard open source driver which is part of the kernel and has been working in Ubuntu about one and a half year.
I think the thing about making the big changes now is so they have time to polish it for when they go mainstream. Changes have to happen, just be glad its happening now not later.
Amen! The 6 month model has to go!
1) What is Ubuntu’s greatest strength?
The forums. Ask the same question fifty times and you won’t get flamed. No “RTFM” or “See the man page for xxxx(4).” Windows users are very needy and want very specific solutions to their issues…click this, check that, type this in a console even though you don’t know what it does, problem solved. The forums are very friendly to new Linux-users in that regard.
2) Are any of the criticisms listed here valid?
The system requirements / bloat argument is not valid in comparison to other distros. That is a Windows-ism left over from the days of software coming in a stand-alone box that depended on nothing but the OS and listed hardware requirements on the side. Regarding system requirements, most of the perceived lag in user interface these days is a GFX card issue. A smooth UI on Linux is more than having a Pentium IV when the box only “requires” a Pentium III…it is finding that perfect combination of the right kernel, the right drivers (open vs blob), the right version of that driver, and a window manager that won’t shit itself if this configuration isn’t perfect (KWin).
Regarding bloat, unlike Windows, a distribution is just a collection of different software by different people working together…it can be as small or bloated as you would like it. This granular approach to building a system is best experienced with the *BSD’s or Arch Linux…Ubuntu just provides the combination of tools Canonical believes make the best desktop environment. You can add or remove to your hearts content…that’s why Kubuntu/Fluxbuntu,whatever exist. Continuing that thought, I never understood the argument that all software should be on the LiveCD, or people who complain when xxx package is left off. Nobody ever sticks to what the LiveCD gives you. That is another Windows-mentality. We have had working software repositories for almost twenty years. Use them.
3) If you could ask the Ubuntu team to fix one thing or improve one area, what would it be?
Buy Qt from Nokia, and truly differentiate yourself from your RedHat overlords by embracing KDE.
I can see so much potential with it, but they have no fit and finish. It changes far too quickly for any company to maintain. Slow down, be surgical in what is fixed… Create _the_ desktop Linux standard for companies to release to…. then make the ubuntu software market top notch, and watch what happens.
Linux has 20 years of core features being solidified. Rest on those, and nail the user experience.
Edited 2011-05-31 02:22 UTC
Me too.
Stability, reliability, great ease of use… instead we get new features crammed in and a bug cycle that never quits.
Good to see that so many other posts agree with us. Maybe Canonical will read these posts and get the message.
May I inquire as to if you also believe in Santa Claus and the toothfairies?
You guys are so misguided.
Companies, and corporate users are going to stick with the LTS releases for the very reason you just put out there. Because the normal six-month cycle is too fast. The LTS releases have a great time table, and historically have been pretty stable for the server packages. They seem to have quit making the catastrophic desktop changes for LTS releases as well, giving you a stable desktop to work from.
We use Ubuntu server, LTS releases, in our production environment at work — we’ve had great success with those. If they weren’t LTS, I’d be pulling my hair out re qualifying software every six months. Now, I only have to test a few targeted fixes in our test environment before we update the production boxes.
Somewhere around horrid harpy, or impish incubus was the apogee. Beyond that new versions broke more things without fixing old things.
Some of these may be dated, since I stopped around jaundiced judas.
Take grub 2. Please. It can’t recognize bootable DOS or Hackintosh partitions. It will find the hackintoshi, but will write some horrid long stanza that won’t boot. No, it will write two, one for 32 bit and one for 64 bit but neither will boot. This was an early beta at the time but became part of the release. Big mistake. Replace something that works and is reliable with something dodgy and half-broken, but that is Ubuntu. Fedora Rawhide has less unstable stuff.
Wireless is horrible. It won’t find the AP at my desk unless I open a terminal and type “sudo iwlist scan”. It does remember the dozens of APs on my drive into work which are no longer visible. Bluetooth won’t let me pair half or more of my devices (I can tell it the PIN, but it ignores my choice).
My old laptop has a 1024×768 screen. It probes as VESA, which has that resolution as a mode but THEY THREW AWAY THE UTILITY THAT LET ME MANUALLY SET IT. Even doing something in xorg.conf is obscure. So I’m stuck with a blurry screen. It used to have a VIA driver that worked for everything but 3d (it did work but had a few artifact problems). Throw away the working, don’t leave any manual options, give you something broken.
Then there’s the wireless deadlock. Install Ubuntu. Try connecting to the internet. It can’t – Oh, it doesn’t have a driver for your wireless card. It will download it from the internet. Oops, it can’t because it can’t connect. Oh, it doesn’t have… It is stupid not to leave the deb files for all the drivers somewhere so it doesn’t have to download them BECAUSE IT CAN’T DOWNLOAD THE DRIVER UNTIL IT IS INSTALLED.
Then there is the Annoytification. I get this huge, intrusive box that I can’t get rid of on my netbook, but this nearly invisible, tiny box on my HD monitor. This was to replace the standard Gnome notification – the one that had buttons so you could open IM or mail.
Now they were getting rid of the system tray – and replacing it with – what? Nothing? When my battery is about to die, or the cpu about to melt down? Oh, it is to ugly to have an icon say it, so we will just not inform you.
I really wouldn’t mind if they kept the FUNCTIONALITY and replaced less than optimal elements with better versions, but they keep breaking things and eliminating functionality and manual bypasses to the broken stuff, and if they stopped today, they wouldn’t fix everything until rotten ratfink.
But by that time, it should look very pretty, probably all in 3d, but you won’t be able to open a spreadsheet.
The system tray has been replaced with indicators, which pretty much do the same thing as whatevers in the system tray but just more organised, easy to use and consistent. so i dont know what ur complaining about on that one. you just seem misinformed.
The notifications system is the most non intrusive i can think of for any OS, have you tried Growl on OSX? its a stinker
The notification area in 10.04 is broken; if you read back through the forums, IT HAS BEEN BROKEN FOR YEARS… yet, as you say, new features are added, but the small yet consistently annoying niggles don’t get fixed.
Another example is the laptop touchpad driver. Years ago it worked perfectly, now after startup, you have to wait for it to resync before you can use it, otherwise your mouse flies around the screen for a few seconds.
Hardly showstoppers, but damn frustrating when you know THEY USED TO WORK FINE.
This applies to Linux in general, not just Ubuntu. Problems like this don’t give you a feeling of confidence with a product.
I forgot to mention the key thing that made me dump it (for Fedora right now).
I customized it. It was suffering from Mono, so I removed that, the annoytification stuff, and lots of other things in a recent version. Then there was an update.
The update failed (I don’t know why) but I managed to sync everything to get it working, but it put back all the trash and junk I spent time removing. It used to take me a few hours to tweak a new ubuntu install. Now it is an all day project and has to be done upon upgrade.
Good article, Howard. I am not an Ubuntu user, but based on my limited experience, I pretty much agree with your assessments.
As for “3. If you could ask the Ubuntu team to fix one thing or improve one area, what would it be?”, I would probably choose the trainable front-end for the firewall. On my work development system (Windows), I have a firewall that I have trained over the years to be pretty restricted. It is a nice feature to be able to create rules right from the “Allow/Deny” pop-up.
I don’t know where to get started on such a project, but that would be my first choice.
having been an ubuntu user for about 4 years, i don’t think i have ever needed to use the firewall.
Funnily enough I’M posting from a 2xPII@300Mhz with 512 MB ram and its fairly snappy.. and htop says 161Mb are being used with LXDE, opera and mesa compiling in the background I just don’t see this a possible running ubuntu it isn’t just the packages you install its all the crap running in the background too that you would have to strip out that quite a few other distros don’t have.
Another distro I consider quite like besides ArchLinux is Slitaz… its quite impressive what it can do on machines with 64Mb ram or even less when installed to the HD.
When you are on hardware this old one of the biggest problems is having a supported graphics card I have a low end version of the radeon 9800 the radeon 9800SE and it does 2D well and 3D ok even on the AGP2X bus. I pity all the mach64 users out there :/
Edited 2011-05-31 05:00 UTC
Criticism is a funny thing for those who don’t understand the reason or the goal of it. Usually, I also say the same thing, don’t like it, use something else. But when a distro becomes more well known than many others (even if some others might be better), then every distro will be judged by how this one presents itself – mostly by those who don’t know any better, but they are still the majority.
Also, if one becomes too well known, that has the implicit danger of becoming too influential (e.g. other distros might say ok, it sucks, but it’s what people like, so let’s follow suit), which might make other distros worse in the long run. Of course “worse” is relative, but you should get my point
In generic everyday situations Ubuntu might just become Linux (as IE was the Internet, and still is to some extent). It has happened that when the answer for complaints was to mention the option of using another distro, that answer caused real surprise – well, not the answer per se, but the possibility of using another alternative. Many people didn’t get to know “Linux” or a Linux distro, they came to know “Ubuntu”, so no surprise there.
And sometimes Ubuntu also seems to think they are Linux, which they aren’t, they are just one option (they certainly would like to be the only one, no matter), and that’s also part of why I personally don’t like them, even if I sometimes recommend the use of Ubuntu. Also, I wouldn’t want certain Ubuntu solutions to propagate to all distros just because Ubuntu’s popularity. But I’m hopeful that there will always be enough people to keep at least one other distro alive.
One thing I really don’t like about Ubuntu is the lack of choice given in the installer (the text version is a little better than the graphical one because it actually has about a dozen package categories you can turn off or on, but it’s still not fine-grained enough).
Sure, have sensibly selected defaults so that users can just click Next/Next/Next…, but allow more advanced users to turn off or on not just package categories, but be able to drill right down to individual packages and decide to include or exclude those.
sshd and ntpd are two examples that, for some bizarre reason, Ubuntu doesn’t install by default and I’m sure there are plenty of other examples. You might think it’s just petty complaints, but remember that pre-installed Ubuntu is rare, so the installer has to cater for all its potential audience and its severe lack of package customisation will turn off a portion of them.
I think, for example, Ubuntu’s default install doesn’t cater for developers well at all and I’m also mystified why the default for DVD installation can’t include a lot more than the CD version (apparently, it’s just additional language packs, which is ludicrous). It’s why, when all is said and done, Fedora is a better distro for non-novices, IMHO.
Well, maybe that’s because they’re not even trying to cater to developers by default?
I mean, it’s pointless to complain about such when you know perfectly well that Ubuntu is by default aimed squarely at non-advanced users.
+1
Why would developers care about “default install”?
That’s actually a good thing for end users. No sense in confusing them with a bunch of install options. Just give them what most people need.
Most desktop end users don’t need sshd. And having it installed and running if you don’t need it is a security hazard. Why would you want to allow remote shell logins to your system if you don’t need to be able to do that?
If you want sshd, and such by default, Ubuntu Server edition might be more to your liking? You do know they have a separate server edition right?
It’s not possible for any OS install to cater to developers. And developers know what they need to install and will just do it afterwards. How do you expect an OS installer to cater to developers? Say I’m a Java developer, I need the JDK, and probably a good IDE. But I don’t need C++ compilers or kernel headers. What if I’, a mono developer? I don’t need Java then do I? Or say I’m a Ruby developer? I don’t need Mono or Java, but I do need Ruby.
There are way too many different types of developers for an OS installer to be able to take into account. As I said, developers know what they need, and will install it after the OS has installed.
Why would you want it to? Most of the stuff on the CD and DVD are outdated anyway by the time you actually get the CD and DVD and install it. It’s better, in my opinion, to install as little as possible from the CD or DVD and then pull the rest of what you need from online. After all, if you install it from the CD or DVD, chances are the first time you run Update, unless the release is very new, you are going to have several hundred Megabytes of updates to download anyway because the stuff on the CD or DVD is outdated.
This is the best article I have read in a long time
hits almost every nail on the head.
though I had to google sisyphean and ameliorate
I find it better to use Virtual box as opposed to wine
wine still misbehaves , simple things such as not adding a link on the menu after an install, and it is no use offering to install propitiatory software if the user has no clue what it does, we need something like ” would you like to play a dvd then install abcd”, not throw up a list of unintelligible 1980’s file names.
The price of success? Really? If you think Ubuntu is a success, perhaps you’re the one living in a world of fiction.
How do you define success then? Considering that Ubuntu is one of the most widely used distros around, they have a huge number users ranging from complete novices to advanced users… wouldn’t that be success?
Success doesn’t mean being the biggest or reaping the most money in, it means reaching a goal. And gee whiz; one of Ubuntu’s goals was to become one of the most popular distros among non-advanced users, and they’ve reached that goal.
Ubuntu is the one-eyed man in the land of the blind. No one else is even trying to take Linux to the masses anymore. Showing good judgment, I might add.
See above, that wasn’t particularly difficult. Ubuntu is failing miserably at meeting the main goal that was set for it:
https://bugs.launchpad.net/ubuntu/+bug/1
Maybe the real issue is that goal was never realistic in the first place. That’s the difference between hype and hard facts.
How about Google?
Oh Comon,
Chrome OS is basically a X, Linux Kernel, Userland and Chrome … it could be running pretty much anything underneath, BSD, Solaris, Windows doesn’t matter … the important thing is Chrome.
And if you run KDE 4.6 you could be running Linux BSD or Solaris underneath it really doesn’t matter?
The point is that with Chrome OS and Android the underlying OS isn’t exposed to the user unless one adds in particular apps to do it … however with KDE/Gnome/Xfce the underlying OS is quite easily revealed.
Edited 2011-06-01 14:56 UTC
1) What is Ubuntu’s greatest strength?
* Marketing
* Installed base
* Standing on the shoulders of giants
2) Are any of the criticisms listed here valid?
Yes, Canonical experiments too much and this causes major inconvenience to users.
3) If you could ask the Ubuntu team to fix one thing or improve one area, what would it be?
Stop using your users as lab mice and breaking everything with new releases! Test upgrades on as much hardware as you can get your hands on!
Why would you need a firewall? By default, Ubuntu ships with no open ports on public interfaces. That means that in its default configuration, a port scan on an Ubuntu machine would show exactly the same result with or without a firewall. That’s a big difference to Windows, where (at least until XP) the system shipped in a vulnerable state, so it became almost a reflex to install a firewall immediately after a fresh install.
Now, whenever you install a new server program, you usually want its public ports to be reachable — that is the whole point of installing a server program. Having to configure the firewall after installation is just an additional step. If you don’t want that program to open a public port (e.g., MySQL or Apache installed locally for testing), you can just disable that in the program’s config files. I can’t think of a single server program I ever installed (except MySQL and Apache, see above) where I didn’t want its ports to be open. In contrast to many Windows programs, Linux programs usually don’t go about opening ports when it’s not absolutely necessary.
That said, I have to admit that a firewall might be useful for newbies who might accidentally install a server program without knowing that it will open a port.
I completely agree, exactly my thought. A firewall is hardly needed in Ubuntu.
The only thing which is installed by default and listening on the network is the Avahi-daemon.
Personally I think the Avahi-daemon could be configured a bit more strict but that is about it (I think this is because of compatibility with old Mac OS X versions or something).
A firewall on by default could help with installing daemons. But I think if that was on by default, I an install script for such a daemon would also probably open the port on the firewall during installation.
Or atleast do something along those lines to make it easy to do so.
Edited 2011-05-31 11:54 UTC
Agreed – I’ve seen users mess up personal firewalls in Windows leaving a box that can’t connect to the Net get DHCP etc. As a desktop does Ubuntu need a firewall if the ports are closed? – different on a server but on the desktop an unnecessary complication.
The Sins of Ubuntu is, many people think Linux IS Ubuntu, or it’s the best distribution available since it’s the NO.1 on DistroWatch for six years. And when they find Ubuntu has lots of bugs, they blame “See, Linux is not ready yet”.
But in fact there’re a lot more distributions out there, and the bugs they blame are usually Ubuntu specific. Ubuntu is so popular that it makes Linux afford its sins.
the quality of free desktop distros is not as high as the for-pay desktop distros windows and osx. for the free distros to compete they need to change how they operate, or find a new business model to pay for quality…
…trying to convince himself he’s made the right distro choice.
It neglects to address any of the important problems discussed around Ubuntu over the last year and instead just invents a lot of not very interesting new ones.
The interesting questions over the last year has been:
1. Ubuntu does very little actual upstream work despite their popularity.
This isn’t a particularly fair comment. While Ubuntu is heavily used, Canonical earns very little money from it compared to for instance Red Hat. Red Hat probably does ten times as much upstream work as Ubuntu, but they also probably earns ten times as much as Canonical from Linux, so they can afford to.
2. Ubuntu just tramples over upstream with little or no interaction backwards.
When Ubuntu DOES do upstream work, they rarely discuss requirements properly with the upstream product. Instead they just implement something (possibly requiring copyright assignment from contributors) behind closed doors and then get surprised when upstream doesn’t accept it afterwards. All the other major distributions do this better. Red Hat, for one thing, had an open discussion with the community about their ideas for GNOME Shell and developed it WITH the gnome community out in the open (with no copyright assignment). Consequently, GNOME Shell is the default upstream user interface, not Unity.
3. Ubuntu rarely consults anyone about changes, but rather just “does what Mark Shuttleworth likes”.
GNOME is also a bit guilty of this. However, there is a major difference. GNOME upstreams at least has the discussion in the open where people can participate, before any decisions are made. I.e. if you care enough about the direction of GNOME, you can participate on live.gnome.org or the mailing lists. This is not true for Ubuntu, where decisions are only made public after they have been made.
Personally I like using a popular distribution, because it is well supported. But Canonical has taken some fairly massive steps to alienate me recently and so rather than upgrade to 11.04, I decided to install Fedora. I immediately missed some quality assurance, especially with regards to NVIDIA drivers and wireless, but I much prefer GNOME Shell over Unity so I’ll have to see whether I will switch back or not.
Yes, these are actually the issues I and everyone else I know have with ubuntu.
Most of the ones in this article are more general Linux Desktop issues that affect several distros.
Of course. Canonical needs to think beyond Ubuntu. What Canonical is doing is doing a dangerously risky investment to a single business direction. For example, make a better sound component to replace ALSA and PulseAudio for any Linux distro.
If they want to be all experiment-y, here’s an idea. I think there is a good need for Canonical to make a brand new OS from scratch alongside with Ubuntu.
I installed Ubuntu on my mom’s computer system a couple years ago, and she has been very happy with it. She likes to play card games and various strategy games and such, and she especially likes the fact that she can install any game she wants from Synaptic without worrying about viruses and malware. I don’t see why anyone would say Nautilus is difficult to use. It works like almost any other file manager from what I can tell, and is not that different than Windows or OS X. My mom didn’t have any problems with it.
There are only two complaints i have about Ubuntu really:
1: DIstribution upgrades are usually problematic enough that end-users can’t perform them on their own. For example, when I upgraded from Ubuntu 10.10 to Ubuntu 11.04, it rendered the system unbootable into GNOME I had to manually reconfigure video drivers from the command line, manually download the wifi drivers on another computer, copy them via USB, manually install them, manually install the kernel source, before I was able to get networking back and restore full functionality. This is something my mom, of course, never would have been able to do on her own. They definitely need to do a better job of testing their upgrades on different types of hardware, and making sure that it doesn’t do stupid things, like remove a no longer compatible video driver, but then not replace it with anything (which is what it did in my case).
And my other gripe is that, Unity sucks. Sorry, but it just does. That also would have confused my mom terribly if I hadn’t set it up after the upgrade to use “Ubuntu Classic” by default. In my opinion, they should have made that the default and given the users the option to try Unity on the first boot, and then make it easy for them to go back to classic if they didn’t like it. Instead of forcing users into this entirely foreign desktop. And of course, without help, people like my mom probably would not have figured out that they can get back to the “old” desktop from the selector at the bottom of the login screen.
You never even tried your mum on it?
My mum uses unity, mostly without a problem, if your mum is like most mums then she’ll probably just care about getting into firefox. And unity makes this easy and immerses you in firefox. which is perfect really.
Nope. Haven’t tried her on it. I also don’t consider it stable enough yet to be used as a primary desktop. For example. I’ve seen a few applications that lose their menu bar completely because for whatever reason, they don’t want to play nice with the “Mac like” integrated menu at the top. But instead of keeping the menu in the application, it simply disappears instead. There are usually work arounds for it. Like you can pass a parameter from the command line when starting the application that explicitly disables the shared menu thing. But again, that’s not something the average end user is going to figure out.
“Several academic studies and papers conclude that Linux and open source software have fewer bugs than commercial products. Ubuntu has bug-tracking identification and resolution procedures equal to those of any large, well-run software project.
From years of participating in the Ubuntu forums, I’ve encountered consistent anecdotal evidence. I read very few posts where a user abandons the product due to a bug. This is a huge vote of confidence in Ubuntu. (You can’t say this about every Linux distro.)”
ok. bugs is why i stopped using linux. and maybe it does have less bugs. but every time i installed the new version of ubuntu i had to spend hours fixing problems for things that should just work. the problem with ubuntu’s bugs is that they are highly user facing. ie, a piece of software isn’t compatible. or wont work with other parts of the system. and a work around exists in many cases, but i got tired of dealing with that.
That’s a good point. It may be true that Linux in general is less buggy than Windows. But most of the bugs that Linux does have seem to be in the UI stack, so they end up affecting desktop end users. I guess that’s to be somewhat expected. Linux’s strong point is obviously the server market, where it has at least 50% market share. But on the desktop, it hovers right around 1%. So of course, much more developer time gets spent on the server aspects of Linux than on the desktop side.
I’ve used Ubuntu since the inaugural release, and more recently I’ve abandoned it for openSUSE, which is my daily driver. But both of these systems suffer from the same serious, fundamental problem- instability. The entire Xorg graphical stack from the video driver all the way up to the desktop environment is precariously and disgracefully unstable. Ever since the very first week that I started using Linux in 2003, I experienced frequent total crashes of the X server that resulted in lost work. And today in 2011, I experience the same problem. Several times, things as simple as disconnecting a USB device or selecting a certain menu combination in OpenOffice have caused a fatal Xorg crash that made me lose all my work. In short, Linux with Xorg crashes constantly. Windows does not.
So, there’s a lot of work left to be done. Sorry, this isn’t what people wanted to hear, and this isn’t what I want to see either. But it is what it is. And I continue to use desktop Linux. I figure it’s better to lose my desktop work to a crash than losing it to a Windows hacker…
Oh yeah. One more complaint. And that is the lack of integration of many applications. One in particular that I have found to be problematic. There’s no easy way to plugin your digital camera, select a photo from the photo manager, and say “email this”, have it automatically resize to some reasonable size for sending in email, automatically attach it to an email message, and then you just have to fill in who you want to email it to. Of course, on a Mac, this kind of thing is trivial. On Ubuntu, I haven’t found any photo management software where it is possible. It is of course, possible to have the photo manager automatically start when you plug in your digital camera and then import photos. And there are plugins for Thunderbird that can automatically resize attached images. But it still requires the extra step of importing the photos, then opening Thunderbird, and finding the image on the file system and adding it as an attachment. There’s no easy way to create an email message from within the photo manager that has the images already attached.
That’s something that if I ever get enough free time, I might remedy myself. But finding free time to work on FOSS software these days is difficult.
First, there are a lot of false statements around here:
Windows and Office license fees. All linux evangelists out there do their best to quote the biggest price for Windows/Office. Almost no one buys retail license, OEMs buy OEM licenses and Corporations buy Volume licenses. Both type of licenses are much cheaper than buying a Windows CD from the shelves.
Home user either buy Windows PC – and they benefit from the low price of OEM license, either upgrade from a previous version of Windows which is pretty cheap.
Another false statement is about Linux better supporting Windows software than Windows. Windows 7 quited to support old software (like 10years+ old software) directly in the OS, instead it provides you with Windows XP mode (essentially a VM running XP kernel and runtimes). They do this because MS has to move forward. What do you expect? Support in Windows 8 for some 1990’s DOS software? Yes, Linux has DosBox and Wine. The fact is almost nobody gives a shit now for old software.
Wine supports old Windows software -not all but many- because it emulates best old windows software. Try to run Photoshop CS5 or After Effects CS5.5 in Wine and call that a proper user experience. If they ever run will be slow and sluggish.
The truth is Linux doesn’t support old linux software. Try to install Heroes of Might and Magic III on Ubuntu NOW.
Hardware reqs does matter only if you are poor and can’t afford a P4. You can buy a used P4 for as low as 50€. If you are poor and only have a P2 or P3, than you have a reason for using Linux over Windows.
I’ve used Linux from 2000. I ran into lots of bugs and issues over the years. Ubuntu included. Nothing that I can’t fix, but the biggest issue here is that it takes time. While many years ago I didn’t find this as being an issue – as I was learning the internals of the OS and was somehow new and exciting, now I tend to find my time very valuable. If I spend a few of my working hours fixing Linux related issues instead of working, than I consider Windows to be much cheaper than a 0€ linux distro downloaded from the net.
My biggest complains about Linux are the lack of software (like in software I need, find useful and consider it quality software) and the fact that it seems more like a bunch of different software thrown together than an OS.
For the last issue I tend to blame the anarchic development of some FOSS software and the GPL license. The ecosystem developed around Linux is best described by anarchy. Anarchy and Democracy can’t be good software development models. To have some degree of success in OS area you have to use some other models: Autocracy (Os X) or Dictatorship (Windows). I think that if Linus has been late by an year or two, we would have seen Open Source OSes with much better adoption. I’m thinking, of course, of BSD family of Operating Systems.
As someone said earlier, Linux doesn’t have a hardware ABI. How funny is that in 2011?
As much as I don’t like Os X, I have to admit that they have done a much better job than any Linux distro out there. They didn’t just throw a bunch of open source software together. Yes, they borrowed heavily from open source, but they took their time to write their own software, improve the open source bits and provide a high level of integration between the different os bits.
I think that to achieve success, Ubuntu has to do the same. I don’t think that would take more than tens/hundreds of millions $ or € to do the same. Or you can write the entire OS from scratch, but that would be much more costly.
I didn’t attack Linux/Ubuntu/FOSS in my post, so no need to vote me down. I’ve just expressed my views. I don’t pretend that I’m 100% right.
Best value of linux is on servers right now. Top enterprise contributors to linux kernel (IBM, Oracle, Intel) don’t give a shit about desktop linux. All they care is servers and enterprise linux.
Second best value is on mobile phones and tablets. But Android isn’t quiet the typical linux distro.
Edited 2011-05-31 15:19 UTC
dont intel make meego?
Yes, ofc. But they do care more about corporations buying Xeons than netbooks with Atom.
I don’t think Canonical could have afforded to buy Novell considering that Attachmate paid 2.2 billion.
No kernel backwards compatibility. You have to recompile (at least) and often patch any specific driver module from a kernel release to another. This is why hardware manufacturer don’t release Linux drivers. This is why human people don’t like Ubuntu, it lacks LOTS of drivers like every other linux distro. Less people that use it, lesser expert and developers that can support and produce software for this platform. STOP update kernel this way and Linux will win in a few years over every other OS.
You are completely right!!!
It’s annoying even in enterprises environments… you’ve upgraded the kernel and then you have to rebuild FC, Multipath and every 3rd party driver… too much work and very error prone.
Linux is an amateur OS in this particular matter.
So, there should be a user-friendly way to generate and edit a cryptic config file that few people have a need to touch? Doesn’t this sound ridiculous? I’m fairly certain that the “generations of Linux support personnel” that are used to editing X.org config files have no problem generating a new one. This suggestion amounts to, “It needs a simple way to complicate things!”
I believe that’s called “moving out of the stone age” or something along those lines. Most of us would consider the fact that xorg settings can be automatically discovered and configured at runtime these days in the vast majority of cases to be progress. No more need to specify mode lines and mouse configurations and such in xorg.conf is hardly something to complain about if you ask me.
Edited 2011-05-31 20:46 UTC
Great article. It occurred to me, however, how unscientific common assumptions about Linux’s appeal are and how little or nothing (to my knowledge) has been done to test them systematically.
Commonplace #1: End users don’t care about Linux’s “freedom agenda”.
Commonplace #2: End users don’t want to put in work to get a system configured.
Commonplace #3: Unstable kernel API’s/lack of integrated enterprise stack/[INSERT CAUSE HERE] are holding back Linux on the corporate desktop
I don’t work in a corporate setting, so I can’t say anything other than to point out that the skills expected of a sys admin have changed greatly over time. I read a comment from one of Hadrien’s recent posts about interrupts (not on OSNews but elsewhere) about how the UNIX security model was designed with keeping the server install protected from its own internal users rather than the external threats that are considered more dangerous today. Which requires more skill and money to cope with? If you can answer that question, and by extension if you can definitively say that the disadvantages of Linux or whatever OS cost a company more than the savings created by its advantages AFTER accounting for economies of scale that would be created by widespread adoption of the new platform, then we can talk. In other words: would training and employing staff who are expert in recompiling binaries after the kernel API breaks/whatever be any more expensive than hiring people who are expert at handling Windows’ foibles once the former kind of people are being churned out by the thousands? If so, then we might have a real basis for labelling an architectural difference as an architectural defect (or worse, as a showstopper).
As for end users, I think that Apple has pretty clearly shown that you can sell computers on more than just technical merits or ease of use, though to be sure Apple’s products are mature and user friendly. Apple’s real insight, however, was to understand that computers could be sexy, the Jane Q. Public could fall in love with one: in other words, that computers are no different from anything else, that computers are stories. You can only sell a good yarn. They knew this a long time ago, lost it under Sculley, and finally regained it; they aptly controlled the narrative even during the transition from PPC, when Apple diehards went bananas.
Selling that narrative will be a critical part of spreading Linux. Frankly, the cavalier arrogance of everyone who assumes that no one cares about Linux for its freedom is astounding. Let’s not forget that hippies changed the world. The free software story is incredibly romantic and is one of Linux’s greatest assets (plz don’t be annoying and nit pick about *BSD, kernel blobs, or Stallman or Ulrich Drepper or something kthx). Sometimes I wonder whether that arrgoance, which too often goes under the name of hard-nosed “experience with ‘real’ users”, is a form of self-loathing. “Well, yeah, clearly, it’s important to me, but why on earth would anyone else get it? After all, I’m really nerdy.” Or maybe just laziness or the inability to sit down and think of a way to sell OSS culture in a way that people would get it while also not dumbing it down (i.e., the patronizing variant of the same cavalier arrogance).
A nit pick for the author: DistroWatch’s ranking doesn’t measure popularity, IIRC, but how many clicks links receive or something like that.
For the record, ever since my OpenBSD laptop bricked, I split my time between my Nokia N800 and my own custom-rolled Linux distro.
(Edit: added inflammatory remarks and OS disclaimer)
Edited 2011-06-01 00:54 UTC
I don’t understand people thinking this was a move.
LibreOffice is the Ubuntu/SuSE branch of OpenOffice, they just finally got a different name and Google backing.
I recant the different name bit… a good name. Go-OO wasn’t a good name.
Edited 2011-06-01 01:19 UTC
Ubuntu isn’t following MS’s model on releases. MS doesn’t arbitrarily release a new version of their OS every 6 months. Ubuntu’s model is a little silly. I understand not wanting users to sit around on an old OS and get passed up, but re-releasing the OS every six months is a bad model. And it certainly isn’t Microsoft’s.
MS also doesn’t just release new features to keep people interested. Computers change over time. Was there a significant leap in CPU speed and memory availability between 1995 and 2001? Of course, which is why new features were added to XP from Win95/98.
The only other time the author shot himself in the foot was when he attempted to dispell the “myths” about hard Ubuntu might be harder to use, then ended up proving it by saying the hardware detection is screwy. I wholeheartedly agree that this is the Achille’s heel of Ubuntu. MS has been pretty good at driver integration and making drivers backwards compatible. Ubuntu is horrible at this and you can’t expect users to embrace an OS that stops working with their hardware with a new release, especially under the moniker that it’s easier to use!
Of course, you can do what I do; wait another 6 months for the latest release, then your hardware has a good chance of working again.
It’s the only one it can have. Canonical is essentially a packager and depends almost completely on upstream. That’s why they are basically yet another distribution instead of an operating system on its own right. And without a viable business model that provides them with serious money, that’s all they can ever be.
It’s amazing how after so many years people still consistently underestimate the immense amount of resources that creating a general purpose operating system takes.
I guess it depends on how you define “viable”.
Exactly, which is why arbitrarily choosing 6 months is silly. Can they honestly say they can make all of the improvments and test them for bugs every six months? I would think you’d need more time, especially considering how much time and effort it takes just to get one or two applications to work well, let alone an entire OS or distro.
Linux is like designing a house with builders who have no design team, and little interest in communicating with the people they are building the house for.
The pragmatic, technical, in-joking, legacy way of building Linux is what leads to a product that only a small margin will ever be passionate about, while the masses will look elsewhere.
Linux is high on ‘features’, but short on ‘benefits’.
“It’s terrible,” De Raadt says. “Everyone is using it, and they don’t realize how bad it is. And the Linux people will just stick with it and add to it rather than stepping back and saying, ‘This is garbage and we should fix it.'”
De Raadt says his crack 60-person team of programmers, working in a tightly focused fashion and starting with a core of tried-and-true Unix, puts out better code than the slapdash Linux movement.
“I think our code quality is higher, just because that’s really a big focus for us,” De Raadt says. “Linux has never been about quality. There are so many parts of the system that are just these cheap little hacks, and it happens to run.” As for Linus Torvalds, who created Linux and oversees development, De Raadt says, “I don’t know what his focus is at all anymore, but it isn’t quality.”
I can second him.
Edited 2011-06-01 21:18 UTC