“The news that Dell will begin making the fast-growing Ubuntu flavor of Linux available on some of its machines should be welcomed by consumers everywhere.” Read the rest of the review here.
“The news that Dell will begin making the fast-growing Ubuntu flavor of Linux available on some of its machines should be welcomed by consumers everywhere.” Read the rest of the review here.
From the article:
Anyone else noticed this? Think I’ve been using GNOME since about late 1999 but never realized it looks so much like Windows..
Ubuntu (and by in large Gnome) has generated buzz and has brought many new users to Linux, including myself.
But the Gnome devs have a terrible pretentious “one size fits all” attitude, and it shows:
http://live.gnome.org/GnomeScreensaver/FrequentlyAskedQuestions#hea…
“Under the xscreensaver model… If they didn’t want someone to put “The CEO is a bastard” in rotating, 3D text on a publicly available computer the only option was to remove it form the system.”
So… they did.
Because you can change “2.6.xx-Linux-Generic” 3d text to “The CEO is a bastard” they removed all options from all screensavers.
Replacing Gnome-screensaver with Xscreesaver is an option, thought it breaks some things.
With each release there are less options. I guess that is appealing to some, but I had trouble explaining to a new Ubuntu user *why* they can’t change their 3d text or any other screen saver options. They were hoping it would be in the next version… No, I explained, it was in the previous version.
One of the first things a new user does is change their desktop background. One of the next few things is to pick or personalize a screensaver.
A user can still use a CEO/Bastard wallpaper. By this reasoning, wallpaper choice should also be removed.
Even though it is “easier” than KDE, I’ve found many new users are more frustrated by Gnome’s lack of options as apposed to KDE’s over abundance.
I just question the choice of Gnome for new users.
*I still use and prefer Gnome.
…Why anyone would want to have a screensaver like the one you advocate is beyond me …
Screensavers tend to be one of the lamest things available on the desktop – very rarely beautiful in any form and too often they’re plain ugly even…
I always install and enable the BSOD screensaver – at least that makes me chuckle from time to time…
But I get the drift – why remove otherwise available configuration options from the GUI? -And honestly I don’t know…
Sometimes I think it’s a price to pay for Gnome being open source, in terms of that it might be difficult to get the ‘right’ people to do a certain job, as in people with the exact right education within an area, e.g. graphic design and GUI design…
…And Gnome’s not the only one suffering… KDE’s just as bad in its own ways (like e.g. a too configurable GUI and aesthetically displeasing details)…
I’d just wish for more open source developers to look in Apple’s direction when it comes to beauty and sense of detail. I don’t know about Vista’s Aero Glass interface, but I definitely prefer Apples Aqua over anything else I’ve worked with and would like to see Gnome move more in that direction than the more Windows-esque direction e.g. KDE is taking…
Screensavers tend to be one of the lamest things available on the desktop – very rarely beautiful in any form and too often they’re plain ugly even…
Here’s the thing, Linux (and the OSS in general) is all about choice, and the previous poster was right in suggest that GNOME, with each version, is slowly taking this right away.
Whether I want to put a stupid screensaver on my private desktop is -my- choice to make, not anyone else’s.
FYI, I use both GNOME (servers, low-end machines) and KDE (development workstations)
– Gilboa
This is a retarded comment. They didn’t remove xscreensaver because of the inability to prevent users from using a “the CEO is a bastard” screensaver text. They replaced it because the new code integrates better with GNOME technologies (DBUS, themes), etc. They separate themes from theme engines, to allow the system to be more easily administered. If you are the administrator on your machine, you’re perfectly free to create a screensaver theme that rotates through hardcore porn.
This is a troll, and a very good one at that, since so many people bought it without actually reading the page you linked to.
A new Ubuntu user is not easily able to change simple screensaver settings.
And while my quote is out of context, I linked to the source. Further, the CEO is a bastard remark is their words, not mine.
In the explanation more than half of it describes what a fantastic restriction it is as a selling point.
Don’t get me wrong, corporate administrators will likely see this as a feature, but new Linux converts find this missing ‘feature’ rather quickly, and are quite surprised to learn it is intentional.
And again, I choose Gnome, but I have had to set up KDE more often for new users due to such options not being easily available or apparent.
GNOME is like Mac OS X! Am I the only one who thinks this? So many parts of Nautilus and other parts of GNOME are similar… The Windows shell is quite different to GNOME, at least, I think so.
I like Ubuntu a whole lot, but, this article is just beyond the joke. It makes outrageous claims such as Windows being really slow and Ubuntu being much, much faster. Now, I’d love this to be the case but Nautilus and many GNOME apps aren’t exactly lightweight and ultra-fast…
I vote for a 24-hour ban on Ubuntu articles
“Is this a new day in computing? That might be a slight exaggeration, but I hope not.”
Yes, you are exaggerating. We are a long way from Ubuntu being the best desktop operating system ever. I use it every day and I choose not to use Windows – but I still acknowledge that Windows has advantages and it’s not all black and white.
More precisely, GNOME is an evolution of MacOS Classic. It takes a lot of the fundamental principles of that platform, and evolves them in new directions (adding layout management, etc).
As for being fast, it’s really weird. GNOME was a pig on my PII-300. Windows and BeOS flew in comparison. But fast-forward seven or eight years, and GNOME isn’t much more heavyweight than it was back then, and OS X and Windows (Vista) have exploded. Moreover, some credit needs to be given to the kernel developers. The early problems with interactivity have really driven them to make a kernel well-suited to interactive use. It’s still not as snappy under low-load as Windows (which uses scheduling hacks to achieve that speed), but under moderate/heavy loads it is the absolute best kernel out there for a desktop machine (under such loads, Windows’s scheduling hacks come back to bite it in the ass).
Its not an evolution of anything it is what all *western* desktops are a *desktop* analogy. The reality is its an analogy we are all familiar with. Now whether that analogy works well or is a good one. It is the standard.
If you read the GNOME HIG, you can clearly see that a lot of it is modeled pretty closely on the old Apple HIG.
and Windows looks like Mac OS
Alright, another day, another Ubuntu “news”.
But, at least, this one has real information
It is fast, lean and responsive, like a sleek jungle cat prowling through the South Africa outback. But unlike a jungle cat, Ubuntu’s not rapacious.
Edited 2007-05-05 06:31
Lately I’m reading all of this naysaying, and it’s beginning to harsh my buzz. What we need is more cheerleading like TFA in the Linux echo chamber. We need to get back to our idealistic roots and rediscover what we do best–preaching to the choir. Read Slashdot, OSNews, and anything by SJVN. Get with the party rhetoric, and stay on message.
If this isn’t the Year of Desktop Linux, then it’s next year–at the latest. If there’s any hardware or software that Linux doesn’t support, it’s because Microsoft is a convicted monopolist and a big meanie. The OOTB experience doesn’t really matter since it’s so easy to fill in the missing pieces. Linux and all other free software is inherently more secure and more stable than proprietary software. Binary compatibility is a conspiracy perpetrated by the proprietary software industry to force people to pay for software, which is an intuitively flawed concept. Nobody has proven that Linux distributions infringe on any patents, and nobody ever will.
That’s our story, and we’re sticking to it. World domination is really close now. The last thing we need is some well-meaning blogger kindly sharing 10 reasons why Linux will never replace Windows. If you use Linux, you have a responsibility to say great things about it on the web. If you’re having problems with Linux, it’s probably your own fault, so don’t go slandering our fine product in the online media. If you’re really dumb or intoxicated, just remember: Linux = Good, Microsoft = Bad. Thanks for your cooperation.
I don’t really know what you’re talking about, but I agree )))
Interesting… You’re one of the people on this forum whom I respect a lot, even though I don’t really agree with your pro-FOSS views.
I hope you’re not becoming jaded or pro-BigCo. It’s not so fun to disagree with people who don’t think. Maybe this whole Ubuntu phenomenon is killing the spirit that makes Open Source appealing to people with an engineering mindset.
Thanks for your vote of confidence, and, yes, that comment was highly sarcastic. Just testing the filters, everybody!!
I’m all for increasing the level of debate in our society, whether it be about Linux, politics, or anything else that matters to us. For me, Windows represents a world-view that coincides with the talking heads you see on TV, while Linux and other free software efforts reflect ordinary people taking responsibility for the direction of our society via the Internet.
It’s not so fun to disagree with people who don’t think.
It’s also not so fun to disagree with people if you can’t disagree with yourself. Everybody has challenges, every idea has shortcomings. Everybody changes their mind, and I immediately ignore people who refuse to admit it.
Think, reflect, communicate, share, and be honest with yourself and others. Everything is now a culture war. You can’t win if you refuse to think.
Thanks for your vote of confidence, and, yes, that comment was highly sarcastic. Just testing the filters, everybody!!
I picked up on it fairly quickly, but what scares me are the people who were enthusiastically nodding their heads in agreement as they read your comment.
Everything is now a culture war. You can’t win if you refuse to think.
Culture is far too localized and personal for any culture war to be winnable. These battles will rage on forever because you simply have one group trying to impose its values on another who will always refuse to accept them. In the US, at least, this sort of attitude has become frighteningly predominant. For humanity as a whole, the culture war will ultimately be “won” when everyone learns to practice their own values while respecting those of others.
In the context of technology, I realize that Microsoft has a lot to learn in this regard. Lately though, it seems the Linux crowd isn’t doing too much better.
Edited 2007-05-06 09:23
While we’re on this tangent, it reminds me of a point somebody made about Ron Paul in the Republican debate: “He’s dead wrong, but at least he stands for something.”
I think that the Linux community has to be prepared to accept the possibility that we are dead wrong. We stand for something, and that’s a part of what draws people to our community. But that doesn’t necessarily mean that we are on the right track. We might perceive a downhill battle, but we face challenges that could stall our efforts.
In that original post, I alluded to a number of these challenges: high expectations, third-party support, white-box security, binary compatibility, and intellectual property. There are more, undoubtedly including issues we haven’t yet considered. That’s why we need to start thinking and stop rationalizing. What are our desired outcomes, and do we have the solutions to take us there?
I don’t think we’re getting any worse in this regard. Actually, I think we’re getting better. As our community expands, the newcomers are increasingly driven by pragmatism. And old-timers like myself are becoming wiser with experience and age. Some of us truly thought in 2000 that Linux would have a solid 10% of the desktop market by now. It didn’t happen despite the relentless pace of development. We were wrong.
But why were we wrong? We’re seeing reasonably fair comparisons of Ubuntu and Vista that come to the conclusion that Linux ain’t no slouch on the desktop. But most of these articles touch on the OOTB comment I made in that post. The market is telling us we’re wrong. We have our reasons for not shipping proprietary software by default, and they are to a certain extent central to what we stand for. But the public doesn’t buy into this part of our strategy, and as I’ve previously stated, neither do I.
I’m calling Ubuntu out on this. They claim to represent Linux for Human Beings. Like it or not, it seems that many human beings want to play MP3s and watch YouTube. If their tag line was Linux for Free Software Enthusiasts, then I wouldn’t be complaining.
There are clearly a couple simple things we can do to dramatically improve our ability to appeal to the masses–if that’s the outcome we’re after. But until we try, we won’t know about those other challenges hiding behind our current barriers. We assume that increased marketshare can only make the free software community stronger and more influential, but it’s easy to foresee issues. Will our communities scale effectively? What happens to the community support model when we get flooded with newbies? Will distributions continue to proliferate, or will we see consolidation? How will a shift in marketshare affect that giant IP bullseye on the back of the free software community?
I’m not saying we’re wrong, but I don’t think we have all the answers. It’s great that we stand for something, but let’s not allow that to become a liability. OK?
I think one of the answers lies in to cater to the end-user like Ubuntu and Firefox does.
Make your software great, easy to use and and solve real-world problems. Then your user base will come.
Port more software to Windows, so it is easier to switch platforms. People use applications, not OS’es.
It is easy to get disappointed because making software that is great is not something you do overnight.
Maybe we are all wrong, as you say. We should strive for always to be open to change our ways if the past teaches us that we were wrong.
I don’t think we were dead wrong. I think we underestimated the inertia created by the Windows quasi-monopoly, especially as it pertains to MS’ OEM program.
The mp3 issue has been solved with the latest version of Ubuntu. Click on an mp3 file, and Ubuntu asks you if you want to install the required codecs. It then automatically installs them for you.
Click on Desktop Effects, and it does the same thing with proprietary drivers.
Understand that these limitations are not due to a choice on the Ubuntu devs’ part – there are legal issues at hand here. You can’t distribute the drivers pre-linked to the kernel, or you’ll contravene the GPL.
The YouTube thing means installing Flash, which can be done either by going to the Macromedia website or through Synaptic. As far as I know, it’s not automated yet (maybe I’m wrong). This could be an improvement, and as such a bug could be added about this to Launchpad. That said, these (along with pre-installed DVD playback) are things that could be included on a Dellbuntu box by default.
I think the Dell/Ubuntu announcement is going to shake things up. It won’t mean market domination in the forseeable future, but that’s all right: market domination is *not* necessary, nor has it ever been. No one would argue that OS X has “failed” as a desktop OS, and yet its market share is not so much bigger than Linux’s.
This might be good for Linux enthusiast but for your average home users and those in business world it won’t matter much. These folks require the use of many “windows only” commercial software applications ( as well as oem hardware drivers ) and frankly Linux is very lacking in this area. I don’t see this venture going anywhere but down the toilet. If the commercial software applications are not there then it won’t penetrate that much into Microsoft’s market share.
If the commercial hardware drivers are not there I don’t see your average consumer flocking to Ubuntu. I don’t think grandma will be happy if she has to recompile her kernel to get her firewire or usb camera to work. In the end I’ll wager good money that Dell will dump Ubuntu after lack luster sales and demand.
In the end Linux will only be a buzz word for nerds and server admins and nothing more. It’s just the nature of the GPL and OSS movements that make it so it’ll be nothing more then a niche OS. Don’t get me wrong a few years ago I was all hyped up about Linux but eventually I realized that the OS consumer market will and shall always be dominated by Microsoft. MS just has to big of a leverage in the commercial software industry and with hardware manufacturers. All of whom are on their side and work in one way or another in Microsoft’s favor. You can sit there and argue the semantics about the nature of this relationship all you want but in the end it does not take away that MS is the king of the OS world. MS shall remain the king well after you and I are long dead and gone.
Edited 2007-05-05 07:18
I find that day to day “most” home users don’t have many commercial apps that don’t have a functional equivalent in Linux. It’s early and I’ve not had coffee but not a single one springs to mind.
Having said that I put Gamers in a different class. We do need to see more games on Linux, but then no one is selling Linux as a gaming platform. So why worry about it?
Businesses do tend to have more commercial software that can’t be easily replaced. Then there’s legacy software, that does tend to be a problem. But for large proportions of a company the basic office suite, web browser and e-mail is all they need. Do you really claim Linux can’t compete here? Cedega and CrossoverOffice are pretty good at running legacy apps but this is an area that needs work. Thankfully the last few years have seen development of software switch to platform independent languages such as Java and .NET so this is improving too.
I’ve been a linux user for about 8 years. I’ve used nothing but Linux for about 4 years. I’ve got quite a few Linux only apps that will not run on MS Windows, should Microsoft likewise be chastised for not making it easy for me to switch back? I’ve never heard anyone say it before!
Now for hardware, as a sheer numbers game, Linux supports more hardware devices Out Of The Box than Windows does. I think I heard a statistic recently that it supports more than XP and Vista combined (take with a pinch of salt I can’t remember the reference).
Does Apple deserve the same condemnation you give Linux because not all my hardware would work with a Mac? Why can’t people accept that Linux is simply an alternative platform to Windows. That doesn’t mean it has to compete on the same terms! My company has built and sold Linux desktops for years, hardware has rarely been a problem for us because we know what we’re doing and pick well supported chipsets. We’ve just started selling laptops too and by talking to manufacturers about what we need they have put together some beautiful machines that have near 100% support. And YES I do include the typical no-go Linux areas such as WiFi support and Hibernation.
I hear what you mean about software – but take solice in two things, firstly alot of software is going to be pushed to the net and provided by banks through their ebanking website, as they take on more business transaction work, thus making Quicken and MYOB obsolete.
The other flip side is the deminishing returns being shown in each software release – look at Creative Suite and the rather medocre features being added in version 3 – its going to be a snow balls chance in hell of trying to push a suite that all it really is, is a grand unified patch so that their software can run on Windows Vista.
Sure, don’t expect Adobe or Quark like software anytime soon, but Office suites are pretty much getting to the dead end of their development – features being added that fewer and fewer customers are utilising, Microsoft going gang busters trying to show the features in their new office suite, and gradually locking them in with those new features so that in 1-2 years of migration customers big whine is, “oh, we can’t move, we need zyx feature”, when in relaity, they don’t need it, they did fine without it since moving to Microsoft Office ten years ago, but now apparently it critical? Hence, going off topic a wee bit, it always pisses me off when I see companies cut huge numbers of staff but no one ever takes responsibility when IT upgrades increase vendor lock in – no one ever stands up and says, “I made that decision” and come up with a rational reason behind it.
Pardon me for chiming in.. but maybe.. if this gains traction it could offer the kind of motivation that gaming houses and other software houses need to write software for GNU/Linux. Yes.. I totally agree that gamers are a different breed. But wouldn’t it be nice if Nvidia and ATI offered us better drivers with better 3D support for those gamers? I think it would benefit consumers on the whole, not just the existing Linux community. After all, isn’t that what it is all about? Linux shouldn’t be just for geeks. As a “joe sixpack” user who has gone through some of the teething problems of past distros and endured the learning curve required on distros like Slackware( which I still love) I am delighted with Dell’s decision. And before I go, this also..with gaming and other apps to be written for new Linux users, may finally monetize Linux to the point where the naysayers(like the ugggghh. Yankee group) have to agree that Linux has come of age.
I think that virtualization will have an impact on Windows to Linux migration and on the way people use computers in general. If gaming and other Windows-only applications are a limiting factor, then here’s Windows running in KVM on Linux. It will really piss people off to find that they need to buy a Windows license and maintain another system image just so they can run their games in a virtual machine, and that’s the kind of sentiment that leads to change.
Intel is leading an industry push towards synergistic computing, using CPUs for what they’re good at and offloading the rest onto stream processors like GPUs. Here’s a perfect opportunity for free software to shake things up. Let’s get moving on GCC 5, take advantage of Intel’s Linux-friendly stance on graphics drivers, and bring this idea to market before Microsoft beats us to it. If Linux is the best way to take advantage of your graphics hardware, the other graphics vendors will come around (I used to say red team and green team, but they’re both fairly green-colored now).
The bottom line is that Linux has to show the computing industry that we are more adaptable as the industry evolves. Microsoft keeps the world stuck on yesterday’s technologies. Let’s call them out for being the enemy of progress that they’ve become.
I’ve been using Ubuntu now for the better part of four months. I’ve used probably three other distros for a total of about six months, since roughly 1999. I’ve always had to go crawling back to Windows because at the end of the day, I just couldn’t do what I needed to do.
Ubuntu has been the first distro that I haven’t ultimately formatted over. I’m typing this in Ubuntu right now. For all the normal tasks, such as web browsing, e-mail, IM, etc. Ubuntu has worked fantastically (thanks mostly to the fact that the respective applications have finally started to mature). For some things (working with NTFS, etc.) I’ve been able to do it, though with some issues.
However, there are still times when I have to drop to a command line to fix or install something. Honestly, I don’t mind the fix part of that, because I tinker a lot, and a normal user likely won’t be doing what I do to break things. However, a user should never, I repeat NEVER have to drop to a command line to install something. And even still, if that ever was the case, it should be as simple as “install appname”. The fact that Linux distros still force you to compile from source is absolutely ridiculous.
If someone(s) would step up and find a universal way to make all software available for installation via a double-click method, I would be ready to start getting friends and family on board. Until that point comes, I don’t want to be the one to have to support it, because I can barely support myself…
All that will be moot of course once SkyOS is ready to go.
Linspire have, it’s called CNR.
http://www.cnr.com
Look for a release announcement coming soon.
Sure, but for the stuff that CNR will bring, most of that is already available in Synaptic. What I’m talking about is beta software that isn’t in any of the reps, or when something breaks X and you have to fix it from the command line, etc.
I’m probably the last group of users that Linux doesn’t work really well with. Novice computer users are not likely to run into scenarios like the above, and seasoned vets of Linux accepted a long time ago that the command line is a necessity and just learned their way around. I’m someone who has a lot of computer experience (I’ve used every version of Windows since 3, Mac OS 7-X, BeOS, SkyOS, a number of Linux dist.), but hasn’t taken the time to learn the nuances of fixing things via the command line because it is an annoyance.
Obviously some of that is due to Linux having a hard time being recognized by hardware companies, etc. but some of it is because of competing systems (ALSA/OSS, Gnome/KDE, different package managers, etc.) That type of stuff has to end, because as soon as it does, the hardware companies become the only impedence, and with companies like Dell on board, if they are the only obstacle left, they will fall in line pretty quickly.
I don’t see how Gnome vs. KDE poses any obstacle to Linux adoption. Both DEs can run programs made for the other, and it’s not trivial to have them use the same style. Soon, they’ll be able to use each other’s file open/save dialogs…as for ALSA vs. OSS, this one’s been over for quite a while now. Different packaging systems are also increasingly irrelevant, especially with services such as CNR.
In any case, these competitions haven’t stopped Dell…with the buzz that they’ve generated since announcing that they’ll be offering Ubuntu soon, I think we’ll see a definite surge of a couple of percentage points in Linux market share.
I’ll agree with you that Power Users are the ones that have the most difficulty with adjusting to a new OS, because all the tips and tricks they’ve learned on their favorite OS will not carry over to the next one.
That said, I do not consider that the command line is “necessary” when using Linux. I use it not because I need to, but because I want to (it’s often quicker and more powerful than the GUI). Even editing xorg.conf *doesn’t* require the use of the command line, since you can very well edit the file using a GUI text editor. I often do that, actually, because I like the color-coding that Kate/Kwrite apply to text files (makes it easier to navigate and spot mistakes).
I don’t see how Gnome vs. KDE poses any obstacle to Linux adoption.
And every single year, at the OSDL desktop architects meetings, the ISVs have been saying that on Windows, they make a single installer for all versions, on Linux, supporting the number of distros times the number of desktops is a NIGHTMARE.
X breakage only tends to happen with beta software too but the point of labelling it beta isn’t to show that’s it a new release it’s to say this isn’t ready but I’d appreciate the help of knowledgeable people to find bugs that I’ve missed. It’s not meant to be used.
It’s ridiculous that people still insist that the additional availability of software in source form should be a draw back.
Normal users justuse pre-packages software, independent of the platform.
Additionally advanced users on Unix/Linux have an easy option for building software on their own, for example to use development versions or to enable certain build-time features.
I think you stated the same thing as I did. You just beat me to it and said it in less words. Anyway, I agree with you on all points. People seem to not notice this. I guess it’s human nature. If everything goes great even better than expected, but one tiny thing they expect to happen does not, they throw up their hands.
Which application did you have drop to a command line and compile from source?
I think in most cases, you can simply pick an application from Synaptic and you’re done. There are occasions when I’ve dropped to a command line to install something. But, it would have been the same case for Windows, too. In fact, installing all of the supporting software like a compiler or automake on Windows would have added an additional burden. I think the chances are that if you are dropping to the command line to install something, it is because you already had the command line open and did it out of convenience to you or you were trying to install something that a binary doesn’t exist for Windows either. In either case, you have no complaint. If I’m missing something, I’d like to know.
There is a scenerio that really concerns me about the Dell situation. I wonder how a consumer is going handle a situation like buying a new periphreal, they open it up and it comes with a CD, they put it in the CD tray, close the tray and sit there and wait. Chances are that the new periphreal will already be detected as soon as they plug it in. But, some features may not be supported without the manufacturer’s supplied drivers. The bright side of that is that it won’t have the included spyware and nagware either. But, they are going to be confused. And, knowing the typical end user, they won’t think about the absence of interrupting messages from virus scanning software popping up to tell you that it’s finished scanning, like mine did while I was typing this on Windows (I still use it on my laptop).
“Which application did you have drop to a command line and compile from source?”
I have to do this a lot because I’m in an x64 environment. If it is in a rep, I’m usually ok, but if I have to download a deb file from somewhere to install, it never works because it was compiled for 32-bit. That means, unless I can find an x64-compatible package (which sometimes happens via ubuntuforums.org), I have to build one from source.
Off the top of my head I can name the Audacity beta release, which I had to install because the stable build would not work with my USB mic, and it was reported that the beta did. On Windows, I would just download the binary installer, check it out, and move along. Not so on Linux…
Another time, the NVIDIA proprietary drivers were updated and somehow had a mis-match with my kernel version, I think. Basically, I couldn’t boot into X anymore. I somehow figured it out (I don’t even remember what I did), but if I didn’t have a second Windows computer there to read online, I couldn’t have fixed it. Why is it that X isn’t smart enough to default back to VESA drivers on its own if the NVIDIA/ATI drivers aren’t working?
Linux (mostly through Ubuntu imo) has made some great in-roads, but I think there is still work left to be done before it is truly ready to go.
Edited 2007-05-05 15:52
Did you really need to install the 64-bit version of Ubuntu, though? That’s what I did on my laptop, but because many of the commercial/non-free stuff was only available for 32-bit, I finally reinstalled the 32-bit version. You know what? I haven’t noticed *any* performance decrease, with the sole exception of encoding mp3 (which was marginally faster with the 64-bit version).
I’m just saying…if you don’t really need the 64-bit version, perhaps you should consider installing the 32-bit one.
That’s an excuse though. The answer the Linux community should give should not be “Well, do you *really* need to use that anyway, come on.” Rather they should say “Yes, we recognize that is a problem. Let’s find a way to fix it, or at least make it better.”
More importantly, even Windows has trouble with 64-bit support for third-party software (drivers and apps). This could be one area where Linux could not only equal Windows, but actually surpass it, which Linux will need to do if it wants to gain significant ground. The only reason Firefox took off is because it not only worked as well as Internet Explorer, it actually surpassed it in almost every way (security, functionality, etc).
I disagree that this is an “excuse.” When I tried it, 64-bit worked very well except for two things: proprietary Windows codecs and Flash. These were *not* available at all. You still haven’t made the case that you *needed* 64-bit, by the way. It’s not as if you can really do stuff in 64-bit that you can’t with the 32-bit version.
Anyway, please tell me how it is the Linux community’s responsibility if Windows codecs and Flash, both proprietary software, were only available in 32-bit versions?
There are many areas in which Linux already surpasses Windows, however having a better product doesn’t automatically mean a higher market share (Betamax vs. VHS is still one of the better examples of this). There are lots of other forces at play: consumer inertia and brand loyalty, ignorance of alternatives, pure marketing muscle, strongarm tactics on the part of the monopolist (Windows wouldn’t be so dominant if MS hadn’t threatened OEMs to revoke their Windows contracts if they offered any other OSes on their PCs, etc.).
The important thing is that Linux continually improves, and continues to garner new users as it becomes better. Linux may never displace Windows completely, but that’s not the point. The point is to achieve a diverse OS ecosystem, where one isn’t discriminated against because of the operating system they choose to run. I think that goal is close at hand, and the Dell/Ubuntu announcement is an important step in that direction.
Edited 2007-05-06 17:02
I shouldn’t *need* to make my case. It is available, I want to use it, I have a 64-bit processor. All I know is that is what I have installed, and some stuff doesn’t work. Anything other than acceptance that it doesn’t work, and understanding that the system should be made to work, is an excuse.
As for need, ok, fine, need. Everything is moving to 64-bit eventually. Perhaps I don’t feel like re-installing my operating system once this happens. I bought a 64-bit processor in anticipation that this is where the industry would be heading. In less than three years, everyone will have a 64-bit OS. I don’t feel like re-installing my OS, since I have the technology today for 64-bit.
Correction: the system works fine. Some 3rd-party applications are not yet available for it. That’s a major difference. Again, it’s not the Linux community’s responsibility if Windows codecs and Flash aren’t available as native 64-bit applications. To imply otherwise is disingenuous.
That said, you *can* make it work, using a 32-bit chroot. It’s not that difficult, but really, if 3rd-party apps you need aren’t ready yet, moving to 64-bit seems counterproductive to me.
You have the technology, but not all the 3rd-party apps are ready. It’s your choice to use 64-bit, but you also have to accept that if there is no 64-bit Flash player out you won’t be able to run it without jumping through some hoops.
As you’ve stated, the situation isn’t much better with 64-bit Windows. That tells more about the readiness for the industry as a whole to move to 64-bit then about the quality of either Windows or Linux.
I’m also a bit skeptical that *everyone* will be running a 64-bit OS in less than three years. Maybe, maybe not. We’ll see. In any case, right now it’s your choice: go for 64-bit knowing that some apps are not available (under *any* OS) or use 32-bit while waiting for those apps to be released. However, you shouldn’t use missing 64-bit apps as a basis to criticize Linux when the actual OS works fine in 64-bit…
Yes, thank you, because despite RTFM and enabling “universal” repositories in Synaptic, Every. Single. Time. in order to get Opera up and running under *buntu, I’ve had to hit the command line.
Every. Single. Time.
So please do not tell me that Synaptic is all that and a bag of chips and that Linux software install is easier than Windows or OS X. In my experience, 50% of the time Synaptic can’t find its own arse with both hands and a roadmap. Until there’s a way to drag and drop a manually downloaded file on to the Synaptic window and then have it take care of the rest? It’s missing critically needed functionality.
Opera, well thats those Ubuntu users with their non-free software.
Seriously thats a bug, and a silly one at that Gnome/KDE!?/XFCE4 all use the desktop standards highlighted at http://standards.freedesktop.org/desktop-entry-spec/latest/ and is a trivial fix at best. I hope that you have filled this bug in the appropriate place, and I would imagine it will be fixed immediately.
I will actually write you a little desktop file if you really want one.
Actually nobody ever will say that *any* package manager is perfect, or even the best way of installing software. There have been *flame* wars before, but most would agree the advantages from a package manager comes from *maintaining* already installed software. With the main disadvantage being delays between availability of a new package and it being available on your platform.
It does offer a better way of installing software than on the windows platform, with a greater sense of security. That said we live in a web world. Ubuntu is *the* distribution of choice for migrating *Microsoft* users and this…
http://www.nongnu.org/synaptic/images/0.53-main.png
is not good enough. We live in a web world and that is not delivering a Web 2.0(sic) experience. Its Ugly ; unintuitive ; No feature listing ; Rating; Comments, and I don’t even want these things, but from a download.com refugee this must be a daunting experience. Especially when they are unfamiliar with what equivalent software they are used to is.
Yes, thank you, because despite RTFM and enabling “universal” repositories in Synaptic, Every. Single. Time. in order to get Opera up and running under *buntu, I’ve had to hit the command line.
I just installed Opera 9.20 by browsing to http://www.opera.com, clicking the green “Download Opera for Linux” button, selecting the version of Ubuntu I’m running, and finally clicking “Download Opera”. Then the browser download prompt is displayed, recommending the “GDebi Package Installer”. I click ok, enter my password when requested and Opera is installed, with a new menu item under “Applications-Internet-Opera”. AFAIK, this is similar to installing it on MS Windows. I’m not sure why you need the command line to install Opera.
Even if this does work (I don’t know, I’ve never tried it, I only use Firefox), keep in mind that this is for a major company, who has the resources to do this. Not every company wants to do this. And what about the small companies, or even individual programmers, that want to make this available? This is a lot of work, and they have to create packages for multiple different distro’s, and in my experience, most of them just throw the source out there because they don’t want to compile 10 different things every time they push a build out.
I don’t even blame them though, because that is a lot of work. The problem is that there are no standards in Linux for how to install software. There are multiple competing schemes, but there is no one, universal standard. If there was only one “Synaptic”, and every single distro used it, and it was easy for developers to get new builds of their software on there, that would be fine. Additionally, if there was one binary package format that all distros could use, alongside one repository scheme, then most of these problems would be solved.
They use non-repository installers, e.g. autopackage, or an installer creator like InstallJammer http://specialreports.linux.com/specialreports/07/02/21/0818230.sht…
Even free software projects like Inkscape do it this way.
Yes, but the problem is, that they don’t. I’ve found almost no applications that actually make use of these things (which I think is a shame, I’ve been a big proponent of AutoPackage for a long time).
Hmm, true, there might be such software, however I am pretty sure I haven’t built non-development software in years.
Therefore I can only guess about the developers’ motivations, maybe the want to limit their userbase to people who are advanced enough to build themselves, e.g. for more detailed bug reports, etc.
For software that is targetted at “normal” end-users it wouldn’t make sense.
Any example of a software that is targetted at the mass market, has reached its 1.0 version milestone, isn’t a development version, but is neither available through package repositories nor other pre-built installers?
It would be hard to match all that criteria because all software is in a constant state of flux.
My best example is with Audacity. I needed to install a beta, which supposedly fixed an issue with my USB microphone. The only option available was the source package.
I think another example is Maya, which from my memory, only had RPMs available on the disk, which needed to be converted to a Deb package to install.
There’s stuff that works great, Firefox and Thunderbird for example. Installing them was no trouble at all. Unfortunately, not all software is like that on Linux. If there was a standard desktop environment, and a standard package format, then rather than putting the source up and forcing users to compile it, the developers could simply put up a binary that everyone could install with (in addition to the source, if people wanted that instead).
I don’t think this is the reason. One of your example in the other post, Pidgin, would qualify if it weren’t available in package repositories.
If there has been a delay between the release announcement an becoming available in the repositories, maybe the developer’s didn’t notify the packagers soon enough before announcing the release?
This is less a matter of installation, more a matter of coordindation.
As it seems the developers use the option of depending on external packagers for some platforms, while they package themselves for others.
If they don’t care about users on those platforms with external packagers, maybe those users should complain to them, since they have several options to overcome this:
– coordinate external packaging efforts and release annoucement (this is what most projects do)
– provide all repository compatible packages themselves (this is what virtually nobody does)
– provide a non-repository stand-alone installer (several different options available)
It is even possible to use combinations of those for different types of releases, e.g. using a non-repository approach for betas, but use externally provided but coordinated packages for stable releases.
Just because someone doesn’t make use of available options doesn’t imply they do not exist.
Standard package format and binary installer are orthogonal concepts, both have advantaged and disadvantages.
Therefore it doesn’t make sense to claim a standard package format would fix all use cases, because it wouldn’t.
Just because a software is available through distributions package repositories does not mean it can’t additionally be available as a stand-alone installer or through a secondary repository system.
M guess is that very few projects just don’t see a benefit in having both, but this could change if their users’ feedback indicates the opposite.
As you wrote yourself, an example where this obviously has happend are Mozilla’s products.
That’s inaccurate. The fact is that there are *multiple* standards to install Linux software (just as there is for Windows: msi packages, packages using InstallShield, self-extracting archives, etc.).
Debs and RPMs are both standard, and between themselves cover 90% of distros out there. For standalone applications, it’s very possible to package them so that they get installed to /opt with static libraries. Other graphical installers, such as the Loki installer or others (which were listed by anda_skoa above) are all available.
This is really a false problem – it’s relatively trivial to package your software so it will work on 90%+ of distros out there. As such, there is very little software that is unavailable for someone who has one of the *big* distros (Ubuntu/Debian, RedHat or SuSE).
So why is it then that I still have to compile from source at all? I should never have to do this, if what you said is true, and I can think of probably ten different applications I have had to do this with, because the developer either didn’t put it in a repo, didn’t make a binary installer, or didn’t make one that worked with my distro.
Please go ahead and name those ten applications. I can guarantee that they’ll either be quite exotic, bleeding-edge versions of sofware availaible in repositories, deprecated apps no one’s been working on for years, or have equivalent-or-better alternatives available.
Seriously, I’m a Ubuntu Power User, I like to try out lots of programs, and yet I haven’t had to compile a *single* application since I switched to Feisty. Even when using Feisty, the only app I compiled was the newest version (1.3.3) of Celestia just because I want to check out what’s new (version 1.3.2 is in the repos).
Okay, that’s not entirely true: I’ve had to compile a kernel module for my SD card reader, because the current driver is buggy. It took me five minutes to find the HowTo on Ubuntu Forums, and someone had actually written a script for it, so all I really had to do was download the script and run it (I did look a the script to make sure it didn’t do anything nasty first). So while I did compile, I didn’t even type the usual “configure, make, make install” – the script did it for me. I used the command line, but I could have done it all through the GUI (there were instructions in the HowTo about doing it through the graphical interface). All in all the whole thing took me 15 minutes, and the step-by-step instructions were clear enough so that any newbie could have followed them.
That said, it is possible that your particular distro has fewer packages available…but this is a discussion about Ubuntu.
So who are you to judge what software people should install?
Users should be able to install all the software they want.
Let them brake their systems if they, for god’s sake.
Okay, I want to install Amarok on Windows.
What’s the problem? You said I should be able to install the software I want? Oh, I also want to install Half-Life 2 on my Mac, I want to install Final Cut Pro on Linux, and I want Beryl on my Windows PC as well. While we’re at it, I want to play Wii Sports on my Xbox, and Guitar Hero on my DS.
I don’t want to hear excuses, I should be able to install the software I want! I can’t? Well then, it’s Microsoft/Apple/Linux/Nintendo’s fault!!
Listen, if someone doesn’t package a piece of software for your OS or your distro, you have two choices: either you don’t use it, or you compile it from source (if it’s available for your OS/architecture).
Complaining that Linux isn’t ready because some obscure/bleeding edge/deprecated software cannot be easily installed on your current distro is disingenuous, to say the least. One cannot install old Mac OS programs to run natively on Mac OS X, nor can I play all old Xbox games on the Xbox 360. Some bleeding edge programs are not yet ready, and the developers don’t feel like taking the time to make it available for all OSes/distros/architectures. Some apps were abandoned by their developers, and are no longer maintained by anyone. That’s not a failure on Linux’s fault, that’s just how the bloody software industry works!!
Users *should* be able to install the software they want, but if the ISVs don’t package it for their system, tough luck! That doesn’t mean that Linux isn’t ready for the majority of users, who’ll be more than happy to use whatever is available in the repositories.
Edited 2007-05-06 17:00
Things I have had to use the command line for:
– Avant Window Navigator
– Audacity Beta (to fix a problem with my USB microphone)
– NVIDIA drivers (64-bit, didn’t see them in the repo)
– Beryl (didn’t work when installed from repo)
– Pidgin 2.0 (wasn’t available via repo three days after release)
– Automatix
– Fixing a problem between my kernel and NVIDIA (an automated Ubuntu update broke it somehow)
– Problem with VMWare after updating to Feisty
– Working with permissioned files in GEdit
– Installing write support for NTFS
I see at least eight things on that list that would be resolved (in the meaning of not having to use the command line) if there was a universal desktop and packaging standard in Linux.
Even if this does work (I don’t know, I’ve never tried it, I only use Firefox)
It works, in fact I just removed it using Synaptic (I use Firefox, too).
Small developers don’t need to make 10 different builds, they can focus on the “new user friendly” distros like Ubuntu and SUSE. And if they want to keep thing really simple, just make a .deb version, a .rpm version and a source tarball. That would work on a large number of distros.
Having a source tarball available also allows people to test software before the developers have official test builds available. You usually don’t get that option with software running on MS Windows.
Relying on source makes the developers lazy. On Windows, they push out alpha and beta builds in a binary installer all the time.
You said that the can focus on certain distros, but the fact is, many don’t. Many just provide the source tarball, and their instructions include “Here is the source. Feel free to install it and try it out!”
One example is Pidgin 2.0. I wanted to install it the other day because it was listed on Digg as having been released. On Windows, I simply downloaded the Win32 installer, double-clicked, and installed. Done and done! On Ubuntu, I went to their website, and here are the options I had listed:
Windows
Fedora Core
Source
(copied straight from the website). So my options are Win32 (can’t, obviously), Fedora Core (not compatible), and source (I don’t want to, not should I have to). I went to the Ubuntu forums, and someone made a .deb, but it was 32-bit, so it would not work. The most common response was “Just wait, it will make it into the updates in a few weeks.” So I’m expected to wait a few weeks to use it, when I could have used it right away on Windows? Not the best selling point. Fortunately, after digging around for a half-hour or so, I found someone had compiled a 64-bit deb, which worked.
Similar example was the Avant Window Navigator. Same scenario, except I think it was only available in source. For that one, I couldn’t find anyone with a 64-bit deb file, and finally had to build one myself (which I then shared on their forums). I’ve had to do this with probably at least 5-10 other apps, over the course of four months.
Once again, it is the very computer savvy people that have not used Linux that have the hardest time switching over. Many of them desperately want to, but it is simply too hard to, with not enough gain. The other fact of the matter, is that these people are generally the ones most likely to get the rest of the world to switch over, because they interact with that level of people more.
I know personally, my experience with trying to get applications installed is the only reason I don’t push everyone else I know to use Linux (Ubuntu in this case). Until I am able to find a binary installer or repo listing for every single application that I want, I’m not willing to move anyone over, because I have to help them, and quite literally, I don’t have enough time or knowledge of how to do this to support them.
It’s available for Windows? I doubt it.
Here’s a clue: what do you do when an apps isn’t available or ready for our OS? You don’t use it!
Really? Where can I get the binary installer for the beta version of Photoshop CS3?
Seriously, the difference is that, usually, Windows development takes place behind closed doors, and you can only install an app when it’s ready. Some developers do provide installable betas, but that’s only a minority of them IMHO.
On Linux, nearly all development is transparent. You can try out stuff that’s not ready (like AWN), while for a similar Windows app you’d have to wait until it was.
You can get the beta from Adobe, if you are on their beta team. None the less, that isn’t even the point I was trying to make, and I think you know that.
Yes, it is great that the development is transparent in the Linux community. I am very happy that I can get nightly builds. I don’t care if they don’t work, I understand the point of a nightly build. What I do care about is that in order to install them, I have to download the source and compile it myself.
The point I was trying to make is that it is needlessly complicated to install software in Linux. It has gotten better, through packages and repo’s, but if what you are trying to use is not there, then what is the point?! I understand nightly builds not being in a repo, but why isn’t it in a binary installer?
The reason is, there is no standard for a binary installer in Linux, that works across all distros. Autopackage is working towards that, but unfortunately, for some reason, no developers use this.
It is a pain in the ass to compile from source, it comes with complications (i.e. dependencies), if you are new to Linux then it is hard to find out how to do it properly…the list continues. The Linux community needs to agree on one standard binary package format, and rally around that. That doesn’t mean that the repos should go way, quite the opposite.
In SkyOS, we have the Software Store (similar idea as the repository paradigm), but we also have the .pkg format. If you download a .pkg file, you can right-click on it, choose “Install”, and it gets installed. Why can’t I do this on Linux?!
Well, from my point of view, if there is a solution (in this case several besides autopackage) available but not widely used, it doesn’t make much sense to claim it doesn’t exists and how much it would be better if it did.
IMHO it would make more sense to start requesting that it is going to be used.
Obviously there is room for improvements but (falsyl) claiming that there are non stand-alone installer options available and suggesting a change in the orthogonal repository system would solve it all, does not help get your point across.
It’s ironic, that if you use wine and the Windows version on a non-Fedora distro, it’s easier to install than having to compile from source.
If only people used autopackage…
I’ve been using Ubuntu now for the better part of four months. I’ve used probably three other distros for a total of about six months, since roughly 1999. I’ve always had to go crawling back to Windows because at the end of the day, I just couldn’t do what I needed to do.
Ubuntu has been the first distro that I haven’t ultimately formatted over. I’m typing this in Ubuntu right now. For all the normal tasks, such as web browsing, e-mail, IM, etc. Ubuntu has worked fantastically (thanks mostly to the fact that the respective applications have finally started to mature). For some things (working with NTFS, etc.) I’ve been able to do it, though with some issues.
However, there are still times when I have to drop to a command line to fix or install something. Honestly, I don’t mind the fix part of that, because I tinker a lot, and a normal user likely won’t be doing what I do to break things. However, a user should never, I repeat NEVER have to drop to a command line to install something. And even still, if that ever was the case, it should be as simple as “install appname”. The fact that Linux distros still force you to compile from source is absolutely ridiculous.
If someone(s) would step up and find a universal way to make all software available for installation via a double-click method, I would be ready to start getting friends and family on board. Until that point comes, I don’t want to be the one to have to support it, because I can barely support myself…
All that will be moot of course once SkyOS is ready to go.
—————————————————
Wow, your post is so in-line with the way I think/feel I had to double-check to make sure I didn’t write it and forget.
Needless to say, you’re 100% right on. Even about the SkyOS comment. (I’m a huge Robert fan)
Edited 2007-05-07 16:37
everybody considered Apple too obscure to be useful. They argued that software developers would NEVER spend resources developing for a new operating system with an insignificant install base.
Now, I am not saying that Canonical will be the next Apple. But, with a fair few million users out there and a fully supported hardware platform, who is to say that we won’t see it be a phenomenal success (except for some of the guys above.)
You know, it is sort of the same with less-popular game consoles. Why would one develop for a hugely popular console (with thousands of titles to compete with), when you could create a fantastic title on a less popular model and capture that entire market?
Every Dellbuntu (for the record, I coined that term here, today) user would immediately start screaming for accounting software, games, etc.. Let’s just hope developers are poised to take advantage of the opportunity?
I’m not sure Canonical are interested in being the next Apple. I agree someone needs to enter the market place and put Ubuntu machines on a par with Apple offerings. I doubt Canonical are interested in getting into the hardware game though.
I also doubt Dell could create a business model around that style that wasn’t completely incompatible with their existing business.
My personal opinion is that Dell will use this to reach a niche and generate a lot of advertising buzz online. If the sales ever started to reach a level that Microsoft could be concerned about, they are in a position to stop Dell. I’m not trying to bash Microsoft unfairly here, I think any business would (should) do the same. So Dell is in catch 22, either the sales will be poor or their existing sales of Windows PCs will be hit hard. My money is on poor Linux sales due to lack of effort.
I think you’d better search the archives before claiming that.
For a little bubble-burstage you may want to google “dellbuntu”. Turns out you didn’t coin the term today.
http://www.google.com/search?source=en&q=dellbuntu
Sorry.
But sadly, the difference is that Apple users demonstrated their willingness to pay for differentiation. This made them a viable market, a few million users paying a premium for a specialized platform can’t really be compared to a few million users reveling over a freely-delivered platform, as far as market viability goes. It will take more to convince the ISV’s that linux is a viable market of users willing to purchase software, and becomes something of a chicken-and-egg scenario.
See above. Canonical’s business model revolves around being able to convince users / integraters to pay for support. I suspect there is not a line up of Ubuntu users signing up for $250 USD annually for desktop support from Canonical, or $900 if you want 24×7. The part about Shuttleworth’s business strategy that I haven’t been able to figure out is how to convince the ISV’s that linux is a viable market for commercial software when Canonical is struggling to generate commercial support from their vast userbase. So there’s no givens here, it definitely remains to be proven.
Er, business 101. This isn’t about marketshare, which is a number companies often like to hide behind when trying to deflect attention away from other factors, it’s about revenue. There are fixed costs to development regardless of the platform you select, you need to dilute them by reaching as wide a customer as possible otherwise you have to markup your product to attain the same profitability with a smaller userbase. I’ll admit there’s a certain cachet to being the top dog even in a smaller market, but it’s useless if it isn’t generating enough revenue to pay the bills. All other things being equal, you pick the platform that will reach the biggest market.
Dellbuntu users screaming for accounting software and games have purchased the wrong platform, and this is what I fear the repercussions will be. Frankly, Dellbuntu users, at least based on the systems Dell has selected, will likely represent the economy-oriented or niche-specific power-user segment in marketing terms of Dell’s customer base, that’s hardly going to sway influence with the ISV’s unless the numbers are dramatic. To play devil’s advocate, it would make more business sense to invest in developing software for OSX where you have a userbase that is accustomed to having to pay, often at a premium, for software than for an unproven segment that frequently extols the virtues of “free” software.
Don’t get me wrong, I think linux is a viable market segment that deserves ISV’s attention, but I also think we need to quit trying to map linux demographics to traditional old-school commercial models. It just won’t work at this point. And before you ask, I don’t know what the proper answer is either, otherwise I’d be courting VC whores right now to become the next IT maven, rather than posting about it…
Feisty 7.04 absolutely leaves XP in the dust so I can only imagine vista would seam like a pregnant slug on morphine, not having used it.
I have to use Windows for work, and it aggravates me no end how slow it is compared to Linux these days. I know a lot will say XP is faster and a while ago I would’ve agreed with them, but not anymore! XP has become a chore to use on my system – E6600 Core 2 Duo, 8800GTX GPU, 4gig DDR2 800mhz ram, twin sata2 drives.
If vista is even slower than XP, then I can see more and more people moving over to Linux which will further encourage more hardware and software places to offer better support!
Yeah, I agree. I’m a KDE fan (I do want the web browser to remember my Yahoo ID, which Firefox does not) this is the first version of Kubuntu I’ve installed and will keep using – their KDE is *much* more responsive this time. I’ve tried Vista on someone else’s computer and it was an absolute dog; I had ghost tooltips appearing and then not disappearing for several seconds, and everything took ages. The only app I used was Internet Explorer. The system was an Athlon 64 with 512Mb. It’s the only OS I’ve ever seen which could not display things properly with that amount of RAM.
Also, GNOME really does not look like Windows. If anything on Linux really copies the Windows model it’s KDE (from left to right: main menu, application launchers, OK the desktop switcher is new, application switcher, system tray, clock). It just looks radically different because of its themes.
As I’ve already said in another thread, I recently tested boot times between Vista (on the Hard Drive) and the Kubuntu LiveCD, and amazingly on a 512MB machine it was faster to boot into the LiveCD (2m10s) than Vista (2m35s)!
Fascinating insights here from Tiddlypom Sweetypie such as “Best of all, Ubuntu is fun.” Would you like that with chocolate fudge sauce?
OTOH, I think it’s possible to be contrarian and say that the Dell-Ubuntu deal could easily turn out to be a disaster. At issue is the notion that Dell is fairly desperate at the moment and grasping at straws. If this gambit doesn’t work then they’ll quickly drop it, just as they made a big song and dance about Linux a few years ago and pretty well canned the lot one year later. Dell needs better financials and their real aim is to best rivals like HP, Lenovo and Acer, not to support Linux per se.
Were Dell to backpedal for a second time, Ubuntu and Linux would take some hard knocks. Gleeful rivals would present Linux as the OS that had failed repeatedly to capture the public imagination outside of a few geeks.
Is there a chance of this happening? I think there is. Dell + Linux represents only a fairly small saving over Dell + Windows, because an OEM can offset the cost of Windows with marketing and crapware contributions. So Dell + Linux is not that compelling for a purchaser judged purely on price. What might be more compelling is the notion that Linux is a more sophisticated offering that allows you do to ten times more, much more securely. But the moment you say this, you are moving smartly away from a mass-market “save money” pitch.
Is there a trap here? Something along the lines of “It’s very hard to sell Linux as a niche product because the audience is both small and too sophisticated to pay over the odds for it. On the other hand, it is also very hard to sell Linux as a mainstream product because the audience is too unsophisticated to judge it on anything more than price.” We’ll soon see.
That was one of the more insightful posts I’ve read here.
I think you’re right, in an ideal world, Linux PCs would sell for more than MS Windows machines. After all you pay for quality everywhere else. Sadly the Linux is FREE mentality is a little too pervasive for that to work. It’s a shame that unless you’re very careful, OEM Linux sales just don’t create as large a profit as Windows.
One of Dell’s recent comments that’s been overshadowed on sites like this by the Linux news is that they seem to be considering selling their machines through brick and mortar partners. It’s probably something they do need to do with the upsurge of laptop sales and peoples desire to go for looks over specs these days.
I wonder if they would put Linux on display as part of the Dell brand? 😉
“in an ideal world, Linux PCs would sell for more than MS Windows machines. After all you pay for quality everywhere else.”
In an ideal world, people would pay reasonable prices for software, and Linux prices are very reasonable. Linux will get many new users that are fleeing the software dictatorship.
Toshiba should be selling Linux computers soon:
http://www.slashgear.com/toshiba-rumored-to-install-linux-on-notebo…
Is there a chance of this happening? I think there is. Dell + Linux represents only a fairly small saving over Dell + Windows, because an OEM can offset the cost of Windows with marketing and crapware contributions. So Dell + Linux is not that compelling for a purchaser judged purely on price. What might be more compelling is the notion that Linux is a more sophisticated offering that allows you do to ten times more, much more securely. But the moment you say this, you are moving smartly away from a mass-market “save money” pitch.
Sure, it could (and probably will) fail. Dell has to build a support infrastructure for the PCs that it sells with desktop Linux preinstalled. Dell isn’t going to give away desktop Linux support for free. In all likelihood, given its deployment timeline, Dell will outsource this support (probably to India or Pakistan). Consequently, the cost of desktop Linux isn’t going to be “free” to the end-user, which diminishes any competitive advantage that it might have had. Desktop Linux advocates tend to lose sight of this fact. And here’s the ultimate irony: The people that are most likely to want desktop Linux are the ones that are least likely to PAY for it! Sorry, this spells disaster to me. Dell is desperate here. Desktop Linux ain’t the solution. It’s a niche product.
Sure, it could (and probably will) fail. Dell has to build a support infrastructure for the PCs that it sells with desktop Linux preinstalled. Dell isn’t going to give away desktop Linux support for free. In all likelihood, given its deployment timeline, Dell will outsource this support (probably to India or Pakistan).
My understanding is that while Dell will support hardware, all software support will be carried by Ubuntu. Punters can opt for free support (the Ubuntu website) or pay-for. This isn’t going to cost Dell anything or at least a minimal amount.
Correct. The Canonical COO is quoted here:
http://www.linuxplanet.com/linuxplanet/newss/6383/
“Dell will be announcing a partnership with Canonical to ship pre-loaded Linux models with Ubuntu,” stated Jane Silber, Canonical’s Chief Operating Officer. “They’ll be selling these models from their Web site with Ubuntu pre-installed. Canonical in turn will be working with Dell to certify those models to insure that all components are fully functional and will also provide support that will be sold through the Dell Web site.”
This isn’t going to cost Dell anything or at least a minimal amount.
My point was that whether or not Dell supports it directly or not is irrelevant. SOMEBODY has to do so. So, regardless, the cost of support will be inevitably incorporated into the price of the operating system on the machine. Since desktop Linux advocates have been telling us that it’s so much less expensive than Windows, I just wanted to point out that the cost of support won’t be borne solely by the OEM or the OS vendor — it will be passed on to the customer. So, any price advantage for Linux will be reduced, if not eliminated.
Modding me down isn’t going to change the reality of my post, losers.
“I just wanted to point out that the cost of support won’t be borne solely by the OEM or the OS vendor — it will be passed on to the customer. So, any price advantage for Linux will be reduced, if not eliminated.”
Considering what MS charges for Vista, Linux will have a significant price and quality advantage.
Considering what MS charges for Vista, Linux will have a significant price and quality advantage.
You’re confusing retail prices with what OEMs charge. Hint: There’s a big difference between the two.
So now Microsoft are giving support calls to users for free ?
Last time I checked it was £80 in the UK for users to get support on Vista through Microsoft, £49 for XP.
“You’re confusing retail prices with what OEMs charge. Hint: There’s a big difference between the two.”
Vista OEM prices can’t beat Linux:
http://blogs.zdnet.com/Bott/?p=187
Linux is superior:
http://news.com.com/2102-1044_3-6181366.html?tag=st.util.print
http://www.regdeveloper.co.uk/2007/04/29/vista_end_dream/
Edited 2007-05-05 23:59
While I do not agree with Tomcat, Dell and the other teir1 manufacturers are very likely charged well under the generic prices available to pretty much anyone.
Support is a concern, though I honestly do not think it is all that big a one. Dell does not support MS installs either beyond basic install questions. Same for Microsoft themselves.
“Dell and the other teir1 manufacturers are very likely charged well under the generic prices available to pretty much anyone.”
Maybe, but Linux is FREE and Vista is crippled unless you pay extra for the “premium” and “ultimate” versions. People can also find FREE support online.
Maybe, but Linux is FREE and Vista is crippled unless you pay extra for the “premium” and “ultimate” versions.
Point is … if all you’re doing is competing on price, then you’ve already lost. Given Dell’s volume discounting, Windows isn’t really all that expensive.
“Point is … if all you’re doing is competing on price, then you’ve already lost. Given Dell’s volume discounting, Windows isn’t really all that expensive.”
Linux competes with Vista on price, quality, and freedom, Linux obviously wins all of them.
Linux competes with Vista on price, quality, and freedom, Linux obviously wins all of them.
Linux has a long way to go to match the quality of the Win2K3 Server codebase — which Vista is built on top of.
“Linux has a long way to go to match the quality of the Win2K3 Server codebase — which Vista is built on top of.”
Vista = Quality? You seem to enjoy using low quality software. Are you an MS Bob fan too?
Vista = Quality?
Educate yourself by browsing over to secunia.com and looking at the security vulnerabilities reported for Win2K3 Server, Vista, Linux, and OS X — with special emphasis on Linux and OS X. Next, take a look at Firefox and IE7; Apache and IIS6. I think that you’ll be surprised by the number of bad vulnerabilities present in the open source software relative to the commercial alternative. And you know why? BECAUSE ALL SOFTWARE IS VULNERABLE AND HAS QUALITY ISSUES. Open source software isn’t some kind of magic bullet that changes the practicality of software engineering dynamics. You seem to think that branding something as “free” annoints it with some kind of quality sticker. It doesn’t work that way in reality.
Windows isn’t really all that expensive.
Running vista though is.
Running vista though is.
How so?
Windows support isn’t free either…I know, because *I’m* the one that people around me call when they want to fix their borken Windows PC.
The advantage of a Dell PC with Ubuntu pre-installed is that it’s likely to require much less support than a Windows PC. There will also be community support available (which, in the case of Ubuntu, is excellent) in addition to pais support provided by Canonical, so really I don’t see where the problem is.
It doesn’t surprise me that you would predict doom and gloom for the Dell-Ubuntu partnership. After all, it’s clear from your posting history that you *want* Linux to fail, and so you will never present any Linux-related issue in a positive light. However, I do believe that you are completely wrong about this: I think the Dell-Ubuntu agreement will be very profitable for both companies. I already have people around me who have never run Linux who have already expessed their interest in at least trying a “Dellbuntu” PC.
The advantage of a Dell PC with Ubuntu pre-installed is that it’s likely to require much less support than a Windows PC.
You’re dreaming. Support also includes how-do-I-do-this-in-Linux kinds of questions. And there are going to be a LOT of those…
http://ubuntuguide.org
Ubuntu also have a nice and helpful community for those kind of questions. Users are going to enjoy Linux and have fun with it.
I don’t want Linux to fail; if anything, I would prefer that it succeed in order to provide competition to Microsoft.
I don’t want Linux to fail; if anything, I would prefer that it succeed in order to provide competition to Microsoft.
Since when did Linux need to succeed?
I’d say the desktop user experience is pretty decent for 0% market share, wouldn’t you?
Sure Linux isn’t perfect but does it warrant NULL desktop market penetration?
Is Windows so perfect to deserve 97% of the pie?
What about Windows 98 and ME? They still have greater marketshare but are they better?
I think if Linux was so bad enough to deserve such a hideous ranking, nobody would ever be using it for any purpose.
So this leaves only one thing: who has a monopoly and who does not.
I know there is some marketing going on with dell and linux, however i feel that it’s only pointing to people in IT. To make the linux deal succeed dell need to market it to everyone, i want to see adverts in general IT mags, TV and bill baords. This is a good oppotunity for dell, to capture not only IT savvy but also ppl who haven’t purchased a computer before. Really what i am trying to say is that i hope the linux pc’s don’t stay in a hidden part of the dell website.
Browser: Mozilla/5.0 (SymbianOS/9.1; U; [en-us]; Series60/3.0 NokiaE61/2.0618.06.05) AppleWebKit/413 (KHTML, like Gecko) Safari/413
“as I move around the different sub-directories, lists of hundreds of files pop onto the screen instantly. It often takes 10 seconds or more for Windows to do exactly the same thing.”
This doesn’t sound like something I’ve experienced. It’s something I’d like to though. I don’t know if things have changed in Vista, but explorer has always been better at displaying large amounts of files relatively quickly, IMO.
If you want to see something interesting; navigate to /usr/bin (or any directory with a ton of files) w/ nautilus –it’s disgusting– and it has to be about a billion times worse when you’re using list view.
Then try the same with Konqueror, you should be surprised. I think that actually does better than Explorer.
Edited 2007-05-05 09:07
This is true, but navigating remote shares in windows explorer post win98 compared to Nautilus is painfull. Don’t get me wrong, I’m not defending Nautilus because I think it’s complete crap, but generally speaking it leaves explorer for dead speed wise just navigating through normal folders and network shares.
I’m not sure what type of machine you’re on, or its specifications, but on my machine (a four-year-old DIY system with an Athlon XP 3200+ (single Barton core, 2GHz), 1GB DDR400, and an ATI 9700 Pro) running Ubuntu 7.04, Nautilus navigates and displays files instantly or nearly so. In /bin, it finds 104 files. In /dev, it finds 700, and takes the longest time to display files, somewhere between 1 and 2 seconds. /etc, with 222 items, was as fast (or as slow depending on your point of view). The slowest operation? Nautilus displaying /media/hda1/WINNT (or C:WINNT) at 5 seconds and 512 items, and that’s because of the NTFS translation taking place. This machine dual boots between WinXP and Ubuntu, and quite frankly, with all I have installed on Windows, I don’t notice any speed difference between the two operating systems when performing equivalent tasks.
I live mostly in Ubuntu because of Firefox and the development environment for C++, Python, Ruby, and Java. I boot Windows just to play a few games and to run Web content checks with IE. I can’t absolutely recommend one over the other, but I can comfortably answer those who ask that it’s now truly a matter of personal taste. Ubuntu 7.04 is, in my humble opinion, the first distribution to provide a decent alternative to Windows. And I would expect Fedora Core 7 and Open Suse 10.3 to do no less when they’re officially released, simply because all three share many of the same underlying software technologies.
I like your metaphor: I live mostly in Ubuntu.
BTW, I also use IE to check web content, but I use IEs4Linux so I don’t even have to boot Windows: http://www.tatanka.com.br/ies4linux/. The only down side is that it doesn’t do IE7.
Also, Wine keeps getting better, so maybe your games will run under Wine on Linux now. I used to dual-boot but it got to be a colossal bother.
in ubuntu you can install ies4linux
http://www.tatanka.com.br/ies4linux/page/Installation
you can also install wine. Then you can install firefox for windows (it is with ubuntu CD); after installing firefox for windows you can install the plugin ietabs . this sould save you a reboot to windows only for the sake of IE enabled websites. I am replying you from the above mentioned Firefox.
With all the reviews coming out of not so technically challanged people, it’s actually refreshing to get one from someone that doesn’t blabber on about which version of Gnome is included. Very nice read, obvious simplifications that ordinary folks can understand, and a good, positive attitude.
Love it!
Some will argue that Linux is “too complex” for the average user or its reserved for “Geeks” and “enthusiasts”. I disagree. A significant number of computer users would be equally confused in front of a Linux machine as a Windows machine. I imagine Ubuntu’s support call center will pick up in volume as this becomes a reality. How the support is will dictate how successful the Ubuntu / Dell relationship will be. If the call support is marginal it will surpass MS in satisfaction as long as the Language barrier is minimal.
-nX
Clearly not written for the techie, but aimed straight at Wanda Windowsuser. And of course I realize that describing Linux as “… an open-source operating system that does everything Windows does — only faster, better and much more cheaply” isn’t exactly an accurate description, but is probably close enough for the average user.
Now, what I REALLY want to know is, why did the author replace Fedora with Ubuntu? I use them both
I think the author of the article is definitely in the Romantic Phase of using Ubuntu. He’s not entirely realistic and is very-biased.
I use only Ubuntu at home (on my Macbook to boot) but his article has no negatives regarding hardware support or software availability.
You can’t simply overlook the mass amounts of commercial software that aren’t available and hardware driver support when reviewing an OS. Especially since the article is aimed at non-technical Windows users.
Just my 2cents… while I type this from Ubuntu
Edited 2007-05-05 13:12
One would say he has no negative opinions because he is using a 100% compatibility Linux laptop.
I think a review on a 100% compatible laptop is a waste of time as the more important review are on those bits of hardware which were bought originally to run Windows, the end user has changed their mind, and now want to run Linux or some other opensource/free operating system.
Thats where the big growth will come from.
Yes and no. The fact is that the article originally talks about Dell Ubuntu PCs, which will (obviously) be 100% Linux compatible.
I do agree that the author would probably have written a different article if his laptop had not played well with Linux, though. That said, Feisty has achieved new levels of compatibility with laptops. My laptop used to require much tweaking and tinkering to get everything working. When I installed Feisty on it, *all* hardware was detected correctly (well, maybe not the modem…to tell you the truth I’ve *never* used it on this laptop, not even in Windows).
I did have to install ATI drivers to get 3D acceleration, but that was it. I was quite impressed.
For me, everything actually works – the only thing i had to do is download the ricoh webcam driver, make then make install, reboot, and voila, webcam and everything works out of the box.
With that being said, I’m looking at upping my C skills and porting drivers to OpenSolaris – my preferred OS.
The author writes that Ubuntu is 1/7th the bloat of XP. If I remember correctly, XP after install and patches takes about 1.5-2.0GB of disk space. Is the author saying that Ubuntu takes only 215-286MB of disk space?
The author also claims that XP is sluggish on a laptop with 2 GB of RAM, running a 2.16 GHz dual-core Intel processor and a 7800 RPM hard drive. The author needs to go into more details when making a statement such as this.
Well, actually the author claims that a Ubuntu installation takes 1/7 the disk space a Windows installation takes to perform a very specific task. As always YMMV.
Nothing more than a Deb junk pile of code.
No thanks I will continue to use Fedora.
I hate Jubunku with the african drums and some other garbage that don’t work!
I read the Ubuntu review and said to myself, “sheesh, this is nothing but unwarranted hyperbole.” After all, claiming that Ubuntu takes 1/7th the resources and is significantly faster than Windows XP is a bit much.
Then I come here and see people talking about compiling kernels and having to drop to the command line to configure things. Cripes. I don’t remember when I last compiled a kernel, and the only reason why I did it was because I’m an OS masochist. I have never compiled a kernel for hardware support. While I do drop to the command line to do a fair bit of configuring, that is only because I’m a tweaker. Most users wouldn’t even care about tweaking what I tweak, so it won’t be an issue for them. (I do the same sort of thing in Mac OS X, yet people would never criticize Mac OS X for the ability to do this.) The fact is, Dell is going to ship hardware that is ready for Ubuntu anyway, so all of these supposed rough edges are going to be even less of a concern.
At the end of the day, I think that a few caveats should have been added. A product like Ubuntu would be great for users who want to use their computer for socializing online, certain types of graphics and office work, novice or Unix software development, and many other things. But if you want Ubuntu to play anything more sophisticated than casual games or you like buying hardware without keeping an eye on compatibility, then Ubuntu won’t be here.
After all, we want new an enthusiastic Linux users. We don’t want people to try it and be let down.
7800 RPM harddrive…
Where did he get it?
7200 RPM, okay
10000 RPM, sure (not in laptop though…), but 7800 RPM
Yeah, I know it’s a typo, but still…
KDE and Gnome only like what you wish it to.
The defaults might appear similar to Windows and Gnome respectively, but some customization goes a long way.
http://www.gnome-look.org/
http://www.xfce-look.org/
http://www.kde-look.org/
Look under the Information pane for more sites.
Okay – this review was just over the top.
I like Ubuntu. I use Kubuntu 7.04 on two of my machines.
But for this reviewer to say Windows (which? XP?) crawls on his hot 2+Ghz laptop with 2Gb RAM… but Ubuntu just sails along… strains credulity.
I have an old Latitude C610 (P3-1.2Ghz) and Windows XP Pro works great on it. Can it run the latest hottest games? No. But browsing, Office 2003, even VMWare Workstation running Kubuntu… all are fine.
I had Windows XP Pro running quite well on an old P3-450 desktop. Which *had trouble running Ubuntu 6.06*. As long as it has enough RAM, older machines can handle WinXP. Now… if the reviewer is refering to Vista… that is a different animal.
Something is wrong with this scenario. And to say “Ubuntu does everything Windows does” is simple rubbish (unless you start getting into Wine to run Windows-only apps). Ubuntu does *almost* everything Windows does. I am on staff at a church whose people/finance software suite is Windows only – no Mac or *nix versions that I know of.
Praise Ubuntu. May it grow and thrive. May DELL offer it preinstalled. But please spare us these rather distorted(?) comparisons to Windows.
@darthkeryx
“I am on staff at a church whose people/finance software suite is Windows only – no Mac or *nix versions that I know of.”
http://www.churchdb.org/
I went to freshmeat.net and typed church in the search. I do think this type of searching needs to be part of the next generation of package managers.
Last time I looked a Fresh XP compared to Gnome is snappier, and needs less memory. I’m not sure that it true anymore I know there have been efforts to reduce memory usage, but for an older system like a P3 xfce4.4.1(and the 1 is important) is a better choice than Gnome.
Edited 2007-05-05 17:02
-“I am on staff at a church whose people/finance software suite is Windows only-no Mac or *nix versions that I know of. ”
Is that ACS People Suite (http://www.acstechnologies.com ) or ChurchWindows?
Anyway, ACS needs only TWO (2) eventually THREE (3) computers: one for database server and one (or two) client PC (one for accounting office and one for church administrator/pastor). Anybody else can use Linux ( but they don’t).
And that’s very interesting; Linux penetration in church offices seems to be harder than in any other enterprise-like environment.
I’m 20+ PC administrator ( Windows 2000 Small Business Server ) with mix of Win 98/ME/2000/XP workstations in our church and I found it really hard to force people to switch to Linux ( unfortunately).
Only three or four people in church need direct access to ( Windows only) ACS database: main accountant, pastor, church administrator and a volunteer ( church attendance, contributions tracking data entry). Everybody else are using workstations only for WEB browsing, email, calendaring and light desktop publishing where Linux can be more than good replacement for Windows based machines. But,no! Lazy conformists always find the way to get Windows only solutions.( Church members are going to pay for whatever church admin staff find better fit, anyway. And that’s sad)
Fifteen out of twenty church computers could be Microsoft free (by my bets knowledge) but…
Thank God, I managed to force church management to outsource our email (so I’m free of Exchange 2000)
and WEB hosting. Next step is going to be to install Linux on each and every computer that came with Windows
OS-es (usually donated computers with no restoration CD). It is going to take year or two.
Missing packages for new releases for software is not just a Ubuntu problem, it is a Linux problem.
Nobody has a solution for installation of 3rd-party software which the distros fully support.
Some solutions exist, however: klik (http://klik.atekon.de/), autopackage (http://autopackage.org/) and zeroinstall (http://0install.net/).
Even though the Linux world settled on a common system, who says that software developers would make packages for it?
There are more solutions than that. You have standalone installers, including the Loki Installer and InstallJammer.
Also, by offering a .deb and a .rpm you are able to reach about 90% of the Linux install base.
The proof is in the pudding: there are very few ISVs for whom this appears to be a problem. Most commercial applications for Linux do offer installers/packages for a variety of distros. It’s not that complicated, especially if you have static libraries (which, for commercial apps, is a good idea anyway). The only real issue is desktop/menu entries, but with the standards set forth by freedesktop.org that’s not really a problem anymore either.
The problem is when developers just put up a tar.gz for you to compile.
Until end users don’t experience this anymore, the complaining will continue.
Who’s to blame, I don’t know.
Saying that the problem is solved is ignoring the problem or saying people should just adapt to the system.
I’m not saying the problem is solve, I’m saying that for 99% of users it’s not a real problem. Developers who only provide a tar.gz file are few and far between.
The fact is that End Users don’t “experience” this until they actively look for such packages. Look at it this way: having a .tar.gz is better than having nothing at all. So it’s a bonus, not a penalty.
The solution is simple: if an ISV only provides a .tar.gz, just don’t use their software. As I’ve said above, this is such a rare occurence that at the end of the day it doesn’t really make a difference.
The problem is when developers just put up a tar.gz for you to compile.
Until end users don’t experience this anymore, the complaining will continue.
Except that many of us end users want those tarballs. We want to see the code. I can and do tweak the code of some programs so that they behave like I want them to. Or I go through the code to see how they do something so that I can incorporate it into my own code.
Maybe I’m not a typical user, BUT, people who don’t want to compile stuff should just ignore the tarballs and pretend they don’t exist. Those tarballs aren’t for them. Wait for the binaries to come out. It’s the same thing they do in the Windows world, afterall. It’s just that in the Windows world people don’t see the time lag between when a developer finishes code and when it’s available for the masses as a binary.
The first time I had Ubuntu running, a reporter asked to see the 264 iPod complaints for the previous year. I compiled and downloaded them into the OpenOffice word processor as fast as my fingers could move across the keys. I was at first convinced that I must have opened an old file by mistake, but no — the compilation was up to the minute.
What in Gods name is he talking about here?
In any case. I don’t agree. Ubuntu is not my distro of choice and with that said Dell shipping Ubuntu on their hardware will not hurt Microsoft in any way and i’ll you why.
Dell will probably not use their high end systems for the Ubuntu ones. They’ll most likely use this opportunity to get rid of their low end machines bundled with Ubuntu. Its a great marketing strategy for Dell. This way they’re selling twice as much hardware and making money off of last years machines.
I browsed through this piece, and like other commentators here, was amazed at the stupidity.
First, GNOME does not look like WIndows. Even KDE doesn’t look like Windows, although the differences are not terribly great.
Second, how this idiot blames Windows for having 21GB of Web and email files on his XP system, and then compares it to the 4GB of system files on Ubuntu is beyond me. Does this moron think he won’t have the same email and Web files on his Ubuntu machine? WTF? A Windows XP system directory holds maybe 4-6GB depending on how often you clean out the crap and how many updates you’re installed. (I keep my installed programs on another partition to avoid overflowing the system partition for just this reason – and of course my data is elsewhere, too.)
And given his laptop hardware, I don’t see how his system is that slow with XP. Vista, yes, I could believe it, but not XP. I’m running dual-boot Kubuntu 6.10 and Windows XP on what is a comparatively ancient 1.66Ghz AMD Athlon with 512MB of RAM and both OS’s are quite adequately fast. Certainly I don’t see a huge difference in speed on either side. I DO see Windows bogging down when a program screws up – and that happens more on Windows than on Kubuntu – but that’s it. Basic speed of both systems is pretty nearly the same. (Feisty may be faster, but I’m not upgrading to that until a couple months from now to make sure any bugs are squashed.)
This article reads like an Ubuntu fanboy trying to trash Windows. Not that I mind trashing Windows – it deserves it – but this kind of over-hyped FUD is no help to Linux when the average end user hits the reality of poor driver support and the failures of Canonical’s QA team.
that Dell UK, in their Infinite Wisdom, have decided not to install Linux on PCs in their territory.
Grr.
Used to watch movies etc. on TV screen with nvidia card and Windows XP ?
Forget about it under Ubuntu.
After you install nvidia drivers (which is quite simple on 7.04 compared to previous editions) prepare for “battle” with nvidia-settings applet.
I noticed two things – it should be used in two ways – sudo’ed when you want to set basic parameters stored in xorg.conf – screen mode, X-screens configuration, resolution etc. Sudo enables to save settings in /etc/X11/xorg.conf. Logical approach with changing permissions of that file is not enough. changes are not saved. I dont know why.
Second way to use nvidia-setting (without sudo) is required to change settings like x-server color corrections. They are saved in user’s home folder in .nvidia-settings-rc file. When n.s. is run sudoed that file is owned by root and cannot be loaded after restart.
There are other “mysteries” in that matter – why sometimes second (brightness etc) settings are loaded after restart, and sometimes only after loading nvidia-settings applet itself ? (its very irritating, similar to issues with two soundcards randomly activated by each Ubuntu reboot as primary)
Why sometimes settings for each screen’s brightness etc are swapped ? I don’t know why ?
ps. sorry for my poor English.
I’m try to learn to use Ubuntu for some time (about eight months) as Windows replacement (I never used Linux before) and i’m still very sceptical. I think that 7.04 is little more polished than 6.06 and 6.10 but its still light years away to become Windows XP replacement. Maybe 8.10 or so will be acceptable for that role, but not until that – and I think that disaster called Vista could be some “driving force” for Linux/Ubuntu mass adoptation, but I do not see any urgent need for XP users to “upgrade” to Ubuntu quite same way as to Vista. They are both not ready yet. Vista maybe after SP2, and Ubuntu maybe after a year from now.
Not that setting up Dual screens is easy in linux, but to blame Ubuntu for problems you’re having with nVidia Settings is a bit much don’t you think? Best to take your problems here http://www.nvnews.net/vbulletin/forumdisplay.php?s=cf88bf440f39ea48…
Maybe they can help you with your problems.
Wow, I was about to make a comment until I realized this was at “consumeraffairs.com”…what a freakin joke. There is just so much wrong with this pathetic “review” to even begin with. Enough said, I just hope this sad little pathetic site does not get confused with Consumer Reports (http://www.consumerreports.org).