After the customary six months of incubation, Ubuntu 13.10 – codenamed Saucy Salamander – has hatched. The new version of the popular Linux distribution brings updated applications and several new features, including augmented search capabilities in the Unity desktop shell.
Although Saucy Salamander offers some useful improvements, it’s a relatively thin update. XMir, the most noteworthy item on the 13.10 roadmap, was ultimately deferred for inclusion in a future release. Canonical’s efforts during the Saucy development cycle were largely focused on the company’s new display server and upcoming Unity overhaul, but neither is yet ready for the desktop.
It’s also the first version available for phones. Well, for the Nexus 4.
Been a while since OSnews actually posted news about an OS. 😉
You don’t count SteamOS or Windows 8.1?
EDIT: Yes, there’s been a lot of non-OS related news lately as well, but there’s been plenty OS news too.
Edited 2013-10-17 23:16 UTC
I prefer to call it Windows 7.1…with a SO METROSEXUAL GUI. It should be a Windows 7 service pack at the most, really.
Edited 2013-10-18 04:52 UTC
Are you talking about Windows8, the gutted, overhauled, UI replaced monstrocity? If you think that only warrants a .1 over windows 7 i’d have to respectfully disagree!
On an ontopic note, i’ve been installing almost every version of Ubuntu on my desktop since the beginning. I’ve never had a pleasant userfriendly experience with it and even though i’m a developer and have a Nexus4, i’d NEVER install that on my phone.
Yet you keep installing updates???
Edited 2013-10-18 16:41 UTC
Yet you keep installing the updates????
I’d call it Windows 7.-1
Win 8.1, SteamOS, Android..plenty of OSNews of late.
That said until Linux in general and Ubuntu in particular can fix the frankly abysmal driver situation? Then Linux on the desktop will remain lower than the margin for error.
For an easy to replicate example take any old machine, or make a partition on your current one and take the “Hairyfeet Challenge”. Its really easy and shows what a mess Linux drivers are and actually tries to tip things in favor of Linux by only requiring HALF the support a standard Windows release gets.
Oh and before anybody says “it isn’t a fair test because you don’t/can’t upgrade Windows” show me a distro that gets a decade of patches WITHOUT upgrading or paying an insane yearly license and I’ll concede the point, until then the only way to keep patches current is to jump on the upgrade treadmill.
Download the version of your distro from 5 years ago and install it,making sure all the drivers are working, in the case of Ubuntu this would be 10.10 which just FYI support ended in 2012, and then upgrade to current using ONLY the GUI, no CLI or “open Bash and type” allowed as Joe Average doesn’t have the skills nor desire to learn CLI and frankly in 2013 they shouldn’t have to. If you do this on most systems what you’ll end up with is a broken mess, even bog standard hardware like Realtek and Via will often end up with trashed drivers which again in 2013 ia aimply unacceptable.
If you want to know why Linux desktop adoption is lower than the margin for error,even when MSFT puts out the most hated release since MSBob? Its the drivers,its a mess. Say what you will about MSFT but I just recently retired my old nettop at the shop, we are talking about a circa 2003 Sempron that went from XP RTM – last patch Tuesday, know how many broken drivers I had with no less than 3 SPs and countless patches? NONE.
And THAT is what you are competing against, until I as a system builder and retailer can slap the latest Ubuntu on a desktop or laptop and know that the odds are better than 90% that the customer can update/upgrade for the typical 5 year lifespan without ending up a mess? Then Linux will stay off my shelves and without retail sales and more importantly support Linux is going nowhere fast on the desktop.
My experience is, in Windows, it’s not the drivers breaking that you have to worry about. Especially in the XP days it was updates being applied in the wrong order, something that was fortunately fixed with Vista and above. Remember the initial IE 7.0 update applied via the Windows Update web site and what happened there when the ActiveX-based installer decided it would install the IE 7 update first, then install IE 6 security updates over top of it? You could avoid it by installing IE 7.0 yourself instead of letting Windows Update do it (something I still do for IE updates to this day) but I saw quite a few broken systems because of that and similar incidents. Of course keeping your security patches current would prevent this as well, but we all know how well most users do that don’t we?
Well said. The problem with desktop Linux is two fold: drivers and applications. In both cases we need a system of separation, separating drivers from kernel updates and separating application updates from the operating system’s packages. The first bit requires some sort of stable kernel ABI which so far the Linux community is against. Solaris got that one right years ago, and the BSDs got the other part right by separating ports from the operating system software.
darknexus,
“My experience is, in Windows, it’s not the drivers breaking that you have to worry about.”
The thing is windows drivers are only supported for a limited time by manufacturers before being left behind and breaking with new operating systems.
“The problem with desktop Linux is two fold: drivers and applications. In both cases we need a system of separation, separating drivers from kernel updates and separating application updates from the operating system’s packages. The first bit requires some sort of stable kernel ABI which so far the Linux community is against.”
There are many (maybe even most) in the linux community who want stable kernel ABIs even though Linus has decided otherwise. I feel it should be stable for technical reasons, however I also see a risk of negative repercussions. With stable ABIs, many manufacturers would be far more likely to follow their windows model and deploy closed source drivers under linux as well.
The problem with closed drivers is that we’re 100% dependent upon manufacturers to update and compile the drivers for new architectures, integrate new features, fix bugs, etc. With linux today, this isn’t usually a problem because most drivers are open source.
On windows, I’ve seen lots of hardware stop working due to windows upgrades, even hardware with userspace drivers that shouldn’t have been impacted by a windows upgrade. My parents had an HP printer/scanner/copier machine that was fully functional until I bought them a new windows 7 machine. HP doesn’t provide windows 7 drivers. (If anyone else has this problem and is so inclined, you CAN run windows XP under a virtual machine and use the original drivers with full functionality in a VM). I had a Pinnacle Studio video transfer adapter that I used for video editing, it doesn’t work with windows 7 and same deal with an Encore bluetooth adapter. Recently windows 8 broke a dameware video mirror driver. In all these cases a user generally has to toss out an old working product and buy a new one solely due to the driver incompatibility with a new version of windows.
The point of this rant isn’t to pin blame on any one party, but rather to illustrate why closed source drivers are bad for compatibility. Someone will probably retort that we shouldn’t expect things to work on a new OS that was never listed on the box, which is a fair point. However it’s still true the closed source drivers are largely responsible for it.
So in other words its EXACTLY as I have been saying for years, its NOT a technical issue, its a political. The second you have major design choices being dictated by politics? Give it up, stick a fork, you are done.
This is why Linux on the desktop will never be more than a blip, and why companies like Google make their own GPL V2 only forks like Android, because with Linux it becomes more about “GPL Purity” than having a working system.
bassbeast,
“The second you have major design choices being dictated by politics? Give it up, stick a fork, you are done.”
Haha, aren’t you a windows user? You shouldn’t be pointing a finger
Honestly, I don’t get your grudge against linux or gpl? You are welcome to stomp around proudly declaring that “Linux will never be more than a blip” if you like, but despite your opinion, it doesn’t change that Linux does work pretty well for many of us. Sure it has faults sometimes but your blatantly exaggerating them while ignoring the faults with windows. Linux is *just* an OS – a tool. Sometimes it really is the best tool for the job, other times it’s not, so what? I’ll never get why people are so adamant about treating technology like gospel.
Edited 2013-10-22 00:34 UTC
Because we all know that in huge corporations like Microsoft, Google and Apple there are never any major decisions made solely on politics. Ever. It’s all based on technical merit and for the better of the consumer.
Well, maybe on Planet Imagination that would be the case but that’s not where we live.
Edited 2013-10-22 06:10 UTC
Oh, I wish I had so much luck with Windows. I recently installed Windows 8 on my laptop… I had to hunt down an exact working version of sound driver (which got automatically updated with broken one later that day), camera randomly stops working (and causes BSOD once a month), middle button of touchpad doesn’t work as middle button, video driver is buggy, and another (vendor-supplied) driver refused to install after failing to find corresponding hardware (otherwise present in the system). Needless to say all these issues are Windows-specific – I have none of these under OpenBSD.
That’s not to mention tones of crap that come with drivers. Even if you are careful and lucky enough to avoid installing stand-alone application for managing wireless, bluetooth (these are most ugly) and what-not, you’ve still got random litter in control panel, “My Computer” and throughout hardware-related settings dialogs. And, of course you get a bunch of autostart programs for updating, monitoring and providing OSD you never asked for.
Edited 2013-10-20 22:38 UTC
Oh, that sounds familiar!
I bought this HP laptop in 2010 or something and while it was working quite okay otherwise the soundcard was fucking horrible: every now and then it just decided randomly to try to output audio to headphone connector even when no headphones were connected or it insisted on trying to use an external microphone when there was no such thing connected. The same went vice versa, too: it insisted on outputting audio through speakers even if you had headphones connected or it insisted on using built-in mic even when you tried to use an external one.
There was no way of choosing what to use in the software, it was all supposed to be “automatic.” The method how you normally choose input and output in Windows didn’t work, because all the devices were collapsed into one virtual one, with no way of configuring it. There were no Microsoft-supplied drivers for it, it was a HP-modified version so reference drivers wouldn’t work on it, and the HP-modified drivers were already outdated when I bought the laptop and HP never released a single update for them.
And yet, under Linux it worked just fine with whatever drivers the OS came with.
I can’t even count the number of times these manufacturer-supplied 3rd-party apps have been the most-broken part of the whole deal, breaking something that would’ve otherwise worked, like e.g. how some manufacturers insist on supplying replacement apps for managing WIFI-networks. Sometimes I could modify the drivers so that the 3rd-party app didn’t install, but the bare essentials did, and then the thing worked just fine, but also just as often the 3rd-party app was so ingrained in the drivers themselves that it would just fail inexplicably without the app, doing nothing, but running the app would hard-lock the system or BSOD it.
I don’t blame Windows/Microsoft for any of these problems, though. I place the blame wholly on manufacturers doing craptastic job, skipping the standard ways of accessing things and not caring in the slightest about the result as long as their logo was visible somewhere at all times.
Indeed you get drivers from hw vendors – Windows comes with nearly no drivers at all.
Oh yeah, remember the time before Windows had it’s own bluetooth stack? That was AWESOME! Every computer and device vendor had their own shitty stack that was not compatible with other vendors and couldn’t co-exist with other stacks and things would break randomly all the time. Wasn’t that long ago either.
Yeah, tell us again how awesome the Windows driver situation always was and is.
My laptop is 2011 model, and my Bluetooth driver forces its custom UI just like one of those. Windows-supplied Bluetooth stack doesn’t include necessary drivers for common BT device classes, so I either have to install this abomination or refrain from using BT at all.
I find the driver situation strange.. I seem to have been living in a bubble over the last few years which recently bit me in the butt..
Since 2008 I have been using Centrino based laptop’s running Kubuntu exclusively.. I have not had a single driver related issue in that time. It has always just worked..
However I recently came to own a HP Pavillion dv6.. (won in a raffle) which in all honesty is THE WORST laptop I have ever owned.. (another story).. The driver situation is pretty much exactly what you describe. It has an AMD radeon graphics chip (6xxxM) with an ralink wireless card and realtek onboard ethernet.. the realtek seems to work well.. but the ralink has shocking wifi reception despite having “driver support”.. but that doesn’t really matter when the state of AMD graphics support on Linux is so so so so so so so so so so bad.. When I used to be a gamer (10 years ago) the ATI card I had at the time was excellent.. ohh how this experience has completely smashed any dreams I had of my previous experience…
While the driver state from AMD is improving it is not improving fast enough… and the damage has already been done to AMD reputation. I will NEVER buy a HP or anything with an AMD graphics chip.. it’s absolutely shocking.
Edited 2013-10-21 07:33 UTC
Au contraire, I have just installed Kubuntu 13.10 on my laptop with AMD Mobility Radeon graphics, it worked out of the box with support for OpenGL 3.1. To enable the new dynamic power management support I had to add one boot-time parameter to the linux command line options in grub “radeon.dpm=1”. For some reason Ubuntu is missing a library file to get UVD hardware video acceleration working, so I had to install a library file from a PPA.
That is the total extent of problems with radeon graphics in the latest release, and they aren’t really problems in the sense that the graphics will work (less well, but still work) without taking these steps. The graphics performance isn’t up to full possible high-end gaming speed yet, but it is just fine for desktop usage, for that it works very well indeed.
Having said that, apparently earlier this week Painkiller: Hell and Damnation was released as the latest popular game title reaching Linux. Painkiller: Hell and Damnation is powered by the Unreal Engine 3 and its Linux porter has recommended an interesting choice of drivers.
http://www.phoronix.com/scan.php?page=news_item&px=MTQ5MTE
It seems to be a reasonably demanding game:
http://www.youtube.com/watch?v=IUcKpjbQI60
So there you have it, game developers apparently now recommend using the out-of-the-box open source AMD Mesa/Gallium3D driver instead of the proprietary driver.
Edited 2013-10-21 13:17 UTC
But … that’s not how you do it. For a Windows system, you have to pay real money for a new version, and user files are often intermingled on the same disk partition as the operating system, so updating the same version piecemeal, update by update ad nauseam for hundreds of update packages is seen as the way to go.
That scenario just doesn’t apply for Linux. You may as well upgrade rather than keep updating an older version because it doesn’t cost you for your OS over again.
So what you should do is:
– Download the version of your distro from 5 years ago, in the case of Ubuntu this would be 10.10,
– install it with the OS one one partition, the user home directories on another partition, and a third partition for swap, making sure all the drivers are working,
– Save the tail end of the /etc/passwd file for the details of all users (only necessary if you have a number of users),
– Sometime later download the current version 13.10,
– make a bootable USB of version 13.10 (using the tools provided by the OS),
– boot version 13.10 from the USB (make sure that the newer version of drivers all work before you commit further),
– format the OS partition and install the upgrade version back into that partition, leaving the users home directories partition intact,
– reboot, then edit the /etc/passwd file and re-instate the tail end of the file to restore all of your users (only necessary if you have a number of users).
This sounds like a lot of effort, but it really isn’t, and it takes waaaaaaaaay less time than updating or upgrading Windows.
One gains the following benefits from following this process: (a) one saves a great deal of time, (b) you get a full upgrade rather than just an update, (c) you don’t touch any user’s data, and (d) you get to test the new version before you commit to it so there is very low risk.
I have followed this process of updating the desktop Linux OS, more or less, about fifty times across various systems. I followed it again just yesterday on my laptop for Kubuntu version 13.10, from which I am posting this very post.
I only ever had a problem on one occasion, and I simply decided to not commit to that problematic new version of the OS (it had KDE version 4.0), I shutdown, took out the USB and re-booted the older version still on the hard disk. I just skipped that ill-working version and waited for the next one six months later.
Edited 2013-10-21 08:29 UTC
BTW, you can easily do all of the above without touching the command line:
http://www.kubuntu.org/getkubuntu/download
http://www.kde.org/applications/internet/kget/
http://en.wikipedia.org/wiki/KTorrent
https://apps.ubuntu.com/cat/applications/usb-creator-gtk/
http://en.wikipedia.org/wiki/Ubiquity_%28software%29
http://kate-editor.org/about-kate/
Perhaps not all of your userland applications come with the default install:
http://jontheechidna.wordpress.com/2010/07/05/introducing-qapt-and-…
Enjoy.
Edited 2013-10-21 11:30 UTC
I’m sure that is all true…if your time is worthless. This seems to be a concept that the FOSS community can’t/won’t grasp, that time has value to most folks and it doesn’t take long at all for those “upgrade broke my driers” scenarios to make Windows cheaper than Linux. In my case if I have to spend just 1 hour fixing a driver a year? That Linux system cost me MORE than a Windows license.
Frankly I wish there was a way to force Linus and pals to work retail for just a month to see what their politics actually cost. The first time Linus saw all profits for the month dry up because his kernel fiddling broke major wireless driers? I’m sure he’d change his tune.
Exactly what part of “it is way quicker to upgrade Linux than to update Windows” and “you get to test the new version before you commit to it” did you fail to understand?
http://en.wikipedia.org/wiki/Ubiquity_%28software%29
I quote: When reviewing Ubuntu 10.10, Ryan Paul from Ars Technica said “During my tests, I was able to perform a complete installation in less than 15 minutes.â€
Here is a 51 second video of someone using the Muon Software Centre to search for a package, read a description and get a screenshot, have a quick look at user comments on it, download it, install it and test run it. Yes, I did say 51 seconds.
http://www.youtube.com/watch?v=gKIw8O5WEp0
To keep Linux updated and upgraded, we are talking maybe 30 minutes total time spent every six months.
One is lucky to get away with just 30 minutes wasted with Windows updates every second Tuesday, normally it takes a lot longer than that.
If your time is not worthless, then Windows is the OS to avoid, not Linux.
Edited 2013-10-22 06:50 UTC
I come over this mantra every now and then, and I am puzzled. Why Joe Average shouldn’t have to know CLI? CLI is a productivity booster, and it is the only real reason to use Unix-like systems (Mac aside) over Windows.
If Joe Average goes for CLI, he took the right direction with Linux (or BSD for that matter). If he is for free software, !windows or whatever idealogical reasons – go to Haiku instead. Or sit and wait for ReactOS.
CLI might only seem to be a productivity booster…
http://plan9.bell-labs.com/wiki/plan9/mouse_vs._keyboard/
About Kubuntu 13.10, it has been released and its highlights include…
– New app installer: Muon Discover
– New accounts setup: User Manager
– New Software from KDE: KDE Plasma and Applications 4.11
– Easier to get everything during install. Wireless Setup in Installer
– KDE Telepathy with Better Text Editing and Improved Notifications
– New Network Manager applet
– Easier to report what you’re exactly using with About System
– Documentation returns at http://docs.kubuntu.org
– Commercial Support is available
There are more details in http://www.kubuntu.org/news/kubuntu-13.10
Any Ringtail users upgrading in the next couple of days?
I liked Ringtail, but I like the latest Fedora KDE spin much more.
Even if I wanted too, I’d have to wait a while until Bumblebee was updated – not sure how long that takes.
I will probably upgrade this weekend.
I upgraded to the beta a couple of weeks ago. I’m not using Unity so I don’t know how things are going in there but besides that I’m happy with it.
Definitely not a lemon release, at least.
The only issue I’ve had is with WebEx no longer being able to share the desktop (the app hangs). It might be some java related problem, I’ll have to look into it.
If I am correct, this is the first Ubuntu version that includes the Gnome “Classic Mode”.
I would consider that the most noteworthy improvement. Gnome, after all, is still the most sucessful free desktop environment.
I don’t know, “was” might be the correct tense of the verb in this case.
I accidentally installed Ubuntu Server yesterday, unaware of the fact they just released it earlier that day.
The installation was easy and straightforward. In fact, I liked it better then Scientific Linux 6.4 (RHEL clone), but I still use SL when security (SELinux) and stability are top priority.
The text-based installer of Ubuntu gives more flexibility in choosing (groups of packages). Also installing daemons is easier, as they are configured to start on bootup automatically, and also started after installation immediately. Basically, it’s more geared to ease of use, with more packages available in the repo, so it might be a better choice when you have a test or development system or a server that is not exposed too much to the outside world. AppArmor delivers much of the same as SELinux does, although the latter is more secure. Still I have more trust in RHEL for exposed production systems. I remember Grub troubles and Plymouth nightmares from earlier Ubuntu Server days.
Also I have a Windows 2012 server installation. Well… the differences between Ubuntu Server and RHEL are negligible in comparison. Let’s not talk about installation and the update system, that’s already bad enough for Windows Server, but in the table below, I’ve already set win2012 low on RAM while the other two have plenty of free memory.
OS VM Memory in KiB
win2012 exchange 5242880
SL 6.4 mailfilter 1048576
SL 6.4 www 2097152
Just a note that if you are intending to use Ubuntu Server in a production environment, I would stick with the 12.04 LTS release, which gets 5 years of updates (until 2017), whereas the new 13.10 release appears to only get a year of updates if http://www.ubuntu.com/server is to be believed (which is frankly useless for a server OS).
Nice to see that 64-bit desktop Ubuntu is finally the “recommended” bitness (8 years after I started using 64-bit Linux), although I wish the torrent download links got equal prominence with the direct download links…
The problem with using LTS releases such as 12.04 is that you will end up with packages from all over the place, e.g.
– ruby 2.0.0-p247 (LTS: 1.8.7-p352, Jul 2011) – self-compile based on https://github.com/postmodern/chruby/wiki/Ruby
– redis 2.6.16 (LTS: 2.2.12, Jul 2011) – ppa:rwky/redis
– haproxy 1.4.24 (LTS: 1.4.18, Sep 2011) – ppa:chris-lea/haproxy
– nodejs 0.10.20 (LTS: 0.6.12, Mar 2012) – ppa:chris-lea/node.js
– php 5.4.20 (LTS: 5.3.10, Feb 2012) – ppa:ondrej/php5
– apache 2.4.6 (LTS: 2.2.22, Jan 2012) – ppa:ondrej/apache2
– nginx 1.4.3 (LTS: 1.1.19, Apr 2012) – deb http://nginx.org/packages/ubuntu/ precise nginx
– postgresql 9.3.1 (LTS: 9.1.9, Apr 2013) – deb http://apt.postgresql.org/pub/repos/apt/ precise-pgdg main
– zfsonlinux 0.6.2 (LTS: none) – ppa:zfs-native/stable
How is that a problem, exactly?
PS. for ruby just use rbenv.
How do you accidentally install an operating system?
I am still, after almost 20 years, to notice any significant server stability differences between distros.
SELinux seems to exist for the sole purpose of people disabling it.
There’s really little to no practical differences between distros in this aspect.
Uh, so you compare Windows running a heavy, full-featured service like Exchange to Linux running apache/nginx/whatever and a simple mailfilter. yeah, ok…
A fairer comparison would have been with running something like Zimbra on Linux.
Edited 2013-10-18 19:21 UTC
“setenforce permissive”, the only command you ever need when dealing with SELinux. I’ve yet to work in a single job, or with a single person, who ever thought SELinux was worth the effort. Even apparmor has proved to be a bit of stretch.
Consider switching to a Mac or whatever. Or install something without SELinux / AppArmor. Probably the environment you work in does not care that much about security anyway.
Ever heard of personal / fun / hobby ?? It might be a hard concept for you.
I gave a few examples. YMMV. The grub -> grub2 update in Ubuntu took down my personal server.
So that’s your standard then?
Yes there is. Not in your case, if you disable SELinux. Unresponsibility makes “practical differenses” disappear.
Heavy, yes. Windows is heavy, mostly without good reason. Full-featured? Are you trolling?
Really?
http://www.zimbra.com/docs/os/6.0.8/administration_guide/2_Overview…
Sure but I still don’t accidentally install stuff.
Sorry but at a given point in time most Linux distros are pretty much equal in terms of stability.
The only real reason to go with RH is that you want the support.
Actually knowing and understanding how to run a system. SELinux is not a magic bullet and unless you configure policies yourself most things you install on RH does not even have one.
When I need a really secure system I just use OpenBSD.
Comparing Windows running Exchange with Linux running a mailfilter is an inaccurate comparison regardless.
No, trolling would be to think that Windows+Exchange is in some way comparable to Linux+webserver+mailfilter.
If you don’t think Exchange and Zimbra are full-featured it just means you’ve never worked with either.
Official installation images exist for two phones (Galaxy Nexus and Nexus 4) and two tablets (Nexus 7 2012 and Nexus 10).
Also Ubuntu Touch is known to work on dozens of other devices through community efforts:
https://wiki.ubuntu.com/Touch/Devices
Thom, as you apparently own a Nexus 7 2012 a report on how well Ubuntu runs on the device would be highly appreciated.
I was thinking of installing it on my nexus 7 2012. I installed the previous version of ubuntu on it, and it left a really bad taste in my mouth. The Ui wasn’t very good, not only in design, but flaky in implementation. There were a lot of nonfunctional icons that were supposed to represent future functionality, but did nothing when tapped. It was dog slow compared to android 4.2/4.3 I’m not confident that they addressed those issues in the new update. I can’t imagine they’ve changed the UI ( navigating from one home screen to another was a frustrating pain filled with failure).
Going by this review I wouldn’t: http://www.omgubuntu.co.uk/2013/10/4-reasons-why-you-shouldnt-insta…
That’s unless you have no actual use for your nexus 7 other than tinkering.
That review was Ubuntu on a Galaxy Nexus. The tablet experience *might* be better as they’ve been working on it longer. But yeah, I don’t have much other use for my nexus 7 other than tinkering. The thing is, I’m not sure I have time to repeat the experiment of going to Ubuntu and then back to nexus. Plasma Active is going to hit ver 1.0 soon I hear. I think I’ll give that another try first.
Looks like someone beat you (and Thom) to installing Ubuntu Touch on their Nexus 7 and posted a small review. Generally, it was positive but a few show stoppers made him go back to Android until these are fixed.
http://mobile.slashdot.org/story/13/10/22/1344209/ubuntu-touch-on-a…
No thanks, I’ll stay with my trusty FreeBSD 9.1 setup. It rocks!
Ubuntu jumped the shark long ago. I don’t even consider it anymore. Luckily there’s Linux mint mate edition…..
Linux Mint Mate jumped the shark long ago. I don’t even consider it anymore. Luckily there’s Linux Mint Cinnamon edition…..
Linux Mint Cinnamon jumped the shark long ago. I don’t even consider it anymore. Luckily there’s Linux Mint Licorice edition…..
Edited 2013-10-18 19:24 UTC
Linux Mint Licorice jumped the shark long ago. I don’t even consider it anymore. Luckily there’s apt-get install fvwm1…
apt-get sold out WHILE jumping the shark. Luckily there’s Windows 8.1 (everyone knows that Microsoft makes the best Windows, unlike that fake Brand-XWindows junk).
Real men use TWM.
Real men use screen. Exclusively.
Edited 2013-10-19 16:22 UTC
Real men use screen. Exclusively.
Real men use virtual consoles with & and %.
Edited 2013-10-21 14:36 UTC
Welcome to http://www.b2cmall.us//
where is the most popular Panthers online shop.
((( http://www.b2cmall.us/ )))
lower price fast shippment with higher quality!!
WE ACCEPT CREDIT CARD /WESTERN UNION PAYMENT
YOU MUST NOT MISS IT!!!
I gave up on Ubuntu when they decided the future was mobile. I want an OS for our desktops. Sorry ubuntu.
Because there is. But they provide Ubuntu as well.
Once Ubuntu switches to Mir I think many users will Mint Debian edition
Why not just use Debian itself? Remember that most users, if you mean typical desktop users (of whom very few use Linux), usually have their friends or a tech company load their os anyway. More technical users would have no problem installing Debian themselves. Once Debian’s installed and running I’ve yet to see a Linux desktop more solid, though that comes at the cost of not always having the bleeding edge latest desktop environment. Still it’s a nice compromise between extremely outdated yet stable (RHEL/CentOS) and bleeding edge crash city (Ubuntu).
I bet Canonical is planning to make a new binary package system to replace deb.
Might not be a bad thing to replace deb, though I’d not wish them to invent a new package format. Deb reminds me of GNU Autotools; it does work but it’s massively over-engineered and needlessly complex for the task it’s designed to perform. Were I looking for a new packaging format I’d either adopt rpm or else adopt a simple archive format with metadata like *BSD or ArchLinux use. Both can be made equally simple for both users and developers. Deb is simple for the first group, but a hassle for the second.
Eh, it’s not really a hassle if you use fpm to create packages. Know your tools.
Thanks for making my point for me. If you need yet another tool to easily work with something that already has its own tools, it’s over-engineered especially when you do sometimes have to get down into the nitty-gritty of the deb format even with tools like fpm.
so…since there are so many 3rd party installer tools for Windows, Windows MSI is obviously horribly over-engineered and god, what a hassle this must be for developers.
You compare apples to oranges. All Microsoft tools are overengineered from Unix perspective. Microsoft takes completely different approach from what was always the basic principles of Unix, and similarity to Windows couterpart is a very bad recommendation for Unix tool.
It is a common trend of many recent Linux things to be overly complex and have tools to deal with complexity. The whole concept of complex format and reliance on tooling is major departure from Unix spirit.
Being a BSD user and “suckless” ideology adept (or zealot if you like) I regard this trend as very wrong in a long run – complexity is almost always mutually exclusive with flexibility and hassles further development.