This (quite long) article has been written by me for two primary reasons: One, to hopefully save someone else the time and hassle associated with trying out various Linux distributions, and two, to promote some discussion and feedback regarding what a modern Linux distribution should be, and of course to contrast this with what is currently available. I am exploring the offerings of MS Windows, BeOS and MacOSX, and then taking on a number of well-known Linux distributions.
For a while now I’ve been an “OS Junkie”. I like to keep up to date on what’s happening in the OS world, and I spend way too much time researching and trying out alternative OS’s. I use a computer for everything from video editing and music production to accounting, web development, and graphic design, so my Dream OS has to be up to the challenge of handling most, if not all of these chores.
In short, I probably fall into the “Power User” category, and am constantly looking into how I can improve both my productivity and my user environment, either through software or OS changes. To that end, I’ve used quite a few different OS’s, on quite a few different platforms. My current desktop system’s based around a 1.2Ghz Athlon, 512 megs of DDR ram, 2 video cards (an older ATI All-In-One-Pro, and a Sis 300 based card), a SB Live video card, and about 200Gb of drive space. I also have a large assortment of add-ons used with this particular box… Everything from scanners to drawing tablets, and even a digital postage scale that hooks into the serial port.
My goals are lofty: To try and find the ideal OS for my own personal usage, which will allow me to work the way I want to, while not shorting me when it comes down to the tools and features that I need in order to “get the job done”, no matter what that job might be. And yes, it has to look good too! Eye candy might not be a concern for you, and there’s very many good arguments against eye candy (with the number one concern being that eye candy often just equals wasted CPU cycles), but for me, I want to work in an environment that’s not only functional, but also one that looks good.
I’m not a huge gamer, but the ability to play the odd game would also be nice, as would the ability to handle all of my non-gaming activities, from Web development to Graphic design and video editing.
Before we get into the Linux distributions specifically, let’s look at some of the non-Linux alternatives. This will hopefully provide some background on what I don’t want, while providing us with some hard data about what I do want. We’ll start with the biggest list, Windows and then work our way down.
Windows:
Obviously the Windows family of OS’s is going to factor in here. There’s been arguably more work done on the Windows platform than on any other modern Operating System, and it shows! Windows owns the world, as far as consumer Operating Systems go, and they’re very well known, so without further ado, here’s some things I do and don’t like about Windows (oh, and by the way, unless noted, most of the Windows references refer specifically to Windows XP âMany of the issues I will note are present in other Windows versions as well, but since XP’s the latest Windows version, this is the one we will focus on):
Windows Plus’s:
- There’s a plethora of software’s available: Just about everything’s available for Windows, from music sequencers to cad programs, there’s an app out there for Windows that will do the job. Similarly, there are drivers out there for virtually all hardware I’m ever likely to run on my PC. Software support is definitely not a problem when using Windows!
- Keyboard Shortcuts: Windows has a keyboard shortcut for virtually every function, and if it doesn’t have it, there’s any number of add-on programs that will provide such features to you.
Similarly, if you want to, or have to due to hardware issues, you can set the more modern versions of Windows up with an included utility which will let you substitute keyboard commands for mouse actions. This functionality is a huge bonus to Windows as it allows one to flip between programs knowing that these shortcuts are common from one program to another.
- Customization: Sure, all OS’s allow you to customize it’s look and/or feel to a certain extent, but it’s the degree with which one can modify his or her environment that really interests me, and Windows is up near the top of this category both with it’s features, and with the ease at which you can customize them (although arguably, many of these features are either hidden away, or are accessible only with the aid of 3rd party software).
With the appropriate tools, I can change the look of Windows to virtually anything I can dream up, and that’s saying something!
- Common Look and Feel: This kind of ties in with the Keyboard shortcuts as there are a number of Windows elements that help to tie 3rd party applications together and make them operate and appear similarly to all users. Much of this can be attributed to the GUI of Windows, which provides a common, and somewhat skin-able/theme-able GUI for all apps to run under.
This allows one to jump from one Windows application to another without having to learn a new interface. This Common Interface is a big plus when it comes to being productive, and it’s one of the many strengths to the Windows Operating System.
- Quick and Responsive: Well it should be as most companies tend to optimize everything for Windows! Windows, at least when just using one or two apps, is very snappy on my machine. Granted, when I have multiple apps open (which is virtually every time I’m on the computer, mind you), the system can bog down somewhat, but for the most part Windows is “The” desktop by which others are judged by most people, as far as responsiveness is concerned.
Windows Minus’s:
- It can be slow: As mentioned above, Windows can bog down when many apps are open simultaneously. True, a dual CPU machine would help immensely with this (XP is SMP-capable!), but we’re looking specifically at the operating side here. Multiple CPU’s will also help to speed up my Linux box when and if they’re added (although Linux would of course require that I build an SMP-capable kernel). Ideally, I’d like an OS that allows me to switch my focus between apps without noticeably slowing down, or taking awhile for the CPU’s focus to shift gears (as can happen when burning a CD-ROM and typing a letter, for instance).
- Built in GUI capabilities are limiting: As mentioned above, I can customize virtually every aspect of Windows I want to, from the GUI itself, to how keyboard shortcuts are received and handled. However much of this customization is handled only by 3rd party apps. Windows has came a long ways as far as what it allows the user to do, but to still get the most from a Windows system, you have to typically use a 3rd party program. This often requires additional $$, and it often adds another application to those already running in the system tray. This in turn results in more CPU cycles and memory going towards a feature which I for one feel should be included with the OS itself. Virtual Desktops under Windows is an excellent example of this, in that if you want Virtual Desktops, similar to Linux and BeOS, you must run a 3rd party application to provide this functionality.
- Stability: Windows XP goes a long way towards making Windows a stable platform. It’s much more stable than Windows 98, and arguably more stable than NT ever was, but it is by no means perfect. I’ve gotten everything from the dreaded “Blue Screen of Death” to popups saying that Windows can no longer find a valid license on my machine thanks to the bugs in XP!
Don’t get me wrong, XP is a nice OS, and it’s fairly reliable. But for someone like myself who’s always “pushing the limits” of what a machine can do, XP is a long way from perfect. The ultimate example of the problems this can cause is how XP reconfigures itself without asking. Allow me to explain…
I do a fair amount of music composition and recording on my PC, and as such, I strongly feel that a PC should be seen and not heard. I don’t need my PC going “beep” in the middle of a recording session just to let me know that an update’s available (or for any other reason for that matter!). So I typically turn all the system sounds off on my PC and save that scheme as my default.
So you can imagine how upset I get when I boot up Windows and hear Windows default opening sounds play. This is my cue that something has changed within Windows, and that Windows in its infinite wisdom has reconfigured itself again without asking me. This problem is a great example of why I want off the Windows platform: I want an OS that works great and one that remains configured as I set it. I don’t need or want an OS guessing at what’s best for me, and then making changes without my approval. A first-time computer buyer might appreciate such features, but I for one loathe them.
- Security: It’s Windows… It’s full of holes, and more are discovered every day. Enough said.
- Can’t optimize core system: By this I’m referring to the fact that Windows is a closed-source system. The users don’t have the access necessary to do such things as optimize the kernel for a particular platform. This isn’t a major problem, but when you consider all of the legacy applications that Windows supports, you naturally have to wonder how fast and responsive it could be if it was optimized and targeted at modern PC’s. Rather than have a kernel that runs on everything, it would be nice to strip out the things that don’t apply to my setup and optimize it for the hardware I do have. On the other hand, this is one of Linux’s strong points!
The BeOS system is still around in several forms, but it is no longer actively maintained (As much as I like them, Hobby releases of BeOS, and promises of Zeta do not have the same affect as a company collectively making all of the decisions relevant to building a cohesive OS). I used it for a while, and still do on occasion, but the lack of support for modern technologies has kept me from staying in BeOS for too long. BeOS still has some features I’d love to see in my dream OS though.
BeOS plus’s
- It’s fast! Very fast!! In fact BeOS is the speed by which others are still based in my case. I figure that if an unsupported OS on unsupported hardware can be this fast, that there really isn’t an excuse for other GUI’s to crawl on this same system.
- Meta-Data: BeOS’s meta-data is still talked about in geek circles. I never used it to the extent that others have, but it was always nice to know that I could tag any file with any meta-data I wanted to, knowing that I could quickly query for the file later via BeOS’s built in search/query capabilities.
- The ability to quickly drill down to files: With the BeOS, you could simply right click any folder, whether it was on your desktop, or in Tracker, and a popup menu would appear allowing you to both browse, and drill down into the folders contents. No other OS has achieved this level of integration yet, and this is one of the things I miss most about the OS.
- Resolutions specific to virtual desktops: This one is one of those features, which I loved. As a Web Developer, I need to not only test my code on different machines, and operating systems, but also at different resolutions on those machines. In Windows, this requires a trip into settings to change my display mode, and every time I do this, I end up resizing my desktop, and the icon placements on that desktop.
With most OS’s, it’s a similar process: Switch resolutions system-wide, and then deal with the ramifications.
With BeOS though, one can assign resolutions to specific desktops. So when clicking “Virtual Desk #3”, for instance, I can work on a 800×600 screen, whereas “Virtual Desktop #2” might give me a 1280×960 resolution, and “Virtual Desktop #1” might provide me with a 1024×768 setup. It couldn’t be easier, and I really wish this feature were supported on more OS’s.
BeOS’s Minus’s:
- Lack of Programs: There’s no company backing BeOS any longer. It has only recently received a JavaScript capable browser, via the BeZilla project, and such items as JavaScript, Java, and CSS are all important elements to my being a Web Developer. These things are starting to trickle in, and there’s always those who are trying to resurrect the platform, but an OS with no support and a dwindling number of users is not a viable choice for my dream OS.
- Lack of modern features: This kind of ties in with my previous point about web support. Quite simply, PC hardware is still moving forward at quite a fast pace. However the BeOS is stuck in 1997, give or take a year. There are 3rd party drivers and apps popping up here and there, but in general, as you upgrade your PC, you’re more and more likely to be adding a piece of equipment which the BeOS simply won’t recognize.
I am glad to see that some independent developers are adding new drivers, but this by no means something that one can rely upon. I don’t really care if there’s no parent company behind an OS, but building my dream system around an outdated, and un-supported OS would be kind of stupid on my part. I plan on this system growing with my PC as new hardware’s added and old components are upgraded. I’d hate to be in a situation in a year where I souped up my computing capabilities, only to find out that I can no longer use my OS due to the upgrade, and unfortunately that’s what would happen with the BeOS in its current incarnation. I really hope that some of the open source attempts at building a BeOS clone succeed, but even then, I have to wonder if such efforts will always be playing “catch up”, as far as hardware and drivers is concerned.
I’ve played with OSX a bit, and I must say that it has a lot of impressive features. Unfortunately, the first things that come to mind when I think of OSX is “Mmmm… Eye Candy”, rather than “What an outstanding OS. I can see this growing and maturing with me and my development environment”.
I’m sure that a lot of this impression is due to me not owning or having 24-hour access to OSX, hence my knowledge of it is more from reading, and the odd “doodling” while in my local Mac resellers. With that said, I would love to have an OSX box around, and based on my limited experiences, I feel that OSX would have the potential to become my main machine. However… I don’t have such access, and I certainly can’t afford one currently. Let’s delve into the pluses and minus’s I’ve perceived, as they apply to my dream system.
OSX Plus’s:
- Eye Candy: I’ve touched on this before, but eye candy is important to me. OSX has lots of it! Behind the superficial though is a robust, very stable Unix backend. I’ve not spent any time on dedicated Unix boxes, but I have spent considerable amounts of time on Linux systems, so I can appreciate many of the features that a Unix based OS would provide me with. In short, OSX looks and feels “cool”, It’s fun to use, and it really reminds me of how a computer can look and feel when it’s entirely produced according to one company’s plans and recommendations.
- Easy to use: Apple doesn’t appear to have “dumbed down” OSX to the extreme that Microsoft has its Windows XP release, but OSX is undeniably easy and intuitive to use. It’s software uses common controls and looks to create a truly unified, “all-in-one” computer system that’s easy-to-use, while being capable of everything from audio and video editing, to such simple pleasures as browsing the web and checking email.
- Excellent multimedia capabilities. OSX is an excellent candidate for graphical work, as well as sound and video editing. Its low latency and high quality inputs show that it was made with multimedia uses in mind.
- Unix backend: Unlike previous versions of Apple operating systems, OSX is built on a Unix kernel (FreeBSD actually), and as such it’s as rock solid and powerful as you’d expect. Similar to Linux, I’m told it’s virtually impossible to kill the underlying OSX system. This reliability, in addition to the fact that it makes it much easier to use Linux and Unix apps under OSX, is a very big plus for this operating system. I like the fact that it’s both pretty and easy to use, in addition to built on a rock-solid base.
OSX Minus’s:
- Cost: Lets face it, Apples are very expensive, and for the money they’re underpowered. I won’t get into the usual arguments about Apples quality nor about how PPC systems can outperform similar x86 systems, but the facts are that you can purchase a very powerful x86 based PC for the same money that it would cost you to purchase a modest Apple machine. I can appreciate that quality costs money, but I can’t justify Apples prices no matter how “pretty” their machines look. In my opinion, you can build an equally capable, and sharp looking Linux machine for a fraction of the cost of an Apple.
Additionally, unless I want to run my Windows apps under Virtual PC (which also costs $$), I have to purchase all new versions of my software in order to run them natively under OSX. When you add the price of the PC together with the cost associated with purchasing all new software, it gets very expensive very quickly.
- That damn mouse: Here’s an argument that’s been around for as long as the Mac itself has. It goes something like this:
x86 User: I can’t stand the fact that the Mac only has one mouse button. I need at least two to be really productive!
Mac User: You can do everything you can with 2 buttons just as easy, if not easier on a Mac with just one button. Anyway, if you really need a mouse with more buttons, you can buy them and add them to your system.
x86 User: But that’s adding even more cost to an already expensive system. For the money I’d spend on a Mac, it should come “ready to use” out of the box!
And so on.
For my needs, I recently purchased a 7 button MS Intellimouse, and I’d find it very difficult to use anything but now. The ability to assign different tasks to the extra buttons dependant upon the program you’re using, is a huge plus to using both Windows and Linux. Yes, you can probably get an Intellimouse up and running under OSX, but I agree that for the money you’re spending, having to immediately replace your expensive mouse is indeed a pain in the you-know-what! Even such simple things as surfing the web are made that much easier if you can just hit the side buttons of your mouse to move forward or backwards. Similarly, coding is much easier being able to use the mouse’s buttons to copy and paste. Suffice it to say that I need that Intellimouse to work with my dream system!
- Future’s unclear: Not to turn myself into a swami, but if you follow the news, Apple’s future is very much in flux right now. No, I’m not one of those people who think that Apple’s going to go away, but they’re definitely going to be changing something as far as their architecture goes.
While much of this is speculation, it’s fairly well known that Apple’s about fed up with Motorola’s inability to produce faster chips in a timely manner. Just the other day Adobe released a report stating that an x86 PC is preferable over an Apple box for graphic design work simply because the raw computing power in an Apple is so far behind it’s x86 counterparts. This is pretty powerful stuff considering that Apple’s crown jewel has always been Photoshop, and it’s users are generally artistic-type people who swear by Adobe.
What remains to be seen is how Apple will deal with this setback. Rumors indicate that it can go one of 4 ways right now:
- Apple switches over to x86 chips.
- Apple switches over to AMD’s forthcoming 64bit x86 compatible chip (The infamous Opteron we’ve all been hearing about)
- Apple switches over to IBM’s forthcoming PPC chip, which is miles ahead of Motorolas offerings (supposedly… These aren’t even in production yet).
- Or Apple chooses to stay with Motorola and tries to increase their market share through other methods besides raw computing power.
If anything other than number 4 happens, Apple will have to re-tool OSX for use with the new CPU, and this will more than likely mean that all existing OSX software will have to be re-coded to work with the new systems.
I don’t know about you, but I’d hate to spend $5000.00 on new Apple hardware and OSX software only to find out six months later that my system and software are now obsolete, and that if I want updates and such, I’ll have to upgrade to the new platform. This is unacceptable to me, and it’s really unfortunate that Apple is in such a position. They’re still trying to woo developers over to OSX, and just as they’re staring to make some headway, they must make a decision which would essentially send them back to square one in the game.
So now that we’ve identified the 3 biggies in the OS world (Well, 2 biggies, and one “also ran”), let’s look at what this article’s supposed to be about: Linux!
To be fair, I’m going to do a quick plus/minus review of Linux also before we delve into the various Linux distributions. As much as I like it, Linux isn’t any more perfect than the other OS’s we’ve discussed (Well… Maybe it’s “more perfect” than some of these, but it’s not perfect itself!).
Linux is such a generic thing these days. When you say “Windows”, you naturally think “Microsoft”. When you think “OSX”, you think “Apple”. But when you think of Linux, what do you think of?
It could be Redhat, Suse, Mandrake, Gentoo, Sorcerer, Slackware, Free, Open Source, or just about any other common Linux company or term (How many of you thought of Linus Torvolds when you thought about Linux?). Linux is simply a kernel when you get right down to it. It’s the applications that run on top of that kernel and how they’re configured which make all of the difference! With that in mind, here’s some of the plus’s and minus’s I see, as far as creating a proficient and enjoyable Linux experience:
Linux Plus’s:
- Cost: One of the first things that comes to mind as a big plus for Linux is its cost. This is kind of a double-edged sword for those who want to make money, but Linux is for the most part a free operating system. Yes, you need money to build or buy the hardware on which you’ll run Linux, but the chances are that you already own hardware which will do the job, be it a PPC based machine, or an x86 one.
Let me put forth this friendly piece of advice though: Support whichever distribution you choose to use. Without support, the companies that are really pushing Linux to the desktop will disappear, and we really don’t want that, do we? Whether you pay for your distribution of choice is totally up to you, but I would recommend giving something back in some form, be it through programming, volunteer work, or simply by paying for your software and/or distribution. Many independent developers now take donations also via Paypal. Put simply, the economic model for Linux is really aimed at giving the consumer the benefit since you can try and use most things for free. It’s only once you’ve evaluated and settled on a package or distribution that you should then consider what you’ve got, and what that’s worth to you.
Ok… My brief “Come on guys” speech is over. Back to the analysis!
- Flexibility: This is a big one when it comes to Linux! While it is true that Linux is known for it’s reliability as a server OS, Linux on the Desktop is just starting to take off and I expect to see a lot of action in this area over the next several years. By “flexible”, I’m referring to the fact that you can make Linux into virtually anything you want to!
For example, it’s a given that Linux is an efficient, proven, and reliable platform for servers and server development. The desktop is a different field entirely though in that you want the users interaction (the mouse movements and keyboard entries) to take precedence over any services and programs running in the background. If you had to wait for your quickly typed verbiage to appear in a Word Processor simply because Linux is busy FTPing something in the background, you’d probably quickly tire of it as a desktop OS.
But Linux is a beast of many faces, and as such you can “tune” it to be anything you want. If you’re using it for a server, you want those background processes to take precedence as they’re serving out content (or whatever) to your users. A servers highest priority should be to serve its users equally, and as fast as possible.
For a desktop machine though, we want it to only serve one person: You! As such, the kernel has to be tweaked and told to not dedicate so much attention to those background processes, and to pay more attention to things like video and input speed. You can do this with Linux. You can’t do it so easily with OSX, Windows, or many of the other desktop OS’s (You can use virtually any OS as a basic server, but the performance will not be anywhere near that of a dedicated server system).
If you’re “in the know”, you’ll know that Windows offers different versions of its OS for server usage. Similarly, OSX has a dedicated server version. And while server-specific versions of Linux are springing up here and there, the core systems are still more or less just like the desktop versions of Linux. Their kernels are just tweaked differently, and the apps that are included are usually different and pertinent to the task at hand. You probably don’t need an advanced network packet sniffer on a desktop OS (but you might… I certainly don’t!), but in the same sense, you don’t need a fast and flashy GUI for a server OS. A desktop system’s for using every day, while a server system’s generally meant to be setup, and then just left alone to serve out content as needed.
And it doesn’t end there! There’s all kind of patches and tweaks available to tune your kernel for real-time responsiveness, pre-emptive multi-tasking, and many others. It’s truly “your system”, and as such it’s up to you to decide what “your system” is.
- Support: Linux is widely supported, both by its users, and also by the companies that have chosen to market it. You can find dozens, and perhaps hundreds of books on Linux these days, and you can find even more websites dedicated to the usage and administration of Linux.
In fact when I install a new version of Linux these days, I usually opt to not install any of the included documentation as I can quickly “Google” for the details I need as I need them. Usenet’s another great source for Linux support and help. In short, there are more sources for Linux information than I’ll ever need, and it’s only growing by the day as more people make the switch.
- Lots of software: I have a love hate relationship with the amount of software that’s available for Linux. You can go to freshmeat.com or sourceforge.com and find literally hundreds of programs, and that’s just the tip of the iceberg, so-to-speak.
My only problem with all of this software is that much of it’s a work in progress. This is largely due to Linux being an “up and coming” OS, and while I can often find a new and unique tool to do the task at hand, I also am often questioning why something looks or acts the way that it does.
I guess the best way to sum it up is this: With Linux you’ll spend more time finding the software that you’re looking for, but what you’ll end up with is likely to be “just what you need”, as opposed to being something that’ll get the job done, but leaves a lot to be desired as many Windows programs do.
The time spent evaluating all of the packages available can be fun, but it can also get rather tedious at times. It really depends on how you look at it, but to me, there are a lot of tools available that function at a professional level. The fact that you must wade through a number of “wannabe” apps to find the gem you need is just something that comes with the territory right now. As Linux matures, and coding practices and tools standardize, I expect this problem to decrease.
- Stable and fast: These two items are grouped together simply because the two don’t always go hand-in-hand when we’re referring to other operating systems. In this case they are a major plus when it comes to Linux.
Don’t get me wrong, you can crash individual applications, and such, but seldom have I seen the entire system freeze up. And the speed is really amazing! The overhead requirements of Linux are very modest, and once you start increasing CPU speeds and memory, you can really build yourself a system that just flies! In fact there’s several companies looking at tweaking Linux out for usage as a real time operating system (RTOS), so speed and stability are definitely not an issue!
Similarly, Linux disk access, both for serving and reading, is as fast as anything Redmond’s released. Suffice it to say that Linux is ready for competition!
- One word, “Wine”: Wine, which stands for “Wine is not an emulator”, allows Linux users to run many Windows apps directly within the Linux GUI (Not via an emulated computer ala VMWare!). Wine has came a long ways since I first tried it several years ago, and nowadays, it’s easily one of the biggest strengths in the Linux arena, as far as getting existing Windows users to convert.
I won’t go as far as to say that Wine can run any and all Windows apps, as it can’t. But for most of the applications one would run under Linux, it works acceptably well, and in some cases the programs actually run faster than they do under Windows!
I personally use the Codeweavers distribution of Wine myself, and I have to say that it’s one of the best investments I’ve ever made, from a software perspective at least. I certainly wouldn’t run all Windows applications under Linux; If all you want is Windows software, stick with window, but for those specialized applications that just aren’t available in Linux yet, Wine is a godsend!
- Linux finally looks good: This is not so much a strength of Linux per se, but it’s worth noting as until recently Linux UI’s didn’t look that nice. We take such things as anti-aliased fonts and true color displays for granted in today’s Windows world, but until recently Linux users had to make due with regular, “bitmappish” looking fonts. Similarly, video performance was not always as snappy as it is these days.
But as the number of Linux users increase, and as more companies put their dollars behind Linux development, Linux is looking very nice indeed. KDE 3.1 is, in my opinion, going to be the UI that finally pushes Linux into the desktop arena, and to me, it offers everything Windows does, plus some!
Like all things, there is a downside to everything. Linux’s minus’s aren’t too many from my point of view, but I’ll cover some here in order to show that I don’t consider Linux to be absolutely perfect (despite what some of the above might indicate!).
- Steep Learning Curve: Linux, like Unix, is inherently more difficult to learn than say Windows or OSX. Many people might argue that this is a good thing for one reason or another (“It’s more configurable” is a common reply to this point), but either way you go, if you’re coming from a Windows world to a Unix or Linux world, you’re going to have to basically toss out everything you’ve learned and start over.
Miss your C: and D: drive? Tough! You’re C: drive is more commonly known as /dev/hda1 now, and D: might be /dev/hda2 or it could just as easily be /dev/hdb1.
How about the registry? Well, if you liked the registry, then you’ll hate the hundreds of little configuration files laying around your Linux system waiting to be tweaked.
Did you hate the command prompt? Well, then you’ll hate the Linux console! But you’d better get used to it because as far as Linux’s GUI configuration tools have come, you’ll still end up in a console a lot of the time.
These points aren’t to indicate that one shouldn’t switch to Linux by any means. It just means that if you have a “Windows or bust” mentality, then you should probably stick with Windows. The command line interface that is the terminal, or “console window” isn’t still around because old timers love it; It’s there because you can often be far more productive in a console than you can with a GUI interface.
To contrast this statement, I certainly wouldn’t say that a console based image viewer would be easier than a GUI one, but for more file-oriented work and data-based work, a console can often do a lot more with a lot less effort than a GUI based system. If you walk into Linux with this fact in the back of your mind, you’ll probably pick it up a lot faster than someone who thinks that command line interfaces are outdated and purposefully difficult.
- You define your world: No, this isn’t some Zen saying… What I mean by this is that Linux doesn’t dumb everything down the way Microsoft does. It assumes that you, the user, know what’s best for you, and Linux doesn’t try and guess what you want as Windows tends to do.
This puts a little more pressure on you, the user, as you have to be aware of how you want things to function, and you have to learn how to make that happen. The biggest difference between Windows and Linux (and I’m focusing on Windows here as that’s where most people considering a switch are coming from) is that there’s not a GUI for everything.
This is changing rapidly, but for the foreseeable future, you will have to delve into text-based configuration files on occasion. It’s not a big deal, and it’s really a lot easier than it sounds, but be prepared to have some patience as you learn to use the console, and learn what files are used for specific purposes. It gets a lot easier once you realize that when you are using a GUI to configure Linux, it’s just changing these text-based files for you.
- Vendors will not talk Linux with you: Ok… So some will, but generally speaking, unless the company in question specifically states “We support Linux!”, you’re not going to get much help from them. In fact you’re more likely to have them refuse to help you as soon as you tell them you’re using Linux and not Windows!
I’ll give you an example:
I have a DSL connection to the Internet, and I share that connection to the rest of the PC’s in my house via a LAN network I’ve setup. Whenever I’ve had to call my DSL provider for details on their network (DNS address, News server address’s, etc.), I quickly found that if I mentioned anything other than Windows, they’d stop being helpful, and immediately switch to their “I’m sorry, but we don’t support that” mode.
>From a support standpoint, I understand this. It’s hard enough to train everyone how to troubleshoot Windows questions, but once you throw Linux in the mix, terms and programs are no longer the same. You need new concepts and new verbiage to describe the same problem from a Linux perspective, and quite frankly, most companies aren’t taking that extra step right now. They will… Linux is young, and as more people start using it, these companies will be forced to support it, but right now, Linux is not widely supported by most hardware manufacturers.
>From a users point of view, my opinion is “It doesn’t matter what OS I’m using; Just answer my questions!”. Linux uses the same information that Windows does in most cases, it just is inserted and handled differently. For example, in Windows your DNS entries go into a nice little GUI-based entry form under “DNS Server addresses”. Under Linux however, such entries are made in the “resolv.conf” file, which is located in the “/etc/” folder. My point is that they both use a DNS IP address in these areas, you just don’t enter this data the same way you would in Windows.
My advice is this: When speaking with Vendors and/or troubleshooting items with service reps, remember to talk in Windows terms when talking to “Windows people”.
- Programs still don’t share data equally: By this I’m referring to the fact that Linux apps are still maturing, and as such, they don’t always play nicely together. A great example of this is copying and pasting text between apps. Sometimes you can (more often than not these days!), but every now and then you’ll run across an app that just won’t “talk” to the other apps on your system.
In fact I’ve ran into this recently with Adobes’ Linux version of their Acrobat Reader program. Like its Windows counterparts, you can select and copy text out of the PDF’s with the viewer, but if you then try to paste that verbiage into say a KDE-based editor, the copied verbiage will not appear.
This problem is getting better as the UI developers begin to work towards a set of common standards, but the fact that you still will run across applications that don’t want to talk to one another is evidence of the fact that Linux is still a maturing OS. It’s not yet quite as refined as the Windows line of products in some ways, and it can be frustrating when you come across an example of this fact. As I pointed out earlier, this problem is decreasing rapidly, but it’s still likely to be a problem, albeit a small one, for the next couple of years.
Goals: My goal when I set out was to end up with the best Linux distribution I could find, and then customize it to compliment my personal work style and habits. Once done, I should end up with a highly usable system that looks cools, and is rock-solid from a stability point of view.
As such I began reviewing Linux distributions about 3-4 months ago to come up with this “Windows killer” system. For me this meant that I wanted the most modern versions of programs I could get, while maintaining compatibility with my existing Linux apps, and of course I required that KDE 3.1 (or the newer 3.1.1) be present as my GUI interface.
I also have a short list of “must have” items that I wanted to make sure were either included with the distribution or were easily added on at my discretion. Among the list of items that were on this “must have list” are:
- Ximian’s Evolution email client
- Mozilla or Phoenix (both are web browsers)
- Wine capable
- Xfree system capable of driving my dual monitor displays
- Anti-aliased fonts
It’s worth noting at this point that none of the Linux distributions I tried were successful at setting up my dual-headed system. It seems that every installer naturally assumes you only have one display (even though Xfree’s been dual-head capable for quite some time now!), and thus only configures one card (the card it configures is apparently based off which card your bios is set to boot from). In most cases, this resulted in me booting into a one-monitor display the first time, and then manually configuring my XF86Config-4 file in order to get both displays working. I quickly found that it pays to keep a working copy of this file around which can then be copied over and modified as needed, rather than starting from scratch with each distribution.
This isn’t to scare you away from having a dual headed system, but rather to let you know that if you do run such a system, you’d better be prepared to edit some text! Hopefully future Linux distro’s will have the ability to correctly sense and setup a dual headed display in much the same manner Windows currently handles it. For now though, it’s more of a manual process to set this up.
Ok, with that out of the way, let’s start with the first distribution that was considered for use as my desktop OS, Yoper.
Yoper is a relatively new Linux distribution coming out of New Zealand. It’s one of a number of new distributions that seem to build off a Slackware base, and then optimize the system for both newer processors (Yoper’s optimized for i686 CPU’s), and for KDE 3.1 as it’s only interface. Yoper has only recently hit version 1.0 of its software, and as such, all of my work and my impression of Yoper are based off several release candidates (RC’s) that they released.
Yoper aims to have a small footprint (The amount of space necessary for the installation of the OS and all of it’s necessary files) of highly stable, integrated apps, and to be fast. And for the most part Yoper succeeds in all areas!
Yoper looks great, with smooth anti-aliased fonts, it’s fast, and thanks to its Slackware heritage, it’s fairly intuitive to navigate and find the items you’re looking for. Their target market is business’s that want to migrate off competitors’ products (i.e., Windows), and they apparently have rather a successful business converting companies over to Yoper-based systems in their homeland of New Zealand.
The only real problems I experienced with Yoper were with its implementation of the Xfree86 display drivers. The aforementioned Xfree86-4 display file had to be changed and tweaked for each release, and it never was as stable or easy to get running as any of the other Linux distributions I’ve tried. With the exception of RC4 however, once I had managed to get the displays working, Yoper performed admirably. RC4, the last release before their official release, was unstable no matter what I did though. It looked and performed great, but on occasion would just crash hard, dropping the user out of the Xfree environment, and leaving them with a console prompt.
This was where my real problems with Yoper began. Without getting into all the details (this is a comparison, not a complaint summary), it quickly became apparent that Yoper was only in the game for Yopers benefit. They did manage to build an impressive distribution (RC4 being the exception), but their support left a lot to be desired. They’ve deleted their knowledge base and message forums no less than 3 times in the last couple of months, and when I did run into a problem, they refused to respond or answer my questions (and with the message forums deleted, I didn’t have the option of turning to that as a resource). Similar complaints began to pop up on both their forums, and on other sites, to which the company responded with insults and diatribe, referring to anyone who voiced an opinion as a “Slashdotter” and refusing to acknowledge anyone’s issues.
Perhaps this situation has changed, but just recently they posted a long message on their site claiming that anyone who complained about the quality and/or treatment we received was simply a Slashdotter (among other things), and that they didn’t want to service us “geeks”. Instead they claimed that they were very successful in the New Zealand market, and that they didn’t need or want users like us.
Easy enough… I deleted the Yoper releases, pocketed the money I’d set aside to purchase Yoper with, and moved on. At this point, I hope they’re out of business soon. They’re a very misleading company, and rather than providing a service, or acknowledging any problems, they insult their users, appropriate any hints or workarounds their users have offered, and then delete any references to problems rather than deal with the issues. Enough said.
Summary: Stay away from Yoper. There are many other companies with equal or better releases who want to work with their users to develop a better product rather than just take their money and run.
Additionally, Yoper is extremely high priced when compared to other Linux distributions. They’ll charge you just under $100 to get a Yoper install CD, and they’ve already said that additional money will be needed if and when you upgrade to a newer version. Evidently that $100.00 is only good for the initial purchase plus a year of support. I myself don’t see anyone spending $100.00 a year just to ensure that they can get software and support from an arrogant company who has already stated that they don’t need my business.
Redhat’s latest offering, which was just release, is very impressive! My review’s based off the release candidate, Phoebe, which is supposedly very similar to the final 9.0 release version.
In addition to getting all of the experience and support that comes with the worlds most successful Linux distributor, you also end up with a very fast, and well-built Linux system. Phoebe looks great, and KDE 3.1 simply smokes on it. It’s a very fast system! Some of this speed might be due to Redhats new threading processes (due to be included with the forthcoming 2.6 kernel thanks to Linus Torvolds appreciation for the technology!) which impacts how programs talk to the kernel, but either way, Redhat’s performance is top notch, and certainly among the best I’ve seen. Particularly when you consider that Redhat’s i386 compatible and not highly optimized for the Athlon CPU that is running my system.
Phoebe also offers one of the better-looking desktops thanks to their beautifully anti-aliased fonts, and the aforementioned KDE 3.1 UI. I played around with Gnome a bit, but it was too slow for my tastes. In addition, I couldn’t tweak out Gnome as easily as I can KDE. I expect some of this will be fixed with the final release, but for my needs, it doesn’t matter; I’m a KDE fan myself.
If the software you want isn’t included with Phoebe, there’s a ton of sites on the net that offer Redhat 8-specific packages. There’s also a lot of Redhat specific support available online in order to help you through any questions you might have.
I really only had a few negative impressions of Phoebe, one of those being it’s implementation of KDE. Redhat is notorious in some circles for “crippling” KDE in order to make it work better with Redhats favorite UI, Gnome. Although some people will tell you Redhat’s made it very hard to work with 3rd party KDE apps due to how they’ve changed KDE’s structures, for the most part, you won’t notice these changes during your day-to-day work.
They’ve done things like moving the KDE menus around, and installing KDE in a non-standard way, rather than using KDE’s defaults, and they’re totally in the right by doing so due to the way that KDE’s distributed (You can really do about anything you want with it). It does however make it harder to install certain applications since they’ll try to add their shortcuts and such to the default KDE locations. You’ll then have to manually move such items to the correct areas on your hard drive if you want them to work with your Redhat setup.
The other problem with Redhat isn’t really a Redhat specific problem, but rather one which is plaguing many of the “cutting edge” Linux distro’s, and that is that they’re including the latest glibc libraries (v2.3.2), and many existing apps have not been updated yet to work with this version. On top of that, Phoebe also sports an entirely new method of handling program threads, and while this new method does offer some dramatic speed increases for certain operations, like the updated glibc libraries, it also breaks compatibility with a number of existing applications.
Hence important utilities like Codeweavers Wine don’t run under Phoebe yet. Updates are in the works for both the aforementioned Codeweavers product, as well as for may other apps which aren’t currently compatible with this new release, but I want a system that works today. If I wait, it could be anywhere from a couple of weeks to several months before most applications are patched, which isn’t too encouraging.
Redhat includes a ton of software with their distro’s, and Phoebe is no exception here. It’s packed full of software to help you with everything from surfing to coding, and I like the fact that I don’t have to go hunting for what I need immediately upon getting my OS installed. On the other hand, this is specifically why Redhat’s known for adding bloat to their distributions! While I appreciate so many tools being available, I wish they weren’t all installed by default. I like a lean system, and knowing that space and CPU cycles aren’t being wasted on unnecessary services and programs. While Phoebe is extremely fast, even with all of these additional software packages and services installed, I would prefer knowing specifically what’s installed and why, without being forced to manually choose every single component as this is quite time consuming.
Summary: Redhat is a worthy purchase for any Linux “newbie”, and arguably a good candidate for most people who use Linux. The minuses for me are the bloat, the modified KDE installation, and the fact that I cannot run Codeweavers under it yet.
I can’t stress enough how easy this distribution was to setup and get running, but again, I’m not a newbie any longer (at least in my opinion. Others might argue), and Redhat’s handholding and “all-in-one” installation methods remind me just a bit of Microsoft’s ease-of-use efforts. It’s nice, and it’s powerful, but for my purposes, I want something a little more compatible with a 3rd party apps, and I’d also like a system I can currently run Codeweavers under. KDE 3.1 smokes under Phoebe, but this is also the case with many Linux distributions that are currently shipping it.
Additionally, the new threading model that Redhat sports is a nice perk, but it soon won’t be Redhat specific, so this is only a temporary plus. The upcoming 2.6 kernel will also contain the updated threading methods, so soon all distributions will benefit from this, so while this is nice, it’s not a “must have” reason to go with Redhat for my Linux system.
The software and support that’s available for Redhat is great, and I wouldn’t hesitate to recommend this for anyone just starting out on Linux, or anyone who just wants a system to work, and who’s not that interested in tailoring it too much.
Ok, I realize that as of this writing, Mandrake just released the final version of 9.1, but I haven’t been able to snag a copy off of the already over-congested mirrors. Hence my review’s based off of version 9.1 RC2, which I understand is virtually the same as the final release, with a few bug fixes.
Historically, Mandrake has always been my “backup Linux”, meaning that I keep a partition on my drive loaded with Mandrake in case something goes wrong on one of the other partitions (which include a couple of Windows installs, in addition to some other OS’s). Mandrake’s always been reliable, fast, and very well rounded, as far as the packages included with it go. Mandrake also drives my server/firewall setup, and has done so reliably for the last couple of years!
So it was a little disappointing that after I installed Mandrake 9.1 RC2, I realized that it ran noticeably slower than Redhat’s latest offering. At this point, I had done a basic install of Mandrake, and so I had a lot of extra programs and software installed that I didn’t necessarily need, similar to my Redhat experiences. After playing around with some of Mandrakes settings, I decided that perhaps these additional “pre-selected” applications were slowing my system down, so I decided to try a minimal install.
What I did was tell Mandrake that I wanted to custom pick all of my packages, and I then un-selected every choice in the install dialogue. I then chose just what I knew I wanted, and let Mandrake choose additional packages based on the dependencies in the packages that I chose.
This resulted in a much leaner installation, and one that runs notably faster than the older “all-in-one” install. My assumptions are that the additional programs and services that were being installed is what dragged my system’s performance down so much.
So how does Mandrake run now that I have it installed? Pretty damn nice! It’s noticeably quicker than Mandrake 9, and KDE 3.1 runs as fast as Redhats KDE 3.1 does. In fact, for the most part Mandrake runs and acts virtually the same as Redhat. There really isn’t much of a difference between the two from an operational point-of-view.
But Mandrake 9.1 also includes the new glibc libraries. Ideally this should mean that my Wine apps won’t run under it as with Phoebe, but I found that this wasn’t entirely true! Certain Wine apps will run under Mandrake, but certain ones don’t. It’s very hit or miss, and quite frankly, I’ve no idea why it wouldn’t be treated the same as my Redhat installation if indeed the glibc libraries are the cause of this problem (and I’m quite certain, at least in the case of Redhat, that glibc is the problem!).
I haven’t had a need to try Mandrakes more esoteric features such as ntfs partition resizing. I appreciate that it’s there, but I think I’ll still probably feel safer if I’m rebooting into Windows and Partition Magic if and when I need to play with my hard drives partitions. Not that I like to reboot, but since Linux in general doesn’t officially support writing to an NTFS partition, I’m a little leery about possibly trashing my drives by resizing them with it.
Another area that Mandrake reminds me of Redhat is how they’ve also messed with their KDE installation. Similar to Redhat, they’ve installed it and it’s menu’s in non-standard locations. Hence 3rd party applications may or may not install correctly. Since I want to build my workplace around a KDE 3.1 center, it’s very important to me that I can expect any KDE add-ons to install and be usable with a minimum of help from me. Mandrake, like Redhat, will require me to manually move files around it looks like. Not cool.
One more area that Mandrake rivals Redhat is in support and software being made available for it. As with Mandrake, there’s a ton of sites and programs that deal specifically with Mandrake. One that comes to mind is the PLF, which provides software essential for the desktop experience. My concern is more about non-specific software that I might compile from source.
Here’s a brief example of why: Mosfet is a fairly well known KDE developer, and has provided some fun bits of software for the community in the past. He’s also one of the more vocal opponents of Linux distributors who hack KDE up, and he’s written a fairly well rounded piece on his site explaining what they’ve done, and how this affects developers. While I’m not saying his word is the final word on the matter, I have tried to get some 3rd party KDE apps to work with both Mandrake and Redhat, and often Mosfet’s right in that I end up manually moving files and links around in order to get the application to work correctly.
Just because I know how to work around the problems doesn’t mean I necessarily want to though. I’d rather spend my time developing, or enhancing my system than I would tweaking new installs to work with my broken KDE system. In the past, this wasn’t so much a problem, but as I move forward and try to come up with “my desktop”, I don’t want to deal with this problem that much. This is a strike against Mandrake!
But my minimal install of Mandrake was enough for me to replace Phoebe with it! I was really impressed with Redhat’s speed, which again, I largely attribute to their new threading technologies, but my familiarity and fondness for Mandrake did, in the long run, win me over (but I’m still considering playing around some more with Redhat once I can get my hands on the final version!). And while I can appreciate the solidness of Mandrake 9.1 as a “backup system”, I really wanted something that’s more standards compliant for my main Linux desktop.
For now Mandrake’s back where it always has been on my system: A good back up system with a lot of tools in case I have problems elsewhere. As my primary desktop system, it falls in with Redhat’s Phoebe. It runs nice, it looks great, and as long as I don’t want to go outside the box they’ve built for me, it works great.
Mandrake as a company do leave a bit to be desired. It seems like they’re constantly releasing press releases indicating that if people don’t support them, they’ll be forced to go out of business. I hope this changes with the release of 9.1, but it is a concern right now. No one can say for certain that they’ll be here in a year to support their product, but I for one hope they are. Whereas Redhat’s just testing out the waters of cutting edge/desktop distributions, Mandrakes been doing this for some time, and it would be nice if they could keep doing this for awhile. Only time will tell though if 9.1 will be the release to push them away from the brink of bankruptcy. Let’s hope it is.
Summary: Mandrakes latest offering is a nice advancement over their previous one. As with Redhats Phoebe release, KDE 3.1 is what really makes the system for me, but like Redhat, Mandrakes implementation of KDE leaves me wanting as far as trying out new apps and compiling 3rd party add-ons. Mandrake’s reliable enough, and friendly enough to be a great backup or rescue system, but for my new desktop system, I’m going to continue my quest. I want a Linux desktop that both works out of the box, and one that plays nice with other peoples software. Mandrake and Redhat both excel at the first piece (working great out of the box), but they both have issues that could impact me down the road as far as compiling and using 3rd party applications.
Ark Linux is a project that’s been started by an ex-Redhat bigwig who goes by the name of Bernhard “Bero” Rosenkraenzer. While I admittedly don’t know a lot about Mr. Rosenkraenzer, I can appreciate his goal of making Linux easier for the masses, and so I decided to see what new ideas he was bringing to Linux with his Ark distribution. After downloading and burning the Ark Linux image, I booted up to find myself presented with a rather intimidating menu.
This menu gave me three options: Option 1 would allow Ark to basically “take over” my hard drive, destroying all data and formatting it for use with Ark Linux.
Option 2 indicated that Ark would locate and use all non-used space on my hard drive.
And Option 3 indicated that Ark would be installed parallel to an existing Windows partition.
Now let me say up front that this menu scared me! In the PC I’m using for my desktop, I have 3 hard drives with about 200GB between them, and one CD burner. Data files take up much of my hard drive space, and the remainder is taken up by a couple of Windows partitions (Win98 & XP), a couple of Linux partitions, and a BeOS partition. To be blunt, I didn’t want Ark messing with the wrong partitions as I literally had a lot to lose if it did!
So the lack of choice with these 3 menu items really made me nervous. I certainly don’t want Linux deleting my drives to make way for itself, and I didn’t have any unused/un-partitioned space for the second option. For that matter, the fact that I couldn’t choose which drive to use for the first two options was really annoying. I probably wouldn’t have been so nervous if I’d have at least known which drive Ark was going to try and install itself to. It seemed like the 3rd option (to install parallel to existing Windows partitions) was what I wanted, but whenever I’d select this option, Ark’s install process would immediately lock up.
I chalked this rather un-friendly installer up to the fact that Ark’s only in alpha stage at this point. After investigating the problem online and doing some experimenting, I found that if I deleted the partition that I wanted Linux to be installed on and chose option 2, Ark would find and use the un-partitioned space no matter which drive it was on. So finally I was off and installing Ark.
I quickly found out that Ark Linux’s installer doesn’t ask you anything after that initial 3 choice menu! I was expecting that as with most desktop-friendly distributions, the install process would ask and setup both my root password, and a user, but Ark never did. I assumed that this meant that as with some of the more geek oriented distributions, I’d boot into Linux as root without a password, and then would have to manually set both that up, in addition to my regular user, so I was a bit surprised when I booted into X-Windows as a user called of all things “arklinux”.
I also soon found out in the Ark Linux world, there really is no root password! Ark’s take is a unique one, but also a scary one for anyone who’s used Linux in the past. They presume that users don’t want to have to worry about different users, or having to keep track of passwords, and so they do away with both!
Specifically, this means that Ark sets up each machine as if it were a one-user machine, and they remove all need for su passwords and such. If you want to do a task as root, you can just open up a terminal, type “su”, and voila, you’re suddenly root! This is accomplished by a questionable user-rights management system in which you can define whether a user has su privileges or not, and by default those who do don’t not have to authorize themselves.
This is the complete opposite of any other Linux distribution! While I can appreciate what they’re trying to do (remove the complexity of a *nix system vs. a Windows system), I disagree with their methods. They have a blurb up on their website which indicates that to their knowledge, this doesn’t constitute a breach of security, largely because Ark’s not meant to be a server OS, and thus most security issues about not having a root password are irrelevant.
If you don’t like the convenience of having users setup this way (or if you have any common sense about PC security, and/or any experience with other Linux distro’s) you do have the option of setting up Ark with normal security and passwords. This does in essence break a lot of what’s setting Ark apart from other Linux distributions, but it is an option if you really want to use Ark, but you also want to have a somewhat secure method of controlling who your users are, and what they’re allowed to do. Why you’d choose to setup Ark for this purpose, instead of simply going with a different distribution is beyond me though! If you want security and functionality, simply choose a different distribution. Why waste all that time trying to use Ark in a different manner from which it’s designed to be used?
To be fair, Ark also includes a GUI configuration tool if you want to setup new users with the same privileges as that provided to the default “arklinux” user. So if one non-secured user’s to limiting to you, you can have entire groups of them! For my purposes however, even though I could see the convenience of not having to remember passwords, or the fact that there is such a thing as a root/su user in the Unix/Linux world, a password secured system is more preferable.
Ark also sports some very cool looking screens for configuring your system. They’re not so much custom tools as they are custom GUI environments for already existing KDE configuration components, but they do set Ark apart from most other distributions, and it shows that they are moving towards a simpler, more Windows like system than most Linux distributions are.
The other features of Ark are more or less standard for a desktop oriented distribution. You get a fairly lean install without all the bloat of say a Redhat system, and you get a rather basic KDE desktop on which to build your “dream system”.
Ark also lost points for their add-on CD that allows you to install a lot of popular software that isn’t included with the base distribution itself (such niceties as a media player and so on). Rather than use a package manager, or an apt-get method of installing these packages, as you would with most other Linux distributions, you have to pick what software you want to install from a menu on the aforementioned add-on CD. It works. That’s about all I can say about it though. Anyone who’s ever installed software on a Linux system before will probably hate blindly installing stuff as Ark Linux does. Perhaps newbies and the desktop crowd that Ark’s aiming at want such simplicity, but to me it was actually a step or two beyond the handholding that Microsoft is so infamous for. If Ark really wants to make Linux this simple for the end user, they should at least offer an option for the more experienced Linux users that doesn’t dumb it down so much.
Another item Ark has to improve on is their documentation. There simply isn’t any, and their website’s a joke. Both will likely be upgraded as Ark matures beyond its current “alpha” stage, but I’d focus on both right now before worrying about dumbing some more of Linux features down. As it is, Ark just makes me nervous. It doesn’t act like other Linux distributions (which could be a good or bad thing, depending upon your point of view), and it doesn’t tell you what it’s going to do before it does it.
For someone with a brand new one-drive system, this might be ok, but for someone with multiple partitions, multiple operating systems, and a ton of data to potentially lose, Ark is just too simplified for me to feel confident using it (I didn’t even mention how nice it was for Ark to overwrite my boot manager without asking, prompting me to have to fix this before I could see my other OS’s again!).
On the plus side, Ark looked nice once I got it setup, and it was fairly fast (Not Redhat8 fast, but very usable). Also, it comes optimized for i586 & i686 PC’s (How it’s optimized for both is unexplained though), which is always a nice thing to know. However it doesn’t appear to offer me any benefits over any other distributions, unless you consider the dumbing down of security and package installs features âI do not.
Summary: Ark has some potential, but at this stage it’s a big waste of time to me. Its security model is questionable at best, and I don’t think it’s too wise for them to post on their website that it’s not really a concern because Ark isn’t intended for use as a server. That’s just not an excuse for handling security in this manner, and I feel sorry for any newbie who chooses Ark as their first Linux distribution as once they use a real Linux distro., they’re going to be very confused by actual user and root management.
Similarly, Ark needs to accept the fact that their OS probably won’t be the first thing on someone’s hard drive, and they need to fix their install scripts to allow for this fact. Perhaps the 3rd install option would have handled this correctly, but since it locks up when chosen, I’ll never know. It shouldn’t even be presented as an option if it’s not going to work as having the installer lock up first thing doesn’t really inspire confidence in the OS you’re installing.
To me Ark Linux is a Linux distribution made by someone who really doesn’t understand the basics of a *nix system, and thus is trying to circumvent common methods and procedures in order to make it more Windows or BeOS like (from the install & package management methods at least). My recommendation is to not waste your time on it. Perhaps future versions will reinstate some of the missing security, but for now I look at Ark as being a poorly implemented concept.
Vector Linux is based off Slackware, and is 100% compatible with Slackware 8.1 or earlier I’m told. This is a good thing as Slackware’s notoriously stable and well thought out, however Slackware’s also notorious for being difficult to setup. Vector Linux gets around all of the problems of setting up a Slackware system, while providing some very well thought out, and appreciated extras. Vector Linux’s website advertises a 100% Slackware compatible system within 15 minutes, and it delivers on that by providing the user with a simple, yet informative install process.
Vector’s forte is that it offers a rock-solid and fast Linux environment, while not providing you with any bloat. Their theory is that the user should design and build their own system, and I agree 110% with that thought process. Vector installs all of the basics you’ll need, and a few that you don’t, and then allows the user to decide what to add on, and when.
And for me at least, Vectors default offerings are more or less what I’m looking for in a desktop environment! They’ve included my favorite email program, Evolution, as well as my favorite browser, Phoenix (which is a stripped down version of Mozilla for those of you not aware). They’ve also included such necessities as Gaim (multi-protocol instant messenger), and Open Office, not to mention K-Office, which is also installed.
Sure, there are a few things installed that I don’t need, but in general, Vector Linux out of the box provides me with most of the programs that I want. It doesn’t include much bloat, and in fact the installed system uses less than 500 Megs worth of drive space. Very nice indeed!
Let’s see what else we get with Vector…
As mentioned, Vector Linux is based off Slackware, and in fact is supposedly 100% compatible with older versions of Slackware (but not the newer 9.0 version which was just released). This is keeping with the trend of a lot of the newer Linux distro’s, which are built around a Slackware base (Yoper and College Linux are two distributions that also come to mind), so there must be something to the myth that Slackware’s inherently stable and standards compliant.
This is good because now we can also use Slackware packages and documentation to help maintain the system, rather than going with a totally new distribution that has little to no existing documentation or software available for it. This is a big plus for anyone putting together a desktop system, and it was one of the main plus’s for both of the Mandrake and Redhat releases that were reviewed above.
Vector Linus also includes three WM’s for you to choose from, and none of the choices is Gnome (although many of the back-end Gnome libraries are included so that Gnome-based apps like Evolution will run without complaints). To me this is a good move. I do like some of Gnomes features, but in general, KDE is just so much more advanced than Gnome at this time. It’s good to know that I’m not the only one who likes Gnome-based apps, but not Gnome itself!
The included WM’s are XFCE, IceWM, and KDE 3.1, and all three have been outfitted with gorgeous anti-aliased fonts, in addition to all three all containing well-fleshed out menus, containing all the apps included with the distribution. While KDE 3.1 is still my preference, I’ve always had a soft spot for XFCE, and quite often find myself using XFCE for my root desktop and KDE 3.1 for my users environment. Let me just say that as nice as KDE 3.1 is, all three of the included window managers are setup more professionally and logically than any of the WM’s I dealt with in any of the other distributions!
Additionally, all three WM’s are setup as their creators had intended, so 3rd party applications and add-ons act just as they should. This is so nice to deal with after having dealt with Redhat and Mandrakes customized menus and software installations. As I’ve stated, I understand why Mandrake and Redhat are doing what they are, but it sure is refreshing to see how the software was intended to be installed, and then to realize that they’re every bit as functional and logical (if not more so) than the Mandrake and Redhat hybridized menus.
In case you can’t tell, Vector impressed me very much! I was a bit leery about trying it as historically Vector Linux has been known for its backwards compatibility and it’s ability to “play nice” on older hardware. I had envisioned a distribution that was full of older, out-dated software, and that wouldn’t take advantage of my newer hardware. I’m glad to say that I couldn’t have been more wrong!
Vector is very cutting edge, and yet you get the feeling that the system was put together by someone who truly understands their users. Whereas most other distributions don’t appear understand their target audience, and thus install a ton of software in hopes that there will be something for everyone, Vector appears to “get it”. They put the best software in the install, made everything play nice together, and have come up with a desktop distro. for the masses IMHO.
But I can’t just end the review gushing now, can I? As a good reviewer I have to come up with some negatives, and Vector does have a couple of minor ones. Nothing that I would consider real problems, but certainly there are a few areas that could be improved.
The first thing I’d change is to update their website to reflect the power and “modern-ness” of their new distributions. The “Hints and Tips” section of their website, for instance, refers mainly to versions 1.0 â 2.0, and are admittedly outdated. Similarly, the “How To” section of their website refers specifically to Version 0.4, and is dated early 2000! These types of things are specifically what gave me my initial impression that Vector was not going to be a cutting edge desktop distro, and I’m sure that such items are misleading to other people researching what Vector has to offer.
Another area I’d like to see improved is an upgrade to Xfree 4.3. Xfree 4.2.1 is included, as is 3.3.6 for older hardware compatibility, and while 4.2 meets my needs, I can see some people wanting the benefits of a 4.3 display system. Iâm also aware that I can upgrade this component myself, but would like to see it come as part of the default distro.
I’d also like to see the VASM tool upgraded to be a little bit easier to use. VASM is Vector Linux’s system management tool, and is used to do everything from changing hardware and display settings, to User and package management. It’s very capable, but it’s not too intuitive to those of us not familiar with either Vector Linux or it’s parent, Slackware.
For instance, I am coming to terms with Slackware packages vs. RPM’s, URPMI, and apt-get, but VASM’s two package management options still confuse me. Why two choices instead of one more comprehensive choice?
One last item I’d love to see is the option to install an i686-optimized base instead of the i386 one. Not that it’d be much of a change (Vector absolutely smokes speed-wise, and is easily comparable to Yoper which is i686 optimized!), but it would provide that geeky feeling of satisfaction you get when you know your hardware’s being utilized to it’s maximum potential. I realize that this would be a niche market, but it would be a nice option to be able to choose from during the installation process.
But such items are minor compared to the strengths of Vector. It truly is one of the most impressive distributions I’ve had the pleasure of using.
Summary: Vector is an amazing distribution. It integrates some of the best software available into its base install, and it runs flawlessly. Whereas some of the other distro’s I’ve used tend to come off as either thrown together, or overly bloated, Vector comes through with shining colors, being neither bloated nor thrown together. It’s based off one of the oldest, and arguably most stable Linux distributions, Slackware, while providing the speed and software that today’s users demand from their systems.
There are some things that could be improved, but none of these really has any impact on the systems functionality, or it’s integration. I’d also like to point out that with the aforementioned exception of including Xfree 4.2.1, Vector includes some of the most up-to-date software available, and manages to integrate it all into one well thought out package. Vector Linux is truly nice!
At first glance, a source-based Linux system seems like it would be ideal for my needs. I could have everything optimized for my hardware, and I would only have the software that I specifically wanted on my system. Those are both very enticing reasons to go with Gentoo or one of the other source based systems as I’m very anal about both issues; I want an optimized system, and I don’t want a lot of programs that I’ll never use taking up space on my hard drive!
The biggest drawback to going with a source-based system though is the time associated with them. The fact that it’ll take me 24 hours or more to compile and setup a basic KDE 3.1 system with all of the necessary components (Xfree, Sound, etc.) is something that definitely makes me leery to going this route for my primary desktop system.
I’m someone who likes to try a lot of different programs in order to come up with something that best meets my needs, and the fact that I’d be faced with hours of compiling, just to try a program, is daunting to say the least. The reward would be a system totally defined by me, but the cost is the time necessary to set this up, and of course while my system’s compiling, I can’t use it!
And although Gentoo is by far the most popular source based system, it’s also fairly intimidating in that I have to do virtually everything manually. This is changing somewhat as Gentoo readies their 1.4 release, but generally speaking with Gentoo you have to manually setup everything from the boot process to the programs themselves. I wish that Gentoo had an installer similar to Sorcerer Linux, another source-based distribution.
I used Sorcerer a year or so ago, right before the notorious fallout between its users and its developer, Kyle Sallee. I won’t get into what happened (It’s documented elsewhere if you want to Google for it), but suffice to say that Sorcerer is still with us, although they did lose a lot of users and momentum during the fallout from back then.
Sorcerer, unlike Gentoo, uses a scripted, text-based menu for installing Sorcerer. It is amazing how powerful this simple menu is when you compare it to doing everything manually as with Gentoo. It’s one of Sorcerer’s main strengths, and it’s the main reason why I just can’t bring myself to use Gentoo.
I understand that Gentoo’s intimidating, but does it have to be so manual? Well no… Sorcerer’s proved to me it doesn’t have to be so manual. So why don’t I install Sorcerer than and use it for my dream system? Well, besides the time associated with compiling everything, which is the same as Gentoo, my fear is that I’ll again build up a nice robust source based distribution, only to have Kyle and his crew suffer another fallout and leave me stranded with a source based system that’s not easily updated.
I’ve waited for over a year, and while Sorcerer’s still around, they haven’t ever attained that level of confidence I feel I need for my primary system. Their website is… Well, amateurish at best, and their documentation leaves a lot to be desired. Additionally their use of such geek terms as “cast”, “dispel”, and “the Grimoire” don’t necessarily indicate a distribution that’s going to be around for too long. But they might still and I’m still keeping an eye on them as a possible side project for the future.
There are also other source-based distributions, including Lunar Linux (similar to Sorcerer), Source Mage (a fork of Sorcerer that was started during the aforementioned fallout with Kyle), and Rock, which I’ve no experience with.
The biggest problems the source-based distributions face, in addition to the time associated with compiling everything, and the need to manually configure your system, is the fact that you’re more or less reliant on its creators in order to keep the list of available software up-to-date. With the popularity of Gentoo, I don’t see that as being as big of an issue for them as it would be for Sorcerer or one of the other smaller Linux systems, but again Gentoo isn’t as friendly as Sorcerer showed me that source-based Linux distributions can be.
When Sorcerer had its troubles, one of the first things that happened was that their “Grimoire” (Which is basically a collection of pointers to the sources which Sorcerer recognizes, and can automatically download and compile for you) fell out of sync. This meant that we users at the time either had to learn how to update our Grimoirs manually, or we had to move to a different distribution if we wanted new software, and this is generally the reason for the fork to Source Mage.
Yes there were workarounds, but for a primary desktop system, we shouldn’t have to deal with workarounds.
I guess that my ideal source-based Linux system would be to tie in the support and user base of Gentoo with the ease of use that Sorcerer brings to the table. Sorcerer itself has the potential to be a Gentoo, but the lack of professionalism coupled with the stigma that Sorcerer has had since their “problems” means that this won’t likely happen anytime soon. For now, I’ll stick with pre-made bases, and then compile my way to the perfect system as needed.
Summary: Source based systems are great if you have an extra box lying around to use as your desktop until your compiled system is ready to go. If you’re using a source based system as your primary desktop, I’d recommend having a backup system to use just in case you do run into problems, or in case you need to do actual work while the compilation’s taking place.
Also be prepared to learn a lot about Linux if you go the source-based route! This is one of the main reasons that source-based advocates use when referring someone to such systems. And while I agree that everyone should know more about how their OS works, I can’t agree with forcing people to learn every myriad detail of a whole new architecture just to have a responsive, customized Linux system as their main OS.
I’ve used SUSE in the past and was always impressed with their configuration tools, but I did not include SUSE for the purposes of this review. Why? Primarily due to the fact that I wanted to evaluate different Linux distributions before spending my hard-earned cash, and Suse doesn’t offer a free version of their software to try.
Well… Not entirely, that is. If you dig around enough, you’ll find out that Suse does allow people to download and install directly from their FTP site, but I read several reports from people who have either done this, or tried to, and all of these pointed towards both slow download speeds and missing functionality as problems with installing via this method.
Again, I’m not saying that either’s true as I didn’t actually try and install via either method, but the issues I read about were indicative of how Suse presents itself: They’re here to sell you what is arguably one of the better Linux distributions available. They don’t advertise the fact that they offer downloadable versions of their software, even thought they apparently do, and they evidently don’t go out of the way to make such downloads as easy to use, or as easy to get as they do their retail versions.
Which is fine! That’s one of the great things about Linux is that you can more or less do with it what you want. As I said, I’ve both purchased and used Suse in the past, and while my memories revolve primarily around their Sax configuration tools, which even a couple of years ago were of very high quality, they were always good, well thought out distributions. For this project though, I want to definitely try before I buy, and after reading about others Suse download experiences, I decided to skip their latest release.
This isn’t to say that I’ve written Suse off, but merely to point out that their current marketing strategies didn’t make them good candidates for this evaluation.
Final Words
So where does this leave me as far as my original goal goes? What system will meet most of my goals while providing a nice base on which to build for the future?
After a few months of experimenting and evaluating, in addition to irritating the hell out of my girlfriend (“I just don’t understand why you’re trying all of these different versions if they’re all Linux!”), I’d have to say that Vector Linux takes the cake!
Yoper probably would have won if they would have retained a Slackware-compliant backend, and if they’d have hired some people who understood their user base, but they didn’t do either, and as a result they alienated a lot of people with their tactics. Additionally, Yopers high price and subscription model both make a good argument for looking elsewhere.
Vector Linux is rock-solid, and mega-fast. Especially when you consider that it, like Redhat is an i386 optimized distribution! Vector gives me most of the programs that I would normally install anyway, and it gives them to me without complaining at all about dependencies. Add to that the fact that Vector looks good with its anti-aliased fonts and system-wide integration, and you’ve got a winner of a distribution!
I used to wonder why my wine-enabled apps ran so much faster under Yoper than Redhat, and I now have come to the conclusion that it’s something to do with its Slackware backend as Vector is easily as fast as Yoper with both Wine and regular apps. It just smokes!
Additionally you get great support from the Vector forums, and a huge archive of software with which to play thanks to its Slackware heritage. But don’t take my word for it: Get out there and download a copy and see for yourself.
About the Author:
I live in Michigan and am currently employed as a Web Developer. I’ve been involved in computing for the last 20 years, give or take, with my first experience having been an upgraded Atari 400 my father purchased for us for Christmas one year. My first PC was an Amiga 500, and I currently am planning my next box, which will be a dual-CPU AMD system. In addition to the countless hours I waste in front of a PC, I’m also a musician (primarily guitar as of late), I do graphic artwork (primarily computer graphics, but I’ve done more traditional artwork as well), and in addition to plants and horitculture, video editing is a growing hobby of mine.
I see on 7 distros listed in this article. IMHO, you’d need to review at least twice that many to do an article such as this, especially if you plan on recommending one over the others. I notice that Debian, Xnadros, Lindows, Slackware, etc are missing from the list – that’s kind of a large ommssion.
what about debian. you need to add a distro with binary package capabilities such as deb/apt-get, the article includes mandrake and redhat, which are both rpm based, and gentoo which is source based, so add debian to make it complete!
First, Mac OS X is not based on FreeBSD. FreeBSD is unix base of Mac OS X’s closest cousin but Mac OS X is based on Darwin, a BSD operating system based on BSDLite 4.4 and the Mach kernel. Originally, Darwin also had some NetBSD and OpenBSD in it. Nowaways, Darwin stays in sync with FreeBSD userland commands but the two are distinct, different beasts.
What is the thing about the Mac’s one button mous?. You can get everything done with one button on the Mac OS. It was designed that way. Two, three, whatever buttons is purely optional. Any smart buyer will research to see if their mouse work with a new OS before plunging in.
Apple will not go x86. Forget it. Apple’s future is sound. They are a healthy, vibrant company. Not like Gateway which is bleeding to death. Apple will almost certainly go with the PowerPC 970, IBM’s new 64 bit chip. It is sampling now and will go into production in August.
Mac OS X can run umodified right now on the PowerPC 970 though that would defeat the purpose of a 64 bit chip. Mac OS X will have to have to havesome tweaks but all apps will run unmodified on the new chip since it will handle 32 bit PPC instructions as well. Your hardware wont be obsolete in 6 months.
And that Adobe FUD? c’mon. Even Adobe later expressed regret at posting errant information. They mistenterpreted a study that was showing Photoshop was faster on Mac OS X. They didn’ read the fine print.
Obviously this reviewer is biased toward Linux. If you think you have repurchase apps for Mac OS X, wait to see how many won’t run on Linux. period.
Well, one can’t try all that, it is a HUGE job to do so. Linux is linux, no matter if the flavors are a bit different.
For the most part, Xandros, Lycoris and Lindows have a common goal (the desktop), so their offerings are similar and they were represented in this article by Ark Linux. Slackware and Debian are the geek/dev person’s Linux, and were represented by the similar Gentoo (even if they are not source-based exactly).
Overall, I believe that the article had at least one representative of each “Linux category”.
I might add I love Linux (Libranet Debian Linux!) and BeOS so don’t think I am some blind Mac fanatic.
It seems as though you were looking for the perfect precompiled system – and I’m glad you have found something close to it with Vector Linux.
Had you truly been looking for the perfect Linux System, there should have been only one stop : Linux From Scratch. You could have built yourself the perfect GNU/Linux OS – but then again you did say that you didn’t want to get your hands dirty (so to speak) – which is an interesting constrast to the fact that you prefer the extreme (assinine?) configurability of KDE over the elegant simplicity of Gnome. Go Figure.
Oh well.
freshmeat.com should say freshmeat.net
This guy didn’t mention anything about hardware support and getting it to work on their os easily. How good is this in his linux distros? I’d bet it would be horrible. Or did he/she suddenly decided to throw away the hardware support and just depend on the software?
Maybe his hardware was compatible, how do you know?
Show me a site or article where Adobe regrets posting errant information, the only thing they probably regret is that they got thousands of e-mails from Mac users claiming that Adobe is: Stupid, Dont know anything, etc. This was a good article, its a shame tho because of ” marketing reasons ” the author did not include SuSE, IMHO SuSE is undoubtedly the leader in user friendly linux, I have tried them all, Xandros, Lindows, Lycoris. All of them mind you have their little quirks and annoyances but for me SuSE only has one. SuSE is definately more polished and I think overall it probably would have served the authors purpose than any of the others out there. Pay the $79.00 it is well worth the cost And it is cheaper than most professional versions of Linux out there today.
There is no perfect Linux system. What’s perfect for the desktop can be a nightmare for servers. What’s perfect for servers can be a blocker for the desktop. What’s good for both can be bad for embedded devices, what’s perfect for embedded devices may make no sense for the desktop, etc. etc.
Example: high-throughput (for servers) vs low-latency (for desktops). A high-throughput Linux kernel makes the GUI less responsive. A low-latency kernel makes the GUI more responsive, but if you run a benchmark tool you’ll see that the system can process less data than a system running on a high-throughput kernel.
Too many things are mutually exclusive. Again: there is no perfect Linux system.
All of his agruments about the Disadvantages where all incorrect. Let’s start in order.
Point #1 – Being slow.
True, a dual CPU machine would help immensely with this (XP is SMP-capable!), but we’re looking specifically at the operating side here. Multiple CPU’s will also help to speed up my Linux box when and if they’re added (although Linux would of course require that I build an SMP-capable kernel).
Isn’t this actually an agrument FOR Window’s XP not AGAINST it!
Point #2 – 3rd party apps
Windows has came a long ways as far as what it allows the user to do, but to still get the most from a Windows system, you have to typically use a 3rd party program. This often requires additional $$, and it often adds another application to those already running in the system tray. This in turn results in more CPU cycles and memory going towards a feature which I for one feel should be included with the OS itself. Virtual Desktops under Windows is an excellent example of this, in that if you want Virtual Desktops, similar to Linux and BeOS, you must run a 3rd party application to provide this functionality.
AHH NO – Microsoft provides virtual desktop capability and it works GREAT, goto the link below to download it:
http://www.microsoft.com/windowsxp/pro/downloads/powertoys.asp
Anyone who even wants to claim they know anything about windows knows about all the Power Toys Microsoft provides!
Point #3 – Annoying Event Sounds
I do a fair amount of music composition and recording on my PC, and as such, I strongly feel that a PC should be seen and not heard. I don’t need my PC going “beep” in the middle of a recording session just to let me know that an update’s available (or for any other reason for that matter!). So I typically turn all the system sounds off on my PC and save that scheme as my default.
Do I really need to say more than just TURN THEM OFF if you don’t like them!
Point #4 – Security
It’s Windows… It’s full of holes, and more are discovered every day. Enough said
And everyday they fix them. I can’t count how many times I have read weekly post about how new security holes have been found in OpenSSH, Apache or other Open Source software. Microsoft is the largest software company in the world – of course everyone is going to look for a way to break their software … it just so happens that people haven’t people that same effort into breaking linux
Point #5
Can’t optimize core system: By this I’m referring to the fact that Windows is a closed-source system. The users don’t have the access necessary to do such things as optimize the kernel for a particular platform. This isn’t a major problem, but when you consider all of the legacy applications that Windows supports, you naturally have to wonder how fast and responsive it could be if it was optimized and targeted at modern PC’s. Rather than have a kernel that runs on everything, it would be nice to strip out the things that don’t apply to my setup and optimize it for the hardware I do have. On the other hand, this is one of Linux’s strong points
Your joking RIGHT! How many other OS’s out there backword compatibility of up to 8 YEARS (windows 95). Major Linux vendors break compatibility every 4-6 MONTHS. Come on, get real.
—-
I’m sorry, but after the first page of his article – I had to stop reading. Now I’m not some Microsoft zealot but you must give credit where credit is do!
He says that with windows you can’t customize it enough so he doesn’t like it. And what he love about Linux is that you can customize everything to make it work exactly as you want. So you start to believe he would want a source based distro so that he can optimize it for his system. Which is a fair agrument in a sense because he wants to compile explicitly for his hardware to get every last bit out.
BUT then he goes to say:
The biggest problems the source-based distributions face, in addition to the time associated with compiling everything, and the need to manually configure your system, is the fact that you’re more or less reliant on its creators in order to keep the list of available software up-to-date.
WHAT DOES THIS GUY WANT THEN. A distro that just happens to be completely compiled explicity for HIS and only HIS system without him having to do ANYTHING. He says he loves how you can configure linux to the extreme but hate to configure it.
This guy DOESN’T KNOW WHAT HE WANTS from an OS. He just wants the OS to do anything and everything in the world but if you asked him what the anything and everything in the world was he could tell you. Or, he answer would probably be – “I want it to run fast, on all hardware and to do everything I could every want”. Which is the most vague answer in the world!
…which is arguably more alive and more functional on several levels than some solutions he did consider (BeOS).
linux sucks
use windows xp
yadda yadda
great points!!!
THANK YOU Anonymous!!!
I’m so glad not to be the onoy person who thinks it shouldnt be necessary to optimise the code for the platform and who belives an app shouldnt need to be recompiled with every subsequent development of the kernel..
One of the biggest bad things in Linux is that you can’t reliably build an app to run on any distribution.
BeOS is alive and well through YellowTAB’s Zeta product that comes out soon. Get your facts right.
In fact, BeOS is **FAR** easier to install (and cheaper) than eCS 1.1, which I tried to install a few days ago with some unfortunate results.
Review please? ( when it’s installed properly :-p )
When and if it will get installed properly (I am still waiting for a newer ISO build that fixes bugs in the PCMCIA code, while eCS 1.0 was working fine on that machine before), yes, it is on our plans to write a review.
The author seems to be very pleased with the speed of Slackware-based distributions. I remember when I used slackware a while ago and it seemed a little faster to me too (compared to Redhat and the others). Is there a special reason for this? Maybe less bloat installed? With all the effort spent by Redhat on kernel development, I would expect it to be the fastest distribution around…
I don´t know how to put this in euphemic terms.
Sorry, pal, but most of what the guy wrote was misunderstood by you.
The author says XP should be fast without SMP and you say this is an argument pro XP!
He says XP is load with legacy problems and you say it´s good for this reason.
Of course, you´re not obliged to agree with him, but you´re acting as if you didn´t understand the ideas.
Excuse me for being blunt, but do try to read more… you´ll may get HUGE improvements on your skills. Heck, I´m not even a native English speaker!
Again, pardon me, my intention is not to be offensive at all.
Oddly I have not had the problems that some people report with backward compatibility. I have RH6.x RPMs of Xanim and RealPlayer for 7.x on my updated RedHat 8 system and things seem to work just fine.
Doesn’t the .so library system allow any number of major versions of any library to coexist and be available to the programs that use the specific version? The kernel is also very consistent within each major version. I know someone who regularly swaps kernels from 2.0 to 2.5 and everything in between without effecting userland software.
Is it the package managers (RPM, APT) that insist on creating these compatibility issues? I have at times had to go around a stink raised by RPM when installing an older package.
I don’t know what to say except was that a review or a short story about someones dip into some flavours of GNU/Linux? Too many of the reviews I’ve been reading recently have been too shallow and not going far enough into the testing to give me any usable information.
Anon said: “ I can’t count how many times I have read weekly post about how new security holes have been found in OpenSSH, Apache or other Open Source software.”
Quality and security are measures which only mean something when compared relatively to another.
There is no absolutely secure, therefore you must expect, that once a vulnerability is made known to the vendor, the vendor should do their utmost to close the Window of Exposure ( http://www.counterpane.com/window.html ) as soon as possible.
For example, with the lastest SAMBA vulnerability, once notified, the SAMBA developer owned up to the mistake and the SAMBA project released a patch within 48 hours.Redhat has already backported the patch into their distributions RPMs.
Meanwhile there are currently 14 KNOWN unpatched vulnerabilities in Microsoft’s Internet Explorer ( http://www.pivx.com/larholm/unpatched/ )
Some DANGEROUSLY EXPLOITABLE have not been fixed in over a year ( http://security.greymagic.com/adv/gm002-ie/ ).
Other inherent vulnerabilities, such as the Shatter attack ( http://security.tombom.co.uk/moreshatter.html ), Microsoft has known about since 1994!
Even if the API/call flaw is inherently unfixable, that is plenty of time for Microsoft to implement a safer methord/systemcall/API, adapt it’s own applications to use the safer methord and depreciate the unsafe API.
It also appears that Microsoft ‘s own implementation of SMB is vulnerable and Microsoft has known about it for over eight years ( http://developers.slashdot.org/comments.pl?sid=59960&cid=5681769 ), but Microsoft either choose not to, or cannot fix the problem themselves.
Microsoft is clearly not closing the vulnerabilities they are aware that exist in their products and services.
Even after Bill Gate’s Email, Microsoft by choice, remains neither secure or trustworthy.
Microsoft’s attitude towards the security of it’s products, service and customers is abysmal.
From Jason Coombs’ A response to Bruce Schneier on MS patch management and Sapphire ( http://www.securityfocus.com/archive/1/315158 )
Microsoft Baseline Security Analyzer (MBSA) and Microsoft’s version of
HFNetChk both failed to detect the presence of the well-known vulnerability
in SQL Server exploited by Sapphire, which is one of the reasons so many
admins (both inside and outside MS) had failed to install the necessary
hotfix. MBSA and HFNetChk are Microsoft’s official patch status verification
tools meant to be used by all owners of Windows server boxes
…
In addition to designing MBSA to avoid scanning for SQL Server
vulnerabilities, failing to update mssecure.xml reliably and in a timely
manner, deprecating HFNetChk by pushing the MBSA GUI as its preferred
replacement, and hiding the details of the technical limitations and
internal security assumptions made by design in Microsoft’s security
analysis tools, Microsoft pushes Windows Update (windowsupdate.com) as a
safe and reliable way to keep Windows boxes up-to-date. Unfortunately,
Windows Update isn’t designed to supply or verify the presence of SQL Server
hotfixes, either.
None of Microsoft’s own hotfix/patch status scanning tools designed to prove
“baseline security” were able to help administrators avoid Sapphire. This
entire scenario, this comedy of errors, illustrates the security risk
created by any organization that pushes security around from department to
department, passing the buck and hoping that somebody else will know how to
deal with the problem. The result is a system so flawed that it borders on
the absurd.
Because of this continued inherent attatude to security, Microsoft’s products and services should be considered UNSECURE by default.
Eugenia – “For the most part, Xandros, Lycoris and Lindows have a common goal (the desktop), so their offerings are similar and they were represented in this article by Ark Linux”
Oh, so Xandros, which has been repeatedly reviewed to be one of the best linux distros for desktop plus hardware/software ease-of-use (Debian base), is represented by a piece of ALPHA code?
It is represented in the IDEAS and its GOALS, not in the code. Stop being deffensive please.
eugenia: the reviewer did not even install gentoo linux, and i don’t think gentoo can be considered a representative of debian, yes they both have a package mangement system which takes care of dependencies, but gentoo takes days to install and get right, debian takes a few hours, and apt-get does not need a whole lot of changing around to install new programs like gentoo’s USE variables, which is a blessing and a headache at the same time.
also the reviewer made a big mistake with YOPER, which is based on LFS !!! not slackware.
The article mentions the lack of virtual desktops in XP…..
XP Powertoys.
http://www.microsoft.com/windowsxp/pro/downloads/powertoys.asp
Virtual Desktop Manager
Manage up to four desktops from the Windows taskbar with this PowerToy.
too bad this isn’t available for windows 2000.
personally, i prefer vern, oneguycoding.com
this is a very powerful multiple desktop program, and you can control as many desktops as you want. it even has a autohide feature. and supports keyboard shortcuts to move between desktops.
Linux has become a pervasive operating system for developing and deploying enterprise applications. But there are many different Linux distributions, available through many different vendors, and so a key milestone for Linux is binary compatibility for its application stack. That is, ensuring that compliant applications will run on conforming distros — without recompiling.
The LSB was chartered to help insure just that, and Red Hat 8.0 and UnitedLinux 1.0 are now both examples of conformant distributions.
http://www-1.ibm.com/linux/news/binary.shtml
See the Linux Standard Base website for more details.
http://www.linuxbase.org/
Because of the free licensing of the libraries used in GNOME and KDE etc, it is easy to static link or bundle a copy of those and other dependent libraries with binaries.
It will be possible to build entire LSB portable subsystem using The LSB Sample Implementation ( http://www.linuxbase.org/impl/ ) on any Modern Linux Distribution.
computers suck (ALL OSs) go back to using pencil & paper…
From the anonymous post: “Point #3 – Annoying Event Sounds. . .Do I really need to say more than just TURN THEM OFF if you don’t like them!”
From the original article: “So I typically turn all the system sounds off on my PC and save that scheme as my default [emphasis added]. . . .So you can imagine how upset I get when I boot up Windows and hear Windows default opening sounds play. This is my cue that something has changed within Windows, and that Windows in its infinite wisdom has reconfigured itself again without asking me.”
Since it was a lengthy article, I guess you missed this part.
Read the EULA ( End User License Agreement ) that comes with Microsoft’s software and XPs Virtual Desktop. Those licenses are quite a limiting factor incomparison to the flexablity of X11-client-server model using open source ( and even some proprietary linux based software ) licensed applications.
I can view and run my X11/GNOME/KDE applications on any remote PC, X-terminal, Notepad, PDA that has either an X11-server or VNC client and a network or wireless connection. Because of the licensing, I can do so without fear that the BSA will knock down my companies door and demand lots of money. I can host client applications on many remote application servers, to boost security, performance or ease of maintance, without having to renegotiate or buy more per-server, per-user, per-client licenses.
In the future enterprise, such flexability is key.
Allow me to quickly cover a few points that have been made:
Yes, I didn’t review a true debian distribution. Regardng the apt-get points however, you can install such a system (apt-get, that is) on a number of Linux distributions, including Redhat, and I must admit that it does make maintaining a Linux installation much easier. Why Redhat and others don’t include it in addition to their RPM setups is beyond me.
Regarding the missing distributions, Xandros and Lindows both cost $$ to play with, and thus they fall into my “Suse category” in that they don’t provide a free, downloadable version to try. Additionally, I didn’t care about these as neither provides a KDE 3.1 WM, which is what the article states I was looking for.
Hardware analysis was more or less ignored as all distributions treated my hardware equally (with the exception of the X issues that were noted). None of them saw my scanner (a Visioneer) , and none setup my 7 button mouse out of the box (They all see it as a 3 button wheel mouse). Other than these two issues, one of which is easy to fix (The mouse!), I didn’t feel that hardware was an issue worthy of making the article any longer than it already was.
As for the Anonymous’s Windows comments… In brief, the virtual desktop addon you’re referring to is an add-on! If you’d have read the article, my point regarding Virtual desktops is that they should be built into the system. Your solution is just another item that should be there when I purchase my copy of Windows as far as I’m concerned.
Similarly, as for the sounds my point was that I do turn them off, but Windows “resets” them on occasion, turning back on the defaults (I honestly don’t care if you read the article, but at least read what I’m saying before taking a sentence or two out of context. Sheesh…). 8)=
I haven’t had a chance to use eComStation yet, but I continue to hear of problems with it, so I’m not in any hurry to try it out either. Eugenia’s install problems seem to back that fact.
As for Yoper being Slack based… It really is. Kinda…
Check out some sources on the web: They did resort to a LFS mentality, but they also based much of their distribution on Slackware (hence why some packages such as Dropline Gnome initially supported Yoper, but have since stopped due to Yopers diversion from the Slack mentality/file arrangements). My opinion’s largely based off both postings on the Yoper site (the original one that was deleted a couple of times), as well as discussions with other Slackware users. If it’s currently incorrect, it just goes to show how much they really have diverged with the final release, which I admittedly haven’t used. As of RC4 however, this was still more or less true, and was a noteworthy point I feel.
By the way, if you like Gnome and are using a Slack-based distro, you should check out the aforementioned http://www.dropline.net/“>Dropline . It’s very nice, and is a better release than Redhats version IMHO! 8)=
– Tony
This article was great for the most part. Anthony Hicks seems to be a good writer, and (amazingly) I spotted no grammatical or spelling errors in the parts I read!!!
His comments were accurate concerning Windows, Linux, and OSX. I enjoyed it a lot and I think he and I would agree on most everything if we were discussing these issues in person.
Then, for some reason at the end he decided to go through reviewing linux distros. This was an awful idea. As Darius pointed out, it is incomplete. Also he “reviews” distros he hasn’t tried, such as Gentoo, and as opposed to writing a real Sorceror review he simply mentions that he used it at one time and that he thinks the terminology they use is dumb (it is).
My reccomendation to anyone who hasn’t read this yet, read up until the Redhat chapter, then stop.
It’s a frequent claim that “XP is stable”.
In my experience XP is less stable that W2K and about on par with NT 4 (yes, even when using XP drivers only on “Designed for XP” hardware) when fresh out of the box and Service Pack’d. W2K is easier to install or change hardware than either NT or XP, by easier I mean likely to suceed without a lot of screwing around.
From my experience, W2K was the pinnacle achievement of the Windows program running on top the David Cutler hacked OS/2 derived code that Microsoft ever sold. Performance-wise XP introduces some severe deficiencies when compared to W2K (even after stripping out all the XP fluff).
Similarly Windows 98SE was the pinnacle of Windows with DOS underpinnings, being a slight improvement over Windows 95 OSR 2.1.
Since W2K and 98SE Microsoft’s emphasis has been on drawing more attention to Windows itself as opposed to actually improving it. Simplification of the user interface has made it much harder to configure it.
This has opened the door for truly innovative Operating Systems to make a play for the desktop. I have close personal experience with an OS X eMac that handily outperforms an OEM XP box that has nearly twice the clock speed. Albeit that little raster shift defect took two warranty repairs to correct.
I use and install Windows (all flavors including 95 and later) on a daily and weekly basis and use QNX (6.1) every day. I have installed and use well over thirty different Linux and *BSD distros since OpenLinux 1.2/RH 5.0 (and prefer SuSE 7.3 so far) as well as LinuxPPC (2K), UnixWare (7), NeXTStep (3.1), BeOS (4, 5), Solaris X86 (2.5, 7, 8) as a desktop and/or server and use OpenServer (8), Netware (4, 5) and Linux (RH 7.2) daily as a Server and Windows (NT, 2K) daily as a “Server”.
My experience with desktops goes back to daily use of CP/M Z-80 and Apple DOS around 1981 then on to DOS (2 – 5) and Desqview, then DOS 6/Windows 3.11 and then Solaris 2.5 X86, Windows 95, MacOS 8, etc..
Windows doesn’t come close to any Server when used as a “Server” but that’s getting off-topic.
When using an Operating Enviroment (as opposed to an Operating System thereby including Windows) I want to be able to tell it what to do, not the other way around. Using Windows, this is inherently more difficult with less predictable results.
The DOS based Windows allow for a virtual “smack on the side of the head” to force it into submission, but NT type Windows do not recover as well from brute force. While initially a little stronger, they collapse rather quickly into a steaming pile of P00 when such tactics are employed.
*nix, especially Linux, gives the experienced user the ability to intervene when more knowledgable than the machine itself and has stability to go along with the package. I don’t have to guess what answer to the trick question some wizard is asking me to get the desired results. If I know what I want, I get it. Linux hardware support is BETTER than XP and has enough applications (many free) to do most or all of what most need or want to do with a computer.
The author complains about having to compile stuff, this is an option with Linux to gain performance, it’s not an option with Windows. You can’t optimize Windows for your hardware, you have to use a precompiled system that may be supporting a lot of hardware that does not exist on your system.
While BeOS is wonderful, the shortage of apps (although less so than QNX) is certainly a hinderence in being able to achieve the daily burden of menial tasks. I wish it’s IP had been transfered to a more benevolent entity.
As was mentioned, two of the more prominent challengers in the desktop Linux arena were omitted from the review (Xandros and Lycoris). Perhaps even Lindows should be considered as well, despite their questionable ethics.
eCommstation/Warp 4.51 are lacking only one or two things, Crossover Office/Wine and/or X. Or, if project ODIN were to somehow surge ahead in development, it could be a major contender.
With any computer, making sure the hardware and software are well matched is a crucial plan if ease of installation and a stable system is desired (although finding this combination with XP seems to be less elusive than NeXTStep but certainly close to OS/2).
Ecomstation seems a little expensive for my blood, but I always had a soft spot in my heart for OS/2 so I may break down and get it.
I find XP to be remarkably stable, I cannot wait for Windows Server 2003 to be released. I have heard from several testers that server 2003 is da bomb.
Eugenia wrote:
“For the most part, Xandros, Lycoris and Lindows have a common goal (the desktop), so their offerings are similar and they were represented in this article by Ark Linux. Slackware and Debian are the geek/ dev person’s Linux, and were represented by the similar Gentoo (even if they are not source-based exactly).
Overall, I believe that the article had at least one representative of each “Linux category”.”
Not a chance.
They were all rpm or source based distros.
Packaging (software installation) is a major beef among
many users. There are 13 Distros that use Debian Based
system, including Xandros, Lindows,Libranet,Debian itself, and an entire region of Spain.
Now perhaps one could argue that mandrake has urpmi
available but I saw no mention of it explicity in the
article.
So to not have one Debian Based Distro in the review
is a major ommission and sorry, but not just representative of Linux.
Overall I thought this was a pretty good article. I really think his whole section on Gentoo was terribly incomplete. It almost seemed as if he didnt bother to create a whole system. How can anyone write anything about Gentoo without mentioning Portage. Portage alone sets Gentoo far ahead of all other distros in my opinion. Once the system is built it just ownz. 24hrs is a small amount of time to invest in the big picture of things. When youre done you have a highly optimized system with ULTRA easy install, uninstall, and upgrade capabilities for everything……and the added bonus of never having to worry about rpms or dependecies…..EVAR….no…seriously. 🙂
Bay:
The fact that you say that Linux has better hardware support just shows your bias and the fact that no one should even bother reading your comment.
I would like to point out to the people that were complaining about this that if you want a more customizable system you can always install another shell other than explorer.
Take a look at http://www.desktopian.org/ (shell homes section) and you can see that are almost as many different win32 shells as there are window managers for linux :-P. Litestep, object desktop, aston and geoshell are probably the ones to watch out for, 3dtop (a 3d opengl shell) also looks interesting.
http://kde-cygwin.sourceforge.net/
Now that was funny!
But seriously, of the Linux distros I like Mandrake for no other reason than it found all of my hardware (ATi All in Wonder Card, Epson Stylus C42 printer, dLink net card, Kyro Video Card, Firewire Card, SCSI card and Soundblaster Card) and even configured them correctly without me having to do anything. Redhat could not even figure out the video card without me having to manually configure it.
Too me this one step alone in the setup is worth my loyalty.
They are coming,slow but steady.
http://www.desktoplinux.com/news/NS7578819963.html
http://www.infoworld.com/article/03/04/05/14casecurity_1.html
Big proprietary software companies porting their wares to
the Linux plaftform. Who’ld have figured?
For example, with the lastest SAMBA vulnerability, once notified, the SAMBA developer owned up to the mistake and the SAMBA project released a patch within 48 hours.Redhat has already backported the patch into their distributions RPMs.
Really? Did you just pull that number out of your ass? The patch was written within an hour of it being written, and then released four days later.
http://developers.slashdot.org/comments.pl?sid=59960&cid=5681812
is the fact that you’re more or less reliant on its creators in order to keep the list of available software up-to-date.
>>>>>>
With Gentoo, this really isn’t necessary. Gentoo uses build scripts to download and configure software. The build scripts are very well documented, very short (usually less than 50 lines, most of it boilerplate), and very adaptable. As a result, it’s quite easy to write your own scripts when a new version of a package comes out. I don’t know bash scripting (what these scripts are written in) and I’ve still been able to adapt some for new programs. Since it’s so easy, the Gentoo forums are full of user made ebuilds for all sorts of software, even for CVS versions. The result is that, although it has a much smaller userbase, Gentoo’s software repository is very large and very up to date.
If the guy was leery of spending 24 hours just to get the system up, gentoo also has precompiled binaries. I think that the fact that he didn’t even realize this and give it a try shows that he didn’t put any really serious effort into finding out about Gentoo.
The fact that you say that Linux has better hardware support just shows your bias and the fact that no one should even bother reading your comment
No, he’s right. Pound for pound, linux has support for more hardware than Windows XP. Windows XP probably has more support for new hardware specifically, but if you want to do a piece by piece count, there is no doubt in my mind that linux would win.
That is easy enough to see simply by looking at the number of platforms linux runs on.
I find it funny how people are willing to defend Microsoft on security. Microsoft has come out and stated that it’s products weren’t designed for security. This was less than a year ago. A tenent of computer science is that a system must be designed for security from the very beginning, or it doesn’t work. To believe that Microsoft has turned around a 37 million line (which doesn’t include non-Windows MS products like SQL Server) is just silly. Microsoft announced some security initiatives last year. If it can stop the feature kick* and concentrate on security and stability, in a few years you might be able to trust them on security issues. Until then, it’s just pointy-haired marketing-speak (like the word “initiative”).
*> I honestly thought MS had changed when Win2K came out. Win2K was a remarkable advance in stability and performance. Then XP came out and they backslid again. It’s okay on the desktop, but WinXP isn’t really suitable for a high-load mission critical server. Now, talks of shoving SQL Server (WinFS) and the .NET CLI into the kernel leave me with little faith that MS will ever resolve it’s security/stability issues.
you forgot the best of all… Debian.
how many viruses and worms have actually penetrated your windows? Personally, I’ve had 0, and I’ve been on every version of windows since 3.1. I got a simple free virus scanner, it detected one virus from my roommates computer. How important is security to you? Its only important if you have important stuff to HIDE or if you run it in a high risk place like a web server. You guys really shouldn’t be talking about security unless you’re doing risky stuff, like going on IRC to get your pirated MS software.
Please. I used them for about 5 minutes before uninstalling them – the PowerToys virtual desktops are awful, little more than an automatic minimize/maximize utility. Compared to any X implementation it felt clunky and unnatural.
Anyway, rant over – I thought this article was fair, nice to see somebody acknowledge the importance of Wine [i hack on it]
Just because you’ve been smart and lucky, not getting any viruses and worms, doesn’t mean MS is secure.
The recent slowdown of the internet has proven that Microsoft’s most recent full releases of software are NOT secure. The Slapper worm as well as the recent RPC insecurities prove that they aren’t secure.
Yes I agree that it’s important to note that there have been fixes available, but apparently they aren’t trusted by many many IT managers and home users. The SQL server fix was basically backported by later MS updates, why is this? I install a MS sanctioned fix to a critical server and then apply updates later, only to find the vulnerability in SQL back, this is rediculous.
Until MS can produce easily applied updates that don’t reintroduce prior security holes, I will deem them too insecure for the server room and leave them behind a firewall for necessary workstations only.
Sorry, it wasn’t slapper, it was called Slammer.
My apologies.
played out, lets think of something new to write about.
To the author:
However much of this customization is handled only by 3rd party apps. Windows has came a long ways as far as what it allows the user to do, but to still get the most from a Windows system, you have to typically use a 3rd party program.
Well, since Linux pundets like to claim that Linux is just the kernel when you mention any time when your desktop enviroment has crashed, isn’t it then technically true that everything in Linux is a 3rd party program?
This often requires additional $$, and it often adds another application to those already running in the system tray.
http://download.com.com/3000-2094-1539340.html?tag=lst-0-1
This in turn results in more CPU cycles and memory going towards a feature which I for one feel should be included with the OS itself
I’d like to know which built-in feature (such as virtual desktops) in Linux does not use CPU cycles and memory?
Can’t optimize core system: By this I’m referring to the fact that Windows is a closed-source system. The users don’t have the access necessary to do such things as optimize the kernel for a particular platform.
Why is it that when people who are used to Windows try looking for ‘setup.exe’ in Linux to double click on to install something, they are chastized for not ‘thinking differently’ and are told that you don’t do things in Linux like you do in Windows. However, when using Windows, what do Linux users do? They bitch and moan that they can’t compile/optimize the kernel. HELLO??? Haven’t you learned anything yet? There are PLENTY of ways to optimized Windows – it’s just not done the same way as in Linux. Yes, I know Linux has capabilities that Windows doesn’t – this, however is not one of them
PS – What is the name of your 7-button mouse? I’d like to check one out
Rayiner Hashem
I find it funny how people are willing to defend Microsoft on security.
I don’t defend them and say that it is as secure as Linux is, but I think the problem is blown way out of proportion by some people and made to seem worse than it actually is.
Most of the holes that are found are variants of other holes, most of which have been patched for months. Just because some people refuse to take pro-active measures and continue to double click on any kind of attachment that comes down the pipe, well … that’s basically the same thing as leaving your BMW unlocked with the windows down in a bad part of town, and then complaining that your car got stolen.
No Debian ??? lol
the review >> to the trash…
You’ve done a great job with this review! The only draw back is the mysterious absence of Debian/GNU Linux. Otherwise, this was a great read.
Good job, power to you!
–Mike.
How can anyone take a reviewer seriously that doesn’t even test the products in the comparison? His remarks about doodling with OS X make me believe he hasn’t sat behind an OS X for more than a few hours, ever. Get real, this review is a waste of time…back your “review” up with some real data of your own instead of culling a bunch of misconceptions from the web and typing them up as being original and informative.
Nicely written rebuttle, however I must please beg everybody to STOP with the car comparisons! As shown with the recent statement by Gates (I think) about the car industry and development, cars are NOT computers. Car security cannot be compared to operating security. A computer is infinitely more complex than a car in its mode of entry and uses. A car does one thing with no 3rd-party developers: it drives. A computer can be used to do many many things, must be fairly open as a platform (yes, even MS Windows) so that they can be available to 3rd-party developers, and is ever changing. A car is a car is a car. It can’t blow up while you’re driving it, but that’s about it. The developers don’t have to worry about being backwards compatible, or anything other than driving down the highway. Please stop it with the analogies.
Oh, and it’s easy to lock your car, it’s not exactly easy and very well-advertised that you have to use Windows Update (wow, I really wish they would pop up windows making people do it, and power users couldn’t complain because they’re the ones who want computers to be secure). Even an idiot knows enough to lock their car.
Ok, maybe the car is a bad analogy, but …
Oh, and it’s easy to lock your car, it’s not exactly easy and very well-advertised that you have to use Windows Update
I don’t know about this. Though I have never seen anyone get hacked, I have seen some people get viruses – and most don’t install anti-virus software until after the fact, even though they knew better.
Even so, that’s not the point. The point is that when an anti-MS zealot tells someone new to computers that ‘Windows is prone to viruses and full of seuciryt holes’, I doubt very much that they’d add in ‘However, if you use Windows Update regularly and install anti-virus software, you will probably deminish your chance of getting hacked or getting a virus by at least 90%.’ That is what I mean about people mis-representing the security issue. A little proactivity goes a long way.
Just look at
ftp://ftp.suse.com/pub/suse/i386/current/README.FTP
and install it with minimal downloading ( no need to get all those ISOs ) FOR FREE !!
Cheers Udo
test
I use Win 2K, have supported Win NT 4 and Win 95 desktops and have a Mac OS X box at home with marginally better performance on it than I get with my Win 2K box. Surprising since the X box is 5 years old G3 400 MHZ (upgraded slightly) and the PC is a P3 733. The PC actually has more RAM and I give it some credit because my network at work is probably creating some additional load that I wouldn’t have at home.
My actual point is that the test of Mac OS X didn’t really happen, though I appreciate the writer acknowledging this point, I’m not sure most people read thoroughly enough to notice every detail, some skip to the pluses/minuses and move on with life.
OS X is still maturing, I like it quite a bit and think it could have a chance at a larger market in 1-2 years when the economy and hardware issues are resolved and in the past.
well, i think a desktop linux distro shouldn’t just have a nice desktop, it has to be “nice”, from the ground up.
there needs to be a system (command line based, with 3rd party gui tools) for installing EVERYTHING. by this, i mean kernel modules (binary), and rpms(dpkg isn’t as good as rpm(apt isn’t a package, it a package manager).
so i can download an rpm called logitech_latest_drivers.rpm, and after i install it, it will let me use any of logitechs lates mice on my system.
as for the file system, the current hierarchy is a MESS! there shouldn’t be /opt, /usr, /dev, /root, /var, /etc, etc etc…there should be /hardware (which has /dev in there), /users(which has /home), and /system, which holds all other dir’s, like /etc and /usr. (/usr /usr/local should be abolished anyway!)
Linux distributions need to have a standardised registry system, for reasons of consistency. I should open mozilla for the first time, and it should ask me if i want it to be the default browser. if i say yes, it should set a value/key saying so, and all other apps should look at this key if they need to open a browser window….at the moment, some apps look for known browsers in $PATH. but an app should not have to know about a particular browser, it should read an arbitary list of browsers (that have appended their details to the registry) from /system/preferences/applications/browsers.xml and offer them as a choice to the user.
this should be the same situation for all kinds of apps! dvd players, email clients etc….for example, everyone want mozilla to be able to use Evolution as it’s mail client – for this to happen, the mozilla developers HAVE to implement the backend code and the gui for opening the email client. if there was a registry, it would just have to read /system/preferences/applications/email.xml and launch the default mail client. no default? ok open moz mail. when THAT opens, ask me if moz mail should be the default. i say “yes”, email.xml is adapted and mozilla and any other app which needs to access email from now on opens mozilla mail to send a message. easy. nice. integrated feeling.
i use linux everyday (redhat 8), and don’t have windows on ANY of my systems, but man…sometimes i think it just sucks! we NEED to have a registry….whether this is like windows reg, or gconf, or is xml files accesed with libxml, I DONT CARE, but we need one NOW! freedesktop.org submission anyone?
You can plug an effin 18 button mouse in if you want one! Kenisngten has one near that that is Mac only…
When will people see you can PLUG IN ANY MOUSE, not just the cruddy one button one? Jeebus.
I don’t really understand why people are so worried about Apple’s future. Let me review their CPU options again:
1) Stick with Motorola – No compatibility issues here.
2) Move to IBM PowerPC – No compatibility issues here because it’s still a PowerPC chip Program built for older PowerPCs would still run on it, albeit unable to make full use of the new chip’s 64-bit features.
3) Move to x86 or derivative (Opteron) – Only limited compatibility issues here. GUI applications on Mac OS X are, in fact, directories in which are contained separate files for code and resources. It is already possible for a single “bundle” to contain both Classic Mac and Mac OS X code. It would be fairly simple to extend this mechanism to support having an x86/Opteron version of the code as well. Thus to an end user the transition would be completely transparent. Double-clickable applications would work on either platform seamlessly.
mandrake 9.1 is better than rehat 9 as far ease of installation and finding and configuring everything and winxp is great just for all the damn gamez, appz and other warez out there for it…am I the only one that thinks macromedia tools are the shit?
Interesting read and for the most part, at least with the proprietry OS’s, true but:
I would not count the descendents of BeOS out of the OS race. Sure, it has been a while since anything has come along since BeOS 5.03 but with the turmultous time that has been had with Be Inc, Palm, and everyone else effected what would you expect.
As an OS it has user functions that I really miss and will appreciate when Zeta is released. I also believe that interest from Steinberg and some other Audio app vendors will be re kindled with Yellow Tab and Open BeOS. At the moment I am stuck with XP but I know that I will not have to deal with this nightmare for too long.
BTW, before the Win XP defenders come rushing in to flame me for my comments I ask you this, have you tried to setup XP and Win2K for professional audio work? If you can honestly say yes to that question then you have the right to comment but if you haven’t (and I mean 24/96 recording with multi channel audio capability and running multiple VST instruments) then save your breath.
I run Cubase SX on a SMP system with a Hoontech DSP24 C-Port sound hardware and it took quite a long time to get it running ok. Even then I still get crashes that are just not kosha. Most are to do with the sound driver in some form or other but this should not be happening and it is not just my sound hardware as notice boards will attest, many users with similar sound hardware get the same problems. Have a look at http://www.cubase.net and read some of the forums to get an idea. One of many forum sites for audio where users are having fun and bleeding to get functionability out of their hardware.
So you want to destroy 30 years of proven and workable Filesystem Hierarchy and bring in something that is completely new? /hardware wouldn’t make sense because I know my /dev/null is NOT hardware as well as various tty’s…it would make less sense then /dev which is short for DEVICES.
What about already written UNIX software that uses the standard /etc directory for things like configuration files?
This also breaks compatibility for other software that is written with POSIX in mind. There are much better ways of making Linux easier to use, changing the filesystem willy nilly is not it.
I’d support a registry style of system only if it can be changed without the UI…and built for that in mind. I would not support anything that would be built with GUI in mind and then some half assed command line implementation. Build for the command line, then build the GUI around it, or none at all. Too many system admins rely on shell scripts for administration to just remove it. And where linux will shine is the business desktop, where machines are mantained by sys admins, not users.
I would rather see linux falter on the user desktop and stay with the ease of administration than see it shine and have it like Windows, all gui, no real administrative gains.
I am not one to go clicking around, I like to get my work done quickly, editing text files is far more efficient than clicking through screens or menus, especially for administration of large numbers of systems with tiny differences in each workstation…
I would support a darwin/next/os x style of configuration system though, it’s easily administered from the command line.
let’s see I’m using firewire harddrives, CDRW drives, and usb mice/keyboards and cameras just fine on redhat and slackware systems. All autodetected with no driver installation required.
The driver problem on linux on happens when you buy bad hardware, like winmodems, without ever thinking you might need to use it on an open OS, like Linux. Remember, with linux the source code is available, so any company that wants to keep their drivers closed will not be able to work with our OS.
@Stephan: That actually wasn’t a car analogy in the traditional sense. He merely mentioned that if you dont’ take care to protect your expensive object, it can easily be stolen. It just happened that the expensive object was a car.
@Idealist: The current directory hierarchy is the way it is for very specific reasons. It makes a great deal of sense, if you get out of your “the desktop is everything” mindset. Just remember that UNIX was designed for a networked, corporate environment, not a home-user’s desktop. Everybody keeps suggesting these simplistic Windows/MacOS style directory structures, without realizing that they absolutely don’t work for complex networked environments (think: corporate desktop) where access to resources must be transparent and security is paramount. I have yet to see an alternative suggestion that maintains the same level of flexibility or does anything more than change some funny-sounding directory names.
– /etc could very well be named /config. It stores configuration information.
– /boot is for the kernel and boot configuration. This directory is supposed to not even be accessible except during kernel update, because the kernel is such a critical component.
– /bin and /lib are for binaries that need to be present before network filesystems are mounted.
– /sbin is for absolutely critical system utilities that must be seperated and protected from the more often updated utilities in /bin and /lib. It’s also for programs that only the system administrator needs.
– /usr/bin and /usr/lib are for the bulk of apps. There needs to be a seperate directory for this, because in a great number of environments, /usr exists on a central network directory. There is really no point duplicating the same applications on every workstation in some sort of /programs directory.
– /dev is just your /hardware with a different name.
– /tmp is for temporary files. Often, temporary files are put on a seperate fast drive to optimize efficiency.
– /var is for logs and other information. Having this be a seperate directory is very important, because some applications throw a large amount of data into /var, and it might need to be on a seperate/fast drive to isolate this data from other files.
– As for the /usr/local, it’s hardly useless. What if a user compiles an app and wants to run it? Are you going to give that user write access to a shared /usr directory?
– /usr/share is for non-program shared files like documentation and pictures.
– /home is for user’s home directories. Even MS has this, and it doesn’t have a stupid long name like “Documents and Settings.”
– /opt sucks. I agree. It’s there to support self-contained applications that use directory hierarchies from external OSs. These are usually ports to UNIX. Taking a look at my /opt directory (in Gentoo), of the 8 entries in /opt, 6 are ports.
You make a few decent points. Configuration needs to be cleaned up. I really don’t like the idea of putting all user-specific configuration into dot-directories in the user’s home. A central XML (binary would be worse than the current mechanism) configuration mechanism would be nice. /etc needs more subdirectories. The /usr directory has some cruft in it. /usr/X11R6 should be distributed into /usr/lib and /usr/bin. /usr/man should be in /usr/share/doc. GCC needs to be less psychotic about where it puts it’s headers (in /usr/lib!). There is a lot of weirdness in several places, but the core layout is fundementally sound.
Of course, this whole aruement about directory layout is largely pointless. In UNIX, directory layout doesn’t really matter to the end-user, just the system admin. People stay in their home directory, and don’t wander elsewhere. This makes sense on the network, and makes even more sense for the home user. The only issue is that on most desktops, users are also administrators. This just means that the basic configuration tools should abstract the filesystem as well, not that the filesystem itself should be changed.
If anyone wants to keep their systems secure they just won’t be running windows. They won’t use it for email, they won’t use it for web browsing or downloading viruses. Use Linux and all your virus problems go away.
Security?
One of my linux boxes was rooted a few years ago. I caught the intrusion less than a day after it happened and immediately isolated the box. Since then I have not put another poorly configured Linux box on the internet, and everything runs NATed behind a firewall. I have not gotten a single windows virus (except windows itself) or ever been hacked since. That’s security and I doubt you can get that with M$ products. But don’t let me stop you from trying.
“Allow me to quickly cover a few points that have been made:
Yes, I didn’t review a true debian distribution. Regardng the apt-get points however, you can install such a system (apt-get, that is) on a number of Linux distributions, including Redhat, and I must admit that it does make maintaining a Linux installation much easier. Why Redhat and others don’t include it in addition to their RPM setups is beyond me”
i totally agree with this as someone who has used almost every linux in the top 20 or so at one time or another this would be cool.
one last thing is i would urge the author to try gentoo some time i’m not one of the gentoo junkies out there but i have been using it for a couple of months and i love it “emerge” rules and is by far the best package managment IMHO anyway it would be certainly worth a try when your after a new project.
Use Linux and all your virus problems go away.
What virus problems? You mean the ones I don’t have? It’s really funny how these zealots make up some imaginable problem you’re having (for example, constant blue screens, even though I’ve been running without a hitch since upgrading to Win2k back in 2001), and then suggest that you switch to Linux to get rid of some issue that doesn’t even exist. So to all of you pundets out there, don’t try to tell me how insecure or unstable my box is – I’m really not impressed.
One of my linux boxes was rooted a few years ago. I caught the intrusion less than a day after it happened and immediately isolated the box. Since then I have not put another poorly configured Linux box on the internet, and everything runs NATed behind a firewall.
If you have to NAT a Linux box and put it behind a firewall to keep from being hacked, what kind of security is that? Sure, it may be a little more secure than Windows, but the NATting and firewalling goes a long way for any OS.
That’s security and I doubt you can get that with M$ products. But don’t let me stop you from trying.
Well, so far, it looks like you have been ‘rooted’ once, so the score right now (assuming the one with the lowest points wins) is You: 1 Me: 0
i didn’t find this article very good. it was probably because it was mainly all release canidates or betas that were being compared. i think that was a “dumb” mistake. also some info i found either wrong or misleading, not much information, but their was some…
i realize a review is a someones experience with a particular product, and therefore i won’t comment on his experiences. although i definently would have included more distros, or bigger ones. SuSE, Xandros, Lindows, Slackware, Debian etc….just my 2 cents
am I the only reader of this article who noticed the extensive use of exclamation points throughout? It! reminded me! of! an episode! of Seinfeld!
lol, ur right. 🙂
Overall good article, probably the fairest in this genre I’ve read so far. Only too bad you didn’t take the time to try out Gentoo, because judging from most of your comments it’s exactly what you’re looking for.
I know the installation process looks intimidating. I remember contemplating on installing it several times, only to refrain from it because judging from the installation documents it seemed too hard and tedious a job. But the documentation is very well written and you do seem to have your share of experience, so it shouldn’t be too much of a problem. So …
Go rent a couple of videos, print out the necesary documentation and download the image. Do the basic setup, start the bootstrap script and go watch a film. Same goes for system and xfree; you’re system should be able to set this much up in 4 to 5 hours (depending on how fast you can configure a kernel). Run the KDE installation during the night, take your girlfriend (or your dog if you’re too geeky for a girlfriend) somewhere the next day and you’ll be set when you get back! If you need to install other stuff, run it in the background and give it a higher nice-value; you’re system will be more than usable. And the good stuff:
– no more dependency problems, really!
– as fast and bleeding edge as it gets
– all the software you’ll ever need in a single, constantly updated repository
– very, very tweakable
Also, try out our new thermal undies; they’re very comfy
He included Gentoo in his synopsis but it seems he bombed out of the installation. Don’t get me wrong, Gentoo is not easy to install, but if you know computers pretty well are reasonably familiar with how software functions, installation is pretty straight forward. Gentoo’s package management system is awesome and makes every Gentoo user fall in love with the distro upon their first emerge. Gentoo is rocket fast…faster than Slack based distro’s IMO (nothing against Slack btw, slack is great too). And new packages are available for Gentoo faster than any other distro, bar none. Gentoo also has one of the widest, if not the widest, offering of packages.
I’m not saying this should have changed his conclusion, but if you didn’t install and use Gentoo…then don’t comment on it and look like a dick… just say you decided to not bother with the long install and leave it at that.
How could you review Yoper, Ark, and Vector and leave out Lycoris? All three Distro’s are inferior to Lycoris in most every way, especially solidity, support, ease of use and ease of installation. No excuse. Stop reviewing junk and start reviewing polished Distro’s. Lycoris, Lindows, Libranet, Connectiva, and the final release of Mandrake 9.1.
Well, so far, it looks like you have been ‘rooted’ once, so the score right now (assuming the one with the lowest points wins) is You: 1 Me: 0
Well, since with Windows all process run with system permission then an argument could be made that you were rooted the first time you opened IE .
I’m really getting tired of the “I’ve run Win2K and my computer hasn’t been <bad_stuff>’ed yet so Win2K is <good_thing>.” Your personal experience doesn’t mean a whole lot. You have to look at studies and attitudes among people using these machines in the field to get a good idea of where things actually stand.
Honestly, a desktop isn’t a heavy load. Done right (ie. don’t install any non-Microsoft software), even Win95 made a decently stable desktop OS. But a heavily loaded network server? That’s a different story entirely. The fact that UNIX machines can handle heavly loads and keep running for months at a time is why UNIX is so respected for it’s stability.
i was just wondering why you didn’t include Slackware 9.0 in your test systems, considering that the distro you picked (Vector) is Slackware-based.
No one’s picked up on how Yoper advertises here and what response they will have on reading the scathing review of Yoper’s service.
Are they going to stop advertising?
I am missing Debian in this list. It’s by my opinion the best and most stable Linux system out there. apt-get rewlez!
Believe me, I have tried several distobutions as well, and I found that Gentoo was the last on my ever-continuing list. Last, but not and nowhere near least.
My first distro was SuSE then I tried RedHat, Mandrake, then Libranet, Debian. All of them seemed like they were missing something. After using apt, I could never go back to the RPM world, even though apt-rpm was avalible. Perhaps it was because of the outdated RPMs. Well, when it came to installing the NVidia drivers for Libranet/Debian, I would have to patch my kernel. Yay…
With Gentoo, it was as easy as:
emerge nvidia-kernel; emerge nvidia-glx;
I came back a while later, and automagically I had nvidia dirvers! Whee! Now, just about everything is as easy is that. No more hunting down RPMs only to find that I was missing 14, 17, 20 packages. Apt was nice, and is nice, but it wasn’t enough for me. It was what addicted me to automagic. 🙂
Now, now, not to mock you or anything, but had you tried the Gentoo installation and read the METICULIOUSLY explicit (although at some points annoying) installation guide, maybe you would’ve realized that it’s not as hard as you think. Plus, you end up knowing exactly where everything is. Although the time required is somewhat of a drawback, everything is compiled with your specific system in mind. Many things run faster, as you stated, when compiled. If patience is applied while using Gentoo, you will certainly be rewarded.
Also, the installation is meant to be read FIRST, not while. Often, notes, directions, and important information is included after.
In my opinion, Gentoo is worth it.
I don’t recall Adobe apologising for the PC Preferred page – although it has since been removed…
However the site that actually made the comparisions (www.digitalvideoediting.com) posted another article suggesting that After Effects wasn’t optimised for a dual OS X Mac (http://www.digitalvideoediting.com/2002/05_may/features/g4benchmark…). Surely this is basic computer common sense anyway – the speed of a program isn’t necessarily a reflection of hardware performance.
Also, does MacOS X not allow you to browser folders as BeOS does with a right mouse button click? You could do this in OS 9.
Andrew
I can’t help but notice the overwhelming amount of people bashing this article. I would have to say this is one of the most comprehensive and accurate reviews of OSes I have seen in a very long time. I say A+ to you mate.
I do agree every article, including this one, could always include more info. In this case I might add a little more emphasis on multimedia apps; it seems Anthony doesn’t watch many movies, as I think most of us do. However, I stand firm saying this article was very well done.
In a comment above someone told he found /opt quite useless. It indeed is, at the moment. But in my opinion it would be a good idea to use it more.
Use /bin and /lib just as usual, use /usr/lib for standard libraries like libpng, libxml and such, place X11 in /usr/bin and /usr/lib.
Then, put all extra applications like KDE, Gnome, Lyx/Tex, Mozilla, OpenOffice, Apache and such in their own /opt directory, maybe with, for example, /opt/app/lib for application-specific libraries.
In a network it then is no problem to share /opt in addition to /usr, making all applications available to the whole network.
And with this, all applications are in a logical place, can be installed by just copying them to /opt, without needing to worry about dependencies, which is great for “the desktop”, and you do not lose the power of the old system. For compiling programs yourself /usr/local still exists.
The article complains about automatic downloading interrupting his recording session and how Windows decides things for him, and yet he has Automatic Downloading turned on in the first place? That’s telling Windows to decide things for you.
The review is very comprehensive done and presented with balance. For a newbie wanting to transition to Linux, the advice contained in the summaries alone was worth the time spent on reading the article.
Thank you for an excellent service to people like me.
🙂
The author claims he is looking for a “perfect” OS, for his desktop, then let’s stick to OSes that promote themselves as such.
1) BeOS – find, 2) MacOS_X sure, but in the linux world it should be :
3) Xandros, Lycoris, Lindows, and a few others – and pretty much in that order.
4) Suse do make a nice desktop offering of late, but Xandros still nudges it out for overall polish and simplification of the GUI. Also remember Xandros is “only” a 1.0 … imagine what 1.1 or 1.2 will be !!
– For the reader who claims that Windows security holes are found and plugged everyday … obviously has not lived the stresses of the major virus outbreaks and web-app vulnerabilities, that once aired on TV/Radio make all NT/Windows sys-admin shudder, fall to their knees and scream – please Lord, not my site, spare me, pleeeaaase !
(remember SQL Server circa January 2003, 20,000 servers affected worldwide …)
There is no perfect desktop/server OS – just what is more tolerable to you. In my case I will live with a more limited array of support hdw, I can’t stand : viruses, crashes & waste of time – that I have lived at home on M$.
Cheers,
S.
> BeOS is alive and well through YellowTAB’s Zeta product that comes out soon.
And it runs … what?
I used all of the non-beta versions of BeOS released by the folks at Be, Inc., and it’s a wonderful base, but if there isn’t any software, the OS isn’t useful.
I agree with most of Hicks’ comments about the merits Vector. I have recently jumped the Redhat ship because of the increasing congestion of the demo update servers (and exorbitant prices of a subscription). The first alternative linux solution that I found was Vector 3.2. I was so impressed that I ordered a CD to support the cause.
I have become increasingly annoyed, however, by the challenges involved in adding new software to the system. Vector has no dependency checking and no convenient, automated way of installing packages (wget and the package tool were the best I could come up with). My taste was further soured by the fact that I have not received my CD over a month after ordering it.
My new favorite distro wasn’t reviewed in this article. It’s Alt Linux Junior. Alt blows vector out of the water with regards to security out of the box, hardware recognition, ease of installation, ease of use, and package management. The more you use it, the more surprised you are that you weren’t using Alt earlier. Check out this review at: http://www.virtualsky.net/altlinuxreview.htm and check out the distro at: http://www.altlinux.com/ . Additionally, the download and updates for Alt Junior are completely free.
Using VectorLinux for the last 2 month, I’m glad to see such a positive review.
You’re absolutely right it’s fast, it’s stable, it’s brilliant !
Well done Robert S.Lange and the VectorLinux crew ! You did make the perfect Linux distro !
Bruno
Amsterdam
Although the powertoys give the illusion of multiple desktop ala *nix, MS office apps falter my losing the menu bars consistently.
Nvidia has desktop management software that does the job better.
At the start the article mentions various periprials, but it talks about none of them.
I agree that the MS Virtual Desktops is not great. With it I can have up to 4, right now I can have up to 32. I could probably modify it myself to allow 1000s.
I am probably biased for Mandrake, but I think the reviewer missed a few key things.
The Mandrake Control Center is a good tool to configure, but the review failed to touch on it. I’m not saying that its some revolutionary being, but is definitely worth touching on.
The integrated menu is pretty good, a problem with RedHat is that only GNOME apps show on menu in gnome and only KDE apps in KDE.
The process bloat can be fixed, in 9.1 at the end of the install there is a menu to disable/enable services.
I believe that Debian, Slackware and some other distros should have been evaluated, though other users have already said this.
One thing that constantly annoys me is the bias for KDE. I have nothing against KDE, though I myself use Gnome. My friend uses WindowMaker. Some of the distros don’t include Gnome or WindowMaker, for me or my friend this would be a hassle. Just because it works for some doesn’t mean it works for all.
The abnormal installation is more of an annoyance too.
And for fun you didn’t deduct from Windows, Mac OS X, or BeOS for the abnormal KDE installation. (Its actually possible on win (Cygwin) and MacOSX (BSD Layer) not sure about BeOS, but someone could port the source)