I can say this with 100% certainty. I’ve come to this conclusion by simply reviewing server administration times over the past three years. Admittedly not the most scientific approach, but it is all the info I need. Remember, experience is the best teacher. Editorial Notice: All opinions are those of the author and not necessarily those of osnews.com
I own a small business in southwest Florida. The business primarily performs IT administration for companies either too small to warrant the expense of an in house IT staff or companies that don’t feel they need a full time IT staff. There are literally thousands of companies that outsource their IT administration in Florida alone.
I have been working with Networks since the early ’90’s. Artisoft’s LANtastic v4.0 was my first NOS. I really liked LANtastic until Windows ’95 came along. Windows ’95 was easy to network(read free) and the OS was new and very GUI (as far as OS’s go). Windows NT 4.0 came out soon after and I made the commitment to MS, selling well over $100,000 worth of software with the MS name on it.
Having 25 businesses all running on MS servers would require a visit at least once a month to fix or patch something on the server (usually a reboot would do it). We also ran MS at our office and were used to resetting the server about every month to keep things humming along, no big deal, or so I thought.
My main system was a dual boot station that ran both Windows 98SE and Windows 2000 Pro. My system was very “stable” (an error or lockup about once a week), but I had a practice of wiping the system every three months and reinstalling, just to keep it stable and fast. I used Norton’s Ghost for this and all seemed perfect, until I bought a wireless adapter… Then all hell broke loose. The offending unit was a wireless pcmica card and a PCI adapter combo. The immature drivers caused both 98 and 2000 to lock hard before the desktop would load. After days of trying to make it work, it was disabled and sat there unused.
While shopping Wal-Mart I spotted Mandrake Linux 8.1, $29.00 dollars for the OS and a barrage of software. I took it home, popped it in and walked through a fairly simple setup. The software found the wireless card! I didn’t have to do anything except enter my SSID and address. This “new” Linux thing was cool.
I tried not to boot into Windows on my now triple boot system. As time went on I found that Linux was the environment that I preferred to work in. I had encountered no viruses, was never forced to reboot and only needed to purchase an accounting package to complete the transition. It took three months to make the switch to 100% Linux applications.
I started reading more about my new distribution and found it could work as a Windows file server using Samba, among other things. A $29.00 software package could do what a $1000.00 (10 user Windows 2000 server) package from MS would do. I was interested to say the least.
After learning the basics of Samba, it replaced Windows 2000 on the file server. I ran this for a few months and never had to reboot the server. This was amazing! Performance was equivalent to the MS Windows file server but the stability was so much better. One of the most interesting aspects of setting up Samba (and just about anything under Linux) are the configuration files. Having files that could be viewed and manipulated in a text editor is so gratifying. A parameter can be modified and fully commented as to what was changed and why. Try that in Windows!
Having Samba and Mandrake running perfectly for months gave me the confidence to pitch Linux as an alternative to MS servers. The very next network integration saw Linux/Samba on the file server and Windows on the desktops. The only change would be to Red Hat Linux because the client hadn’t heard of Mandrake.
The install of Red Hat went smoothly and it detected the Adaptec Raid controller with no further driver loading (I’ve setup so many Windows servers and never encountered one that didn’t require loading drivers). All shares were setup and no print queue was required (thankfully since I wasn’t competent with Red Hat Linux printing yet!). A Belkin UPS was used because it came with Linux shutdown software. This particular server has never been down since it came online over 14 months ago.
Now, almost all of my clients run Linux/Samba/Postfix servers. Not once have any of these servers gone down due to software errors. All of these servers have been up since installation, except for two units that suffered hardware failures, one unit due to a hardware upgrade, one unit on a faulty UPS and one unit that was being rebooted periodically by an uninformed (and unauthorized) employee whenever the mail went down (ironically, that company was using an outsourced mail server).
Here is the rub, after switching all these servers to Linux, my company has lost income! We lost approximately $220.00 a month per Windows server in maintenance fee’s. Approximately $5000.00 a month in total! Linux was the best choice for my clients businesses, lowering the cost of IT administration considerably. Microsoft shouts about a study (which they funded) that says Windows has a lower TCO. This study is completely flawed. I can prove it in one capital word , VIRUSES. Windows is afflicted with over 82,000 viruses! Linux/Unix around 30. Of the few that affect Linux, they can’t infect the whole system unless you manually give them root access!
My conclusion is that Linux is a much better OS for businesses that value reliable servers. The uptime is incredibly high. The flexibility is awesome. The price is incredibly low (I have my clients purchase Mandrake’s full software package for $199.00), and the available feature list that comes with Mandrake and most other distro’s is staggering. Linux and Open Source are the best choice for business.
Now on to the desktop!
About the Author:
My name is Alex Chejlyk. I own and operate a small business (10 years) that performs IT tasks for other small businesses in the area. I’ve been computing since the early 80’s. I started out with CPM then to DOS, LANTastic, Windows 2.x, Apple, OS2, Windows 3.x, Be, Windows NT/9x/2K/Xp, Unix, and Linux. Currently working on a drag and drop function for CDRW’s in Linux (like packet writing – kinda…).
I’ve been comparing the lists of drivers for both linux and freebsd, and there aren’t too many things that linux has that freebsd doesn’t.
Two issues I personally think need to be addressed with freebsd at the moment regarding the ‘ready for the desktop’ thing include better hardware detection by the installer (not like ‘kldlaod snd’ is hard, but new users shouldn’t be expected to know this), and a working set of OpenOffice packages.
Don’t let anyone claiming that there are adequate OpenOffice packages for freebsd fool you, they (the packages, and quite possibly the people making the claim) are half-baked and flaky as all hell.
Apart from those two issues, I have been running FreeBSD on my desktop machine for about a year now with no real problems.
There are over 150 native Linux viruses currently.’
I’d appreciate a credible source or two.
Here’s some:
http://www.sophos.com/virusinfo/analyses/index_linux.html
http://www.sophos.com/virusinfo/analyses/index_linuxworm.html
http://securtyresponse.symantec.com/avcenter/venc/auto/index/indexL…..
Note that they are all identified by Linux.something, so I might miss some. Btw, I want the beer!
http://linuxtoday.com/security/2001032800520SC
I took a look at the links you provided and none of them specifically stated GNU/Linux has been afflicted by 150 virii. In fact, the last link you provided contradicts that notion and I quote from the last link.
“David Millard, technical manger of Command Software (a seperate anti-virus firm to Central Command), said there were fewer than 10 viruses that infect Linux systems and he said the bug should be treated as a “proof of concept” rather than anything more serious.”
This further validates my point that viruses on *nices have never really been an issue. I have never been infected by a virus throughout my *nix computing years. Sadly, this is not the case with windows.
Beer you said, may be some other time. 😉
Regards,
Mystilleef
Why is it ironic that the mail was outsourced? It’s certainly amusing, but doesn’t seem ironic to me.
Also – it is not true that Linux viruses have to be given root access – necessarily. There are some existing flaws in older versions of the software (and yet undiscovered flaws) that allow an intruder to obtain root access. There are ways to guard against this, but if you leave your system unpatched then it can still be compromised.
Having said that, I’ve never experienced a Linux virus, but I have enjoyed it’s uptime – once the nvidia drivers became stable – and always on my server. The idea of rebooting to solve a problem is a foreign concept to me, as such I get a bit confused when windows users talk about rebooting their machine because it’s been running a while, or other weirdness.
‘Linux is far from being ready for desktop, but IMO, BSD is even farther.’
That depends on your defination of desktop ready. I use GNU/Linux very comfortably as a desktop. In my opinion, that’s enough reason to stamp a ‘ready for desktop’ seal on GNU/Linux.
Primary desktop usage.
1). Word processing
2). File editing
3). File management
4). Web browsing
5). Emailing
6). Instant messaging/chatting via ICQ or IRC
7). Multimedia
i). listening to music
ii). ripping/burning mp3s/oggs to compact disks and vice-versa
iii). Watching DVDs or video files on your file system
iv). Creating movies
v). playing games
I bet anyone can do all the above on GNU/Linux and a ton more. Agreed navigating your way around GNU/Linux is a different experience as compared to Mac or Windows. But that’s the point, it’s neither of them. And I hope they all remain innovative and different.
Regards,
Mystilleef
You have a list of three viruses from sophos none of them have a method of propogation between linux machines. So basically this makes them concept viruses. Any fool can make a concept virus. If you emailed me a file containing that virus do you think I would get infected? Maybe you don’t understand the difference between outlook and a normal email program.
As for the worms yes they exist. People should patch against them.
When people say viruses are the reason to switch from windows they are talking about the outlook virus propogation engine. Linux does not have it.
Lets see, I can have a linux server running RedHat up and running with Postfix,Apache,PostgreSQL etc etc in about 1 hour with raid. It takes the same amount of time to setup a windows 2k server, and then to get SQL server up and running propery takes another hour or so.
You forget that it takes time to learn new versions of Windows as well, if you think you can simply jump into windows server 2003 and have zero problems you are sadly mistaken.
Now here is another example, I can install Firebird SQL in a matter of minutes and have working DB’s that will do just about everything MS SQL server will do in a couple of hours, now consider if the cost of MS SQL server, if you buy per seat it’s around 150 per user, or if you buy the unlimtetd server license it’s around 5000, either way the few hours I spent setting it up ever at 200 per hour is way less than the cost of the MS SQL server, and my clients don’t ever have to worry about buying more Per Seat Licenses, and if they need another server, no problem, no licenses 🙂
The whole thing about TCO is a bunch of crap.
I have the knowledge in both Windows and Linux and I would recommend Linux over M$ in tons of situations, and it takes the same amount of time to set either one up.
Linux is not rocket science and if you have people that can’t figure it out then something is wrong. Maybe you need to get better qualified people 🙂
“Agree, and that’s sad, as the average user DON’T want to be a part of an elitist community… Geeks fightings can scare them.
”
Never joined a club, have you?
Thanks, I had a look at the project homepage and it sounds interesting. I’ll definitely give it a try.
“There are over 150 native Linux viruses currently.”
I’ve heard smaller numbers. In any case, these are known viruses, but only a fraction of them has ever been seen in the wild.
As far as propagation is concerned…viruses may propagate over a samba network, but only because of Windows apps, i.e. Linux apps will not propagate the viruses. I don’t know about MS Office macro viruses, but these ones don’t seem to be making much problems these days. The biggies are mostly outlook viruses (and MSSQL ones, IIRC).
So in fact Viruses are a very good reason to switch to Linux, until of course enough people have switched so that virus writers target Linux more than Windows…we’ll worry about that problem when we get there! 🙂
Active Directory. It makes it much easier to manage things.
Samba 3.0 supports Active Directory.
Working for an outsourcing company, most of our clients are Windows shops. Properly set up, monitored and left alone, even NT4 servers can be stable. A number of NT4 f&p / exchange / sql servers at clients have been up for over a year. However, these are usually ‘edge’ servers at branches that dont experience a particularly dynamic environment (ie they get set up and left alone, because their requirements dont change).
Most of the linux servers I use are back-end servers for monitoring and webpage serving etc and they are typically very reliable and more able to support change without affecting the entire server (eg adding a virtual interface on Linux is a 2 second job, NT4 requires a reboot). Ive never used Linux as a front-end samba server as (without using XFS) providing the file/directory ACLS we generally need isnt easily possible. Also, being a mainly Windows shop, the politics of trying to make this happen isnt worth it.
A GUI front-end to an application, be it on Windows or Linux (or whatever else) usually makes for a more intuitive setup/configuration of an application, especially if you arent particularly familiar with THAT application. A text-based configuration requires you to do more research and learn more about how to set it up and the implications of changes (a good thing IMHO). Once you have that knowledge, then its usually quicker to set it up on subsequent servers than it would be with a gui.
Typically, i find that open source apps are usually less feature-rich, BUT the features that ARE available usually WORK without serious problems and the little annoying bugs that seem to continually crop up in closed source s/w.
As a note, experience-wise, Ive been in the industry for about 12 years now and started with DOS 3.x and Windows286 etc a few years before that. Ive worked with various flavours of Unix, done a LOT of Netware 2.x/3.x/4.x, worked with NT since v3.51 and have played with linux off and on since Debian Hamm.
At home I run a gentoo server and a dualboot XP/Gentoo w/s. Im typically booted into Gentoo, whereas the better-half and flatmate typically use XP, although there was a comment the other day from the better half as to how much faster Linux (Konqueror) seemed when she needed to do some webbrowsing.
I enjoy tinkering and find that Gentoo is a great environment for this. I do find that I end up needing to access the forums and doccos lot when setting up a Gentoo w/s or server from scratch as I dont do enough of it to imprint the commands/syntax in memory, whereas setting up XP is pretty much follow your nose.
Basically, I find that most things work fairly well, some things are better suited towards one OS than another. I leave the religous arguments for others. I just use what is most suitable for the task at hand, or what the customer wants.
When my server was running Windows XP Pro, I could copy large mpeg files to it at 3.5 megs per second on a 10mb LAN. The client was also Windows XP Pro.
When I changed my server to Debian, I was able to sustain 4.5 megs per second for hours.
When I changed my server from Debian to FreeBSD, I was able to sustain 8.5 megabytes per second.
When I changed my client to FreeBSD, I am now able to sustain 11 megabytes per second.
Changing from XP to FreeBSD made my systems about 3x faster. That works for me.
🙂
You are able to sustain 11 megabytes per second on a 10mb LAN?? Even assuming you mean 11 megabits per second, it seems quite impossible. Are you sure you didn’t upgrade to a 100mb lan somewhere in the process?
Aren’t (almost) all network cards 100 Mbit nowadays?
I asked in the osnews-forum a week ago how to auto-boot into a user account and auto-start proftpd — I didn’t receive an answer,
The auto-login is rather easy if you use X+KDM, you can even do it with point-and-click! Settings->System->Login Manager->Ease(?)->Activate autologon and select the user.
I don’t know the English translation, as my KDE system appears entirely in Dutch.
And I am soo happy Linux and BSD have a text-mode console, to do simple administrative tasks. For example, if I want to add an user with Samba support, I can login quickly, do an “useradd -m user” and “smbpasswd -a user”, fill in a password and within ten seconds it’s done.
In Windows I need to login as Administrator, wait for all those stupid services like QuickTime, the virus scanner, WinZIP tray icon and MSN Messenger to appear, as they are all-user, open the MS Management console, dive my way into it, right-click, add-user, fill in the properties, OK-Close, dive my way into the “Documents and Settings” to find his documents, Right-click-Properties-Sharing, enable, OK and logout.
But wait, the login step is not needed, you need to know the trick. You know there is an option “Run as another user”? Well, probably you know this option won’t work if you create a shortcut to the Control Panel or to c:. But what is possible is to make a shortcut to “c:program filesinternet exploreriexplore c:”, then enable the option, and when you launch the shortcut and provice the password you can browse files (and use the Control Panel) as administrator. Gee, isn’t that user-friendly?
When I use the scp command, it displays the speed it is transmitting at. It oscillates between 10.5 and 11.
Both NIC’s and my (Netgear) Hub are advertised as 10mbits, but I suspect they are capable of a little more. I do have cables rated at 100mbit.
When I first saw the speed exceed 10, I though FreeBSD must be pushing the hardware to its uppermost limit. The time to copy a 700MB mpeg file went from 3 minutes to 1 minute when I changed from Windows to FreeBSD.
As a test, I just copied 3 of my DVD movies, and this is how FreeBSD performed (I don’t how to paste yet, so I’ll recreate scp’s output):
12_angry_men.divx 100% 598MB 10.8MB/s 00:55
Ali.divx 100% 964MB 10.6MB/s 01:30
Beautiful_Mind 100% 825MB 10.6MB/s 01:17
There use to be a 4.5 where there is now a 10.8. The amount of data copied increments about 10 megabytes per second, so I assumed MB/s meant “Mega Bytes per Second”. A (roughly) 600 meg file in (roughly) 60 seconds would be 10 megabytes per second, too.
10.8MBps (or even 10Mbps) is not possible on a 10Mb lan. Best case, you will see 8Mbps or 1MBps on a 10Mb segment. 10.8MBps barely obtainable on a full duplex 100Mb lan.
yeah, you are either wrong or lying. in the “real” world it is impossible to get 10 out of a 10 lan. there is some analysis that will help you out with this in various network books.
yeah. something is awry here.
dude, you dont even know how to paste, so I doubt you know what your talking about. On top of this, SCP is extra slow, with lots of overhead.
One interesting point, and i don’t know if this has been raised before (im too lazy to read all 100 comments), is that being the cheapest can often be a disadvantge with the crazy world of managers and corperates.
These middle-manager types often want to be in control of as much money as possible, and hence pick the big expensive MS SQL server (or oracle) when something smaller and simpler would actually be better.
Hi,
I agree with the article – switched over (not to Linux but to its older ‘cousin’ FreeBSD) from Windows 95 for stability reasons. Writing code (with Borland C Builder) and testing the applications caused the system to crash constantly and I really got fed up. FreeBSD is stable as a rock – never had a kernel panic (in 5 years).
At work I see a lot of people that illegally use MS Windows software since they don’t want to pay the price. When I point out that they either should pay for it or look for cheaper alternatives, they don’t know any answer to that. Basically Microsoft shoots itself in the foot for asking such absurd prices for its products. Sooner or later everybody will do this. I am glad to see that Open Source is actively eating away at the Microsoft monopoly and simultaneously not suffering from the same things as Microsoft (viruses, instability and piracy).
Re: FTP server software for Linux. I use proftpd, its pretty much the one the best if not the best out there.
Re: BSD…..yes, I too am growing tired of these guys. BSD is a great OS no doubt, but look at its dev pace over the last 30 years and compare that with Linux’s pace year to year. Linux is just blowing straight past BSD and is already farther ahead in many aspects. When 2.6 kernel is out and stable there will really be no comparison. When you consider the fact they are both free, Linux has much larger industry backing, much faster development, and much better hardware support……Linux is the obvious choice.
Re: Novell? I agree Novell eDirectory is pretty much a work of art for large enterprises. Combine that with your choice of OS (edir runs on most anything) and ZENWorks and youve got a really sweet setup….and yes, despite is declining marketshare, Netware 6.x does kick serious bootea
Here Ive got a monster RS6000 running AIX for our proprietary credit union software, several Windows2000 servers for proprietary windows only apps, several Netware 6 servers for edirectory, file, print, and non-proprietary apps, and Linux servers for DHCP, DNS, Web, Ftp, email and all firewalling and intrusion detection. All of the servers are stable and about 99% crash free. However, a lot of time is spent patching, updating virus definitions, and the resulting reboots all the Windows2000 boxen.
Mine isn’t so funny, but I can’t seem to get an uptime longer than 45 days on my cheap Linux fileserver at work. Damned removable IDE drives. Nice having 300+ GB available to store all those redhat beta CDs.
I recently rebuilt my fileserver at home. 300 GB software RAID 5. First I moved all the disks and IDE controllers to a new system. Booted up without needing any additional software or configuration, mounted the RAID, even tho all the disks were on different controllers than before, and proceeded to boot up to multiuser mode and configure itself for my LAN. But since I was only running on 3 of the 4 drives I pulled the disks out and replaced ’em with new 200GB drives.
Unfortunately the kernel has not been going through the proper QA and regression tests. They seem to have messed up the promise IDE controller’s driver since 2.4.7 ~ 2.4.10. The driver just spat out error messages on boot, tho it did detect the disks, but then it could only write to them, no read access.
So I swapped out the controllers for some siimage chipsets. Only now Linux wanted to boot the PCI controllers before the onboard chipset. Great job Marcello, sheesh.
Anyway, I eventually got through the mess with a ide=reverse kernel option and now have a 500+ GB encrypted RAID5 array that is independant of the hardware and current system configuration it sits on.
External SCSI might be nice, but there’s no way you could do that for under $1000. That’s the TCO I’m talkin about.
Nice setup hmmm
My linux SMB fileserver is pretty nice (I think) tho nothing compared to yours.
My uptime is currently at 130 days on the 2.4.20 kernel.
BTW, for the person that is looking for an FTPd, Try vsFTPd (very secure ftpd) its supposted to be the most secure, and the fastest. (just look at the sites running it: http://ftp.redhat.com, http://ftp.debian.org, http://ftp.suse.com and http://ftp.openbsd.org) thats quite an impressive list
Both systems have similar functionality (actually Linux has more) and about an equivalent security track record.
Not freaking likely.
Ever heard of OpenBSD? I think it’s security track record completely blows any linux distributions out of the water….and I am correct in thinking that.
“…stated GNU/Linux has been afflicted by 150 virii.”
The plural of virus is not virii. Actually, there isn’t a plural…
The closest thing is viruses, so use that.
@Kevin
‘Viri’ is the latin plural of the latin word virus. In English ‘Viruses’ is the correct form. In my opinion both is valid.
Smartass 😉
” ‘Viri’ is the latin plural of the latin word virus. In English ‘Viruses’ is the correct form. In my opinion both is valid.
Smartass 😉 ”
‘Viri’ is the pluarl of the Latin word ‘Vir’, which means man. So when you say ‘Viri’ you are really saying ‘men’. There is no correct Latin pluarl for virus.
Vire, viri, virum, viros, virora, and virii (which isn’t even a word) are all incorrect. Because there is no Latin plural of virus it’s best to use the English form, viruses.
Consulting my latin dictionary the plural of virus is viri, as well as viri is the plural of vir.
Strange thing…
You Latin dictionary is wrong. ‘Vir’ means man thus ‘viri’ means men. I don’t know what kind of latin dictionary you are using, but it is incorrect. Would you like for me to send you one of mine?
Or, to save time, check out this page:
http://www.perl.com/language/misc/virus.html
[QUOTE]Right now when you try to do certain common tasks in Linux such as Adjust the Date/Time it asks you to sign in as root.[/QUOTE]
[quote2]Linux keeps better time than Windows, so why would they even need to change it?[/quote2]
My system clock is adjusted by a secondary time standard everyday by ‘cron’ through the ‘rdate’ command, so I don’t need to become root for this. I have the habit of typing
‘exit’ whenever I need to ‘su’ to root. I suppose it is harder to do these things under Windows.
“1. Change in location, may need to adjust the time zone…”
Doen in .bashrc or e.g. KDE and Gnome setup tools on a per user base. Just think of a machine living in timezone A logged into from a user living in timezone B. What you set in the system is the system or defautl timezone.
“2. System time may not be correct… ”
NTP: xntpd or ntpdate either on dialup or in cron
“3. Individual may want to adjust the appearance of the date && time.”
Locales. KDE or Gnome setup. On a per user base.
Sorry but the idea that users need root for most simple desktop tasks is nonsense.
Use sudo – user can start internet connections, change times, add a new printer, etc by entering their own password.
easy, problem solved.
I administer 50+ W2K and NT servers + about 100 W2K and XP workstations (web development group) and unless you have IIS and many other services completely disabled you are asking for trouble by not installing the required hotfixes and updates. Installing these updates usually require rebooting. If your server has 1 year of uptime, your server is either 100% locked down (which is questionable anyway with windows) or you are not doing required maintenance (Slammer anyone?).
I have just spent a solid week rebuilding systems and installing a very expensive anti-virus software package (good for the economy!) to combat a worm that has spread via legitimate file shares. Now that this is mostly done at least I feel better about things, but there goes one week of my life.
Whoever thinks that administering Windows is a: simple b: not time-consuming or c: a task that doesn’t need “expensive experts” is smoking some really good stuff.
1. Change in location, may need to adjust the time zone…
You set your system clock to GMT then you use the locale to display it according to your current time zone. Changeing the locale doesn’t requir root login.
2. System time may not be correct…
This could happen, but if you use a time server to get
the correct time (most likely from some atomic clock) an incorrect time setting would be very unlikely
3. Individual may want to adjust the appearance of the date && time.
This is done through environment variables on a per user basis and doesn’t require root log in
You mention that the cost of training one properly (in either environment) is dwarfed by the cost of the employee over the length of his service.
Actually this isn’t quite true. If you learned Unix 20 years ago, you can probably still use that knowledge as a Linux admin. If you learned windows 15 years ago, you will not have very much use of that knowledge administrating windows XP of today.
If you run windows systems major changes in administration occurs for every new version Microsoft releases.
As a result you can expect Linux training costs to be much lower over time.
At least later versions of RedHat contains GUI configuration for ftp. I would guess that most other distros do as well.
> This particular server has never been down since it came online over 14 months ago.
Yikes; I hope that server, and the others he mentions with long uptimes, are behind *good* firewalls. There have been several kernel security patches that are a good reason to reboot a system.