Steve Mallett published Part II of W. McDonald Buck’s essay on Linux TCO. In it he looks at the scenario of a company having already moved to Linux in the server room and also to OSS on Windows desktops, but “…now wants to know, how much extra can be saved by the final step of changing the operating system itself? And, what are the other costs, risks and benefits of doing that. To keep the scenario simple, we’re assuming too that this will be done at a time when the desktop equipment is also being replaced. The news is good, but not as good as we like to believe.” Here’s Part I.
But whenever someone complains about Linux it’s because they want more improvements. It’s good. Indifference is bad (what we usually show for Microsoft. lol.)
Anyway, after reading the Martin Taylor’s interview, this article better be some good.
lets see, you buy a cheap pc with no OS, download Linux, OpenOffice and other software for FREE.
the inital cost: the cost of the cheap PC.
Like my pappy used to say: “he’s out of it”
The article is about corporate desktop- don’t you get it?
The main problem with Windows (from a support point of view) seems to be printers.
Questions like “why can’t I print,” “where’s my document,” and the like make up a large chunk of the helpdesk calls. That, and “it’s slow or hung.”
Moving to Linux will make printing worse, not better.
That’s fun… And you got it partially right, from my experience.
At work, someone used to redirect lpt1 to a network printer while he still had a printer attached to his computer. Both used to work just fine even with such redirections. The local printer was an HP. They changed this local printer to a Lexmark (it was difficult to remember this one), and then only one would work at a time or something like that (on Windows… users pissed off.)
Another one was a report in landscape that used almost all the people. Sometimes it would print well and others it wouldn’t. It’s still a pain. Application developed in Delphi with Quick Report (on windows… users pissed off.)
Can Linux be worse than that, aside from not supported drivers?
“almost all the people”
should have been
“almost all the paper”
Moving to Linux will make printing worse, not better.
Yep, setting up printers in the Unix world isn’t as easy as it is in the Windows world. Not to mention the lack of full featured drivers and the sorts.
I figure for every job there’s a tool. Linux so far isn’t the one-size-fits-all tool. It’s fine in the server room. It’s fine on the desk of techies. At this point, I doubt we’ll (we as in the company I work for) ever switch to Linux on the desktop.
The comments were fairly insightful as well. It’s like…what slashdot dreams of being…
*adds osdir to his bookmarks*
Many who argue this point in the Linux camp are of the type who will custom order parts and build their own box, thus cutting down hardware costs in addition to the savings open source software provides. The problem with this is that most companies will not want to have hundreds of desktop computers custom built. They want to be able to deal with one company for support, whether it be Dell, HP or IBM. They want a standard hardware solution that they can order and deploy.
We, who support Linux, could argue uptime and reliability and ease of use all day long and it would mean nothing. Companies will want an easy corporate solution that will work for them.
Buying a system total system from a vendor is an easier solution
“Linux is only free if your time is worthless.”
Windows is never free. If your time is worth anything, it’s even more expensive.
Hey, isn’t making broad, unsubstantiated statements fun?
The conclusion of the article is that there’s still money to be saved – and that in itself is a positive thing. I wonder if all the anti-Linux posters will still say that this guy is right and that Linux enthusiasts just refuse to see the truth…
“Linux is only free if your time is worthless.”
Windows is never free. If your time is worth anything, it’s even more expensive.
———————–
Ever have to investigate getting a REALLY tricky malware program off a windows box? Hooeee.
“Linux is only free if your time is worthless.”
By AC (IP: —.dsl.pltn13.pacbell.net)
____________________________________________________________
Yes, that saying is old indeed, and very silly.
By Anonymous Penguin (IP: —.pool8249.interbusiness.it)
____________________________________________________________
To an extent I have to agree with both posters. Perhaps when distro’s were a tad bit younger, you had to jump through some hoops to get certain tools working. Now, in the last couple of years, tools, applications and so forth are getting configured automagically.
So, to AC, no, this is no longer releavent.
To Anonymous Penguin, I have to agree.
And my own personal response (on commercial software):
1) How much of the time are IT shop’s removing Spyware from PC’s.
2) How much time is spent on doing forced upgrades of Office Suits and OS’?
I guess, the time of an IT professional is worth nothing.
“Linux is only free if your time is worthless.”
So where does that put you as an administrator if you spend most of you time cleaning up add/spyware and virus’s off Windows machines ?
Does it increase the value of Windows because you spend more time keep them going ?
Basicly it comes down to no corporate desktop is going to be free. If the software is totally free the admin’s time certainly is not but it does help if the system in place is more secure.
So where does that put you as an administrator if you spend most of you time cleaning up add/spyware and virus’s off Windows machines ?
Why are you letting it *in* in the first place ?
Why are you letting it *in* in the first place?
That still requires an investment of time and energy, just not as much. You either pony up the time to keep it out, or you pony up the time to clean it up.
Admin costs aren’t such a bad thing. But downtime and data loss is. Companies are learning this now, finally.
That still requires an investment of time and energy, just not as much. You either pony up the time to keep it out, or you pony up the time to clean it up.
Main differences being:
a) You only pony up the prevention time once
b) You should be doing it *anyway* to keep your desktops secure and maintainable (regardless of platform)
… nice, but generally irrelevant.
The purpose of having computers (at a business) is not just “owning” them per ce, but to meet business demands.
Whichever OS is better in it – is the winner. (note: it is not the OS that is useful, but the applications).
As for the costs, well, they are the costs of doing business. Any manager, of course, would like the lower costs, but … it is really the last question.
One thing I’ve found endlessly funny, is how all these arguments play into people thinking. If you spent $500,000, just remember, you spent more…than if you spent $200,000.
Where I work we buy disks that are so EXPENSIVE, that we cannot afford to purchase them anymore…we buy them, because they are so reliable, we are SURE we are saving money.
Do you really save money, when the disks are full and people have to stop using them? Do you really save on TCO, when it’s so expensive you can’t afford them?
But this is how business people think…they are so sold on the logic. But if you just bought the stupid xServe at 1/10th the price, and you just backed it up every once in a while for that failure that might occur every 5 years…. oh, you’d not only spend less, but you could afford enough storage, that you could use the thing.
It’s madness I tell you. Before you realize its madness, though, I guess you have to run out of money.
But believe me, if its madness when you finally run out of money, it was probably madness before. You shouldn’t waste money, just because you aren’t currently having financial problems….but companies do it everytime. Without hard times…Linux isn’t considered.
Businesspeople…BAH….STUPID STUPID STUPID.
You only pony up the prevention time once
If you want to lock down the network security to the point that you might as well not even have a network, that’s certainly true. However, part of prevention is staying up-to-date on security patches, and that is an ongoing task.
“Ever have to investigate getting a REALLY tricky malware program off a windows box? Hooeee.”
I work on a campus that is a closed MicroSilly shop. One of the student workstudy machines, an XtraP box, was so mucked up that they finally called me in to fix it.
Three hours later it was back to its normal running state except no pop-ups, porn links, viruses, etc. This box was rebuilt only two months ago by the administrative IT people.
Amazing how much time is “wasted” by us *nix heads to repair the Win platform. Oh well. I get paid to do it.
“Ever have to investigate getting a REALLY tricky malware program off a windows box? Hooeee.”
Nope. My work is a Windows office (400 users).
1: We don’t let users run with elevated priveledges. Simple fix.
2: We only let them out through our proxy server. Doesn’t restrict everything but cleans out a lot of rubbish. Simple fix.
3: We package applications and the users can only install software we push out to them. Needs a server and some time and money invested.
4: If a machine breaks we unplug the old machine and give them a new one (10 minutes). If it’s a software error we rebuild it with our automated SOE build (20 minutes). Needs hard drive space and a new build created for each new desktop type we get in which takes about 4 hours per desktop type.
A university campus is probably not as worried about uptime as a bank so they don’t invest the time or money to set up an equivalent setup.
As for the point above about spending money on expensive hardware. If you need uptime for servers you pay for it. I presume your business should be working out which is cheaper, downtime for half a day or a server that won’t go down (within reason). That $500,000 over 5 years is tax deductable and depreciates so it’s not just $500,000. If our business misses deadlines we get sued. We’ve just spent AU$30,000 on upgrading a server for 40 users because if they’d had this upgrade last year they would have saved more by not missing deadlines.
However, part of prevention is staying up-to-date on security patches, and that is an ongoing task.
It’s an automatable ongoing task. And, again, one that you should be performing regardless of platform.
Bottom line.
Linux is free, its really swear to god free.
that is.. for me.. its free.
If you pay me lots of money, I’ll make Linux free for you too. (and that would be alot less then you pay Bill)
plain and simple, if you dont know it, it will cost you.. big time.
The real key is not TCO.
There has to be an overwhelming reason for companies to move over to linux.
There is no single feature, app or reason for them to do that right now. Its not just TCO people. The hassle alone and adjustment of moving to a new platform is enough to keep most folks away. I lived through the move to Win 95 back in the day. This alone is significant and cannot be discounted.
Corporate America will not and probably should not switch over entirely from Windows. However, there is a very viable niche market that few Linux distros have even bothered to pursue.
Say you run a shop where you program for a Unix platform, or maybe you have a large group of systems and network engineers that spend all day on the Unix command line.
They log onto their Windows NT machines and proceed to spend 90 percent of their day stuck inside an Exceed session or at the telnet or SSH client command line. Maybe the situation is even worse where you work with corporate IT spending money for a group of engineers or developers to have a Windows NT machine and a Unix workstation as well running two boxes out of the same cube.
If there was no market for these people, Exceed and other companies doing the same thing would go out of business. In addition, commercial Unix manufacturers would stop making workstation configurations all together. They have not.
The real market for the Linux desktop in corporate America are these people stuck spending 90 percent of their day in Unix on top of Windows or with two machines.
So, what is the solution?
A fast Intel box running Linux (SuSE is a great end-user distro) with KDE 3.0 (a really sweet desktop environment) with CrossOver Office for running Winword when the project manager sends them that complicated template with embedded graphs and images that even StarOffice chokes on.
The real key for Linux distro makers is to stop thinking of the operating system as a commercial Unix product or, worse, as a end-user Windows-style product.
MacOSX in its now BSD glory gives Linux a clue to where its head should be in viewing itself:
* Base layer–kernel, file system, and command line innards.
* Compatibility layer–all the Wine tools needed to run MS-Office or other programs right out of the box.
* Interface layer–an integrated desktop environment where everything, including all system tools, have the same look and feel. (SuSE almost has this right with even its Yast2 tools available from the KDE Control Panel.)
When Linux companies stop thinking about the OS simply being the base layer with all the rest of the pieces as being simply add-ons for the adventurous, then and only then will there be a true Linux desktop choice for the masses.
Until then, it would benefit many companies that make extensive use of Unix in their IT environment to start looking at Linux for their Unix developers and sysadmins tired of living in two conflicting worlds.
I have been using Linux for about 4 years and Windows for about 5.
Which one of them can I get work done faster on? Windows
I know, I know, before you point out all the great featured of command line, I use command line for ~ 50% of my work.
But nothing stops me from ssh’ing from my Windows desktop to a Linux system or jump box.
All the best parts of Linux are on the command line, and I can have that and still keep my Windows GUI.
By Jim (IP: —.bflony.adelphia.net) – Posted on 2005-02-16 06:43:36
>All the best parts of Linux are on the command line, and I >can have that and still keep my Windows GUI.
Still that is you. I wrote a whole spiel in my Slashdot journal about the things I missed about Linux when forced to use a 2000 and now XP laptop.
I finally got so tired of it I put the blastwave.org packages on this old Ultra 5 that had 256MB and added a bigger disk.
Can’t get real work done in linux spiel.
You see not all people switched because they disliked MS or the costs.
Some of us switched not just because we were learning Unix.
Some of us switched and started using linux as their desktop because we simply did NOT like the Windows gui and way of doing things.
Am I saying linux or Solaris or any *Nix is perfect? Oh hell not at all.
I just get work done from scripting to sysadmin work to using Nautilus scripts from my desktop to realizing as I sat my XP laptop that I was using many of the same apps in Windows as I did in linux. Why? Because I like Firefox, gaim, gnumeric, Abiword (go between Notepad and Word realize its limits thank you) and gimp.
But I still miss my linux desktop. I miss all the programs I use in linux that are not available for Windows. I like the clean simple look and feel of gnome. I miss even on my Solaris workstation the simple nice look of the Systems Settings config tools of RedHat. Oh yeah, I miss the command line and the Windows *Nix command line tools are NOT the same. But I miss all of it.
Isn’t the best part of an OSS desktop is its stability? I mean, set it and forget about it. I’m a FreeBSD person but that doesn’t matter much does it? So basically you set permissions for your average office worker so that they don’t screw with the settings on their system and they cannot install any software too (because they don’t know how generally), so all they have is… OpenOffice, Email client, and a webbrowser, and ICQ if they need it. So it’s really a time-saver! You won’t get problems appearing out of nowhere, as in Windows world. Same on servers – just set them up *properely* the first time and all you’ll have to do afterwards is monitor their status. But, if someone doesn’t know Linux, or FreeBSD enough, then the TCO may be higher, but same thing is with Windows.
1) How much of the time are IT shop’s removing Spyware from PC’s.
2) How much time is spent on doing forced upgrades of Office Suits and OS’?
A lot. Some of the IT-staff at work told me about how it is to run windows in this corporate envoirement. It made running Linux sound like heaven, even to me who don’t know much. They’re saying the same in my city’s IT-department, Bergen/Norway who’ve recently switched.
In fact, with Windows, you have to upgrade all of the apps manually from each machine manually. At least all the non-MS. There’s no ‘world update’ thing like in debian, gentoo and others. This causes MUUUCH extra work for the admins.
“Moving to Linux will make printing worse, not better”
Maybe that’s not such a bad thing. In my experience (real world), people print WAY to much. Things that could be easily stored as electronic documents, are printed, holed, and stowed in a notebook. Then they are never looked at again.
With tools like Google Desktop Search, storing documents on your HD electronicly is much more efficient. You just ask GDS ‘what documents mention XYZ’, and BAM! you’ve got them, in context. No more remembering WHERE you documents are, let alone what they each contain…
> lets see, you buy a cheap pc with no OS, download Linux,
> OpenOffice and other software for FREE.
> the inital cost: the cost of the cheap PC.
I did exactly that, and Linux refused to work with my hardware. When I once brought that into discussion right here in this forum, the answer was that Linux does just NOT work with any hardware, and that I should be careful to buy Linux-compliant hardware next time. So simply buying a “cheap pc” as you suggested is going to cause you a lot of headaches.
>Printing problems…
>By Kick The Donkey (IP: —.aic.bls.com) – Posted on 2005-02-16 13:08:43
>”Moving to Linux will make printing worse, not better”
Really?
With CUPS I have found printing to be very easy kind of almost a brain dead operation.
Ok, last place I worked was a software development division for a large government/security contractor.
We did Unix Solaris/Linux development.
So, my boss says we are going to save money and dump Exceed and stop buying one off Ultra 5’s and just have every Engineer in our group use linux.
I thought to myself we just nearly doubled our supported base of boxes and opened myself to support hell/fussing from the engineers.
I expected some resistance so I knew I had to get these boxes set up right.
We did some research into Desktop layout and friendly corporate linux versions and at the time my boss went with Suse.
Behind the firewall and using tcpwrappers we went with centralized authentication old-school style with NIS and the home dirs mounted from network on NFS.
The key was to set up the desktop from the start. The sad part is most people will never see linux at its best as set up by someone who cares about the desktop. Most people see linux through the eyes of the distro maker who is always confined in one way or another by liscenses and other issues in what they can give the end user.
Notice this was a whiiiiile back.
All users by default get KDE. At this time Gnome was in flux in their early stages of Gnome 2.0.
OpenOffice run once so they don’t have to see setup Wizard and set the default pref dir (home)/.OpenOffice so as to not clutter their home dir.
Set up Mozilla with all the plugins on the latest versions. Java, mplayer, acrobat, Realmedia etc… Make sure they also work in Konqueror.
Set up Evolution as the default mail client on the pop3 and import standard contact list.
Set up LinNeighborhood with the major Windows shares and set with proper settings for network browsing. Even Konqueror did not have the range of options and flexibility of LinNeighborhood when set up right.
Set up the desktop area with all the launchers needed for easy access.
Show them how to get to stuff like Kopete and other apps.
Oh yeah and setup the default printers to print via IP to the network printer.
We even figured out most of the config placement and could replicate the config while simply doing a recursive grep with a perl one liner on all the text files converting the default test home dir and settings to the user’s settings.
After we got this going it was pretty damn easy to setup the users.
I was surprised at how positive the response considering most of the programmers had only ever worked from a Windows workstation using exceed and terminal apps to get to the Unix/Linux development boxes.
What did it take? a little planning and a few scripts. After it was set up it was pretty damn easy to dup out to new users.
Do I suggest such a setup for the project folks, secretaries and managers?
No. But for a group of unix programmers and system administrators and network folks it just made sense for us at the time.
I>Why are you letting it *in* in the first place ?[/i]
Stop. Blame should not solely be placed on the admin. An admin is only as good as the tools, resources and politics provided for him/her.
That still requires an investment of time and energy, just not as much. You either pony up the time to keep it out, or you pony up the time to clean it up.
Main differences being:
a) You only pony up the prevention time once
b) You should be doing it *anyway* to keep your desktops secure and maintainable (regardless of platform)
Prevention on a Windows platform is a constant exercise in “Chasing Spyware/AdWare”. It is constant; no one practice captures all of it, even with “appropriate security measures” in place.
The one universal idea that any good admin should learn is that no network is the same. Small companies run networks much differently than medium sized companies as well as enterprise level management. Enterprise level tools and policies may restrict and or interfere with the use of applications on a small business or medium business network.
Example: My engineering teams requires their systems to be left open (as in Admin rights) so they can install and uninstall our software as well as other software they use on a daily basis. It doesn’t run under “RunAs” correctly, and even in documentation it states to run the sofwtare as Admin by the software manufacturer. So your “one size fits all method” doesn’t always work for everyone.
It is true that you should administer both Windows and Linux systems. However, Windows requires more maintinence at present than Linux does. Adware and Spyware software seems to be absent from use on Linux systems. Why?
Easy…Linux doesn’t run VB scripts, ActiveX or executable binaries on download or receipt of mail, or web pages.
That simple.
Most malware is based upon one of those three, and since Linux doesn’t run those, as an result, a bit safer.
An object at rest tends to stay at rest and an object in motion tends to stay in motion.
The notion of TCO is a old canard. The problem with TCO related to software is that only a small portion of it is actually dependent on the software itself. The TCO is so situational and dependent on the skill sets of the implementors, the user base, how much the company is willing to pay (too high, and you get costly personnel that feel that budgets are just wishful thinking; too low and you get a pool of headless chickens), et cetera.
A business already using a product will stick with it. They will do so as long as the product is available and there’s nothing about doing so that might represent actionable malfeasance or a make-or-break decision for the company as a whole. This is true even if the current solution is sub-optimal or more expensive. Why? Because whatever you have, you already know how to account for it, how to pass the costs on to the customer, you know you have the resources on hand to use it, etc. Additionally, business may have custom or prorietary applications that induce platform “lock-in”. An object at rest tends to stay at rest.
So long as you stick to a Desktop oriented distribution, there’s little about Linux that makes it unsuitable for the corporate environment right now. It supports a wider array of hardware than current versions of Windows (particularly the cheap stuff), the desktop environments are intuitive and easily cutomizable, the platform is pretty secure and easy to manage. Optimal deployment takes different skills than Windows administration, but that’s about it.
So why switch? Application impetus: particularly in certain sectors, Linux is becoming the preferred workstation environment and one for which applications are developed (I work in one). Liability: the BSA reminds you that if you don’t manage your licenses properly, you are susceptible to lawsuits; and there’s the public perception that Windows is prima facia insecure and thus any security breach that compromises consumer data is much more likely actionable as criminal negligence. Cost: it is possible that in some situtations cost may be a compelling factor. Flexibility: certain aspects of the architecture make customizability of the environment and user experience much simpler and maintainable in Linux. Stability and scalability: depending on the applications involved, Linux frequently has an advantage here. Lock-in: some businesses are burnt very badly by vendor or application lock-in and will choose Linux as a greater strategy of diversification and embracing industrial standards.
My personal experience is that few people, even those who should be in a position to be aware of such things, have a feel for Linux on the desktop. They simply accept it as awkward or not ready without having ever seen used it. The silly bit about printers being difficult to get working is a great example. The response I get from most people that see me give a presentation or demonstration from my Linux box is surprise at the GUI and applications. “That’s Linux? Wow.” is pretty typical — even among IT and some informatics people.
We have a number of applications that members of my group wanted access to that were Linux-only (computational biology) and we set them each up with a Linux workstation next to their Windows workstation. Those with both now work predominantly in the Linux environment (after having it for about 3 weeks), even when not using the particular application. Why? They find the KDE desktop much better suited to their particular needs, particularly the seemless VFS support, user-created context menus, and richer environment for doing the various types of tasks required in their daily routine. Were it not for the Lotus Notes clients, I think some of them would switch to Linux on the desktop entirely.
Because the PHB won’t live with anything less than administrator privileges…:D
I did exactly that, and Linux refused to work with my hardware.
What PC did you buy?
only reason that printing is a mess is that printers more then anything else (if one is talking cheap home and small office printers) use prorietary drivers. if you want printing for that under linux you either bother the maker long enough for them to give out specs or ready drivers (often they just add you to their spam filter) or you make one yourself.
if only postscript printers where cheaper and more available and the printer problem would be long gone.
mind you, there are some that are worse then others, and from personal experience i would say that lexmark is the worst out there…
[i]… we set them each up with a Linux workstation next to their Windows workstation. … Were it not for the Lotus Notes clients, I think some of them would switch to Linux on the desktop entirely.[i]
One word: rdesktop (http://www.rdesktop.org/)
I have the same setup, a windows box for Windows specific testing (and Office the few docs OOo can’t manage) and a Linux box for development and everything else. With rdesktop I never have to leave my sweet kde desktop.
And, when it comes to printing… the windows box has never been anything but a PITA. On Linux it Just Works(tm).
Moving to Linux will make printing worse, not better.
Yep, setting up printers in the Unix world isn’t as easy as it is in the Windows world. Not to mention the lack of full featured drivers and the sorts.
I thinnk you would have been right on spot a couple of years ago, but if you use KDE and CUPS printing is not harder on unix than on windows. Provided you have drivers that is.
However that is a small problem, most large printer manefactuers have good Linux drivers nowdays. A good example of this is HP. In fact my HP670C prints at higher
resolution on Linux than in windows.
And the hardware that runs Win/OS-X is getting cheaper with every product cycle. Software is only TCO superior when it gets a lower price point on the hardware side with fewer restrictions on use. Linux has clearly done that on the server, and in specialized applications. The consumer desktop requires the Linux vendor to integrate alot of hardware functions with software, not to mention deal with the patent minefield in the multimedia/driver area. The mini-mac is the prototype solution for the Linux desktop, but it hasn’t happened yet.
Prevention on a Windows platform is a constant exercise in “Chasing Spyware/AdWare”.
No, it’s a matter of stopping it before it gets in (mail/proxy filtering) and making it impotent when it does manage to get in (not running as Admin all the time, firewalls).
It is constant; no one practice captures all of it, even with “appropriate security measures” in place.
No malware has managed to get into any of the networks I’ve been responsible for.
This is not to say it will never happen, or to pretend that it couldn’t happen, but a competently managed network is a long, long way from a constant, inevitable cleanup scenario.
Example: My engineering teams requires their systems to be left open (as in Admin rights) so they can install and uninstall our software as well as other software they use on a daily basis.
The ability to install and uninstall applications does *not* require (nor justify) running with admin rights all the time.
It doesn’t run under “RunAs” correctly, and even in documentation it states to run the sofwtare as Admin by the software manufacturer. So your “one size fits all method” doesn’t always work for everyone.
Have you tried to figure out why (and reported it to the developer as a significant bug) or have you just accepted it ?
I’ve yet to find any software that *absolutely requires* Admin privileges. I’ve run across a couple of things that needed some fairly in depth investigation and fiddling of filesystem and registry key permissions – once – but not anything that won’t run at all.
*Even then*, if you are somehow put in a situation where users must run as an Admin all the time, this doesn’t justify not running common vectors like browsers and email programs under reduced privileges (either via Run As or tools like dropmyrights).
Adware and Spyware software seems to be absent from use on Linux systems. Why?
Probably mostly because no-one is interested in targeting them yet.
Easy…Linux doesn’t run VB scripts, ActiveX or executable binaries on download or receipt of mail, or web pages.
Neither does any remotely up to date and properly configured (or even defaultly configured) Windows machine.
Most malware is based upon one of those three, and since Linux doesn’t run those, as an result, a bit safer.
Linux is certainly made safer by its rarity, but that will change over time. I wouldn’t treat a Linux system as a silver bullet that requires no defensive measures.
Linux is certainly made safer by its rarity, but that will change over time. I wouldn’t treat a Linux system as a silver bullet that requires no defensive measures.
Well, that brings up an interesting paradox. If Linux is more secure because of its scarcity (which, it must be said, is much less of an issue is server space) and people don’t switch to it because “it shouldn’t be treated as a silver bullet”, then it will remain rarer, and thus more secure.
Truth be told, I do not think that it is only scarcity that explains why there’s so little malware that affects Linux. One thing seems clear, though – after reading the “How to Secure a Windows Box” articles that have been published on this site in the last few days, it does appear that keeping Windows boxes secure is in fact more work than it is for Linux computers.
That in itself represents a lot of savings for a Linux-based solution. And since we’re still quite a few years away from when Linux will have a larger market share than Windows, it’s safe to say that Linux will be a less costly system to secure than Windows for a long enough time to justify switching. Don’t you agree?
Well, that brings up an interesting paradox. If Linux is more secure because of its scarcity (which, it must be said, is much less of an issue is server space) and people don’t switch to it because “it shouldn’t be treated as a silver bullet”, then it will remain rarer, and thus more secure.
First, I have to nitpick here because I believe there’s an important distinction between being “secure” because of technical reasons and being “secure” because of exposure reasons (“security by obscurity”, to a degree).
I don’t think “more secure” is an appropriate way of describing something when it really means “less attacked” or “less exposed”.
Truth be told, I do not think that it is only scarcity that explains why there’s so little malware that affects Linux. One thing seems clear, though – after reading the “How to Secure a Windows Box” articles that have been published on this site in the last few days, it does appear that keeping Windows boxes secure is in fact more work than it is for Linux computers.
Most of those guides (at least the ones I’ve read) are very much over the top (actively disabling services, heavily customised installation CDs, unbinding network services) or unnecessary (using third party firewalls).
Additionally, by their nature, they also tend to ignore the fact that most of the “securing” can be automated at both system install time (by scripting the installation) or at system run time (using Active Directory and GPOs).
For example, many of our desktops need exceptions to the Windows firewall for certain applications to function correctly. However, we don’t configure each of these machines manually, we just configure it once in a GPO and that configuration is then applied to the relevant machines.
It’s really not difficult to “secure” Windows machines quickly and easily – and most of the things you need to do (eg: running as a regular user) you should be doing on *any* platform.
That in itself represents a lot of savings for a Linux-based solution. And since we’re still quite a few years away from when Linux will have a larger market share than Windows, it’s safe to say that Linux will be a less costly system to secure than Windows for a long enough time to justify switching. Don’t you agree?
I don’t, because it’s not that hard to “secure” a Windows machines and really quite trivial in a corporate environment (ie: where cost argument is actually paid attention).
Added to that, I’m not aware of any real equivalent to Active Directory and GPOs for centrally managing and configuring lots of Linux desktops, so the configuration costs of lots of Linux desktops are probably going to be higher.
First, I have to nitpick here because I believe there’s an important distinction between being “secure” because of technical reasons and being “secure” because of exposure reasons (“security by obscurity”, to a degree).
Indeed, you are nitpicking. Being less frequently attacked may not mean that the system is intrinsically more secure, however the bottom line is that less costs are incurred due to security. That’s what counts for businesses.
It’s really not difficult to “secure” Windows machines quickly and easily – and most of the things you need to do (eg: running as a regular user) you should be doing on *any* platform.
I agree that in a business environment there should be strong security protocols enforced to make it easier. The guides that were published here were mostly for home users. Now, you can’t argue that it’s easier to secure a home Windows desktop than a home Linux desktop, however. There is literally nothing to do to secure a home Linux PC – the default install on most recent distro has all services off by default and the firewall turned on by default. Meanwhile, Windows still requires you to install anti-virus software and to change IE and OE for safer equivalents. And if you don’t happen to have a copy of Windows XP SP2, you’re SOL.
Yes, it is possible to make home Windows secure, but it’s more work for the common mortal.
I don’t, because it’s not that hard to “secure” a Windows machines and really quite trivial in a corporate environment (ie: where cost argument is actually paid attention).
That begs the quesion, though: if it’s so trivial, how come malware costs last year were estimated at between 150 and 200 billion dollars? How come worms like Nimda, MyDoom or Code Red can bring the Internet to a standstill? It’s not just a hoax – MS has a dismal security record. Why do you think Gates and Balmmer have been hammering on security for the past two years.
Obviously, you think it’s pretty easy to secure Windows. Perhaps you should consider that you’re a lot more knowledgeable about Windows security than a great deal of Windows admins out there, and that what may seem easy for you maybe isn’t to most of them…otherwise – at the risk of repeating myself – how do you account for such high costs due to malware?
Added to that, I’m not aware of any real equivalent to Active Directory and GPOs for centrally managing and configuring lots of Linux desktops, so the configuration costs of lots of Linux desktops are probably going to be higher.
I’m pretty sure the equivalent tool exists, though it probably is just a matter of writing the appropriate script. As far as security is concerned, centrally managing Linux desktops shouldn’t be harder than centrally managing Linux servers, and there’s a whole lot of tools to do that.
Biggest problem with linux or any OS outside of Windows/Mac OSX is hardware support. Maybe the government needs to create competition by going after hardware manufacturers and requiring them to release programming internals for hardware so competing OS’s could work with everything. Obviously some of the companies like Nvidia may be in Microsofts back pocket because unless Nvidia realease a driver for your OS your out of luck on getting specs, which seems odd considering most companies would want to sell more hardware unless they had other motives. Maybe there needs to be an open driver language/spec that could be nativley compiled to an alternative OS.
“There is literally nothing to do to secure a home Linux PC – the default install on most recent distro has all services off by default and the firewall turned on by default.”
In Red Hat ES 3.0, which was the most recent Red Hat distro until yesterday (they released 4.0) I double clicked on .RPM file from the GUI and have got that RPM installed.
You sure that there is literally nothing to do to secure a home Linux PC?
In Red Hat ES 3.0, which was the most recent Red Hat distro until yesterday (they released 4.0) I double clicked on .RPM file from the GUI and have got that RPM installed.
Why were you running as root?
You sure that there is literally nothing to do to secure a home Linux PC?
No, there isn’t. The Honeynet Project left default installs of Linux connected to the Internet for weeks and they didn’t get hacked.
As for malware, well, since you don’t have ActiveX and files made executable through their extension, that solves a lot of issues. Not that this is really necessary, since there are no Linux viruses currently in the wild.
Of course, you can continue defending the monopoly by acting as if malware was not a problem on Windows. After all, 200 billion dollars is chump change, right?
“if it’s so trivial, how come malware costs last year were estimated at between 150 and 200 billion dollars? How come worms like Nimda, MyDoom or Code Red can bring the Internet to a standstill?”
My personal experience: got hired by the company with over 5,000 desktops. Each user is given administrator rights to desktop, no antivirus on desktop, Windows Updates optional, no firewall enabled on desktop.
I was amazed. Can’t believe it.
6-12 months later, Internet worm hits us, MyDoom or like it. Someone plugged infected laptop to the LAN. My desktop was not infected.
Few weeks later, email worm hits us. My desktop was not infected.
Scrolling forward to 2004. Users are not given administrative rights for desktop unless they can justify it, each desktop equipped with the Symantec Corporate Antivirus which can’t be turned off by the user, Windows Updates is on and gets updates pushed by our corporate server, desktop firewall enabled, MS Exchange server does not allow emails with executable attachments.
What do you think, how much did it cost for our corporate IT to clear the mess they made themselves, and to set up trivial security measures? That is why corporations report such high costs of dealing with malware.
The Honeynet Project left default installs of Linux connected to the Internet for weeks and they didn’t get hacked.
The USA Today left default installs of Windows XP SP2 connected to the Internet for weeks and they didn’t get hacked.
By your logic, it means there is literally nothing to do to secure a home Windows XP SP2 PC.
Of course, your logic is faulty, but I bet you won’t admit it.
Indeed, you are nitpicking. Being less frequently attacked may not mean that the system is intrinsically more secure, however the bottom line is that less costs are incurred due to security. That’s what counts for businesses.
True, but a problem comes from the attitude it enforces. A lot of people think – and advocate – that just replacing Windows machines with Linux or OS X machines is all that needs to be done, but really that will only hold true until those platforms start becoming widespread enough to attract noticable amounts of malware. Then you’re in the position of having spent all that money on a migration but then still spending the same amount you would have been on maintenance anyway.
Now, you can’t argue that it’s easier to secure a home Windows desktop than a home Linux desktop, however. There is literally nothing to do to secure a home Linux PC – the default install on most recent distro has all services off by default and the firewall turned on by default. Meanwhile, Windows still requires you to install anti-virus software and to change IE and OE for safer equivalents. And if you don’t happen to have a copy of Windows XP SP2, you’re SOL.
Well, it’s worth pointing out that on _current_ Windows machines, the firewall is also on by default. If you want to start comparing older versions of Windows it’s really only fair to compare to older versions of Linux.
Yes, it is possible to make home Windows secure, but it’s more work for the common mortal.
Well, it becomes a matter of where you want to spend your time – some initial configuration changes on Windows or possible ongoing difficulties using Linux on the desktop .
I agree that Linux and OS X are more locked down “out of the box” though.
That begs the quesion, though: if it’s so trivial, how come malware costs last year were estimated at between 150 and 200 billion dollars?
Well, there’s a hell of a lot of Windows machines out there in businesses processing a hell of a lot of money. Even a relatively minor, isolated incident (eg: a department’s worth of machines not having been updated properly) can have substantial financial repercussions when really it was caused by something easily predictable and rectifiable.
There’s also no shortage of inept Windows admins, either. One good thing about unix is that it has a much higher bar for basic capability and – more importantly – interest in the actual field to get started. There’s a lot of stupid MCSEs who only got into it for the money out there and don’t really have any interest in the field (and hence, no pride in their work and desire to do a good job).
Has anyone got some costs involved with exploits/defacements/intrusions/etc of other platforms so they can be compared ?
How come worms like Nimda, MyDoom or Code Red can bring the Internet to a standstill?
Volume. I’m not going to look it up now, but I’m pretty sure the vulnerabilities all three of those exploited were patched well before they were out in the wild.
I’m pretty sure the equivalent tool exists, though it probably is just a matter of writing the appropriate script. As far as security is concerned, centrally managing Linux desktops shouldn’t be harder than centrally managing Linux servers, and there’s a whole lot of tools to do that.
I really don’t think a bunch of home grown scripts (presumably doing interesting things over SSH with commandline tools) is a fair comparison. I’m pretty sure Novell is working towards an AD+GPO competitor for the Linux desktop (and if they’re not, they should be), but it’ll be relatively new and untested for a few years.
There’s a rather large contigent of the Linux community that thinks “centralised remote administration” is equivalent to SSH and a bunch of shell scripts. It’s not. It’s somewhat workable for small numbers of machines, but would rapidly become unmanageable with large numbers of machines being used for different purposes. This applies equally to servers, as well as desktops.
I’d also expect Apple are either working on, or soon to start, a competitor for AD. I’m also rather surprised they haven’t been working on an equivalent to Windows Terminal Services as well, although I suspect the Mac Mini might be the beginnings of such a strategy with its relatively low cost.
If your looking for a laptop with Linux pre-installed
here is a good site : http://mcelrath.org/laptops.html
I bought my laptop from: http://www.xtremenotebooks.com/
they sell notebooks with NO OS for savings from $79.00 to $140.00
If your looking for a laptop with Linux pre-installed
here is a good site : http://mcelrath.org/laptops.html
I bought my laptop from: http://www.xtremenotebooks.com/
they sell notebooks with NO OS for savings from $79.00 to $140.00
If your looking for a laptop with Linux pre-installed
here is a good site : http://mcelrath.org/laptops.html
I bought my laptop from: http://www.xtremenotebooks.com/
they sell notebooks with NO OS for savings from $79.00 to $140.00
Administrators dont clean up spyware, desktop support monkeys like me do. But seriously, the costs of installing a boring standard PC from a single vendor with a support contract has got to be cheaper than employing three times as many desktop support people to look after a myriad of botched together custom PCs. I cost 15-20 times more than a new PC in one year.
(Prevention on a Windows platform is a constant exercise in “Chasing Spyware/AdWare”.
No, it’s a matter of stopping it before it gets in (mail/proxy filtering) and making it impotent when it does manage to get in (not running as Admin all the time, firewalls).)
No filter is foolproof; especially with a vulnerability it’s not prepared for.
See below on the second part.
(It is constant; no one practice captures all of it, even with “appropriate security measures” in place.
No malware has managed to get into any of the networks I’ve been responsible for.
This is not to say it will never happen, or to pretend that it couldn’t happen, but a competently managed network is a long, long way from a constant, inevitable cleanup scenario.)
If I didn’t know better, I would have taken that comment to meant that my skills were incompetent. I’m sure you weren’t intending to be that insulting.
Example: My engineering teams requires their systems to be left open (as in Admin rights) so they can install and uninstall our software as well as other software they use on a daily basis.
The ability to install and uninstall applications does *not* require (nor justify) running with admin rights all the time.
It is a requirement made by both engineering managers and by the president of my company; full access to machines.
Oh, I forgot to mention that my company, like many others is not a tech company and has different requirements than others.
It doesn’t run under “RunAs” correctly, and even in documentation it states to run the sofwtare as Admin by the software manufacturer. So your “one size fits all method” doesn’t always work for everyone.
Have you tried to figure out why (and reported it to the developer as a significant bug) or have you just accepted it ?
They don’t consider it a bug. The software was written to be used by Admin only, and they aren’t about to change it just for me.
I’ve yet to find any software that *absolutely requires* Admin privileges. I’ve run across a couple of things that needed some fairly in depth investigation and fiddling of filesystem and registry key permissions – once – but not anything that won’t run at all.
*Even then*, if you are somehow put in a situation where users must run as an Admin all the time, this doesn’t justify not running common vectors like browsers and email programs under reduced privileges (either via Run As or tools like dropmyrights).
Once again, my engineers require direct access; no impediments.
(Adware and Spyware software seems to be absent from use on Linux systems. Why?)
Probably mostly because no-one is interested in targeting them yet.
Even if they target Linux, it’s much harder to destablize Linux than Windows.
No registry, no insecure plugins, and my reasons down below.
Easy…Linux doesn’t run VB scripts, ActiveX or executable binaries on download or receipt of mail, or web pages.
Neither does any remotely up to date and properly configured (or even defaultly configured) Windows machine.
Pre SP2, not by default. SP2 does tighten up a few things, but it’s not as good as being “out of the box”.
A good pieces of software shouldn’t have me creating extraneous scripts, running through a long, extended lock-down process, and shouldn’t have me cramming anti-virus, spyware blockers, and security patches on 90-100 workstations constantly.
1 admin, 100 workstations and users. How much help do you have in your organization?
Oh, and any thoughts of you saying “Well, then setup an automated server to do that”…try explaining that to a company that isn’t tech oriented; it’s not a priority to them. As I said, politics also shape IT policy, not just technical practicality.
(Most malware is based upn one of those three, and since Linux doesn’t run those, as an result, a bit safer.
Linux is certainly made safer by its rarity, but that will change over time. I wouldn’t treat a Linux system as a silver bullet that requires no defensive measures.)
To be honest, I’m not sure how you would treat a Linux system. Your comments seem to lean towards a pro-Microsoft position, so Linux isn’t a real concern for you.
My personal experience
Anectodal evidence, especially from someone with a history of bias, is not admissible. Sorry.
By your logic, it means there is literally nothing to do to secure a home Windows XP SP2 PC.
Of course, your logic is faulty, but I bet you won’t admit it.
Wait a minute…isn’t that what you were arguing in just the previous post? I.e. that you didn’t do anything special with your machine and yet you were safe?
You should take a look at drsmithy’s answer – at least he doesn’t contradict himself like you do…
True, but a problem comes from the attitude it enforces. A lot of people think – and advocate – that just replacing Windows machines with Linux or OS X machines is all that needs to be done, but really that will only hold true until those platforms start becoming widespread enough to attract noticable amounts of malware.
Then again, if no OS had more than a 30% market share, it would make life harder for malware writer as they would have to split their efforts. Nevertheless, while I agree that security through scarcity is not an acceptable policy, right now it is safer to use Linux, OS X, BSDs or any *nix rather than use Windows (or rather, it’s less work).
I’m all in favor of this changing due to increased market share for non-Windows OS. In other words, we’ll cross that bridge when we get there… 😉
Well, it’s worth pointing out that on _current_ Windows machines, the firewall is also on by default. If you want to start comparing older versions of Windows it’s really only fair to compare to older versions of Linux.
Yes, MS did the right thing by enabling the software firewall by default. That’s why I didn’t mention it in my comparison, but rather said: “Meanwhile, Windows still requires you to install anti-virus software and to change IE and OE for safer equivalents.”
On the other hand, if we were to compare older versions of Windows to older versions of Linux, the advantage still goes to Linux: 20 minutes average before WinXP is compromised vs. 72 hours for Red Hat…
I agree that Linux and OS X are more locked down “out of the box” though.
Fair enough, I’ll agree that MS has in fact increased the security of a default install. They really had no choice, mind you: when people get hacked before they get the chance to download security updates, you’ve got a serious problem.
I really don’t think a bunch of home grown scripts (presumably doing interesting things over SSH with commandline tools) is a fair comparison.
Well, it would be a lot less versatile, but if it was strictly to enforce security policies it’d probably do the job. I do agree that, to my (limited) knowledge, there is no direct equivalent of Active Directory+GPO on Linux. Novell would indeed do well to propose one.
Well, that’s it for me on this topic. Thanks for a rational, non-flammable discussion!
No filter is foolproof; especially with a vulnerability it’s not prepared for.
I would never suggest it is. It will, however, stop a very large proportion of the nasties and keep you away from that “constant cleaning”.
If I didn’t know better, I would have taken that comment to meant that my skills were incompetent. I’m sure you weren’t intending to be that insulting.
Not directly, no. But if your network is properly run, exploits and virus/malware outbreaks should be extremely rare and limited to things that are either as-yet unpatched vulnerabilities or as-yet unknown malware. I’ve managed to get through the last two years with nothing happening to my environment.
If they’re happening *regularly*, you’ve got fundamental problems.
It is a requirement made by both engineering managers and by the president of my company; full access to machines.
There is, as I said, a substantial difference between having “full access” and running as an admin all the time.
They don’t consider it a bug. The software was written to be used by Admin only, and they aren’t about to change it just for me.
They should. Personally, in that position, I’d be filing regular bug reports and harassing the hell out of their support line. I assume this is an application you pay a lot of money for ?
Also, as I said, I’ve yet to see an application that couldn’t be pursuaded to run with appropriate fiddling of file and registry permissions – have you tried ?
Once again, my engineers require direct access; no impediments.
That’s about as much of an “impediment” as locking the office doors at night. I assume that happens ?
Even if they target Linux, it’s much harder to destablize Linux than Windows.
No registry, no insecure plugins, and my reasons down below.
There’s more than enough ways to “destabilise” a Linux system.
Pre SP2, not by default. SP2 does tighten up a few things, but it’s not as good as being “out of the box”.
It should be, if you’re running your environment properly.
A good pieces of software shouldn’t have me creating extraneous scripts, running through a long, extended lock-down process, and shouldn’t have me cramming anti-virus, spyware blockers, and security patches on 90-100 workstations constantly.
Nor should you be doing that on Windows.
1 admin, 100 workstations and users. How much help do you have in your organization?
The company I’ve just left (which was in no way a “tech company”, either) had 2 IT staff – myself and a DBA/IT Manager – for ~220 desktops (spread around ~30 offices all over Australia). Before I arrived it was a freakin’ disaster and there were 3 permanent IT staff who were in the same sort of mess you are. Now, if the DBA/IT Manager had a bit more than a user’s knowledge of Windows, he could probably do the whole thing himself (with a contractor brought in for occasional major events).
Also, before I started there (~2 years ago) I had _zero_ practical experience with Windows Administration – my background was Solaris and FreeBSD. I’ve learnt a lot in the last 2 years and found out you can do some damn impressive stuff with a properly setup Windows environment.
FWIW, I left because they were getting to Windows-centric. I like to try and keep my skills base broad, and they were phasing out all their unix infrastructure in favour of a 100% Microsoft shop (mainly being driven by the parent organisation).
Oh, and any thoughts of you saying “Well, then setup an automated server to do that”…try explaining that to a company that isn’t tech oriented; it’s not a priority to them. As I said, politics also shape IT policy, not just technical practicality.
I’m well aware. However, as the IT department, it’s your job to determine what needs to be done to make the environment manageable and sell that to your boss(es). If for no other reason that when someone complains about it currently *not* working you can point to your plan and show them how you could have avoided the problem.
I know it can be difficult to do this sort of thing one machine at a time, but a lot of the basics (Group Policy, automated updates via SUS, etc) can be done behind the scenes.
It is unfortunate that you usually need an “event” to spur companies into investing properly into IT infrastructure, but you need to be able to have a plan ready if/when it occurs. When your whole company is crippled for a few days and has lost $00,000s, you can make a very strong case if you have a plan that would have prevented (or greatly restricted) the damage had it been implemented.
The other area you can sell improved IT infrastructure is future growth. Obviously if your company has no expansion intentions then it’s dead in the water, but if they are growing, decent infrastructure makes adding new branches, rolling out new machines, etc a *lot* easier.
To be honest, I’m not sure how you would treat a Linux system. Your comments seem to lean towards a pro-Microsoft position, so Linux isn’t a real concern for you.
You do the same fundamental procedures for all platforms. Limit the damage your users can do (90% of your problems will be caused by them), limit the vectors for attacks (automated or deliberate), keep your systems up to date, try to educate your users about running strange code, systems behaving strangely, illegitimate emails, etc.
The principles are the same, it’s the semantics that change.
Believe me, there’s nothing special about Linux that’s going to stop some malicious code causing havoc to yuor environment if some ignorant user chmod +x’s it and runs it.
The problems you are having aren’t the fault of Windows or Microsoft, they’re the fault of your environment. You might not be able to influence that as much as you might like to, but don’t blame Windows or Microsoft because you’re not using the functionality and features they offer.
Then again, if no OS had more than a 30% market share, it would make life harder for malware writer as they would have to split their efforts.
I’m not sure if I agree with that. Most of those “malwares” aren’t doing anything particularly tricky and are just variations on a theme. I suspect (with nothing more than intuition to back it up, for obvious reasons) that if you had, say, 3 major platforms dividing the market equally, they’d all suffer much the same levels of malware, and those levels would be “higher” than 1/3 of the current “level” Windows has.
On the other hand, if we were to compare older versions of Windows to older versions of Linux, the advantage still goes to Linux: 20 minutes average before WinXP is compromised vs. 72 hours for Red Hat…
Those numbers sound a lot less impressive if you present them as a proportion of the machine’s lifetime (say, 4 years) .
They really had no choice, mind you: when people get hacked before they get the chance to download security updates, you’ve got a serious problem.
Well, again, this is more due to prevalence than anything else – if 99 out of every 100 machines were running those old versions of Red Hat I imagine that 72 hours figure would be a lot lower.
I do agree that, to my (limited) knowledge, there is no direct equivalent of Active Directory+GPO on Linux. Novell would indeed do well to propose one.
Well, I’m just about to move into an environment with a much greater proportion of unix machines, so I’ll be looking for myself in the not too distant future, I imagine. We shall see.
I decided not to go line by line for my response to your last response.
While some of the information you provided does make sense to me, and I appreciate the fact that there is someone “human” in there instead of rote answers to complex questions, some of my issues mentioned didn’t seem to grasp the point quite right.
First, my engineering group requires Admin access, and the president of the company backs that requirement. No amount of coercing on my part has changed that. As a non-tech company, the IS department is still not regarded with influential authority on policy or procedure. I can warn, advise, beg, plead, and coerce, but none of that works when the management of your company is dead set in their ideas, just as much as I am dead-set in mine.
So as a result of policy and procedure, I then have to look at other ways to protect my networks, my users and my servers.
Linux serves that purpose on the server end for me right now for 3 reasons:
1) Reuse of existing Hardware: Only 25% of my servers are “true servers” in which power supplies, drives, processors, and form factors are server machines. The rest are recycled workstations and old server implementations. Windows would be too heavy a system to run on some of these “task-oriented” machines, and Linux provides an ability to do some amazing things with hardware that Windows wouldn’t run effectively on.
2) Cost: As I mentioned before, 75% are machines that are recycled because my company is a non-tech organization; a company that does not regard IT as high on their priority list (up until recently) and does not give large sums of money in budgets for upgrades to all server class equipment.
Linux is cheaper to sell because it’s use to me is free or cheaper to use than Windows. It’s freely available to me to be able to use. Microsoft’s rotating policy of upgrades is too cumbersome for existing hardware, and frankly, walking into my bosses office and explaining that to be a 100% Microsoft shop requires $20,000 US isn’t going to be taken seriously.
3) Policy: IT policy is maintained by the President, the VP of Operations (my administrative head) and the CFO.
Their policy is open to users bringing flash disks in, users bringing CD’s to use, and users not AV scanning their systems before hooking into this network.
Much as I have tried to influence this policy, I cannot change it.
Creating a plan which shows how this wouldn’t have happened if you “had done things my way” isn’t received well or at all. “I told you so” logic doesn’t wash well, and in a few cases has been enough to get people fired.
Linux provides me with an “out of the box” that doesn’t have too many questions involved, and just works. Windows requires longer configuration time, explanations to users and management, and more headaches than a one person IT department needs to handle.
Strangely enough, I have heard of this quite a lot in companies in both small and medium size space, especially from the posts here and on /. (I consider us a medium sized company) in the US.
I’m not the only one with these problems. Others have them too. Instigation doesn’t really help me; understanding my plight kind of does.
Sometimes that could be why we go round and round. Your comments, while seeming to be debunk information you don’t find swallable, appear to be more argumentative and instigating. It’s simply not helpful to me.
The earlier post was aimed at drsmithy…sorry for the confusion.
Firstly, I have to say that if I was in your position I’d be concentrating on finding another job ASAP. It may or may not be deliberate, but you’re being setup as a fall guy for any IT disasters that might happen.
Fundamentally, the problem in your environment is policy as you state. However, Linux isn’t going to change the policy and, hence, isn’t going to solve your problems. The *technology* to solve your problems on Windows is available, but not amount of technology will fix fundamental policy problems.
In short, since you’ll be running your Linux boxes the same way you’re running your Windows boxes, you’re going to end up with the same problems in the end, because those problems aren’t failings in Windows.
ALso, the trick with “I told you so plans” is in the presentation. Obviously, if you go in with a chip on your shoulder and say “you should have listened to me”, it won’t get a favourable response. Diplomacy is the issue here.
The main reasons my comments aren’t helpful to you, I feel, is because fundamentally you’re not really trying to solve the problems I’m commenting on – and you’re placing the blame in the wrong area.