There are enormous amounts of information now available about evaluating, and examining Linux for the desktop. Almost every vendor/distribution is making pitches for the desktop. The quality of the software has improved, and continues to improve. In my personal tests, there are still some missing elements that I thought I would convey to you the reader. Some of my points may have answers, and solutions available. I may not be aware of them
however, so be aware of this and I look forward to your responses in the comments area below the article.
Most of what I have seen in terms of Linux on the desktop, is the rough
guide to removing an OS, with a view to replacing with Linux/Other OS or
duel booting a box, and running the system. There is really no coverage in
the areas I am about to mention, and perhaps it is overlooked by many. But
it is very key to how successful Linux will actually be in the longer term.
In a moment I will provide a kind of rough overview about how companies
handle Windows, and installation and rollout of Windows, both to user and
customer. This is a vital element which I think has been overlooked and from
a business point of view, I think its one of the reasons people set aside
the plus points of Linux and stay with Windows.
Lets start with the suppliers. Lets see how the industry works currently.
You have the dominant player in the market.
Microsoft.
You have the people who provide the systems.
Dell, HP, OEM, Other.
Microsoft works with a carrot and a stick method of getting the synergy so
it works in their favour. They offer the system builder the chance to use a
product they want to sell. But in addition, they offer extensive, and huge
assistance with handling of such products. They offer the tools, and
software that allows system builders to work with the OS. In terms of
special builds, drivers, specialist support and other areas such as joint
marketing, and product development. In return you will see aggressive
tactics such as not allowing the OEM to ship boxes without an OS. One of the
key aspects is imaging of the software, and the replication of an OS. The OS
is built for the consumer. (This applies to desktop Linux distributions).
Its easy for Dell/HP/OEM companies, and others to create a range of
computers. These computers are built and tested, and then the OS is built
and tested. Its a framework that works easily for those who then produce
thousands of systems, and simply make one or more software images of the
system that fits the need. This might include various OS options, and may
include office productivity tools, and other items.
I suspect that these companies use Norton Ghost/Other tool, for imaging, and
the excellent SYSPREP utility amongst others that Microsoft provide. SYSPREP
lets you build or install the OS and tools you desire. It then lets you
reset the machine to a factory state, you can on reboot select a plug and
play check for new hardware and regenerate the system SID, if you wish. In
additions, Microsoft also provide a bootable OS called WinPE. This is
basically the equivalent to a 32bit Windows bootable system with a shedload
of network and other drivers. Along side things like PXE enabled network
cards, BOOTP, and other standards, this gives very easy to use
packages/solutions, that allow the two parties who need these tools the
most, access to the tools that both system builders, and businesses need.
So, the OEM/Dell/HP/Other companies can build their OS, and replicate the OS
in a simple – easy to manage way. This also allows them to build the rescue
CD’s, and updates to the very images themselves in a simple, easily
manageable way.
As for business, many windows specialists simply work on the same basis. If
I have 100 users on site. And I wish to make a change to their systems, lets
say, I want to replace 3 year old systems. I talk to a supplier, Dell, HP,
whoever that may be. I talk to their business support teams. Within a few
minutes, I have a spec of machine I want, with the OS I want, with a
specific build I want. Lets say with a spec of 1 Ghz/40Gig/256mb
ram/gfx/sound/Lan + Windows XP, and Office XP. I talk to Microsoft and
within a few days I get a site license, which includes the system images of
Windows XP and Office XP that have no need to be activated. I supply the
information to my supplier and they build the boxes. Now whether I decide to
let Dell/HP/Other do the imaging, or I make a new image for my company which
we apply to our systems does not actually matter. What does matter, is that
in business I will be licensed. I have a manageable solution. With the tools
I have available, I can build a system image or update an existing system
image, and have it ready in minutes. In addition, if I do the extra legwork
myself, I can have the image so it logs on to the domain, and has all its
programs, all the domains printers setup ready from the get go. Further, I
can carry through a standard registry. If I want to lock down security,
desktop settings, internet settings, software settings, domain settings all
in a central build. I can kill of MSN messenger. I can lock the user down to
the corporate level of agreed services and systems. I can go as far as
making each users ‘My Documents’ folder reside on a server, or group of
servers, instead of the local machine.
Lets take this further, I can have the ‘My Documents folder’, and other
folders stored off the local machine. I can run an LDAP mail or MS Exchange
system which again resides off the local machine. I and the user therefore
benefit IF that local PC ever fails. By reloading an image to a new PC, or
the same PC with replacement parts I can get the user back to working
condition in very short time. You can go further. You can have roaming
profiles. The list goes on and on.
So, to look after my hundred users+ is straightforward. Yes, Windows does have some issues as all OS’s do. But that is a side issue in the real world. For the average user working in an office, they know Office, and they know
Windows. As an IT person supporting those people, it is my duty, my professional reputation that hangs in the balance. Not providing a good solution is simply not an option. But add to this, the various tools and weapons that Microsoft provide to me, my business, and my suppliers, and it is an excellent package. Its also the very same reason why I am sitting here and saying ‘Sorry, I can’t do it.. at least not yet’. Could it also be a
reason why most OEM/suppliers simply don’t offer Linux?
Maybe I am wrong. Maybe Linux has all the Ghosting, Imaging, Business and software tools to do the same as windows offers. But its not visible. It looks like its missing in action. It looks to me an outsider like it doesn’t exist. And while it doesn’t exist, at least in my sphere, its hard for me to understand just how Linux will make it fully onto the desktop, beyond the area of the enthusiast, or hobbyist. Beyond the odd server, or workstation. So you have my comment. Now I hand over to you for a moment and ask some questions:
How would a company like Dell/HP/OEM/Other support a Linux build, its updates and support at least as well as those it gets when it works with Microsoft?
How would companies who then buy those systems from the supplier be able to do the same kind of images and updates in a similar painless way?
Why does there seem to be no Linux RISprep/SYSPREP tools available to both builders and business/users with strong support from the Linux Distributions, to do this level of support?
In addition, Windows XP is remarkable in its level of recovery during replication. You can change the hardware to a greater degree than anything I have seen, and by and large it works, or can be corrected. For me, it would
take days to prepare a Linux system to such a level, and worse still, there would seem to be no easy way of offering the positive benefits of running Windows, a Windows domain, and the tools my users would wish to use, along with the problem /system recovery I get with my current setup. Do you know of a way I am not aware of?
My main caveat with the Linux desktop idea is that I am not dealing with one particular computer. And neither are hundreds and thousands of IT veterans and specialists who work with Windows, often not because its the best
technically (even though its very good), but because it is the most manageable. IT managers, IT Directors, Company bosses and boards have to deal with the reality. They have to comply with the law. They want simple,
straightforward solutions. They want business solutions and providers who do that. They want solutions and provisions that work. I have not seen a Linux
desktop that would be acceptable for me to even try and roll out in any area of the desktop. Its not that any of the software is not good enough. Its not
that its not capable. Its not that it can’t do specific targeted work. It can probably do all that. Nothing I have seen even indicates tools, and management, recovery, centrally held data storage, user control, management
and integration that I can get with Microsoft Windows.
Everything Microsoft have done ties together. From the desktop, through to domain and server. It ties in and it works. That is why people use Windows. That’s why Dell/HP/OEM/Other sell Windows. That is why most of the
corporations round the globe choose Windows. Until that is addressed, I do not see Linux making inroads on the desktop. If it was easy, it would be occurring by now. Its lack of uptake seems at least to me to be an indicator
as to why. Perhaps we need a new distribution. Lintigration might be a good name. And what would it be ? An integrated Linux solution, that ties in to a
server, offers bootable network installation, package and management solutions, user handling, data placement. Everything a ‘Professional ‘ would need from Server, Desktop, Printer, Network and integration perspective.
Linux replacements for areas like:
Central server or domain creation.
Then the ability to create images that can be delivered to any PC on the network. SYSPREP/Automated install/update across network.
Specialised routines for locking down systems, desktops, tools, data storage on the ‘domain’. System failure/recovery.
The tools for building and working with Windows desktops would be of great benefit. Its time Linux stopped looking at the desktop as an individual issue, and looked at a far more complete solution. If you can get that, the
desktop comes in range.
Myself and my fellow staff continue to evaluate Linux, and its various distributions. We have had to swallow Microsoft’s License changes and are very unhappy about it. I think that is repeated up and down companies around
the globe. But Linux doesn’t give me the tools that I need to handle the ‘desktop’ – tied and operational INTO our and other peoples business systems. Its falling short, and that’s why I thought I would post this article.
About the author:
Darren Stewart is the network manager at the Gray Cancer Research Trust. He has worked in IT for the past 10 years with systems ranging from AS/400, Unix, Microsoft Server and Desktop OS’s and systems, and for notable
companies as AXA Equity and LAW, Old Mutual, MCTWorld, Circle International, Finance and IT Expertise Ltd. He is married with an 18 month old daughter.”
First sorry for my rant last night, it was late
But seriously i think you should look at Linux Terminal Services. Its like you have one or multiple servers which host all the application, files and other stuff user might need like printer connections. Then you can reuse your workstations and have them boot off the network or use real X-Terminals ( i have read SunRays before and i think IBM sells them too.) I you need to update software you just update it at your server and BOOM it works at all workstations “Terminals”.
Have a look at http://www.ltsp.org/ and http://termserv.berlios.de/ for tools.
http://www.ncd.com/products/hardware/ncs/ for hardware X/terminals.
The thing with terminals is they are “cheaper” then any workstation (yes i know the servers needed are expensiver) and they wont need a upgrade unless they brake down over some time. Just put some new terminals in the a closet on all departments that use computers and they can reorder new terminals them selve if they have used all, it should be just like a phone or tv. What you deliver is an infrastructure you can auto configure them with bootp and dhcp and the such. Then you only have too look after your linux servers. Happy camper. This is what i would do if i had a network with 100 clients.
Also read this newsforge article about the city of largo they have 400 terminal clients running on redhat servers. http://newsforge.com/article.pl?sid=01/08/10/1441239 very nice read i think
Goodluck in your research.
Quazion
No sane company deals with this area of procurement on cost alone. Many of the suggestions would work wonderfully IF I were to create a new network, or setup a system for a company who was just starting out.
I cannot fully explain to the layman the kinds of detail required when taking a full Win32 shop and the users over to Unix. There are multitudes of problems that have to be considered, and just looking at individual items, mail servers, NIS, NFS, Samba is only half the story.
I can tell you for a fact here, that the company will not move from MS Office, so in this instance, I would have use some VM option, or Wine, and make sure that it worked across the enterprise. I have no doubt that its possible to configure. But I am reticent to say here I think its a viable, suitable, simple solution.
In addition, the programmers here use CVI, which is a Win32 bespoke compiler package. That in tern is then distributed to many client machines, microscopes, scentific machinery etc etc..
The mechanics of moving a business like this to Linux is neither simple, nor straight forward. If you add in other items such as the developments of Serial/USB/Firewire, and the usage of much equipment that tends to have only Win32/Win16 drivers.
I had already looked at nearly every available Linux distro, and in addition I have spoken with Caldera about Volution, and Samsung contact about Mail here. I’d happily move all the desktops to a mail server like that, and run LDAP. Its penciled in for after April budget allowing.
There are sections in the business that can be moved like that, and they will be, but it doesn’t get away from my lack of knowledge, and the lack of advertised methods for Linux
Briefly, here is what I consider would be useful for a Linux Distro for someone like me:
1. Server Install, using the standard installer or some amended one I guess.
2. Services wizard, with additional items that tie into later wizards.
3. Linux DOMAIN/ENTERPRISE data storage settings, where you would select the data storage areas for users /Home,/Mail, usersettings/whatever else
4. I suppose this has to go somewhere, I’ll put it here. Linux workstation bootdisk wizard
5. Linux workstation Image wizard, taking items from the above server wizard, domain/enterprise wizard to configure mail/NIS/NFS/Samba/other, and the storage of such image(s) for rollout via bootdisk.
6. Linux users and groups wizard, allows the domain user and usergroups to be built. I guess you’d have to store or collate the data from domain/enterprise settings wizard as well on this step.
7. The next step would be to unpack the chosen standard PC, and fire up the boot disk, and download the image. This covers pc’s that fail in the future.
Now I have very much oversimplified the issues, but perhaps you can see what I am getting at. Something like the above that every distro comes with would aid people and businesses like me a great deal. I think if you look at ANY Linux distribution NOW, the machine install issues are basically gone. Its a nice GUI, and it works.
Can’t Linux, and those behind Linux do the same for the enterprise, new users and businesses ? If the tools are there its a case of someone pulling them together, and doing a GPL distro of their own I would have thought
DS
You do NOT have to use Linux. Stick with Windows if you don’t want to take some time to learn Linux. You spent time learning Windows and so did the end users.
Linux is not trying to be like Windows so some things are done differently. Accept that and learn how to do it in Linux.
What is stated above is different ways of doing it and what each writer believes is the best option. Take one and go with it or research to find what will work best for you.
“You are making a mistake that someone has the time to go to great depths in these areas. Sorry, real world means most departments have pressures that work against this, and thus, we are back at simple, easy to administer system and integration.”
With Linux spend some time now and save lots of time and money later. That’s what it seems it boils down to for most companies. TIME to learn and suuport = COST. In the end Linux would be cheaper to run.
As for the end users Linux has done a great job (although there is still a little way to go) to make it easy for the inexperienced. Most distros have something similar to the Start menu and with applications like StarOffice which is compatible with Microsoft products there shouldn’t be any problems. Email applications are one and the same but if you are worried about that then use Ximian Evolution which looks similar to outlook.
Sorry for ranting:~)
Well, I’m over my head posting in this thread, but thought I’d make one comment from the point of view of the average worker, sitting at a corporate workstation. First, thank you Darren for an article that produced so much response and ideas!
There is the usual conventional wisdom about Linux on the desktop in the office. Just from that point of view – of the worker at the workstation, this is obviously the easiest end of things. I know many have said Mandrake has been good for this, but I don’t know about it as I’m not as familiar with it as other distros. But, I know that Red Hat is ready on the client side. No more “almost there” – it is here. What they have done has surpassed everything and taken this to a new level for Linux. I cannot think of anything that would stop a worker from using it. They would have to learn a few new tricks, but nothing of any major consequence. There would be things that would happen, of course – users do dumb things sometimes or things they shouldn’t be doing. XP is so much better now than previous versions of Windows (well, 2000 too), but there is nothing I can see in Red Hat 8 that would prevent the client side from working well. It’s here!
And SuSE has announced their intentions for the workstation now. It will be very interesting to see what they come up with. Knowing SuSe, it will be very good. I say the client side is ready – now!
>1. Server Install, using the standard installer or some amended one I guess.
Redhat Kickstart servers which allow you to set up an install from an image resting on a server very much like the Solaris Jumpstart system.
>2. Services wizard, with additional items that tie into later wizards.
You can elect what services are installed and run on startup. Fully admit that distros have to make this easier some do and some don’t. All of the commercial distros have a tools to specify what services are installed through package management and unfortunately another tool to control what are actually started on bootup.
>3. Linux DOMAIN/ENTERPRISE data storage settings, where you would select the data storage areas for users /Home,/Mail, usersettings/whatever else
This is my favorite. Most network shares for storage should be handled through samba and linux tools for smbmouting and such like LinNeighborhood which when configured for the user’s name and password and file manager are really good. However, all users should have an NFS mounted home dir which means all your user’s home directory stuff is one place. Management should be handling through NIS combined with TCPwrappers. I know that both SuSE and Redhat have graphical tools for configuring both services on your servers and for configuring your clients for these services. Think outside of the NT domain box.
>4. I suppose this has to go somewhere, I’ll put it here. Linux workstation bootdisk wizard.
Graphical tools for creating bootdisks are included with every major distro I forget quite honestly how this is handled with the Kickstart configuration. Google on Redhat and Kickstart for more information or download some of their pdf docs on this.
>5. Linux workstation Image wizard, taking items from the above server wizard, domain/enterprise wizard to configure mail/NIS/NFS/Samba/other, and the storage of such image(s) for rollout via bootdisk.
In Kickstart the image is contained on the server and the install picks it up with the settings for NIS/NFS, and other information. There is a graphical tool for configuring Kickstart included with Redhat 8.0.
>6. Linux users and groups wizard, allows the domain user and usergroups to be built. I guess you’d have to store or collate the data from domain/enterprise settings wizard as well on this step.
The best all around tool for configuring/maintaining server settings for NIS, NFS, DNS and other settings is not distro specific it is called webmin check it out. I set up a good deal of this initially with distro tools but for day to day work webmin with its web based interface is great.
>7. The next step would be to unpack the chosen standard PC, and fire up the boot disk, and download the image. This covers pc’s that fail in the future.
Unpack machine fire it up from boot disk and it connects to Kickstart server and installs the OS from settings on the server.
The key is that IT folks are absolutely as locked into the Windows way of doing things as opposed to age-old Unix techniques of handling the same tasks. If Windows works for you this is not a problem. I am opposed to switching platforms for purely “religious” reasons of hating MS. However, most of the issues you talk about were not addressed originally from Redmond.
Network storage (NFS)
Domain setup (NIS)
Unified profiles (NFS mounted home dirs)
Unified installs of servers and workstations from templates or images (Jumpstart)
Unified remote configuration tools across even different types of Unix-based platforms (Webmin and other tools)
Let me remind you, I wrote my comment based on the idea that LINUX is now pitching for this AREA. I have no problem with that. I like Linux. But what seems to be missing here is the basic understanding that whatever you may think, companies are not likely to go and retrain their entire technical teams, and all their users based on your comments.
They aren’t going to have to retrain their entire technical team, but of course they would have to retrain their desktop support staff. Linux is a different operating system; supporting Linux boxes from an OS perspective should be entirely different than supporting windows from an OS perspective. At some point Unixes might create a clone admin system but the work is so different that its highly likely that the tools will need to be different.
If Linux was pitching for server space and its normal areas I would have no problem. But its recent foray into desktop territory is worthy of discussion.
When Linux was pitching for the server area you heard the same types of arguments; I suggest you look back to discussions from 95-98 on Linux servers vs. AIX, Solaris… Then people argued the Linux server configuration tools were too different from the AIX tools and nobody wanted to retrain their staff…
What is also clear, and I mean no disrespect to you guys who favour Linux, but Windows is compared to your suggestions even better than when I first made my comment. Looking at suggested pages, tools and options, its a disappointing mish-mash of variable tools, all created without a vision or any unification in mind.
That’s the issue of organic vs. created technologies. In general there is no question that created technologies tend to be easier to learn. Conversely organic technologies tend to handle a wider variety of issues better in practice.
Also again you ignored the key point that problems you have with windows you simply don’t have with Linux because Unixes have been supporting lan / mutliuser systems from the start. I along with several other posters have noted huge numbers of different paradigms for achieving the same effect which you have ignored.
Finally Linux isn’t unified; RedHat, Mandrake Debian are unified. You shouldn’t expect RedHat and Debian to be any more unified than Corel Office and Microsoft Office.
Many of you even attack the end user, the very people whom would be your customers. That in itself is a cardinal sin.
This is free software, there are no customers, there are members of the community. Part of Linux is getting away from the customer / provider model.
No one has really dealt with the issue of retraining the users.
Because that’s an entirely different issue. In general most studies have shown that Unix / mainframe setups require less user training because the systems are more configurable for the administrators. But, yes depending on the level of skills users are going to require some retraining. You do retraining the same way you do any training.
Very few people covered the intergration of office tools, and the interoperability with other companies who would be still working on MS Office based solutions.
First off all there are mutliple office paradigms in Linux. Its a broad question. As for interoperability its at this point so-so.
Another comment” 2. Protection of system files: Mr. or Mrs. Desktop Luser has an account and password to log into the machine. But not root access… so they cannot mess up their system. And if they do, just restore the backup of their home dir that you did last night during the sceduled cron job, possibly even residing on a CD-RW right in their own desktop machine. You did backup, didn’t you?”
Basically I know its a shock horror, amazing idea, but Admin access is usually replaced with ‘power user’ which acts the same way in the windows environment. I then have the suggestion that I run local Cron jobs on users machines backing up their /Home drives to a local CDRW. And then asking me or the user in sarcastic comment if we backed up.
So let me get this right.. either myself or the user is going to walk round the entire enterprise each night, and mount/unmount linux CDWR drives and do local /home backups even when I have said storing data on a local machine in todays age of one year HD drive warrant’s is the most braindead, stupid, dumb assed lame LAME LAME way of handling this, and its something that was clearly stated in my comments originally. At this point my regard for ‘Mr or Mrs Desktop Luser’ as stated is higher than for this comment.
You don’t have to store anything on the local machines. Heck you don’t even need them to have harddrives at all. As for cron, cron is automated you don’t walk around doing anything.
Many of you have suggested a remote NFS /home, which I would have on a server running a nightly backup.
I am not sure how to say this, but its important to remember you have to convince, and operate a system that starts and ends with the user, and the company. Some of you have stated good ideas and tools, and methods. No one as yet has supplied me with an all round idea of how an all Windows business could successfully carry it off.
Why do they want to? That’s the main question. You seem to be happy with Windows costs, windows configuration, windows administration and the windows paradigm. In that case stick with Windows. An all windows business that is satisfied should probably stick with windows.
Most business however are not all windows. The core apps reside on a mainframe system. Most network services are provided by a mish mash of NT Server, Unix boxes, and custom OSes. The desktops and laptops are a mish mash of Windows versions with each user having some collection of about a dozen apps out of the several hundred the company provides and has to license.
Those sorts of business can transition to Linux. They have Unix administrators to setup desktops. Many of their apps are really mainframe apps and the interfaces could be much more easily provided over X and web interfaces than client server VB apps. They already have their users trained on using terminals within the NT environment. So they start transitioning more and more of their apps over to X interfaces then they switch the desktops over.
Thats the market Linux is aiming at. At least with the desktop sales pitch. Whatever solution it is, its got to be a unified workable and downright simple solution. Both for technical staff and the end user, and with no, or at least limited loss of functionality. Its not the Linux market guys. Its a whole new ball game.
The corporate market was terminal based for decades before the move to PCs become core to the business. The PC transition has been an expensive very labor intensive nightmare. What PC offered that mainframes didn’t was user freedom. The lockdown simple image solutions you are talking about take away the main advantage of PCs. What most companies have in practice is a default starter image that gets totally customized at least at the department level and thus is impossible to support at a reasonable cost.
Most windows shops have specific software, bespoke to windows. Many tools and utilities, licenses and assests tied into what they have. When you have made your grand switch to Linux, and the users can’t use the system you have put in, and none of their software runs even if they could use it, you’ll have to find answers and solutions.
Sorry to rant, but some of you simply are not being realistic. You are going to have to have a solution that is better, simpler, cleaner, easier and with lower cost, and that before people will even consider it.
See above.
I agree with some of your comments, in particular that of the already mixed shop with terminals and mainframe systems.
Your comment below:
You don’t have to store anything on the local machines. Heck you don’t even need them to have harddrives at all. As for cron, cron is automated you don’t walk around doing anything.
Was seemingly a mistake. I was ranting at the Linux user who suggested it (Because it was unspeakably stupid), not suggesting local storage and CDRW backup myself
DS
Linux’s path to the desktop is blocked by 5000 angry geeks who demand that things remain unfriendly.
‘It’s the *nix way!’
‘RTFM!’
‘It’s a feature, not a bug.’
‘If we make it easy to use, people who don’t know what they are doing will use it.’
‘I like it this way, make your own.’
‘You’re just trying to make it like Windows.’
‘I don’t have time to make a readme/help file, I’m adding more [broken] features.’
‘Works for me.’
Your article is nothing more than thinly disguised FUD. Its a clever well wriiten article but underneath the oh so smooth arguments it FUD. Many posters have already offered up various methods and still you come back to your original premise that Windows (take your word for it) is better. LARGE HORNED ANIMAL KAKA!
I also deal with a large network that consists of mostly WinXP systems. I am right now using Ghost to clone an XP box. This is the second attempt because like so often happens with XP the clone fails after WinXP detects the hardware on the slightly different system from the one the clone was created on.
Your claim that it is less time consuming to use Ghost with all the various wiz-bang Microsoft tools than loading Linux from scratch is bogus and was a loaded statement in favor of Microsoft products. Its based on the presumption that it takes longer to load Linux than Windows using Ghost. Sorry, no sale here. A basic Linux workstation is going to take in the range of 10 to 15 minutes from start to finish and require less intervention during the install process than all versions of Windows.
Oh BTW, my Ghost install just finished for the second time and XP just hosed itself for the second time on normal industry standard hardware. So much for that claimed time saving with Ghost and XP. (i’m working as I type) The weak link in your plan is not Ghost, its XP. Another thing you neglected to mention is that Ghost is compatible with ext2 (and most likely ext3) according to thier manual.
“No sane company deals with this area of procurement on cost alone. Many of the suggestions would work wonderfully IF I were to create a new network, or setup a system for a company who was just starting out.”
No saner company continues to use products from a vendor that constantly raises prices beyond market norm. They wouldn’t stay a company long if they didn’t bite the bullet and dump the costly vendor at some point. Thats is what is starting to happen to Microsoft. Too damn expensive!
“I cannot fully explain to the layman the kinds of detail required when taking a full Win32 shop and the users over to Unix. There are multitudes of problems that have to be considered, and just looking at individual items, mail servers, NIS, NFS, Samba is only half the story. ”
The obvious attitude in the above says it all. I would wager that most of the posters in this thread are just as experienced as you. Your article and posts are nothing more than cleverly disguised FUD.
“””‘It’s a feature, not a bug.’
‘I like it this way, make your own.’
‘I don’t have time to make a readme/help file, I’m adding more [broken] features.'”””
Who have you heard say this?
“””‘RTFM!'”””
This predates the free unix clones, and is good advice if you are having problems regardless of the platform.
“””‘Works for me.'”””
Perhaps it works for them and they can’t think of what’s wrong off hand? Computers are vasty complicated pieces of machinary and everything from hardware problems to version differences can cause issues. Again this is pretty standard regardless of platform.
“””‘It’s the *nix way!’
‘You’re just trying to make it like Windows.'”””
You don’t use/admin NT or a Mac the same way you use/admin *nix; it’s a different paradigm. Forcing one on the other isn’t necessarily a good idea.
I have recently installed RedHat 8 Personal onto a second hard disk on my home PC. The first disk runs Windows 98SE. I tried RH 7.3 and Mandrake 9 before my purchased copy of RH 8 arrived. While RH 8 is MUCH better than the other distributions it will not replace my copy of Windows 98 just yet. There is just too many ‘techie’ issues to deal with. May be after I have read my copy of Linux Administration Handbook I will change my mind. I should also say, I won’t un-install RH 8, I will give it a go.
Ativo: with regards to your comments that nfs isn’t that bad, security wise, I must point out that security is not just against access from the ‘net. I don’t have my entire LAN on totally secure premises, so it is assumed to be untrusted to a degree. Anybody with a laptop could theoretically wander in and download, say, an accounts database shared over nfs or in the case of r/w homedirs insert a trojan. I don’t like that.
CIFS provides some measures for encrypted password-based authentication, at least. Its bad, but nowhere near as bad.
Tunneling NFS is an option, and one I haven’t explored well enough to comment on really. However I’d like to note that it uses RPC services etc so it could be difficult to tunnel, and its also fairly inefficient already.
To Sla7er and others: regarding Xfree and it’s memory-usage. Sure it LOOKS like it eats alot of RAM. But Xfree includes your AGP aperture and other things to the amount of RAM it eats. In reality majority of that RAM is NOT, I repeat: is NOT used. It might look like Xfree uses 150 megs of RAM, but in reality, it’s just uses fraction of that amount
It’s fairly plain to me by this point that the author just isn’t interested in an answer. Either the author is willfully ignoring the suggestions proffered, or his lack of knowledge about OS’s outside Windows is tripping him up and he’s not interested in learning.
I work with both *nix and Windows. This isn’t hard. But you can’t want “Windows under another name.” Linux is not Windows. Forget about how you do it with Windows, and put forth some effort to learn how to do it with Linux.
The Author wants a remote installable desktop with presets that is able to migrate across platforms. Sysprep was mentioned. A number of solutions were presented. First, is the network install option ala Winnt.exe or RIS. Forget Sysprep, its crap. Linux installs can be performed from floppy, CD or BOOTP boot to over-the-network NFS or FTP install from a central server. Various Linux distributions have various means of providing the customization, such as kickstart. Mandrake has the handy option to save package selections and whatnot to a floppy, like a unattend.txt file. You can also use 3rd party imaging software to create distribution images stored on a central server for pre-positioned re-imaging. This has been possible with tools like Ghost for a long time, and you don’t have to worry about requirements about HALs and disk controllers being identical.
The author was concerned about central management of security, applications, data, etc. This has been possible on Unix for years. The author doesn’t seem to understand that everything not necessary to boot the workstation can be located and administered remote from the workstation using NFS. Prior to the PC “revolution” users connected to their home directories with their files and apps on a mainframe. The Admin controlled access to applications, files, and updating thereof from the mainframe. Insert “remote Linux server” where you see “mainframe.” Use LDAP (Active Directory is an LDAP directory service with DNS and ACL integration – you can even configure Linux workstations to authenticate to W2k AD domain controllers), NIS, truly standard authentication protocols such as, ahem, KERBEROS, hell even Novell NDS for universal naming, authentication, and resource management.
I could go on but really this post is too long as it is. Linux can do what you want it to do, but it requires putting forth some effort in learning the OS (just as it requires some effort to learn Windows administration). The knowledge of how to do it isn’t going to fall on your head like a gift from on High. If you want a single vendor point of contact to put it all together for you, and hold your hand when you break it, talk to a company that is serious about Linux. Like IBM or, wonder of wonders!, one of the bigger Linux distros like RedHat. That’s precisely the kind of business RedHat WANTS.
Derek
The city of Largo saved millions by doing this – all clients are the same, all setups are done on the server – and for 400 terminaks, the server they are using is a Dual Pentium 933, 3GB RAM and SCSI drives.
Simple admin, simple rollout, save millions – why do anything else?
see the story on
http://newsforge.com/article.pl?sid=01/08/10/1441239
I`d been managing PC in the past, Windows, Linux too. Linux can be customized. The domain is _not_ m$ unique idea. yp and NIS was existing long ago.
I think you are missing the turn-key enterprise solutions from linux. Its much more easier to provide for a singe product or a small set of product like windows. That can be caused by the diversity of the distributions and the speed of envolving. I don`t expect support from the big software companies. The general solution could be providing application generators, with a help of that you would be albe to customize closed source appliations for the local requirments. There is some other try too e.g. RedHat technical desktop. They planning 2 years between general updates.
You bitch about not having the tools you need.
You bitch about not having been informed about them.
You bitch about Linux being hard.
Stop with the bitching, already.
Remember that Linux has a strong Unix heritage, and has been doing the things you want long before Windows ever tried to bolt on network support, much less the poor implementation of multiple user support.
You want roaming profiles? It’s called nfs mounts with the aid of the kernel automounter.
You want server-based mail? pop3. You want it stored on the server? imap4.
You want a file server? Use NFS where appropriate. Use Samba for even better compatibility across operating systems.
You want highly automated software rollouts? Use one of the previously mentioned tools to create a custom Debian install cd and learn the magic of apt-get
So let’s think about what you get in exchange for actually going off and educating yourself instead of sitting on here and whining:
1) Reliable software that tends to concern itself less with controlling the user and more with getting the job done.
2) The tradeoff of point and drool interfaces with software packages that work predictably, as in they don’t try to second guess what you want.
3) The idea that robustness sometimes means “Can’t mount home directory” is a valid error message, and not “Can’t find roaming profile. Using remote, which will overwrite everything on the server, including all mail and settings to a revision level that was never used to begin with”
4) Most of your mission critical software is supplied by people that are concerned with making the best tool, and not “how many ways can we get the user to prove that the software is not pirated, even to the point that we’ll regularly break it for the valid user in fear a single invalid user might get a copy”
5) Software that is designed on the old Unix philosophy “One tool, one task”, not “If we integrate the web browser into everything, we’ll have an excuse that it’s irreversible by the time anyone realizes we just did it to maintain our own monopoly”
6) An overall reliable system that requires fewer people LIKE YOU to administer. As little as one-quarter the manpower by the last estimate I read.
A word of advice: No argument about Linux on the desktop is so weak as one followed up with “I didn’t mean the desktop, I meant the server!”
-craploader
Me thinks you didn’t bother to read the comments.
You are probably starting the next dozen of people who start telling about NFS
We all know that you can replace Office suits and default network services. But how are you going to migrate if you main application is a native Win32 application, with customized code to be able to print special reports or labels to specific print devices? I never hear talking about those details, who in the end are maybe most often the reason why people can’t migrate way from Microsoft.
Maybe we should stop discussing all the little details of solutions which could help, but take some time to make some conclusions? I’ll start with giving mine.
The author was not bitching on Linux. Neither was it FUD. It was a good explanation of the average view Windows SA’s have on Linux. We all know you can use NFS and Samba and authenticate Linux users on Active Directory etc. etc.
The only important point that – IMO – the author wanted to make, is about the lack of integration of unix products and the lack of a linux distribution presenting a nice boxed, customized product who provides all this by default, without you having to download an extra rpm or compiling an extra library.
Unices are technically better and oustand Windows in this perspective. Point, end of line.
Windows is better at providing an “easy to setup” (although not idiot proof) product, who presents (and this is important IMHO) a _fully_integrated_solution.
Following on this, the next point which should be taken, is realizing that if such a Linux product was made, it really could help to have it deployed and replacing Microsoft products. Is it a logic way? is it the right way? I don’t know, but it probaly just is.
The company I work for that is,
We use systemimager [www.systemimager.com]
These are clusters of Linux nodes used for parallel computing. Typically arranged in clumps of 32 nodes [64 cpu’s]. The process would work just as well for desktops.
You get one node right then duplicate it.
Linux handles different hardware very well in my experience, the Redhat Kudzu and X11 configuration utilites work out of the box for most hardware.
There is nothing missing from the current offerings of office applications, koffice, openoffice, abiword etc
except that it is not Microsft, and so it is different and the problems you will get will be different. In the end though the software is open source and so more people can have a crack at supporitng it and you are not tied to the marketing whims of a sinlge vendor.
LUI – http://oss.software.ibm.com/developerworks/projects/lui/ – The Linux Utility for cluster Installation (LUI) is an open source utility for installing Linux workstations remotely, over an ethernet network.
SystemImager – http://systemimager.org/ – SystemImager automates the installation of Linux to masses of similar machines. It is most useful in environments with large numbers of identical machines. Some typical environments include: Internet server farms, high performance clusters, computer labs, and corporate desktop environments where all workstations have the same basic hardware configuration.
GNU Parted – http://www.gnu.org/software/parted/ – GNU Parted is a program for creating, destroying, resizing, checking and copying partitions, and the file systems on them. This is useful for creating space for new operating systems, reorganizing disk usage, copying data between hard disks and disk imaging.
Partition Image – http://www.partimage.org/ – Partition Image is a Linux/UNIX utility which saves partitions in many formats to an image file. The image file can be compressed in the GZIP/BZIP2 formats to save disk space, and split into multiple files to be copied on removable floppies (ZIP for example), …. The partition can be saved across the network since version 0.6.0 .
Learn to use http://freshmeat.net/ there is a lot of software out there.
If the author is serious about even liking the possibility of running his shop on linux, I would like him to contact me for a little tour of our shop. We are currently running 60 linux desktops and intend to roll out 100’s more. The only support call we have ever had on these 60 machines is two for blatent hardware failure. A successful corporate linux rollout first requires a little research and planning. I feel the author is trapped into the MS way of running the enterprise and I can tell you it is just plain wrong. The MS way of doing things optimizes the sale of more licenses not the way things are run. The only corporate solution that is in any way viable is to run the machines like a terminal server. Linux by it’s very nature is intended to be run this way. This will give you a very supportable and stable solution. All of our machines run a desktop that we can fully change on the fly and software is only installed on a single server. Imagine in the windows world to actually be able to deploy software pre tested to every desktop in less than a second. Users are unable to munge up the machine because they run in a totally controlled sandbox. I am willing to fully share my knowledge with you the question is will you take the next step.
Sounds to me like you want Windows, not Linux. Sounds like you grew up on Windows and can’t shake the FAT CLIENT mentality. Linux is based on the UNIX model. All the things you describe are available in UNIX and have been for years – before MS was doing it. Well, most things – some of what you describe – like the windows registry – don’t translate to the Linux/UNIX world. If you want Windows or a system that acts like Windows, buy and use Windows. If you want a system that is cheaper, stable, includes productivity apps, and has an interface that a Windows user could quickly learn – educate yourself on Linux and the desktop options available. Don’t expect Linux to be a drop in replacement for everything you learned on Windows.
Desktop Linux has the ability to integrate with a windows network – that doesn’t mean it’s going to do everything like Windows.
Server Linux can act like a Windows PDC. That doesn’t mean it replicates every nuance and includes Windows utilities.
If you are truely interested in a Linux based network (desktop and server) then you MUST learn the file structure for linux, things like Kickstart (if you don’t know what it is – you haven’t tried too hard to learn linux), LDAP, NFS (again – if you don’t know this – you haven’t learned much about Linux), SAMBA. You could check out the Linux Terminal Server Project for some ideas and hints.
I understand your complaints and frustrations. But before you write an editorial claiming “Linux doesn’t have this Windows utility” or “Linux doesn’t have a wizard like Windows” – you should gather these questions/issues (and they are good ones) in a list and find out how you can do the equivalent in Linux. Because you won’t be able to do it exaclty the same as Windows. If Linux worked exactly the same as Windows – it wouldn’t be Linux – it’d be Windows!
Centralized settings for desktops is not a real problem.
If you have centralized home directories, you have
centralized settings. Want all users to have the same background image? Easy, just deny the users write permissions to the kdesktoprc file in their home dir and then change the wallpaper network-wide with a simple shell script. You don’t even need a centralized home dir for that, just looping over a network range with an ssh command is enough. Accomplishing the same with pre-built disk images is even easier, since you have absolute control over what is in the home directories. The same goes for pretty much any other setting. Need to run a program when a user logs in? /home/user/.profile, or .bash_profile, .xinitrc –
These all get run at some point or other, you can add your stuff to there.
The point is, this is a nonexistant problem, solved years ago. Bootable disks with network tools have been around forever. Want to clone a system over the network? netcat and tar is all you need. Want to clone disks and burn them to CD? Just use mkisofs to make an image file, use tar to clone the system, make a boot floppy with some tools and a script that auto-installs the desktop and you’re all set.
Need the system to be robust and handle hardware changes easily? Sheesh, linux has been like that for years. Especially with modern PCI hardware. The only problem I can see is in multiple X configurations, and for that sort of thing there are daemons like kudzu.
Take a look at things like Tom’s Root Boot (http://www.toms.net/rb/), I highly recommend it.
In truth, I don’t really see why the author can claim to be informed about what Linux can and cannot do in this area. Maybe there are no point-and-click tools for it, but truthfully if you’re an admin for this kind of network you shouldn’t be afraid of the command line.
I love linux vs windows debates.
Some comments refer to a lack of Linux distribution yet I don’t have a clue what version of MS windows are you referring to! I very much doubt that your comments pertain just to any version and second that they are probably not exactly the same across versions.
However, it is extremely clear that many of the posters show that the Linux tools are widely used, been available for ages and with little effort on other Unix based OS’s.
When MS provides a full featured Office package at no additional cost, I might be more open to MS. Until then, no way is MS windows a consumer OS!
The feedback has been great. Some has been harsh, but my original article may have been recieved as harsh on LINUX (It wasn’t meant to be guys), AND my lack of senior skills with Linux was the background to the question.
Conclusions
1. Many skilled Linux people misunderstand why people ask stupid questions.
2. Many people who ask questions or have queries regarding Linux are not stupid, unpleasant or anything else.
3. Sometimes Linux IS going to be more like Windows even if you like it or not.
4. Linux benefits from sometimes Windows-alike tools and configuration.
5. Many of my queries about what Linux can do have been answered or take me in the direction someone would need to go to do the work.
6. Linux CAN do what people want the end product to be.
7. Linux is highly adaptable, but not the easy solution in some target area.
8. Windows does offer an integrated solution that SEEMS to be more integrated, offer the simpler option to the layman/inexperianced/business person.
9. Linux as a whole HAS the tools to do the work, but it seems not often advertised or known to outsiders.
Thank you all for your feedback. In particular I would like to thank those who came back with constructive critique, ideas, possibilities and answers, and to those who replied to my own email address.
DS
Maybe you have never heard of any other operating system besides Windows, but Microsoft is still playing catch up to what was readily available in Unix in the 1980s. Disk imaging? it’s built in. Type ‘man dd’, not that you’ll ever need it. Norton Ghost was build as a work around to the primitive DRM (copy prevention mechanism) Microsoft attempted to use. A straightforward copy can create an identical (working) system image except for a few basic file-like structures that can be manually created like /dev and /proc (type: ‘mkdir /dev /proc’.) What’s more, you don’t have to copy the whole image. You can create separate partitions however you like (typically /boot /home /var /usr /usr/local) and use the same base image for all systems with possible variations as separately mounted partitions for, for example, developers vs. sales reps. You can have *very* fine grained access control. You can use bootp or any of a number of other remote bootup scripts. You can have networked home directories or application directories, or remotely running applications (without paying for Citrix or Windows term server.) You can use SNMP to monitor the network. You don’t need Primary and Secondary domain controllers. Those are artifacts from the days when Microsoft didn’t understand DNS. You can share any directory much easier and more securely than in Windows. And you can share with windows (visit samba.org) if you have to. Chances are, some of the “shared” directories that you use right now are samba on Unix and have been for years. But NFS is much better than NETBEUI, believe me. And it isn’t hard. 90% of this stuff works automatically. Just check the little box and click “install.” When you have a working system, just propogate over the network. I could train your IT staff (say 5 techs and an admin) in a week for half the price of your 100 windows client licenses, or do it myself for the same.
Send me an email and we can negotiate a minimum quality of service arrangement and I won’t require any maintenance “contract”, but will offer 24/7 on site service at reasonable per incident rates.
There’s a lot of things to note that seem to be wrong with the reasoning behind this article. Most of the things he is asking for are already available for Linux or aren’t necessary. First off, most modern complete Linux distributions are able to detect most standard hardware configurations, especially in on an X86 based system. While a few soundcards or software modems (which normally wouldn’t be used in a business envirionment) are still on the undetected list, almost all other hardware made in the last 4 or so years is supported in Linux. The need to create master images based on systems configurations is uncecessary. The need for “Central server or domain creation” exisits, and thankfully you can just install a Linux based system off the network the same way you do a windows install, with some distributions you can even write up a small text based script that pre-sets the installation paramaters and picks the packages neccesary when you install so you can let the system install itself over the network.
With regards to his comment about LDAP and roaming profiles, LDAP is supported in Linux and using NFS one can easily have their user profile saved on the network allowing for a roaming profile just as in windows. All of their files and settings will go with them.
With regards to your other need “Specialised routines for locking down systems, desktops, tools, data storage on the ‘domain’. System failure/recovery.”, Locking down systems isn’t very difficult, most graphical managers have a “lock” option no different then the one in windows, you can also set a screen saver to do a lock once it turns on. There are a number of user freindly desktops many of which look and act very similar to windows, in my experience end users have had little or no problem migrating from Windows 9X to the KDE graphical envirionment. Data storage on the domain is not a problem, there are two ways to set up domains but since you are already running a Windows based network I assume you would want to use SAMBA which allows Unix machines to act as Windows Network clients and servers, you can join workgroups, log onto windows domains with the client and create shares and create windows domain and domain controllers with the server. Recovery tools aren’t usually an issue since Linux envirionments very simply don’t really need recovery tools due to their stability. This is just a little bit of my complaint with this article. I really don’t think the offer looked around hard enough for the solutions he claims to want out of Linux.
The most irksome thing about your article is not your lack of knowledge. It is your unwillingness to concede the need for knowledge acquisition in your current environment.
You were not born knowing the things that you know. You had to learn them. When Microsoft trotted out Active Directory, VPN and however many other things, you did not already know them. You had to learn. That’s the way it is, the way it always has been, and the way it always will be. New things require learning — unless, of course, you are satisfied to take your chances and open your backside to whatever the world may see fit to fling at you.
To learn the Linux way of doing things requires a learning commitment, just like the Microsoft way of doing things.
Of course — there is a buyback. My experience is that the Linux way of doing things will significantly cut firefighting efforts, thus leaving time to learn new and useful things.
As others have pointed out, you apparently have no real interest in anything but Microsoft. I personally rolled out over 280 servers using Systemimager from http://www.systemimager.org/. I create a single machine configured the way I want it, take a snapshot of the system. I can then replicate that setup on as many machines as I choose. I was able to roll out the 280 servers I mentioned at a rate of 1 every 3 minutes. Each machine when it comes up is configured with it’s own network address and hostname. If a machine loses a hard drive, I can replace the drive and reimage the machine. If I need to update all of the systems, I go to my image server, change the image and then via an automated script, I can have all servers updated. These machines are not local to me either. I work from my office in Texas with machines in the DC area and in San Jose. I could just as easily do the same thing for desktop systems. In combination with technology already discussed, like LDAP, NIS, NFS, and Samba, I can do everything you do with your Windows boxes at a fraction of the cost and have a more stable and reliable system on top of that. Other people at my company have used this same method to install over 1000 systems in just the past months.
‘duel’ booting is what I do when I close my eyes and fire up the Windoze machine I have to use here at work.
I think a very small amount of effort on your part could find these solutions in Linux. Go to http://sdb.suse.de or http://www.redhat.com/apps/support/ and do some reading. Type some of your issues into Google and do some reading.
I also like the earlier comment about talking to Linux Vendors. Copy your FUD piece and email it to SuSE, they could offer solutions to all points, I am sure.
TrT
All the problems mentioned are problems specific to the shortcommings of Windows (Ghosting Keys/Computer names, resetting the complete hardware list, having difficulty managing peoples remote desktop, etc.). These problems do not exist in the Unix world. It is as simple as that.
My company runs 1000 Sun Unix desktops – they require less support than their windows equivalent.
All the apps live on the server ( They are installed ONCE ).
The application is cached using a NFS CacheFS ( ie the application is only every read down from the server when it changes )
All the user data lives on the server.
Any user can use any machine – and it will always look the same when they login.
The operating system is installed by typing “boot net – install” – nothing else.
Most of this functionality is their in Linux. Alot of this functionality is not there in windows.
Unix makes a damn fine desktop, mangaement is not one of its issues.
I am at a university where they have such a system where computers are reimaged on boot. The fact is, whether you want to install Linux or not, it is not very advisable to deploy machines with varying hardware as thi brings unnecessary complications. Have a vendor supply hardware which is identical, or close at least and remove this problem of needing to support dfferent hardware. Then you can reimage the computers without needing them to reconfigure themselves each time they reimage.
I think this makes more sense than trying to develop a system that tries to cater for too many differences. I am sure that a Linux install could be scripted anyway, which could mean you have an actuall reinstall instead of an image. And users could keep their files on their computers using ftp. Linux does seem to provide more options here. The tools might not be as easy as in the Windows environment, but for an admin, this should not be too much of a problem.
Windows is by now the OS of choice for the Desktop because of what the user is able to see and work with.
The administrative side of the question is not much the point because:
Installing many of computers at a time in a corporation is a task for the real administrator. Administrators do not fear real tools even if they require more research / training, since the only thing that matters is the result.
For the home user it may be important that tools like ghost are well known for the masses. A Linux admin, however, should be aware of the many tools that exist for the matter:
NFS, LDAP, ALICE (SuSE), cloning and rescue tools, even bash scripting, perl….
Linux administration can be really beautifull and reliable if you are ready to learn how to. Without political or fanatical underground. Just for technology in it.
The reason why Linux is still be years away from Windows on the Desktop, are much more missing software for the “normal” user each day:
– Microsoft Office (Yes it is the best one around)
– Wetter frog with speech output
– Media player
– Good and EASY to use burn software
– CAPI drivers
– Understandable and powerfull printing sub system
These real problem are the applications, not the tools for the background work. Her is Linux the real Winner.
This will maybe never change because people are not intersted in changing the way the use to work and learn.
First a few comments, then a potential solution.
When I first read this article, I thought it was clear and outlined your desire on how things you are familiar with in the Windows world could be translated into the Linux world.
I have worked on Windows, Novell and now Linux systems over the last 15 years and believe they all have their place. When I knew only Windows, that was the ticket. When I switched to Novell 4, it was apparent, at least to me, that in a large distributed network, Novell was superior product.
Having recently started using Linux, about 1 1/2 years ago, I have a great appreciation for it’s strengths and abilities but cannot speak to the same level of expertise that most of the others here have. However, one thing I have discovered is that having an open mind, even if it closes when the next new thing comes along, goes a long way in solving problems in this industry.
Comment over, here is the possible solution:
Novell’s NDS or eDirecory.
The best thing about this product is it’s platform independence. It is standards based, X.500, and allows for centralized administration of the entire network.
Here is a link the second part of an article, it has a link back to the first part, and it includes links to other references.
http://www.linuxworld.com/linuxworld/lw-2000-05/lw-05-nds2.html
I know how daunting moving from a Windows environment to a unix environment can be. I did it about 5 years ago. I’ve never looked back, and now when Windows users have a problem I have to refer them to the help desk because I just can’t figure Windows out anymore. It and I have changed too much I guess.
Anyway, here’s some pointers:
1: There’s more than one way to skin a cat in Linux. What do I mean? Well, let’s look at system configuration. On RedHat there’s a tool called setup. That runs in a nice gui, and is easy to use. but it might miss a few things. There’s another tool I can install called webmin. It lets me configure my box by pointing a web browser at the machine on port 10000 (i.e. http://127.0.0.1:10000) and is very easy to use as well. I can use ssh to “telnet” into my box and edit the configuration files by hand. I can use linuxconf in a text mode, a gui mode, or even a web browser based mode.
Each one of the admin tools has a different feel. Which is better? Hard to say. You just have to crank up one after the other til you find one you like or that works for you.
While this kind of plethora of choice is down right scary to a Windows user, it’s the norm in Linux land.
2: Knowledge gained in Unix tends to stay useful for a longer period of time. That’s because things only tend to change in ways they need to change, and the software has a natural tendency to stay static once it matures. Think about how hard it can be to find your way around on a Windows 95/98 box after a day spent administering NT or W2K. In unix, no matter how old or how new, the /etc/fstab file is still where we store info on the drives mounted on the box we’re on.
When someone made an updated version the inetd package and called it xinetd, the directory for it’s configuration went from living in the inetd directory to xinetd. I could find it pretty easy.
3: Unix in general has a harder learning curve when it comes to administration. It is especially steep right at the beginning. Unix and it’s administration were set in place long before gui tools to administer unix were thought of, so all the gui tools run on top of the text files that unix uses for configuration. This is a good thing. It means that once you know where unix keeps its configuration info, you can always fix it, even if the box is barely booting. If you’ve ever had to reinstall an entire Win2K or NT server because of ONE simple mistake, you’ll come to appreciate this.
4: Integration in Unix means something entirely different than it does in Microsoft. In Microsoft, integration is acheived by closely tying the parts of the OS together, often blurring the lines of functionality of different areas. Like putting a security dll into the web browser so you can’t uninstall the web broswer. While much of windows works smoothly together, the integration makes upgrades a truly dangerous thing. In the last two years Microsoft has come out with two different hot fixes that would scram a machine so hard you had to reinstall. They couldn’t be fixed. This was due to poorly thought out integration.
In Unix, integration is more like cooperation. We have to pieces of software that we need to integrate. We define the interfaces very well, making sure each package can survive should the other stop speaking to it. Then you implement it and test it to see how each package behaves when the other one either gives it the wrong data or stops talking.
5: No one will fish for you. Everyone will be willing to help you learn to fish however. If you can’t figure out how to use a terminal server type setup, or aren’t sure what to do, asking someone for a pointer to a web page or program that can help is seen as a good thing.
Asking for someone to tell you step by step how to do it in a forum is considered bad form. The reasons for this are varied, but the main one is that people don’t want to duplicate effort. Someone has probably already done it, and you just haven’t found their HOWTO or directions.
6: Escalate logically. In windows fixing a problem is different than in Linux. In linux, you first go to the home page of the package you’re looking at. You read the HOWTO, the Manual, you try to figure it out a bit yourself. Then you google for a fix (google rocks, really). Then you try the archives for the general support mailing list for the product if you can’t find the answer in google. Then you try the developer’s list. If you are asking a developer’s list for help, it is considered bad form to have not tried all the above mentioned first.
7: Commercial support IS available. Write a check for $10,000 to the guys who program Postgresql, and you can get them to answer the phone at Sunday at 3:00am and walk you through a fix that takes 4 hours. No problem.
Don’t expect that treatment if you aren’t paying anything to them. Their software is free, but their time isn’t.
8: Time spent planning and learning is NOT wasted. It will always be better to spend a week planning a rollout so it takes a week, rather just jumping in an taking a month to roll it out, and having an unmanageable mess.
With Windows, you can often get away with what I call “cowboy adminning” You just jump in the saddle and go. Hell, you’ll have to rebuild it in 6 months to a year anyway, right? In Unixland this is a bad way of doing things.
9: You’ll eventually get all the time you put into the front end back when it’s all running. Once you have everyone running Linux, the little niggling problems you have each and everyday with Windows will be gone.
The phrase that will disappear from your user’s vocabulary is this: “I was just typing along when suddenly…”
Good luck, and do your homework, it’s well worth it.
There are a couple of huge mistakes being made by the author.
The first one being that the Linux is not an Operating System, but a System Kernel. GNU is the OS. Linux manages the hardware, as does exist a Windows Kernel, which manages hardware (What do you think system.ini is?). The Linux KERNEL is by far superior to that of windows. It supports natively much more technology than the Windows kernel, its only problem being that some hardware vendours don’t provide the information about their hardware for the Comunity to make drivers.
The second one is the GNU is the OS. There are projects to make GNU run under Windows (So, the GNU/Windows system). As there are projects to make GNU run on the FreeBSD, (GNU/FreeBSD), And the most popular one form of the GNU OS being Linux (GNU/Linux).
Most things that the author mentions have been a possibility in the GNU OS for a long time. All you have to do is Ask the Comunity of users, and i’m sure you’ll find out how to make Images, integrate with windows machines, etc, etc, etc.
The author of this article alledges a triumph of Microsoft design.
We have indeed seen a triumph of MS design, but it is not the same one the author mentioned.
The triumph is MCSE certification. This, more than anything else, is what gives windows admins the attitude displayed by this author. Another poster mentioned something akin to this, but it was in the midst of a terrible rant, and was probably ignored…but truly, how many hours/days/years did you spend learning to do the things you do in windows? Really? Consider it for a moment. Now compare that to the time you spent trying to learn Linux. Or, perhaps more accurately, poking around at Linux and giving up because it “just doesn’t have the tools.”
I’m willing to bet that you didn’t even spend 1% of your windows learning curve trying to learn Linux. Is it really any wonder that you can’t do the things you can in Windows?
But this is not really the triumph. The MS triumph here is that MCSE and the like have *convinced* you that the tools MS provides are *the way you administer a computer* when in reality, they are just one set of *tools* to administer a computer *with.* For the skeptical, look over the preceding comments, and the author’s reactions. Over and over very bright people (some of them undoubtably serious linux administrators who probably really do this stuff) have listed off literally dozens of tools and methods that can be used to accomplish the goals the author set out. But he has continually ignored them all. Where is the answer to the fellow who suggested asking google what they do? I’ll bet their boxes (what, 15000 last I heard?) aren’t all the same hardware, and I’ll bet they didn’t install them all from scratch… But the author only sees that SYSPREP isn’t there…that the tools he *knows* aren’t anywhere to be found.
MCSE has been one of the most successful MS endeavors ever. It has made thousands of people believe they are computer experts, when in reality they are Microsoft software experts. This is not to cheapen anyone who has this cert; it is certainly a Good Thing to be an expert with MS products…they are in very widespread use. But don’t confuse that with anything it isn’t. Because I guarantee, you go out and spend the amount of money and time on Linux training that you have spent on Windows training…….and figuring out how to deploy 200 desktops isn’t going to work up much of a sweat.
Matter of fact…once you did that, you probably wouldn’t deploy the 200 desktops anyway; the truth is, that’s really inefficient. You would spend half the money on a monster server and some cheap clients, and not have to worry about how you were going to deploy a zillion full os’s.
So there you have it.
Heterogeeous computing environment is complex. Some design decisions can make migration easier.
Let me list the design issues:
1. Automated system installation.. 2. Common network wide user profile (roaming) 3. Authentication scheme (logon) 4. User data store 5. Software update 6. Backup restore
7. Applications tied to windows. 8 Remote admin….
Some solutions: kickstart(1) samba(2,3,4) cfengine(5,8) partimage(6) Terminal services….rdesktop, vncviewer, pxes(7) ssh, webmin(8)
Careful design is necessary for authentication method and data storage before rollout.
Moral of the story- test throughly.
1. Easy availability to MS technical data/solutions is a concern with MS OSes as well. How does an admin migrating from NT4 to 2K learn about Active Directory? Almost every nt admin I know had to retake a few MCSE classes and pour over admin books to familiarize themselves with new features, functionality and core changes in WIN2K. Most NT/2K admin information is not covered anywhere except in admin guides, classes, or on forums. Linux does the same.
2. Over the years, I have noticed a general decline in the technical knowledge and competency of MS admins. This is not to say that they are not intelligent, but having the software do everything for you without having to dig deeply into its inner workings brings an admin closer to the usefulness of a regular user. In my company we have a mixed bag of 2k, Unix, and Linux and one thing that’s abundantly clear there is that the *nix admins know their $#!t, often easily troubleshooting 2K related problems when the 2K admins can’t. Many 2k admins even have problems creating batch files, yet I haven’t known a *nix admin that wasn’t VERY adept at shell scripting.
3. Tools for automated installs of *nix boxes exists and they are as well documented as the Windows worlds solutions. However, you won’t know about them if you don’t look for them, study them or ask knowledgeable people about them. The same is true for the MS world. You learn the MS solutions in MCSE class or by reading the Admin guides. One cannot learn about either OS’ solution reading a few articles in PC Magazine.
4. For good information on Win to Linux migrations, check out the success stories on http://www.ximian.com , http://www.redhat.com , http://www.desktoplinux.com , and http://www.suse.com .
It’s easy for a Windows world to criticize Linux’s perceived desktop difficulty. After 10+ years dominating the desktop, it’s easy to hold it up as the standard that any desktop OS should attempt to match (just like the Apple/Windows was back in the old days.) In all honesty, when I began using Linux, I found it very difficult also, but as I investigated deeper, I found that it wasn’t difficult, just different. I almost gave up, instead, I realized that the easier path is usually more costly and it would be in my best interest to learn a *nix. Now, a couple of years later, Linux has become my desktop and server OS of choice and I see the limits at work of having to use 2k. They were always there, I just didn’t know because I didn’t know to look. Many software packages that I had to pay a ton of cash for now doesn’t break the bank. Gimp over Photoshow, Ghostscript over Acrobat, OpenOffice over Office. Those four software examples can save you $1000.00 per desktop. There’s a serious incentive for learning the Linux alternative.
gosh,
Linux Terminal Server Project (LTSP) – there is also a patch to it so that with one Linux Server and one Windows Termial Server u can boot winbozes over the network, ‘because MS can’t do it on its own :”)
U have “cfengine” to deplopy, configure, monitor etc… thousand of machines…
U have NFS,AFS,InterMezzo etc.. to share user data on centeral servers…
U have OpenMosix to share the processor power…
Windows breaks if u just move your HDD on other channel, with Linux all I have to do is to edit /etc/fstab and grub/menu.lst file… i have done it many times..
Gosh MS has invented a some form of limited version of UNIX-“symlink”, before a years ago… can’t find the link and show how MS care for its custumers :”)
You make a correct assumption: there is NO unique “desktop computer” and while most “linux-on-the-desktop” articles seem to concentrate on the SOHO PC, you point out that you need a valid solution for large-scale corporate deployment.
The truth is, the solutions to the questions you ask are right in front of you.
1) Mass installation/configuration: others already mentioned kickstart, and you could keep the computers fine-tuned with custom init scripts in that image checking for HW or anything. Kudzu could be helpful too. A nice modular kernel is also a must.
2) protection against user errors: they can’t touch it because they are NOT root! (Or should not be). As a matter of fact your IT support will face a lot less “help me, my PC is daed!” requests and will be able to concentrate on more constructive activities.
3) User handling: common dirs with NFS, AFS, Coda, whatever – there is only too much choice here and YOU keep the files, not a server in the other end of the world. You can also have custom “skeleton” presets for each class of user/desktop and keep them as user groups (executive, technical, h-desk, whatever). Also, a rightfuly thought firewalling can be a lot more protective against user-misbehaviour than the MSN services… And if you think that Average Joe can tunel or pierce a careful firewall, you shouldn’t even mention MSN!
4) Technical support: I’m sure you will find a lot of capable persons to enroll in internal IT or *local* third companies to provide you with that. It’s *your* choice for the best bidder – no strings attached! Ever!
I’m afraid that your entire thesis is simply a demonstration of why people keep using windows in places where no special windows-only apps and features are required: INERTIA!
hint: http://www.tldp.org <- it’s all in there!
It all boils down to this:
If you don`t speak french,you can never learn to cook:)
I am self taught, cut my teeth on a Winnt4 network build(migrated from novell3.1), built out of a reference book, 5 years ago,I was a copier engineer, now, I regularly install my company`s equipment on other ppl`s LANs, and consult on network design and implementation.
Now , after playing with Linux, I am slowly migrating my company`s entire network to it.
I wouldn`t even consider hiring a Microsoft Co-opted Servile Entity. All of those I have interviewed in the last 12 months show a serious lack of creative thought. One even admitted taking the exam 7 times “until he had learned the questions”
The main problem seems to be that “it hasn`t got X, or it isn`t Windows”
I drive a very complex, expensive, high performance diesel car.
My contemporaries criticism seems to consist of “aren`t diesels dirty and noisy”
My vehicle is quieter than most petrol equivalents, is less polluting than most petrol vehicles with an engine size 30% smaller, and is faster than my fellow director`s petrol Mercedes.
It also has higher MPG.
Unfortunately, it runs a version of Windows as an operating system for the SatNav, which is the least reliable function.
My fellow director has just ordered a new Petrol Mercedes.
Plus ca change….
If I lived in your area (I don’t – not even the same country) I would print this article, collect some notes and sources, and use it in a presentation to your senior management about why they should give me your job.
You have dismissed alower cost option without research.The only thing that you seem able to say with any actual merit is “We are tied to Office”, or “Re-training costs would be prohibitive”, both of which could be overcome with sufficient work. No other technical argument holds water, and the amount of research required to hole each of them is startlingly small.
You appear to be an IT manager who is closed minded about technology solutions. Darwin works on these people.
You really need to get somebody to kickstart your learning process’s. I admin hundreds of machines via the internet. No trouble! Try going to a local lug “Linux user group” and connect with some competent folks to get started. That or beat yourself on the head with a clue by four. I am so fed up with incompetent winderz halfwits that are to lazy to get off their respective buttz to learn anything. Just keep beleiving what you want. I am preparing to serve your weeney winderz head on a platter to the bean counterz. I can provide your oganization with a complete OSS solution for pennies on the dollar of the cruft you pedal and manage it from halfway around the planet.
Wake up
> Second: You don’t even need to install a graphical environment to get a linux server.
Come on, you give a general windows admin a command line with no GUI, they haven’t got a clue…
The problem is most windows admins want linux to react and run like windows, and are always forgetting that *they* had to learn windows, but refuse to learn Linux….
Are you clear about how much you are paying for this level of service ? (clue — Walmart sell a ‘raw’ PC for 199 USD)
Do you get any warranty ? i.e. does your supplier take any responsibility for consequential loss, in case of any defects in the supplied software (e.g. allowing virus attacks)
Does the service you get empower you for the future, or does it just leave you needing to buy more of the service to continue in business ?
Just wanted to point out that Ghost allows you to copy not only a partition but the disk itself, which allows you to copy a Linux installation the same way that you would copy a Windows installation.
Just another minor point, although Linux has most of the tools that the author assumes are missing, it does lack a common user interface to modify all those options. A good approach for Linux(I’m not aware if one exists already) would be to create a program similar to MMC which accepts plug-ins and although it is not great, does save you from using a different interface everytime you use a program.
Maybe Linux should look at MAC OS X for a sucessful implementation of efficient design, easy UI and robustness.
Windows provides a great development platform and many great ideas can be borrowed from it. Linux developers should avoid bashing Microsoft and Mac, and instead they should base themselves on things that these OS got right, and improve Linux to compete with them on all levels not just on the server side. Linux has all it needs to be the standard OS, it just needs to be polished a bit more to be more consistent, easier to use, simpler to deploy and more visually pleasant. It’s interesting to note, that most of the major Linux distributions base their business model on support, so ease of use; might not be a priority for them.
I’m hoping for the day, where everything will be dll & blue-screen free, and Klez, bugbear will be all safely locked up in a museum…
Just one thing: the word is “dual” (as in dual booting). Not “duel”. That’s when two things fight. You’ve used “duel” not only in the article, but also in responses. Typos are one thing, but if you want to be a writer, words are good tools to use correctly.
In an earlier post somebody mentioned that NFS mounted directories are not secure. And to some extent he is right. However as it in Linux is possible to use specified TCP/IP ports you can easily tunnel NFS trafic through some encryption layer, e.g. ssh.
And you could use PAM and some kine of smartcard technology e.g. java rings to ensure that users are who they claim to be.
By using iptables you could also limit access to the NFS servers to clients with ehternet cards with known MAC addresses.
Setting up a secure system is much about knowing the weaknesses and get around them.
This actually has advantages, as you in Linux can combine modules as you see fit while you in Windows often have one monolitic solution. E.g. in Linux you can choose from using ssh, IPSEC or SSL to encrypt your network trafic,
while in windows you do it the windows way.
Like any monoculture Windows becomes more vulnerable to script kiddies. As there is one way windows security works and that’s it. In the Linux world such a script would have to figure out how the system is configured before it will be able to do any harm. And in doing so chanses are that it have revealed itself and left traces in the logs.
OK, OK, OK…
I am the Windows/Linux/Solaris Intel guy who works for a very large SUN reseller. As a result, I am pretty much neutral when it comes to the “religious wars” between the Wintel and UNIX/Linux communities.
In fact, I think that BOTH communities are very silly in how they badger, bait, and berate each other.
I do Windows and I do Linux and the MAIN problem that I see in the Linux sphere right now is NOT if the technology is there, is NOT if the documentation is there… it’s there!
However the problem is:
=======================
1) Finding it,
2) Understanding it once you find it, and
3) Getting it to work as advertised (directly related to #2)
In the Microsoft world all 3 (three) are provided quickly and easily by the Borg… I mean Microsoft.
UNIX/Linux people look at this and “cop an attitude” about how “lazy” the “Windoze” integrators are. In return, the Windows people (justifiably) get angry and defensive.
And THAT is the root problem. It’s not an issue of wrong v. right, or black v. white, it’s an issue of cultures clashing and refusing to understand where the other fella is coming from.
In my mind the answer is for someone to get all the Linux solutions and answers together in a single portal – a la Microsoft Technet and all the other resources that Microsoft throws at their community.
Red Hat is doing a pretty good job of this and is being soundly criticized for it by the UNIX/Linux people because it is such “Microsoft-like” behavior.
Exactly – and that is a good thing IMO! If acting, “Microsoft-like” gets the Windows folks to join the Linux party then I’m all for it.
Ditto for the proprietary Intellectual Property that they’ve layered into Red Hat v8.0. Maybe I’m stupid and naive but I really like how easy to install, integrate, and customize RH8 is! Yes, it is indeed “Microsoft-like”… and your point is…?
In the end, and to summarize, until…
1) The “how to’s” for Linux are as easy and accessible as Windows -and-
2) The CULTURAL lines between the Windows and UNIX communities begin to merge and blur.
…we will see long, lengthy, heated discussions like this one.
Thanks for listening to my mad ravings and happy integrating people!
/fwa
samba is a very robust ‘lanmanager’ type server/client implementation.
not only can you emulate roaming profiles, PDC control, logon scripts, and most
of the other things you blathered on about, but you can also customize
things that the server will do when a client connects to a samba resource
(you could use this to automatically back up files sent to the server, or to parse files
sent to the server, or a PLETHORA of other things)
kickstart installs can be used to roll out a large amount of
machines with predefined software ready-to-use. you can customize
these processes yourself to include more specific configuration (like
specific XFree86 Settings)
Something that you didn’t mention.. but I thought I though I would is the PRICE of 100 linux desktops compared to 100 XP desktops…
probably worth a few days learning curve at LEAST
So there we have it.
Turnkey solution for Linux? Nope, apparently it does’nt exist. You have to go and learn about how Unix works.
Ahem – To the Linux guys out there, come back when you are ready. You are not going to get businesses over to your Unix way of dealing with your issues.
Linux is going to have to move itself to the end user side of things to make bigger impressions.
And no, Linux is not a bad product at all. But its selling itself to NON Unix people now. So you’re going to have accept the idea that what works for a Linux/unix sysadmin will not work in that market as you believe.
In the same way as the Linux desktop computer now has a nice GFX installer, the turnkey server/network/services solution must follow if its to make the same progress elsewhere.
DS
you didn’t have to learn how windows worked? the first day on your windows computer you were able to create “turnkey solutions”?
somehow i doubt it.
I can repackage and distribute any software, changes,
anything I like, without visiting 600+ desktops.
We do the same using Linux Terminals Server on 100+ desktops and we do not have a person spending full time on it.
The disk-less termonals have nor software to maintain.
The RedHat 7.3 Server does it all
You may start learning Linux
ajith
Of course you have to go and learn about how Linux/Unix works, just like you had to learn how Windows worked. Maybe you were smart enough to know how to administer an NT/2K network without learning anything beforehand, but the rest of us mere mortals learn by studying, reading, and investigating.
As for your statement that “what works for a Linux/Unix sysadmin will not work in that market as you believe”, Linux is NOW selling itself to home and office end users and the numbers are increasing daily. Read reports from IDG, Gartner, etc.
You’ve already admitted in previous posts to have a lack of knowledge of Linux in many of the key areas discussed in this forum. The only thing you have proven is that your lack of knowledge hasn’t fueled you to investigate or learn, but instead to attack what you don’t know. You haven’t proven any shortcomings in Linux, you’ve only proven your own.
he’s looking for a windows _clone_
some people can not be helped.
I agree with that statement, but our community (the Linux community) needs to stop acknowledging people like AdmV0rl0n. These people act like they want to learn but attack every answer. Linux advocates know what Linux can do. Windows advocates know Linux is a threat, this is why they resort to tactics like that. Instead, ignore them. They losing ground in the server market, they see that they will lose ground in the desktop market (this is already beginning to happen rapidly in the international market), and apparently the enjoy losing money both at home and in the corporate arena. There’s a reason that Linux has become such a strong competitor and has gained the ground that it has without almost any advertising.