While we often focus on the desktop offerings from Microsoft, the company of course also plays a role in countless other markets. The most prominent of those is probably the server market, where Windows 2000 Server and Windows Server 2003 are now facing a number of support changes – important stuff if you manage Windows servers. The biggest news? There will be no third service pack for Windows Server 2003.
Let’s start with 2000. Windows 2000 Server will reach the end of its extended support phase on July13, 2010. This means that Windows 2000 Server will no longer be publicly supported, and you’ll have to resort to ‘Self-Help Online Support’ for your support needs.
Windows Server 2003 and Windows Server 2003 R2 (at a supported service pack level) will move from the Mainstream Support phase to the Extended Support phase on July 13 2010. In this phase, Microsoft will continue to provide security updates and paid support, customers will still have access to security updates and Self-Help Online Support options; however, non-security hotfixes developed during the Extended Support phase will be provided to customers who enroll in Extended Hotfix Support only.
When it comes to a possible third service pack for Windows Server 2003, Microsoft is quite clear. “We have received inquiries from our customers and partners on whether or not there will be a need for a Service Pack 3 for Windows Server 2003,” Microsoft writes, “Microsoft will not have a SP3 release for Windows Server 2003.”
If you manage Windows servers, be sure to take these changes into account.
First they kill virtual PC, now they are killing 2003.
Oh brave new microsoft, you suck.
Microsoft has pushed the perpetual upgrade cycle through support and update cut-off forever, but the fact is that people are using Windows for some business critical applications these days and that means they stay on the Windows versions they know work – for a very long time. It’s rather like how COBOL hasn’t disappeared.
Run around any IT environment in any organisation in the world and you will still see NT 4 systems, or even Windows 3.1, if they had applications written in the 90s that they are still using. If they bought into Windows 2000 you will still see it. Many have only upgraded to Windows XP in the last few years. You will still see classic VB as well as Visual C++ being used because they have several dozen applications written with them since about 1995. If applications have tie-ins to Microsoft Office for things like mail merging then they will still be using the version of Office that they were designed to work with – something like Office 97.
Microsoft has traditionally ‘moved’ things on via older systems and applications not being able to run on new hardware, but virtualisation has largely scuppered that. You can replace or upgrade older hardware and still run older operating systems and applications on them. It matters little that updates have ceased as long as everything runs – as it has done for the past ten or twenty years. If Microsoft won’t accept money for that then they’ll quite gladly pay VMware or someone else to run it on Xen or KVM.
Microsoft still doesn’t realise that it’s not 1990 and the world doesn’t work that way any more. The PC market is not increasing at more than double every year so you can get out a new product version and have an instant installed base overnight. You have to support what’s out there. I just see this more and more these days:
http://www.joelonsoftware.com/articles/APIWar.html
Edited 2009-09-16 23:07 UTC
That’s why I’m largely surprised that they have decided to remove things like the parallel port & floppy disk from the new virtual PC.
Some of us actually still ‘printed’ reports from legacy systems and used the output files to feed databases…
But it’s true, Microsoft has decided they don’t want me to keep supporting these legacy systems on their solutions, but on somebody elses…
How does this stop you from doing the same thing with USB-based (which are overwhelmingly in use today) print output?
It could be argued that Joel’s article is irrelevant now, given the commonplace of VM to run legacy applications and the diminished role as the desktop as a platform.
It’s more relevant than ever, particularly with virtualisation. You have to pay attention to backwards compatibility and your existing installed base if you are expecting your users to upgrade as you want them to. That means supporting existing systems with service packs that include new features and runtimes that can be run on newer systems, and that gives your newer products an installed base.
Microsoft doesn’t see it like that however, because they think they’re giving away new features for free at the expense of new products. They don’t realise that there is increasingly no installed base or market for their new features.
With visualization, the base OS is still required. If your moving a win2k app into a VM you still need to have win2k in place on that VM. This means that your VM becomes more vulnerable as more issues are discovered within win2k. Stability can also become a problem if future hardware and VM developments start triggering win2k crash bugs. Worse still, your win2k box get’s breached and the person managed to break out of the VM through a memory whole or similar.. now your entire VM back and and various hosted OS are compromised because win2k provided the weak link.
A VM will allow you to continue running older OS but it won’t magically protect you fro vulnerabilities within those OS.
Just because there’s not going to be a set of updates known as “Service Pack 3”, does not mean that you won’t still be getting updates.
no it doesn’t. It just means that if I have to do a fresh install, I will have to wade through hundreds of single updates instead of one big one. But hey, why make their OS any easier to use? Just move to Linux.
Speaking of which… I just did a clean install of Fedora 11 and there were just over a gig of updates waiting to be downloaded and installed.
Linux? For people who do not like updating? You must be kidding?
Jesus. When it comes to Linux, anything goes.
Please at least mention specifically some distribution that actually does not pursue these ridiculously rapid upgrade cycles, constant package updates between upgrade cycles, and little quality assurance.
I think you might be talking about Debian.
Amazing how my father and mother have Archlinux setup on their respective computers and hell hasn’t broken loose. I don’t understand what you’re using but if you stick to a decent distribution and do your job as an administrator – there should be no reason for things going pear shaped.
Edited 2009-09-17 06:33 UTC
Have you too fallen in this rhetorical “my grandma”-trap?
In this context, “decent” distributions would probably be something like CentOS or Debian. Nevertheless, each and every Linux upgrade (from major version to another ) cycle contains a well-understood risk, which, as you note, can be reduced by proper administration.
(Maybe I am just frustrated at the moment because I just discovered two Fedora bugs to which the developers promptly replied that these will be fixed only in the next release; perfect example of the six-month-hell.)
But take this from the parent comment:
“Wading through hundreds of single updates” sure brings Linux to my mind.
http://www.osnews.com/user/strcpy
Date Joined: 2009-05-20
Status: Active
Bio: One of my hobbies is to troll here and make fun at the expense of Linux fanboys.
………..
And what a fun hobby that is!
Edited 2009-09-17 06:56 UTC
… I wouldn’t quit his day job if I were him.
– Gilboa
It’s cute the way you think posting that reflects badly on strcpy. What it really shows is that you don’t have the wit to avoid being trolled even when the person doing it explicitly tells you that you’re being trolled.
What’s you’re problem with citing the successes of an OS with the computer illiterate.
In my opinion it’s as good a usability benchmark as any.
<Contradiction alart!>
Seriously though – how few of the “hundreds of single updates” are actually to do with Linux itself? Or Linux plus X & DE?
Most of the updates are 3rd party applications. So you could just as easily say that Windows has the same hundreds of 3rd party applications on top of it’s hundreds of system patches. The only difference is the user doesn’t even know said applications need updating as Windows doesn’t handle app installs very well / at all.
I’m not about to argue that Linux > Windows because, at the end of the day, OS choices mainly come down to user preferences rather than what’s “better”. But I will say that at least Linux’s (read: ‘Linux’ the complete package rather than ‘Linux’ the kernel) “hundres of single updates” are generally managed via a simple interface which (depending on the distro you use) is generally a GUI that requires little more than 2 or 3 mouse clicks.
[edit – hadn’t properly ended a BOLD tag]
Edited 2009-09-17 09:13 UTC
It’s not a good usability benchmark because it’s anecdotal evidence and anyone make could the same comment about Windows or OSX. You could just as easily tell a story about how you got your parents off computers and now they’re happier and stress free.
You don’t even have to use a human.
Just look at this blog post where I switched my cat to Linux:
http://www.jfplayhouse.com/2009/08/how-i-switched-my-cat-to-linux.h…
I appreciate what you’re saying, but most usability reviews I’ve read have been anecdotal or based entirely on the reporters opinion.
You rarely see a scientifically studied or unbiased usability review.
None at all?
ArchLinux update breaks Nvidia driver http://bbs.archlinux.org/viewtopic.php?id=77131
I suggest you stop being so lazy and read:
http://bbs.archlinux.org/viewtopic.php?id=77131&p=2
Where is this ‘across the board epic fail’ when the results are anything but that. Some are having problems whilst others are sailing fine. My father has a laptop with an Nvidia 8400 GPU and running the latest Xorg and Nvidia drivers and it doesn’t have these problems.
How exactly am I being lazy? I never claimed that it happened to everyone.
You originally said:
And I showed how there can be a reason for things to go wrong.
Here’s another one:
http://bbs.archlinux.org/viewtopic.php?id=66980
It’s a joke that you think Linux is better for updates when there are so many cases of working sound, video and wireless being broken by an update.
Linux development is too decentralized; the Linux devs send changes downstream without much care if they break anything. They aren’t designing a consumer OS to begin with and it shows. The hacked-together sound stack and the demand that hardware companies open their drivers doesn’t help either.
It all amounts to Linux being more likely to break working hardware with an update, especially if it is a consumer device that uses a proprietary driver.
RHEL?
SLES?
Debian?
… Heck, Slackware?
Not for people who don’t like updating.. for people who like easy updating.. example:
Boot Windows. Visit Windows Update.. then Adobe.com for flash.. then Firefox update checker.. then java in the control panel.. then quicktime in the control panel.. then XYZ website for another update check.. then Adobe for acrobat and pdf reader.. and on.. and on.. let’s not forget the reboots inbetween various stages.
Boot Debian, Mandriva, Suse, Redhat or other Linux based platform (useing Debian for the example); type “aptitude update && aptitude full-upgrade”, review the list of updates that will be isntalled, hit Y to continue. Done.. maybe there was a kernel update so reboot once.. done. If your a GUI person, check the task bar icon that says “updates available”, click “download and install”.. done. All updates vetted by the distribution provider and made available in one place through a centralized update/add/remove manager.
Linux would have the same problem as Windows if there were more commercial apps for the platform.
We’ll see. Hopefully the people will do it right and work with distribution providers more closely. We’re seeing it done more right with hardware; AMD provides development specs, Creative gave Alsa the X-FI development specs. Kernel and Xorg developers will take any dev specs for hardware provided and suppor in either of those bits of software means support across all distributions of the platform.
Nvidia chooses to go it alone but the installer is lovely to work with. Distributions also choose to include Nvidia’s hardware support; Mandriva has it, Debian has it in “non-free” repositories, Debian 6 Testing took the Nvidia downloaded binary installer seamlessly.
VMware keeps it’s installer and updates outside the distribution which is sad. Hopefully growing popularity of the platform will motivate such major application vendors to provide better update support.
Actually, with Webmin as an example, all vendors need to is provide a self managed repository that the distribution can tap along with it’s own repositories and updates/installs are included into the centralized software manager. I don’t go to “debian updates” then “webmin updates”, they are both done by the centralized tool though they come off separate sources.
Sadly, it will take customers pushing for this better update management and the computer market is anything but customer driven.
adobe, windows, java and firefox all update themselves when updates are available….
Autoupdates have there own problems. They allow the vendor to choose when your computer sends out network traffic. A broken patch may be autoinstalled. Maybe a simply unwanted patch will get installed. This helps though with home users who are not going to take the time to update at least once a month with a few manual clicks but it’s far from ideal and far from the easy of a proper centralized system. Your still dealing with an updater dial-home for each separate application installed.
In a business network, it gets much worse as you can’t have users installing anything that pops up and should have auto-updaters sending out network traffic at will even if the user hasn’t the rights to install the update. Centralized management like WSUS and appliances help but they still leave software out. The IT staff still have to hunt down updates from many indavidual sources.
Driver updates make both issues even worse. Lenovo provides a thinkpad driver update manager but poor drivers and addon software (I spent yesterday fighting with an X200 – executive model my ass). Asus leaves the user to download each seporate component addon by hand; no updates manager for that brand of hardware. HP and Dell think I need a ton of highly graphic software to print a document rather than a simple printer driver. ATI, unless it’s improved, required the hardware drivers be uninstalled back to generic VGA before “updates” could be install on the system. It’s a mess.
The recent IE8 issues have demonstrated, yet again, that Windows auto update is just a bad idea all around.
If you decide to use a ‘bleeding edge’ distro like Fedora then you shouldn’t complain about the amount of updates that is the nature of the beast. You could have looked for a ‘re-spin’ of the disto. This would include all the updates.
If you want stability (And a more direct comparison to Server 2003) go for something like RHEL or SLES.
I just installed RHEL 5.4 and the size of the updates I needed was 68Mb.
Back on Topic, I’d like an SP3 which is just a roll up of all the patches to Server 2003. Simpler & quicker to apply as opposed to the hours you spend downloading individual ones from Microsoft. Pah. Yep, I could get them all but I only use 2-3 S2K3 systems as opposed to 20+ Rhel/Centos/Fedora systems.
Exactly. I’d hope people would tout these so-called long-term support distributions more, especially for newcomers. For me, these are the state-of art Linux.
But instead the press and the public pushes the masses to the next shiny thing in Ubuntu or Fedora.
EDIT: more typos.
Edited 2009-09-17 07:09 UTC
A couple of points on that: a lot of people use Windows Server to manage Windows desktops. Linux cannot take the place of Windows Server for that situation (yet). And yes, sometimes you are stuck with Windows desktops.
The second point is that long term support variants of Linux are rarely long term. Those 10 years from Microsoft are hard to beat, even if they are just security patches. It is also worth considering that long term support variants of Linux usually lock you into older applications. Even if you were willing to build an updated application from the sources, building that application and its required libraries is excessive.
I just rolled my notebook back from Debian 6 testing to Debian 5 stable (lenny) and there was just over five updates to do.. but then, I also did a network install against latest repository versions. Where it a more secure environment, I’d have those network repositories on a trusted local machine. Doesn’t Suse provide a network install? (I honestly don’t know.. one of the few major distros I haven’t tested)
Linux isn’t a drop-in replacement for Server 2003.
I know the frustration, but there is always nLite. nLite plus an update pack will create a Windows install disk that is up to date. The update packs are great and save a tremendous amount of time.
Our email is still being run on an exchange 2000 backend. The mail still gets delivered, despite there not being a service pack for >6 years.
But it has to be replaced soon anyway, mostly because the shiney new MCSE admin can’t/doesn’t want to use such ‘crusty’ software.
I guess a lot of 2003 installs will be replaced for the same reason: not for lack of updates or hardware support, but because training will be directed at the latest-and-greatest and nobody will have a clue with the old versions.
… I’m still using Windows for a couple of mission (rest was moved to Linux).
Never the less, there’s an inherent risk in using an old Windows release – especially if the said machine is connected (by any means) to insecure network.
… I’d consider converting your exchange server to VM and run in in a secure environment. (E.g. ESX, RHEL 5.4/KVM, etc)
– Gilboa
Not my job fortunately! I understand we have been stuck with them because the lease (!) has not expired on the machines, which are dual-processor Pentium IIs, built like tanks.
The hardware is strong, but the software, weak.
Actually, IMHO, Windows 2K was MS’ best OS.
– Gilboa
What was worse about 2k3?
While no a bad OS by itself, we found that Win2K3 eats far more resources (compared to Win2K) – even if you slim it down, making it less than ideal for semi-embedded (as in application server) usage.
– Gilboa