“After many months of waiting, Microsoft finally released the Beta 3 version of Windows Server 2008 (previously codenamed ‘Longhorn’), a major milestone pre-release version of the next version of Windows Server (more recently, a CTP, or Community Technical Preview, version was distributed to beta testers in June 2008). Windows Server 2008 has evolved quite a bit over time, and though the project hasn’t suffered from the many feature drops and problems that dogged Windows Vista, there are certainly a few surprises in Beta 3 and the June CTP. Here’s what you need to know about Windows Server 2008.”
How has Active Directory managed to grab such an installed base missing something as simple as a read only replica?
Obviously, to answer my own question, the applications that sit on AD are what drives AD uptake. Even still, I’m surprised that something so obvious is missing. Isn’t it difficult to promote another server to the PDC?
AD does’t have the concept of a PDC. It is a full multi-master replication directory service. There is several specific roles that domain controllers can have one being a PDC emulator but that is for legacy authentication and a couple of other things. It is a simple one click operation to move that to a surviving Domain Controller
Thanks for the heads up. When my summer of project hell finishes up, I really need to crack open my MCSE AD book and read up on it. My Windows admin knowledge is lacking in some areas. I’m far more used to NetWare/Suse/eDirectory administration.
This is shaping up to be a nice release. On my FreeBSD servers I am spoiled by the command-line so I am glad to see the inclusion of PowerShell allowing me to have similar ease-of-use on a Windows server. They were crazy to not have this in there originally! RemoteApp will also make it more cost effective since you don’t have to buy TS licenses + Citrix.
That it has already been released: “(more recently, a CTP, or Community Technical Preview, version was distributed to beta testers in June 2008)” which seems to indicate Time Machine is part of it
It’s tradition for Microsoft to hype their products with celebrities.. Why not have Doc and Marty McFly for this one? Doc’s latest project is to trick the Dalorian up with Server 2008. :p
wow so core has taken everythign out. i think they mentioned even note pad. thats amazing. i mean with notepad and solitare gone….whats left?
๐
Nope, notepad is there:
Server Core comes up with a blank desktop and a single command line window. There’s no shell, Internet Explorer, Windows Media Player, or any other pointless graphical applications. Indeed, even Notepad–which is available in Server Core–had to be hacked so that it could present an ancient version of the Open Save dialog.
I’d prefer vi.
I’m sure it wouldn’t take much to port. In fact there might already be a ported version (I know you can already get windows CLI versions of a few *nix apps like lynx and unrar)
True. You can get versions of grep etc for Win32 also but the problem is that they’re not standard so you cannot afford to really rely on them if you’re moving between client environments.
I’d prefer ice cream.
Windows Server Core. Now you, too, can pretend the last 15 years of operating system development didn’t happen and go back to the days when MS-DOS and the command line were king.
Unless they make Windows Server Core freeware, I don’t see how they’ll get any adoption at all.
Yup. Because Linux, Solaris, BSD all stay away from that ancient, undeveloped, command line….And succeed because of it!
If you look at paragraph 2, you will see
RTFA. One of the big improvements will be what is looking like the most advanced command line on the planet.
Edited 2007-08-01 04:54
PowerShell is one of those ecosystem things. If lots of admins start using it and sharing scripts, then it could become very useful over time. But if they don’t, then it’s not going to be very useful for the average admin.
The UNIX-style shells didn’t become so useful because of their elegant design and rich scripting environments. They became useful because useful scripts emerged, and eventually more powerful scripting environments came along. The command-line shell was essentially the only game in town at the time, so there was a large incentive to innovate.
It’s not obvious that point-and-click Windows admins will reach for a sophisticated scripting environment to get the job done. Besides, if you’ve seen the syntax, it doesn’t look all that friendly. Users are going to have to dedicate some time to learn it, and it might prove to be too complicated to be useful.
Besides, it’s not like there aren’t high-level scripting environments for *nix. There are at least two interactive shells that allow you to mix-and-match Python with your usual shell utilities. They just haven’t caught on in any significant way.
It’s hard to say whether this is because the existing shells are good enough, because the powerful shells are not so compelling, or because of a lack of awareness. I mean, *nix systems already provide powerful CLI tools for most system administration tasks. There’s no Windows equivalent of apt-get, and PowerShell won’t change that.
You ever heard of msiexec? msi? they remind me of apt-get and debs.
tbh It’s not that hard to write command line scripts for windows using WSH and batch files. It’s something I do a lot if just because i’m forced to use XP at work but find a powerful CLI essential.
I’ve actually written a few *nix CLI tools like grep for the Windows CLI as well as my own custom ones.
[edit]
oh, and against popular belief – Windows DOES support piping of commands in its CLI.
Edited 2007-08-01 11:27
I disagree completely. Any windows admin worth his salt already reaches for the command line. You would be suprised, almost anything you can do clicking in windows can be done with the commandline and registry keys. The big difference between UNIX and windows when it comes to such things is that perl has been standard in UNIX environments for decades. Now Windows will have something at least that good.
Anyways, I wasn’t debating the relative values of UNIX environments, the origional poster said that MS was ignoring the command line, which means he couldnt have gotten far in the article because paragraph 2 mentions powershell. I also find it funny that I get modded down for mentioning that, but his obviously uninformed comment gets modded up three times.
Server core does not include .NET and PowerShell runs on .NET…
It doesn’t matter that your server has no gui, though. Windows servers are designed for remote administration through the MMC which communicates with them through RPC. This is one thing the UNIX people don’t seem to understand about the lack of a Windows command line or SSH environment: you don’t need to log onto a machine or run a script on it in Windows because you can just control groups of machines through the GUIs or WMI scripting whcih talks to the RPC interface. Administering your remote server is effectively the same as administering the local one and you can apply changes to groups.
(from near the bottom)
The article says that .net is obviously required in core, but that is a big problem as there are graphical dependancies in .net that core does not have, but that they will probably go for a stripped down version. It doesnt say anything about a lack of either .net or powershell.
And as I said to butters in another post, any windows admin worth his salt already uses the command line extensively.
RTFP (read the post)
He was dissing the PowerShell feature because he felt command lines were somehow inherently inadvanced. You just wholeheartedly disagreed with his statement.
Windows Home Server and now Windows Server 2008.Both looking solid.How come they cant get it right on the desktop?
“Windows Home Server and now Windows Server 2008.Both looking solid.How come they cant get it right on the desktop?”
Couple of answers to that (I’ll pretend I agree with you for the sake of argument ๐ ):
1. Less hardware to support on the server. Much much much less. Almost all Windows instability can be traced back to some sort of faulty hardware/driver. Getting hardware on the Windows Server WHQL is a very stringent process since they are more mission critical.
2. Servers usually have a specific task to do, so they do that one task, and do it very well. Start throwing other tasks its way, and things start to get more complicated. Desktop machines need to be able to handle pretty much anything you can throw at it, plus be able to support many different types of usage scenarios/applications/etc.
The good news is that MS incorporates their server technologies in subsequent releases of Windows client OS’s (2000 WS to XP, Windows 2003 to Vista), and they are releasing Vista SP1 around the same time as WS2k8, so hopefully we’ll get some of that stability coming our way around then.
Competition, Microsoft really only shines when they are competiting with something. Which is why i feel that XP and Vista are the way they are. I think Windows 7 will be a more dramatic improvement due to the competition from Apple and Linux. Personally the last great Desktop OS from Microsoft was Win2000.
As for the servers, i have been very impressed with Windows 2003 (x64 aswell). Very stable and very fast.
Still trialing Win2008, but first impressions are good.
Consider Windows Firewall: When you install or configure a role like Application Server, the firewall is automatically configured so that that role will function correctly. But you can still go into the Windows Firewall GUI and manually override those settings.
That’s such a retarded comment I don’t know what to make of it…
If your admin is not capable to figure out how to configure a firewall manually then maybe (s)he should look for a different job…
What next? “the really annoying thing is if someone hacks your server, there’s no button to automatically disconnect the punk and return everything to the way it was”
bah!
Yeah, obviously it’s not hardcore enough.
I had installed beta 3 and tried it out. Quite good stuff, realy impressed.
The sad side is that almost every server app out there needs to be updated or patched because of Windows Firewall.
For instance I installed SQL Server 2005 w/SP2, which of course didn’t even touch the firewall. Had to look around the manuals and reconfigure sql server and the firewall so that only the necessary ports would be open.
All in all ws2008 is superb, I just wish Microsoft would get it’s licensing act together.
I mean something is just not right if even qualified licensing salesmen can’t answer my question immediately and they have to ask around for an entire week before getting back to me…
I’ll give an example:
I’m going to set up a WS2003 standard server, install sql server 2005 standard, make 20 remote machines synchronize with sql via https (merge replication), I’m going to have a single sql user and single WS2k3 user, and no AD. What licenses do I need?
It turns out I need 20 windows xp licenses, 1 server license, 20 windows server machine licenses, 1 sql license, 20 sql machine licenses, and an internet connector license because of synchronization via https, which is done through IIS.
Insane I tell you!!!
If your example is a real world example, I’d suggest rethinking your setup…
How about adding a proxy between your sql server and the 20 machines? Do these 20 machines really need direct access to the server?
a proxy server doesn’t matter, because you still need 20 licenses for every machine that connects to sql and windows server.
MS suggests you use two servers, one fore IIS and second for SQL, but I really don’t see the point in that + paying for more licenses.
It looks like the suggestion MS proposed is about the same as what I already suggested.
Let’s say your SQL server is called A, and the IIS is called B, and your 20 machines are c1, c2, …, c20.
A <-> B
A single connection from the IIS to the SQL server.
B <-> c1
B <-> c2
etc.
With that setup, you only need one CAL for the SQL server, and the client machines only need a regular license (assuming they are Windows clients).
Your original scenario was to have all clients connect directly to the SQL server, but with this setup you only get a single connection to it.
Your MS contact should be able to explain it all better.
in my setup, the clients connect to IIS which in turn connects to sql. IIS uses a predefined SQL account. When the clients connect to IIS they also supply a predifined windows server user so that the dll which is hosted in IIS could be run. The dll handles all sql traffic.
So as you can see, the clients don’t connect to the sql server directly. The only difference between your suggested setup and mine is that in my setup A and B are the same machine.
Putting “proxys” between clients and a server doesn’t reduce the number of cals required!!
So I still need 20 SQL cal’s, because I need cals for all machines that connect to the sql server.
I also need 20 windows server cals because the client machines connect to IIS.
Is this all what your MS salesman have told you or what you have come up with yourself?
And I’m not sure what you are doing with IIS that would require you to have server CAL’s to connect to it.
Many websites run IIS as their webserver and they certainly don’t have CAL’s for every single visitor.
And in cases where that webserver has SQL server in the backend, they certainly don’t have SQL CAL’s for every single visitor.
If you insist on what you are saying, I can only suggest you have a long talk with your MS salesman to clear up all the questions regarding your setup. Is this salesman someone from MS or from some company that resells licenses btw?
In either case, I’d get a second opinion from someone that’s independent of the first, and if possible, make sure at least one of them is actually from MS. Obviously you shouldn’t tell either that you talk to the other.