Microsoft’s efforts to bolster security in Windows XP will likely delay the release of a widespread test version of its forthcoming OS, Longhorn, until next year, Microsoft’s executives told CNET News.com.
Microsoft’s efforts to bolster security in Windows XP will likely delay the release of a widespread test version of its forthcoming OS, Longhorn, until next year, Microsoft’s executives told CNET News.com.
That one just show that Microsoft is not good. They reassigned lots of Longhorn programmers to XP SP2. They don’t have enought horse-power to drive all their projects, so some small features will be skipped in Longhorn (according to that link). That’s just shows a fact. Microsoft has been relaxed too long in the sky, now they’ve realized they’ve competence and that their OS is shit. So they’re trying to rewrite half of it it with Longhorn but now they’re learning that is not that simple…..OSS is already producing _lots_ of sofware (kernel, kde, gnome, mozilla, openoffice) and there’re thousands of other projects appearing everywhere echa day.
It would seem, rather, that deploying folks to handle SERIOUS security updates (i.e. SP2) has caused a shift of focus. Even the 900 lb gorilla only has so many folks to handle a situation. Perhaps they wish to squash bugs in what they have extant before turning another beta loose to have to track, as well.
Well, though this is disappointing to a point, is it really that surprising? Lots of MS releases have delays to them, and Longhorn is not an exception in this case either. Besides, you have to give them credit, as they are refocusing the security on there system, and developing what’s one of there best OS’es as of now. Also, though many will argue this, there are a vast variety of things that make Windows as decent as it is, or at least as simple as it is, so these things can take much time. Plus with a limited number of people to develop, this should be expected. I’m fine with it myself, I use mainly Linux on my one computer, and XP holds up fine too. Might as well take there time and do it right.
One of the secondary effects that is going to crush Microsoft is that the other application vendors haven’t caught their applications up with the security improvements.
For example, on my multi-function printer (contradiction noted), I can’t use the scanner with an unprivileged account. I don’t scan frequently, but I have to use an administrator account to do it, because the tool simply doesn’t work otherwise.
I guess I could try making my working account an administrator account long enough to re-install it and see if that cures what ails me, but my point is that the installed application base has some catching up to do in the configurability department, and I think this won’t help Mr. Softy.
I always love it when the OSS guys claim that it’s taking off, then you show them that graph and they claim google isn’t a good source for data, even though Windows is the least likely to have its browser use that search engine.
if MS had focused on good code that was secure from the start…then this would not have been such a big deal.
right now…it is like living in a house and never cleaning anything up at all…then one day you look around and say “crap, I better clean” it takes you about 100000 times more out of your week to clean up everything at that time, than it would have taken had you done it regularly.
same applies to MS and windows.
Interesting, I’m not surprised about the time-slip, but I am surprised that they have publicly announced they will be stripping features from Longhorn. Surely that’s not good publicity? If they haven’t said feature X is going to be in Longhorn, why tell the public that now feature X isn’t going to be in it.
Also, I’m surprised they said “If we don’t really know how to do something by now, it’s probably time that we think about not putting it into the product”, I’m not sure if this is good or bad. The good side would be that people think “wow, they will really be taking their time and making sure everything works then” but the bad side will be “so we’re going to have 2004 ideas and technologies in a 2006+ operating system?”
Good article, which is surprising for CNET.
Matt
Right and what would have beOS given microsoft?
Microsoft has done the right thing, i would rather longhorn released late then have it released half finished and buggy.
BeOS would have given Microsoft a solid basis for an operating system. Of course, they have this, they just can’t get the building decent stuff on top of it part right.
Is it really that surprising? Every version of Windows has been late. This time it is going to be particulary late because of the complete rewrite of most of the system. Microsoft has also been known to promote vaporware so it is not much of a surprise to see them pull some features out of Longhorn. As usual, the initial release will be in beta form and take at least a couple of service packs to become useful at all. OSX was the same way. It really didn’t become usable until 10.3. The difference with open source is that betas and tests are labeled as such. For instance, by the time 2.6 was marked stable, it had become remarkbly stable. I haven’t had an issue with any of the stable 2.6 releases.
A solid basis? By solid you mean, barely networkable, not multi-user at all, non portable operating system? I think Microsoft is much better off with the NT kernel.
about security in the past?
Obviously not too much as this article proves.
“If they haven’t said feature X is going to be in Longhorn, why tell the public that now feature X isn’t going to be in it.”
You got the first part right, they really haven’t been very specific on the details of Longhorn besides Aero, Indigo, WinFS, and Avalon, and even those are very vague. So they haven’t said X feature will be in Longhorn. They just as vaguely (actually even more vaguely) said that some things would be left out. So they actually haven’t told us that X feature will be dropped, they said that some features would be, nothing specific. I understand what you are getting at though, why say your dropping features when we don’t even know what features are going in to it? Maybe they just want more publicity? Maybe the guy just wasn’t thinking clearly because he just got April Fooled by some of his coworkers? Who knows?
Personally, I’m getting sick of reading about Longhorn. Yeah, as of right now, I am not going to read any more Longhorn stories until release. I can pretty much do anything I need in Linux right now, except play some of my games, which is why I dual boot with XP. Until some company releases a game which I absolutely must have that is Longhorn (or whatever it’s called by then) , I’m not upgrading.
“It really didn’t become usable until 10.3.”
Come on, MacOsX was already, what you called “usuable”, since 10.2.
And moreover Microsoft has much more ressources and money than Apple. I simply think that Microsoft is trying to put too much things in Longhorn, and they are realizing that its not possible and that it takes times. Depending as well on how the actual code base of windows xp can be extended with new technologies without breaking everything. I think they realized that it will take them a lot of work and time.
And i am really surprised that Microsoft needs to put a lot of of their developers to work on windows xp sp2 to enhance the security of their system, and therefore slowing down the development of Longhorn. It seems that Microsoft has a lot of things to fix, and it seems to take a lot of human ressources.
“if MS had focused on good code that was secure from the start…then this would not have been such a big deal.”
NT was focused security at start, on the design point of view.
This secure from the start is BS anyway: do you really think UNIX was implemented with security in mind in the 70 ? Come on ! Same for linux. These OS were never thought with security in mind at the begining; evend OpenBSD is derived from BSD, which was not focused on security. Unix model is awful concerning security.
And when the 2.4 kernel was marked stable, was it really stable? And wasn’t it at least one year late?
This kind of delay slips aren’t caused by having closed or open sources, but by having a feature-boxed schedule instead of a saner time-boxed one.
Yes, UNIX was implemented with security in mind in the 70’s. It began as a multi-user system which, by default, implies security and scalability, very much unlike NT. By the way, BSD was derived from UNIX so the beginning for it was when UNIX was begun, in the 70’s.
If the UNIX model is so aweful, what is your opinion of a good model?
Come on, MacOsX was already, what you called “usuable”, since 10.2.
I guess that depends on who you talk to.
And when the 2.4 kernel was marked stable, was it really stable? And wasn’t it at least one year late?
I never said open source projects were always on time. They are often late, just the same as commercial software projects. The problem with commercial software is it is often shipped late and incomplete. NT was unusable when it was released, and so was 98. ME was garbage no matter what you did to it. XP is finally getting needed security updates. How long has it been? At least Linux has been improving on providing stable, featured releases. 2.4 wasn’t the most stable release but how long has it been since it was released? It’s been quite a while and doesn’t represent Linux development on the whole. The 2.4 fiasco was more of an aberration in the Linux development cycle.
“BeOS would have given Microsoft a solid basis for an operating system. Of course, they have this, they just can’t get the building decent stuff on top of it part right.”
Whatever…
Even Apple didn’t used BeOS in the end.
And MS didn’t want to move to a new kernel, as Apple did in OS X. MS wanted to move forward the NT/XP line or kernels and add various features on top of that for Longhorn.
I havent read all the posts in this thread, so my post might sound a bit offtopic.
I prefer linux and all other “Nix” to MS, to the point that at the moment that is all I have in my PC.
However I can give credit where credit is due: Win2K and Win2003 are very good operating systems, IMHO.
Why, oh why they can’t make a desktop edition of Win2003?
If they did and the price were reasonable, I’d buy one tomorrow.
took a big risk by hanging their hopes on an open source kernel, and it paid off big. Apple has made a 180 degree turn-around in the market. The advantage they have is that they have a proprietary system based on an open source project. This gives the advantages of open and closed source. You are starting to see this happen with Linux now, but Linux runs on cheap PCs not $2000 PowerPC machines. Unless Microsoft changes it’s attitudes to open source and the OSS community, this delay will cost them dear. In the mean time the many thousand OSS of developers and their minions are starting to produce polished software at an astounding pace. With backing from such companies as IBM and Sun open source is about to acheive critical mass. By the way has anyone tried the Sun Java Desktop? I’m thinking of giving it a whirl as it is only $50.
Yes, UNIX was implemented with security in mind in the 70’s. It began as a multi-user system which, by default, implies security and scalability, very much unlike NT. By the way, BSD was derived from UNIX so the beginning for it was when UNIX was begun, in the 70’s.
I remember back in the mid-90’s (before hacking Windows became a professional sport) when all the hacking/cracking tutorials were for breaking into Unix systems. Working for an ISP, I saw our Unix box get rooted several times. It may be more secure than Windows, but is that really saying much? Like anything else, without a knowledgable user, no OS is really secure.
Most of today’s security problems involve internet worms that are trojan horse programs that provide a platform from which to launch a denial of service attack from many hundreds of computers. In Linux such a program could not obtain the permissions it needed without user intervention. I know of NO Linux user that would give the root password for an unknown process to run. Hacking individual machines one at a time is merely small-time vandalism compared to hacking thousands at a time. With your attitude we should just say it is hopeless and drop the firewalls and install Windows Me on all the machines, then let the vandals have a field day.
z1xq,
If more “average” users started using Linux, you could be very sure that they would give whatever was necessary for an unknown process to run. Just like they do currently with Windows. Even users who are running personal firewalls usually just say YES to whatever wants to access the network or run as a server. In fact, I would imagine that most people would run their systems as root anyways. If they aren’t using generic user accounts on Windows now, why would they in Linux?
If more “average” users started using Linux, you could be very sure that they would give whatever was necessary for an unknown process to run. Just like they do currently with Windows.
Exactly! If most Windows users were as computer literate as most Linux users are, you’d probably see about a 95% drop in the security problems on Windows. I assure you – any Joe Sixpack running Linux would gladly give up the root password for anything promising them nude pics of Britney or Christina. So, how secure is the OS once some rogue process gets root privileges?
The only way an OS could be truly secure is if the user were allowed to run anything they wanted without damaging the system. In other words, scripts or programs would never be able to have the ability to delete/modify files, along with about a thousand other things. As a result, no OS can ever really be secure in the hands of the wrong person.
Well, I am deeply convinced that even if you run Linux as root, a virus or a trojan can never cause as much damage as it does in Windows. That is because of the entirely different ways the two operating systems were conceived and work.
“Yes, UNIX was implemented with security in mind in the 70’s. It began as a multi-user system which, by default, implies security and scalability, very much unlike NT. By the way, BSD was derived from UNIX so the beginning for it was when UNIX was begun, in the 70’s”
Multi user is by no way a good model for security itself. In the 70ies, everything was open for everybody. I remmember one of my LISP teacher saying that there was no password on login in the late 70in his school !
UNIX was implemented to be portable thanks to C, and multi user, very simple and very open ! Neither Ritchie nor thompson dthought they would design one of the most used OS in the world. Like C, Unix is a nightmare, but better than the shit after, and simpler. At the beginning, Unix spread because of its portability, not because it was superior. In fact, it was pretty bad at this time.
In the 80ies, there were tons of buffer overflow everywhere, because it was very easy to use them (it is more difficult on windows, I think, but I don’t know the details).
Remeber, in the beginning of 70ies, no ethernet, no internet of course, just terminals for programmers. MULTICS was more secure by design. The real benefit of unix is its simplicity and the idea one command for one thing. Do not too much, but do it well.
When NT was designed, at the beginning, there was security in mind, there was portability, etc…
But one doesn’t care ! Linux success is exactly the same than Unix: not the best, not the smartest, bu used everywhere. Why ? Not because it is designed. Heck, even linus and Alan Cox say that linux is NOT designed, but follow an evolutionary process. It doesn’t fail despite that fact, but BECAUSE of that.
Here are a very interesting thread about that.
http://kerneltrap.org/node/view/11
“> I’m very interested too, though I’ll have to agree with Larry > that Linux really isn’t going anywhere in particular and seems > to be making progress through sheer luck. Hey, that’s not a bug, that’s a FEATURE! You know what the most complex piece of engineering known to man in the whole solar system is? Guess what – it’s not Linux, it’s not Solaris, and it’s not your car. It’s you. And me. And think about how you and me actually came about – not through any complex design. Right. “sheer luck”. Well, sheer luck, AND: – free availability and _crosspollination_ through sharing of “source code”, although biologists call it DNA. – a rather unforgiving user environment, that happily replaces bad versions of us with better working versions and thus culls the herd (biologists often call this “survival of the fittest”) – massive undirected parallel development (“trial and error”) I’m deadly serious: we humans have _never_ been able to replicate something more complicated than what we ourselves are, yet natural selection did it without even thinking. Don’t underestimate the power of survival of the fittest. And don’t EVER make the mistake that you can design something better than what you get from ruthless massively parallel trial-and-error with a feedback cycle. That’s giving your intelligence _much_ too much credit. Quite frankly, Sun is doomed. And it has nothing to do with their engineering practices or their coding style. Linus”
To complete what I wanted to say (it is really a pain in the A** not being able to edit its post)
I am still quoting Linus Torvald
“And I will go further and claim that _no_ major software project that has been successful in a general marketplace (as opposed to niches) has ever gone through those nice lifecycles they tell you about in CompSci classes. Have you _ever_ heard of a project that actually started off with trying to figure out what it should do, a rigorous design phase, and a implementation phase? Dream on. Software evolves. It isn’t designed. The only question is how strictly you _control_ the evolution, and how open you are to external sources of mutations. And too much control of the evolution will kill you. Inevitably, and without fail. Always. In biology, and in software. Amen. Linus”
If you read what I said carefully, you would see I have never said that windows was better in regard to security. It may be better, maybe worse, but it has no strong link with its architecture. Look at all the recent flaw in windows: it is not a major kernel flaw. “Just” buffer overflow everywhere… Is it more an implementation problem, or a design flaw in a part of the API. Not in the design of NT kernel itself.
I am using linux almost 100% of my time, now. It is not designed. It is a real mess. But I don’t care too much, because there is always (well, almost) a way to solve a problem.
Linux was also “about to be ready for the desktop next year” since 1999.
And it took 4 years for Mozilla to get usable.
So, it’s ok. We have all the time in the world.
Obviously we have a deep interest in Longhorn why we are babbling about its release date. I think Microsoft is on the right path with Longhorn, “take your time and don’t rush”. I think its refreshing to see Microsoft taking quality interest in security of their current products such as Windows XP and the Server products including other solutions the company develops.
Yes, development in Longhorn has slowed, but I can gurantee, by early 2005 we are gonna start hearing a whole lot from Microsoft about Longhorn and its betas. Write Microsoft is making the right move with developers, companies and partners by giving them the basic understanding of what Longhorn is all about and how to prepare for it with Alpha build, PDC was a starting point WinHEC will continue that tradition.
By the way, why are we in such a hurry to upgrade to Longhorn, its just 2 and half years since the release of XP. We are in a big hurry to reformat, upgrade, clean install a completely new version of Windows, but decent, effective Service Pack 2, is a major problem.
Strange
Just have patience people, you will see.
“Well, I am deeply convinced that even if you run Linux as root, a virus or a trojan can never cause as much damage as it does in Windows. That is because of the entirely different ways the two operating systems were conceived and work.”
That wouldn’t be a correct conviction. A process executing as root on Linux can do just as much damage as a process running as Administrator. Linux is not invulnerable. Neither is UNIX “by design” invulnerable (IRIX historically was very insecure by default) – nor even stable (Solaris 2.0-2.4).
@ Craig
if more “average” users started using Linux, you could be very sure that they would give whatever was necessary for an unknown process to run. Just like they do currently with Windows. Even users who are running personal firewalls usually just say YES to whatever wants to access the network or run as a server. In fact, I would imagine that most people would run their systems as root anyways. If they aren’t using generic user accounts on Windows now, why would they in Linux?
Thats such bull. Any distro now a days creates a user account for the average joe anyways. By default they will not be using root! Which means no you will not be looking atthe chaos that windows has caused. Personaly I realy don’t care. I can’t wait till longhorn comes out! Yoy CANNOT get rid of a DRM signed trojan period. Once I infect you, your finnished. 😀 Great stuff!
That comment is is like saying if linux was as popular as windows it too would be just as exploited Which is just as uneducated. Yes, user knowledge does help in securing an OS but the OS itself also has to be as secure. If you guys want to look at hitory. Every Windows OS, including todays get infected with the same trojans over and over while Linux keeps improving. The only diference with Longhorn is that The Trojans arn’t going to go away in the system. They will remain there.
But again. I like the fact that there is believers in the M$ world. For if there wasn’t and people started to run Linux, I wouldn’t be having quite as much fun!