Sources at the company told Paul Thurrott this week that Microsoft will soon delay the release of Windows Vista Beta 2 from December 7, 2005 to sometime in January or February 2006. However, because the Vista development schedule is extremely time constrained, the company will try and make up lost time by eliminating one of the planned release candidate (RC) milestones that were planned for later in the process.
awesome, one less release candidate.
hello more bugs in the final!
sweet strategy.
Actually, this might sorta work. It’ll allow them to get SP1 out the door faster; numerous companies will refrain from switching to Vista until at least SP1. At the same time as allowing them to roll out SP1 faster, it’ll actually have a somewhat ironic effect of giving them more time to develop the fixes – and, another side benefit is the massive testing base the OS will have
I can see your irony, but aggreavting your users by shipping prematurely can hardly be seen as a benefit. I do hope Microsoft keeps up with their “poor” work as it’ll make people switch to alternative solutions.
Indeed, but what did you expect from a Windows XP/2003 with a facelift? Vista builds on the the same old code with the same old broken design.
Windows 2003? I wish you were right, because believe me, said by an anonymous penguin, that is a good OS.
Unfortunately they didn’t listen to the (many) people asking for a Workstation edition of 2003.
Didn’t you know that Windows Vista is built off of the Windows 2003 Kernal? The entire 4000 series of builds were a kludge off of XPSP2. It had been patched so many times that the developers could not get any work done. It was decided to scrap all of longhorn and start again with the clean Windows 2003 kernal. With Vista’s hardware and presentation pinnings, it is now classed as NT6 and not NT5.3.
Well, I am honestly glad to read that.
Actually, they would’ve classified Vista as NT6 no matter what. If nothing else at least for the fact that it looks better to release it as a new major version if you’re touting/hyping Vista as a huge leap forward…
I’ll be surprised if they do ship in late 2006, particularly with these latest batch of delays. However, I do also wonder what would happen if they were able to release on that date?
Apple (I think) really want to release OS X 10.5 about the same time as Vista in order for them to show how they’re still ahead of Microsoft. Admittedly, that gives them about a year and a half since the last update, but still, it seems a but tight. Especially considering they said they were planning on slowing down the release cycle now.
On the other hand, Apple may aim for early 2007, that way the hype over Vista will have mostly subsided and Apple can say, “Wow, Vista can do that? Well look what Leopard can do!”
Oh well, it certainly be an interesting time!
And I wonder what, Ubuntu 7.4, Mepis Spring 2007, Mandriva 2007, Fedora 6/7, and other Linux releases will have to truly put pressure on Vista
Hmm …
Absolutely nothing. It will be the same old crap, just with prettier graphics and more bloat. You know it, I know it.
Agreed.
BTW I don’t think linux is poo. But you are correct except for the prettier graphics.
This is called “wishful thinking”…
IMO the time for Linux On The Desktop has come and gone, people now is a bit more cynical regarding the capability of the “community” to do such a job.
For the average user Mac OS is the “next cool thing to try”, not Linux anymore.
But then again this is all based on my own perceptions, I could be (and hope, as a Linux enthusiast) that I’m wrong.
>>For the average user Mac OS is the “next cool thing to try”<<
Yes, at the moment it is very fashionable, even I am very tempted.
But it remains to be seen what really happens.
If they keep insisting on selling their hardware for double the price, I can’t imagine “the average user” very tempted. Maybe more the geek user.
Well for many this “high price” argument is more of a myth..and there’s a news item here at OSNews (probably on last week’s archives) that about a million switched in 2005, including my sister who is everything but a geek. Her argument? She just wants “piece of mind”, and that’s worth the extra grand.
“for many this “high price” argument is more of a myth..”
Is it a myth when you pay *this* kind of prices:
http://store.apple.com/Apple/WebObjects/italystore.woa/90813/wo/fK2…
for such low spec hardware?
Hardware is just one variable in the equation, and for many users not the most important one.
What about software, easy of use, security, etc?
I agree. Rise of Mac OS X destroyed a couple of myth:
1) that so-called “community” (seen as a whole) was able to deliver something very big as a Linux ready for desktops.
2) the fact that task was “very difficult”. Apple showed that you have proper knowledge, if you are committed and if your organization is good, you can do that. Though it is difficult for OS X to grow too much, they showed that it is possible.
The hype looks gone now and Linux is facing the controversy of being unable to catch and match with other players. However, Linux has the licensing costs advantage over its competitors. That’s not a secondary thing. While I prefer Windows, if someone asks me about setting up an Internet point (for ex.) with 50 PCs to let people connect to the Internet, write and print documents and so on, I’d say “Go with Linux and save your money”. However, we cannot say that system is innovative, in the meaning OS X or Vista are.
However, that’s not all community fault. The community myth is gone but the real problem is its place has been taken by conflicting corporations which have different goals and are competing with each other. That’s not the right ecosystem to develop a common platform.
And, as MS and Apple showed, controlling the platform is the most important thing here.
Nice post. But there is one flaw in your statements…
See -> Linus is ready for the desktop.
Look at the Windows desktop. It has hardly developed since 1995, without fixing one single bug in the desktop. Neither Mac, nor Windows can handle drag’n’drop on printers properly (the same goes for Gnome and KDE).
And besides that, Apple did not create a new desktop environment, nor a new OS. It merely redesigned an existing OS (NEXTSTEP) and its desktop. The dock is taken directly from NEXTSTEP (btw. a very good decision).
Gnome and KDE began from scratch which is a lot more difficult. Even for a company, especially without the needed know-how.
Despite that KDE and Gnome has managed to include functionality and usability in a way that leaves par to Windows and Mac OS X.
“Gnome and KDE began from scratch which is a lot more difficult. Even for a company, especially without the needed know-how.”
Gnome and KDE are both thin layers on top of a rather large stack of existing software.
“Gnome and KDE are both thin layers on top of a rather large stack of existing software.”
Care to elaborate on the LARGE stack of software that you are referring to?
Last I checked, it sits on top of X. Hmmm… that appears to be 1 software package. Granted, the X package is about 80 meg compressed and takes about an hour to compile, but besides a few font packages, there are more software layers in Gnome and KDE that there are on the underlying graphics foundation X.
Why not cut out all betas and RCs just to make sure you deliver on time?
Afterall, MS reputation for producing buggy insecure software will still be intact!
If they keep delaying it like that then Vista:
*Will come out after the Mactel
*Linux desktop environments will be able to catch up to the features on that will come on vista
*The Open source kernels will have everithing Vista promises and more.
From this eXPerience the only thing that I think could be concluded is that the pace at wich windows releases are made have caused more damage than good. I mean, to keep up with the latest what Microsoft should have done was to keep making new releases on a regular schedule. Think about it. The release cycle on Windows over the last ten years has been anything but regular.
*Windows 4 was release in 1995(NT 4 and Win95)
*Windows 4.10 was release in 1998(three years)
*Windows 5 was released in summer 2000(Five years after NT 4
*Windows 4.90 was in winter 2000(two years after 4.10)
*Windows 5.10(XP) was in October 2001 (One year after the previous releases)
*Windows server 2003 was released in 2003(Two years after XP)
Now, with vista coming in 2007 it means that the Windows Desktop operating system wouldn’t have any major update in 6 years(3 for the server) Now that’s unaceptable in my book.
Aye? what space are you occuping? Microsoft is releasing Windows 2003 R2, which is an update to Windows 2003 in the form of a new release; I’d hardly say that is ‘terrible’.
Considering that end users don’t like upgrading at the same ferocity as the geeks here do, as long as Microsoft keep those service packs coming, who cares if there is a long gap between releases? if theys started releasing them frequently, I’d put money on it, you’d be hear whining about their ‘forced upgrades’ and ‘gouging money from customers”.
My point it’s not about forcing upgrades. It’s about continous development, and making releases more often so the transition from one release to the other could be made less painfull. Now, if MS could make predictable new releases every two year for example, and then making support two releases (with mayor features) at each time with api compatibility in at least four releases, means that users would need to upgrade every 8 years(those who like to take their time). And then, they would have to get new hardware anyway. Oh, and to make my point clearer, I’m talking about the Desktop versions of windows. Windows servers and company can be released at a slower pace like one new release every 4 years and a support period 50%-100% longer than the desktop edition.
Maybe we can put some hope into Singularity?
Or they can fork an OSS kernel and put a virtualized Win32 plataform that would run the old apps
Edited 2005-11-11 05:03
Also I good idea. I suggested something like that a short time ago.
A while ago while reading a Longhorn related article on Winsupersite, it was mentioned briefly that there was at one time plans to have an entirely new kernel for Vista.
It’s not gonna happen now by the looks of things (and I’ve found nothing beyond that article saying that there was going to be a new kernel), but despite the fact that Singularity is a research project, it (or something derived from it) may well become the new Windows kernel some years out, though probably no sooner then Blackcomb (or whatever the first major post-Vista release is called).
“though probably no sooner then Blackcomb”
I’d find that reasonable. One can’t expect that Microsoft, one or two years after Vista, releases something which cannibalizes it (it is a commercial company after all 🙂 )
But it is very important that they understand that they *must* break with the past.
*Linux desktop environments will be able to catch up to the features on that will come on vista
*The Open source kernels will have everithing Vista promises and more.
What makes you think this is true? As I mentioned to Linux is Poo. I like GNU/Linux.
But for me as a user nothing has change since Redhat 9 -> Fedora 4. How many years is that?
The UI hasn’t advance, the sound architecture hasn’t advanced (let’s forget about proprietary codecs, etc).
Where is the Rip the DVD to Uber Advanced Ogg/Theora. Built into the distro? This could be legal, they just need to make it work like Nero does on windows, that states that it needs to be unencrypted.
John Blink
But for me as a user nothing has change since Redhat 9 -> Fedora 4. How many years is that?
The UI hasn’t advance, the sound architecture hasn’t advanced (let’s forget about proprietary codecs, etc).
3 years from RH9 to FC4.
But the fact is that Fedora appears to have stayed in an infinite loop when it comes to usability and eye candy(only visual improvements in splash screen and gnome theme) In the meantime, both Desktops have seen great improvements in usability and eye candy when used in Distros like SuSe and (K)Ubuntu. And the propietary drivers/codecs remain an issue in non-comercial distros because of the legal tangle as you yourself mentioned (doesn’t mean that comercial desktop distros don’t have that support)
CPUGuy
Also, Microsoft used to have a year to 18month release cycle, and EVERYONE said that it was too much to quickly, they (they being everyone) couldn’t keep up, etc… I guess they felt obligated to upgrade to every version of Windows that came out. This was such a hotly debated issue.
Yeah, one year may be to short time. But my mention was about releasing them every two full years(not 1 nor 1.5) And besides, Apple releases MacOS at a rate that’s faster than what I mentioned for Windows(They’re also commercial you know) Now, Microsoft did screwed up that when they kept releasing their server OS faster than they were releasing their Desktop OS.
edit:CPUGuy, I’m Modding you down your second post and modding up the first, nothing personal, it’s just that I can’t take duplicates
Edited 2005-11-11 20:28
Seeing as that you have mismatched versions of Windows….
Also, Microsoft used to have a year to 18month release cycle, and EVERYONE said that it was too much to quickly, they (they being everyone) couldn’t keep up, etc… I guess they felt obligated to upgrade to every version of Windows that came out. This was such a hotly debated issue.
Now you are saying to go back to it?
Seeing as that you have mismatched versions of Windows….
Also, Microsoft used to have a year to 18month release cycle, and EVERYONE said that it was too much to quickly, they (they being everyone) couldn’t keep up, etc… I guess they felt obligated to upgrade to every version of Windows that came out. This was such a hotly debated issue.
Now you are saying to go back to it?
——————-
BTW, I will repost ALL posts that get modded down when they are not flames.
I do not have a good feeling about this…
One RC down?
And we all know how software made under tight chronogram ends up like…
Does anybody here thinks this is good?
Short answer: no. 🙂
Microsoft missing yet another Longhorn deadline?
This sucker was supposed to ship two years ago.
Seriously, how is this news?
A two to three year late product slipping another couple of months late?
Yawn…
Microsoft has more resources to throw at itself than almost any other company on the planet; why–or rather–how do they manage to titanically fumble things so often?
At the rate they’re going, MS will be 4 years late in getting Vista out the door (with no less than 3 or 4 “revisionings”), with major features like WinFS planned and subsequently dropped.
Now they’re dropping a serious bug hunting release because of what? Incompetent developers? Corporate Bureaucracy? Their nine (!) different releases…designed to please “everyone”?
Most of the new features they’re adding are already stable and available in other OSs.
Well, anyway lets hope Vista is better than the forecast.
Vista “final” will be a beta anyway. It took Windows XP at least until SP1 before it was reasonably stable.
>Indeed, but what did you expect from a Windows
>XP/2003 with a facelift? Vista builds on the the
>same old code with the same old broken design.
Dave Cutler created VMS and then he created Windows NT’s kernel. You do know that XP/2003/Vista run on his kernel right? It’s much better than what is in Linux.
This isn’t windows 95/98/ME anymore, boy! This is run by a superior kernel.
{You do know that XP/2003/Vista run on his kernel right? It’s much better than what is in Linux.
This isn’t windows 95/98/ME anymore, boy! This is run by a superior kernel.}
ROFL.
You do know that XP/2003/Vista retains binary compatibility with 95/98/ME right? That means that binaries designed to operate on single-user OSes windows 95/98/ME (95 is not even networked) will still run on XP/2003/Vista.
That in turn means that inherently XP/2003/Vista is borked by design.
Superior kernel my arse!
Superior to what, exactly? MSDOS?
ROFL.
http://www.eweek.com/article2/0,1895,1884318,00.asp
“From the ground up, Linux was designed to be a multi-user, networked operating system. Even now, Windows shows its creaky history as the descendent of a single-user, stand-alone PC operating system.”
Edited 2005-11-11 11:01
“Windows shows its creaky history as the descendent of a single-user, stand-alone PC operating system”
You my good man have never heard of NT4 I see. An ugly, but functional OS that’s had full vertualization, multi user support and a solid kernal since 1996 still used by businesses and you can even install Visual Stuido on it despite the OS being nearly 10 years old.
Shall we discuss the state of Linux in 1996, or are you done with your trolling?
1. I had to suffer using NT4 for many years at work. Shudder.
2. What is vertualization?
Answers.com asks:
“Spell Check
Did You Mean:
virtualization (technology)
vernalization
verbalization
fertilization (process)
verbalize
ritualization
ritualize
vitalize
brutalize
virilization
fistulation
cartelize
Fertilisation
formalize
formulize
vandalize
feudalize”
I’m guessing you meant the first, but it is clear that you can’t really know what you are talking about if you can’t even spell it.
3. What is a kernal?
Answers.com says:
“KERNAL
Wikipedia
The KERNAL is Commodore’s name for the ROM resident operating system core in its 8-bit home computers; from the original PET of 1977, via the extended, but strongly related, versions used in its successors; the VIC-20, C64, Plus/4, C16, and C128. The KERNAL consisted of the OS routines (in contrast to the BASIC interpreter routines, also located in ROM) and was user callable via a jump table whose central (oldest) part, for reasons of backwards compatibility, remained largely identical throughout the whole 8-bit series.”
An 8-bit OS? Basic? Even Windows is better than that, surely?
But the real howler of your post is this one:
NT4 has … multi user support … since 1996.
NT4 runs binary programs designed for Windows 95. Windows 95 is a single user OS. The Windows 95 filesystem had absolutely no support for user or owner or execute permission to be associated with a file. Since NT4 and later derivatives would run binaries designed for Windows 95, the so-called “multi-user” nature of the OS is a thin veneer at best. I repeat, NT4 will attempt to run a file based simply on the fact that its file name happens to have one of a set of extensions. It has to do this in order to support backwards compatibility with Windows 95 – which was a single user OS. Files have no “owner” – the filesystem doesn’t support it.
Therefore, the so-called “multi-user” nature of Windows NT4 and all its derivatives is a very thin pseudo-layer at best.
The very design of Windows is borked from the outset.
So, have you done with your lame (and wildly incorrect) defense of a lame, proprietary, expensive, insecure and buggy OS?
He didn’t troll. However, he was a bit offensive here and there, but that doesn’t qualify as flaming nor as trolling.
He is right, you know
“You my good man have never heard of NT4 I see. An ugly, but functional OS that’s had full vertualization, multi user support and a solid kernal since 1996 still used by businesses and you can even install Visual Stuido on it despite the OS being nearly 10 years old.
Shall we discuss the state of Linux in 1996, or are you done with your trolling?”
Yes, lets discuss that. Cutler did a GREAT thing when he developed the kernel for NT4, but, if you remember your history, the whole thing was borked up because of Bill Gates desire to remain backwards compatible with old versions. Therefore, and this is documented, Cutler could not guarantee that the kernel would be stable; guess what, 10 or more years later, Cutler was correct. The Windows kernel is STILL not stable.
Linux in 1996 was already running enterprise applications (namely Apache) and was light years ahead of Windows operating system from a security and stability standpoint; and it was only at version 1.3, a development kernel at that time.
Linux came, out the door, with security in mind. Unix, on which Linux is mainly based on, was far superior to Windows in every aspect; it still is today. The only thing that made Windows so popular was strong armed tactics and a “Kiss your @$$” interface.
Edited 2005-11-11 15:32
Because it can run older binaries, it is borked by design?
I suggest you figure out how exactly computers/OSes work.
Your lame arguments based on typos are not just unfriendly but also childish, please stop. The win32 api has been designed to deal with access and permissions in mind. All functions that provide access to an object have a lpSecurity parameter. In win95 this was always set to NULL which translates to default user access in NT4 and above (been a while might be a bit different). This is called FORWARD COMPATIBILITY, repeat 3x. Also the FAT never supported ownership and it DOES NOT on linux and nowhere else, NTFS does. I’m not a ms fanboy and I never was. Please next time give details if you want to argue and stop playing with words.
“Your lame arguments based on typos are not just unfriendly but also childish, please stop. The win32 api has been designed to deal with access and permissions in mind. All functions that provide access to an object have a lpSecurity parameter. In win95 this was always set to NULL which translates to default user access in NT4 and above (been a while might be a bit different). This is called FORWARD COMPATIBILITY, repeat 3x. Also the FAT never supported ownership and it DOES NOT on linux and nowhere else, NTFS does. I’m not a ms fanboy and I never was. Please next time give details if you want to argue and stop playing with words.”
Oh ye of little understanding, here is an experiment for you to perform:
On your “multi-user” (snicker) OS – NT or any above it matters not – copy a file with a .exe extension (Notepad.exe will do just fine) on to another volume which is formatted fat32 or fat16 – a USB stick or even a floppy disk will do.
OK, now right click on the copy of the file (say on the USB stick) – and tell me who owns the file. Do any of the “users” of your so-claimed “multi-user OS” own that file? Has the fat32 filesystem lost track of who owns it? Perchance?
OK, now double-click the file.
Does it still execute? Hmmmm. How did the OS decide if it should execute, if the file does not have an owner or have execute permissions set?
Multi-user OS? Pffft.
PS: Just to make it perfectly clear – Linux also cannot set owner or execute permission attributes of a file on a fat32 volume – as you say that filesystem doesn’t support that. But then again, no way in hell will Linux execute a file without execute permissions – so one can’t run a file from a fat32 volume.
Comprendez?
Edited 2005-11-11 13:27
Well it all depends on how your mounts points are setup. It is nice that you can select a user who will own files on a FAT partition. I really see this only as a minor problem, there are problems on the other side too: if you had a usb stick with ext3 fs
on which a setuid application had all correct bits set a bad mount point (missing noexec) could really screw things up! So in the end be carefull not to launch anything from a medium you can’t trust, no matter what OS you are using.
{I really see this only as a minor problem, … So in the end be carefull not to launch anything from a medium you can’t trust, no matter what OS you are using.}
This is not just a “minor problem” – it is a fundamental flaw. Windows OS (even NT and above) will happily execute a file without requiring it to be identiified as belonging to the system or to any particular owner, and without any concern if any administrator or user on the local system has granted that file permission to execute. Windows will happily trust a file from who knows where it came and go right ahead and execute it no questions asked.
That is fundamentally borked. The file could easily be a malicious trojan or virus planted on the system by an external hacker. Windows doesn’t care.
The Windows OS is not a true multi-user OS design. It has fundamental shortcomings in this area.
{there are problems on the other side too: if you had a usb stick with ext3 fs on which a setuid application had all correct bits set a bad mount point (missing noexec) could really screw things up! }
Say what? ext3 fs fully supports ownership and permissions.
Well, owner and execute permission attributes *could* be managed on a FAT32 volume, but Linux would have to use a technique similar to that used by OS/2 to store EA’s on FAT16 volumes — create one or more dedicated hidden files on each FAT32 partition in which to store the additional info on a file by file basis.
you can set ownership and execution rights to files in fat32 volumes if you mount the partion using extended attributes.
now, the only reason I can think of for anyone to do this, would be to share a partion with windows on a dual boot setup, and run applications in a sandbox. other than that, why use fat32 ?
I meant the above ^^^^^ under linux !
Since a release candidate is the “completed” OS and bugs are hunted down it does not make sense for Microsoft to drop a release candidate. If anything, they should have more, not less, so the public will get a solid OS when it comes out.