“640K ought to be enough for anybody.” Bill Gates, 1981. “64 bit is coming to desktops,there is no doubt about that, But apart from Photoshop, I can’t think of desktop applications where you would need more than 4 gigabytes of physical memory, which is what you have to have in order to benefit from this technology.” It seems to me that by the time it ships, Longhorn will need 4 gigs of RAM.Editorial Notice: All opinions are those of the author and not necessarily those of osnews.com
As mentioned in an earlier post on OSNews today, the Register has an article about remarks Bill Gates gave yesterday in Holland. He aid some interesting things worth examining. First is that 2006 could be optimistic for Longhorn. Second that Gates has not yet understood the significance of the Athlon 64. And third that Gates imself may be out of the loop on some important MS issues.
LoooooongHorn
Concerning the ship date for Longhorn, Gates said, “Longhorn could be 2005 or 2006… This release is going to be driven by technology, not by a release date. Which probably means it is going to be late.” Mary Jo Foley quoted Sanjay Parthasarathy, (corporate VP, platform strategy and partner group) last Friday as saying “Three years out is the Longhorn wave.” So the target date is clear as mud. Is it 2005 (which everyone expects to slip into 2006)? Or is it 2006 (which everyone will expect to slip out into 2007)? If its really going to ship for end 2007, there are a few problems. As discussed in an earlier article, the sales people will go crazy without an OS upgrade of some sort next year. They sold a lot of Software Assurance licenses on the expectation that there would be a new OS before the 3 year term was up. OK, MS never promised such a thing, but a lot of customers rationalized that’s what they were buying. Bad things happen. The sales people won’t like going in to sell the next 3 year round of Software Assurance if there’s been no upgrade. But they are all big boys and girls. They don’t have to like it as long as they meet their numbers. But selling the next 3 year Software Assurance license with no new OS in the cards is going to be a very tough job. Very tough. And its going to be tougher unless MS details new upcoming versions of Office or something equally valuable to the clients. The mantra has been “Its in Longhorn” for quite a while now. That seems to mitigate against major upgrades beyond the current Office 11.
All of this assumes MS sticks to its guns and does not release an interim Windows to fill the gap. There is talk floating around that the WinXP Service Pack 2 due in the spring could be tarted up to fulfill such a role. I don’t think so. Nobody that bought Software Assurance with the expectation of a Windows iteration will be satisfied with that. But its hard to see what else could be done. The timeline of previous Windows releases looks like this
Windows 1.0 | 1985 |
Windows 2.0 | 1987 |
Windows 3.0 | 1990 |
Windows 3.1 | 1992 |
Windows 3.1.1 | 1993 |
Windows NT 3.1 | 1993 |
Windows NT 3.5 | 1994 |
Windows NT 3.5.1 | 1995 |
Windows 95 | 1995 |
Windows NT 4.0 | 1996 |
Windows NT Server | 1997 |
Windows 98 | 1998 |
Windows 2000 | 2000 |
Windows ME | 2000 |
Windows XP | 2001 |
Windows XP Tablet | 2002 |
Windows XP Media Center | 2002 |
Windows 2003 Server | 2003 |
The proposition that MS can go three plus years without an iteration to Windows is inconsistent with their past history. The only time its happened before was between 2.0 and 3.0. And I’m not going to take a 64 bit version of XP as a major upgrade.
The 64 bit Question
The second interesting thing from the remarks was the part about 64 bit desktop computing. Its clear that Intel missed the boat, and that they are scrambling to put something, anything on the market to counter AMD. They seem to have won the lottery with the Athlon 64. It wasn’t at all clear until a few weeks ago that a 64 bit chip with 32 bit capabilities in the same price range as current chips would be the magic formula.But it is. And Intel knows it now. They are responding with the P4E. If they had decided to create a strategy to piss off the greatest amount of people, they couldn’t have done better than the P4E. Its too expensive for enthusiasts and OEMs. But being a cut-down Xeon, it will be cheap enough to anger customers that bought the full-price version. And its not 64 bit. 64 bit “extensions” are about as compelling as 3dFX’s 24bit colour when ATI and NVidia were shipping 32 bit chips.
Intel’s problems are obviously not Microsoft’s problems. But a swift uptake of 64 bit desktop systems will be problematic unless MS moves quickly. There is no shipping 64 bit XP for the Athlon. But there are multiple shipping Linux distros with full support. Normally, we could expect MS to move fast enough to cut off the competitive opportunity for the Linux distros. However, the chief software architect has announced “It (64 bit) appears more magical than it really is.” And seemed to emphasize the importance of security and patching over a 64 bit OS. Could this be a mistake as significant as the failure to note the importance of the Internet in 1993 and 1994? We’ll see, but betas of a 64 bit XP are all they’ve promised for the short-term. Its going to be hard to match the pace of OS-X and Linux development if MS is tied to an ever receding Longhorn and has no major plans for a 64 bit desktop.
Out and About
Finally, there are some tantalizing hints that Gates is not entirely in the loop. He said that he was unaware of plans to ally with Phoenix for BIOS level security measures saying, “The BIOS will always be separated from the operating system. Actually, it’s gotten out of date. If you run Windows XP, it calls very little of the BIOS.” Discussing recent security issues, he said, “None of the security problems recently affected people who had their software up to date.” And he added that MS would solve the patch problem by applying patches automatically. AFAIK, Gates seems to be out of touch with actual events in each case. “Trusted Computing” of the kind promised with Palladium will require some serious hardware interaction. The BIOS is the easiest place to do this. Phoenix is a crucial partner in enabling user compliance. Gates did not seem to understand this. Also, I believe that some of the recent security “issues” involved known vulnerabilities for which MS had not issued patches. And the idea Ballmer is going to OK automatically patching user machines seems far fetched to me. The liability issues could be staggering. If MS is going to assume responsibility for securing my machine, the they’ll have to pay if it gets hacked. Let’s not read too much into this. Gates is a very smart man and he’s worked with Ballmer for a long time. But these episodes are orth noting.
In all, a surprisingly revealing session. I’ve frequently predicted Microsoft’s doom, and usually been right about the cause, but wrong about the effect. They are a smart bunch of people. But we are n the edge of big shakeups in the existing order. Rather than articulating Microsoft’s role in the new, they are reacting to events. Gates sounded almost plaintive when he said “We invented personal computing. It is the best tool of empowerment there has ever been. If there is anything that clouds that picture, we need to fix it.” If you can Mr. Gates. If you can.
http://www.urbanlegends.com/celebrities/bill.gates/gates_memory.htm…
Hm.
Let me help you!
Many content creation tools will benefit from 4gig+ of physical memory, including Adobe, Macromedia and software from the likes of Steinberg and Emagic.
Hope that helps. ๐
I guess if you stop programming you lose touch with reality. Bill Gates hasn’t done any programming publiclly in years. He has been the marketing man.
10 years and Linus is still in daily charge of source code. 10 years and Bill Gates just doesn’t remember even how computers interact with hardware.
He also told reporters that windows patches faster than linux.
Linux usally has patches in hours, best from microsoft is days. One should feel sorry for him. His greed and need for control are showing just how frail his thinking truely is. Just remember the truth is a three edged sword, there is your side, their side and the truth. Appartenly marketing has clouded Microsofts side with a bigger reality distortion field than Jobs.(maybe)
Although I’m not entirely sure, I get the feeling that the next year will be quite interesting. 64-bit computing at the desktop isn’t the cause by itself. However, I do feel that it was the needed catalyst to get the innovation boulder rolling. Maybe now is the time that MS WILL change, at their own speed, or at the speed of other’s developments. And, it’s about time IMHO.
Yes it is… And looking at your link
“QUESTION: I read in a newspaper that in 1981 you said, “640K of memory should be enough for anybody.” What did you mean when you said this?
ANSWER: I’ve said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough for all time.”
So even if he didn’t say it, he’s still contradicting himself.
It was a while ago, and I can’t seem to find the link. But I remember reading on a microsoft site the minimum requirements for longhorn… wasn’t it something like a minimum of 1GB?
No, it wan’t.
“He also told reporters that windows patches faster than linux.
Linux usally has patches in hours”
With no regression or integration testing, and often being updated several times within the time of the initial update.
Also, the patches take more time to be integrated into the major distributions which often don’t include all released updates in new versions.
———————
From the article:
“The proposition that MS can go three plus years without an iteration to Windows is inconsistent with their past history. The only time its happened before was between 2.0 and 3.0.”
This also occurred from NT 4 to Windows 2000 as shown in the table in the article.
Oh god I feel an anti-ms wave comming up– grab your chances people, let it all out!
Okay, on to the sane people:
I don’t think any major changes will happen soon– at least, not any permanent changes. We might see some fluctuations in marketshare, especially when 64-bit reaches the desktop for real (whenever that may be). Some operating systems will be ready for this, and some won’t. And, obviously, the ones that are will have a head start. But, if Windows is not one of them, they surely will recapture their lost marketshare soon after, they have enough financial resources to do so, whether with real innovations or with something else.
I find the alternative sector of the OS market way more interesting, with projects such as Zeta, OpenBeOS, SkyOS etc. making (huge amounts of) progress.
LOL.
I’m not a programmer.
But 1GB, just to get the OS booted is too much.
64MB ought to be enough for every os to boot in.
If You want speed then 128MB. After that point the
resources sould go to the apps.
Just my ยฃ0.02
I have been following and analyzing Microsoft’s corporate moves and product announcements for a long time. If it is currently 2003 and MS says Longhorn will ship in 2006, that means 2007 at the earliest, and by that time I can see 4GB (or more) RAM being common in the desktop systems of power users. Bill Gates is a brilliant (and ruthless) business man, but he is often short-sighted in his visions of the future. In 1994 he wasn’t about the power of the Internet at all, he was all about CD-ROM in his speaches. In 1997 he was talking up DVD-ROM as being the future of home computing, and that never took off at all.
That said, however, I think his comment “I can’t think of desktop applications where you would need more than 4 gigabytes of physical memory” is in reference to TODAY’S desktop applications. No, you don’t need 4GB of RAM to run Office, download music, watch movies, burn CDs, or send/receive emails and IMs. He’s right. But that will obviously change in time, anybody who can’t spot a trend toward more feature, more bloat in every single application (especially MS’s own stuff) isn’t paying much attention.
Start Over, is my advice. They are behind OS X/bundled OS X apps. BEHIND. They don’t offer much more than linux (except the apps – but customers understandbly don’t think they should just pay MS rent because windows has the apps.
Start Over, give people a real product. christ, MS, you have so much money. why can’t you start over and produce something good?
By the time ‘Longhorn’ is released, Free Software will have a significant reputation and presence in the corporate world that ‘Longhorn’ may not be enough for the Behemoth to retain it’s monopoly.
There’s a point at which Free Software will simply explode onto the scene and it’s close to that point; when there’ll be bigger and more skilled workforce developing the software than even Microsoft can match.
Microsoft may well be heading (in 10 years time or so) for more niche markets like tablet PCs, gaming consoles, and other more embedded markets. It’s almost impossible to justify buying Microsoft Office over OpenOffice.org 1.1 – imagine how it will be several years in the future!
yeah, well windows should rethink its Memory management.
rather than try to minimize the amount of ram a program takes up, it should allow a program to take as much as it might need if the memory is available.
if the machine is just sitting there with no programs running, I want the OS to run with as much memory as it needs so when I open explore or some other OS component, it just opens ready for use.
when I am running applications, I want them to be able to take as much RAM as they need to make the program run great as well. of course, you need to take memory away when the RAM fills up and you want to run more programs, but so what, up until that point you have minimized swapping very effectively.
if you try to make programs use as little memory as possible, you end up playing games where the program swaps out a lot and might even choke.
why have a gig of memory if all you are going to do is manage it like I have no memory?
>if you try to make programs use as little memory as >possible, you end up playing games where the program swaps >out a lot and might even choke.
The minimun req of 1GB to run the OS (not apps) is over the
top. So a PC 512 should not even bother trying to boot
Longhorn ?
but of Windows memory management in general.
it is antiquated at best.
Oh stupid me, and I thought IBM did that. Perhaps their PS series — Personal System — weren’t really computers. Perhaps it really was microsoft that delivered hardware for decades. Yes yes, it all makes sense now. Pardon my obvious sarcasm.
“640 Peta bytes of memory should be enough for anybody”.
There’s a point at which Free Software will simply explode onto the scene and it’s close to that point; when there’ll be bigger and more skilled workforce developing the software than even Microsoft can match.
Well, Windows may be in a stalemate for the next several years (Service Packs not withstanding), but the apps definitely won’t be. And as everyone knows, it’s the apps themselves that are the big draw to Windows anyway.
So, even though Free Software (and I’m speaking of the desktop variety here) will continue to evolve, so to will Win32 commercial software continue to evolve right along with it.
Free software seems to have the ‘bread and butter’ apps covered pretty well, but there are some gaping holes when you get into the ‘industrial strength’ (*cough* Dreamweaver *cough*) and ‘specialty’ categories. So until these apps are filled, Windows ain’t going anywhere anytime soon. As for things like OpenOffice and Firebird, these same apps will run in Windows anyway.
“10 years and Linus is still in daily charge of source code. ”
So I believe is Dan Dodge at QNX.
“Start Over, is my advice. They are behind OS X/bundled OS X apps. BEHIND. They don’t offer much more than linux (except the apps – but customers understandbly don’t think they should just pay MS rent because windows has the apps. ”
I really don’t see where OS X is far better than Windows. Frankly, the GUI is shiny, but I was quite dispointed by all that OS X thing. It is a good OS, sure, but not really a revolution. More, a lot of mac users hate OS X. I was surprised by that fact, but most music users I know who like mac hate OS X. The apps are not here, or buggy ( Digital perfourmer 4). Almost nobody use Mac OS X for pro audio ! Apple path is not possible for microsoft, beacause of their marketshare. They have basically the same problem than Intel with their x386 line : outdated, very difficult to enhance, for compatibility reasons.
About the start over : NT has a GOOD basis. But there are some horrible flaws, like an horrible registry, not good security (implementation, not in design), etc… A late realease date for longhorn can have two reasons, for me : DRM thing ( yuck) or a totally new super set of API/ systems on the relatively good kernel.
My dream ? A NT kernel, a good documented and not binary registry, no native win32 anymore, just a software emulation when you need it, all system calls available to anyone, double licensing for sdk for open source developement like QT and so ! Possible ? Hardly, but it is just a dream
If I recall, the differeences between Win 3.1 and 3.11 were minimal, almost in the nature of a Service pack, the next upgrade to that Windows line was Windows 95
IF you don’t treat 3.1.1 as a separate Windows release then a three year release cycle isn’t unprecedented.
If you think BillG is technically out of the loop, talk to someone who has been through one of the (regular) “Bill Reviews” people at MS suffer/benefit from. His role is more-or-less entirely ‘technical’ since day-to-day business became steveb’s pigeon.
As to the wider point the article is making – I disagree with his premise. Most of those ‘OS releases’ are really minor iterations, and once you take those out and look at the revenue generators, the picture becomes much more uniform.
The first Windows OS that contributed significantly to MS was 3.1 in 1992. Then the Win95 / NT 3.5 wave in 1994/1995. 98/ME were very minor revs of 95 (the OEM95 versions don’t seem to have made this list, and were no more or less significant to customers than 98/ME), so the next wave to look at is the 2000/XP wave in 2000/2001, along with the short delay before we got 2k3 server (2003).
My product timeline, then goes:
1992 – [3] – 1995 – [5] – 2000 – [3] – 2003 – [4] – 2006
So we wait until 2006 / 7, for the next major release.
Seems pretty average, if we pull out all the ‘point releases elevated to product’ stuff.
And, for fear that you argue that you can’t just pull that out, well – we’ll be getting such a product later this year / early next year (2004 give or take a few weeks): XP SP2. A host of bug fixes, but new features too. At least as significant as Win95 – Win98, and way more so than WinME. And I wouldn’t be surprised to see another before Longhorn.
Plus – the OS is more modular these days. You can get new DirectX, Media Player, hardware support etc. etc. without an OS rev. At the developer end, .NET framework, GDI+, P2P etc. all arrived without an OS release.
An ancialliary question might be: Why no big marketing splash and an “XP Second Edition” moniker?
I think the answer is that we’re living in a more mature market place now, and many customers value stability over ‘the latest thing’. New products take even longer to adopt than a service pack!
those users that love classic are mostly the old codger group who HATE change. forget that OS X has a better multitasking ability…a better memory management system, is 10,000 times more stable, and runs just as well as OS 9 on g3 systems (talking of panther here).
OS X is as much a better system to classic Mac OS as XP/2000 is to win 3.1/9x/NT 4.
the apps are not there yet..well perhaps they should put more pressure on the companies to make them!! I mean, these companies have has over 3 years to port their applications to OS X, Apple even gave them a nice easy rout with Carbon until they could make a Coaca version.
if you think that 98 was a minor revision, then you are definitely not talking to MS. Microsoft was talking about windows 98 and ME as if they were new OSs.
you can not say, just because the features and look are similar that MS did not sell it like a new OS.
MS sold them as major revisions. therefore they belong in there. as it stands, the only thing that people have to keep the 6 TO 7 YEAR GAP between XP and longhorn are free service packs.
Exactly, which is why QNX software is a good buy.
There is no consumer version in 2003 win2k3 is a business server version only. so it will be 6 years for the consumer.
Why no XP SE simply because of win98 SE. It happened to be a joke.
“He also told reporters that windows patches faster than linux. Linux usally has patches in hours”
With no regression or integration testing, and often being updated several times within the time of the initial update.
Yes! Finally someone else with a decent understanding of software development willing to stand up to the open source fanaticism that permeates these comments.
The best, most recent and pertinent example of this is the most recent vulnerabilities in OpenSSH. The initial release of OpenSSH 3.7 did not fully address the security vulnerabilities that had been discovered, and consequently OpenSSH 3.7.1 was released. OpenSSH 3.7.1p1 was released using a largely untested reimplementation of the PAM code, as they patched the portable development branch with the security updates rather than the stable branch. Consequently a number of vulnerabilities were found in the new PAM implementation, and OpenSSH 3.7.1p2 was released to address those. So, an administrator attempting to protect his systems from one newly discovered security vulnerability could have ended up having to reinstall OpenSSH three times.
That isn’t to say that Microsoft hasn’t released patches that don’t fully address an issue, or introduce new vulnerabilities, it’s simply to say that the same problems occur in both open source and commercial software, and that applying a patch that addresses a security vulnerability immediately after it is discovered is probably not a good idea, as that patch most likely hasn’t been tested and may break something else or introduce new vulnerabilities.
“those users that HATE change”. That’s most users. Not most users who post here, or have blogs or are our techie friends. Those are ‘some users’. Most users are either corporate desktops (which change only when there is a compelling reason so to do), or individual home users, who tend to be very technologically conservative, and have limited budgets for computers.
And in reply to debman: my point about the 98 and ME transitions was to illustrate how MS perceive that the market has changed since they had a need to sell those minor technology updates (mainly aggregating previously released components into the OS) as ‘major OS releases’. We’re living in different times: new PC sales slowed and are still slow, the market is heavily saturated, there is less emphasis on technology for technologies sake, the average person, and the average business are the end users, *not* the technology early-adopters. Another 3 year span before Longhorn will pass in a blink of an eye.
And there’s another point to make here, too. Corporate IT infrastructure runs in 7 or 8 year cycles. The last major spend was in 1999-2001 (with Y2k as the driver), so 2006-2008 is going to be a crucial time for the industry. A new OS appearing now would have tough time.
When MS releases a Windows ‘iteration’ is irrelevant. When something gets released is determined by marketing needs, and software specifications adapt.
Remembar Cairo? What was that? When was that?
Who’s to say that Longhorn isn’t another Cairo, and that MS won’t release a non-Longhorn Windows earlier, when marketing deems necessary, and say that Longhorn hasn’t arrived yet, but you won’t believe how good it’ll be when it does.
Remembar Cairo? What was that? When was that?
Cairo was NT 5.0, which was released as Windows 2000 in 1999.
Re: “640k ought to be…”
I actually researched this one before posting. I found some very old references in industry news. Also looked it up on Snopes.com the bible of urban legends, nada. If it is a legend, it came into being very early.
Re: timeline
OK, the timeline wasn’t the best idea. But the main point is still good. A gap of 5 years (XP 2001, LH 2006) is a long, long time in the computer business.
Re: “don’t see where OS X is far better..”
Don’t want to start or participate in flame wars. But I have to say that OSX is unique. Apple has put Unix on the joe sixpack end-user desktop. That it doesn’t look like a big leap is kind of the point.
Actually, it was MSs original idea for a database-filesystem. Its currently turned into WinFS, which is still in very early development.
good points about the tech cycles.
but I still think there is little to Hate in OS X if you are moving from OS 9
So does it mean that if I deny that I said something it automatically becomes a myth – regardless of whether I actually said it or not?
LOL
OK, the timeline wasn’t the best idea. But the main point is still good. A gap of 5 years (XP 2001, LH 2006) is a long, long time in the computer business.
Development of Windows 2000 began in 1995. It was released in December 1999… approximately 5 years as well. This is nothing new for Microsoft.
Considering that AMD’s entire processor line will soon be 64-bit following the release of the Athlon 64, there’s no real reason not to support 64-bits on the desktop as users will be getting 64-bit support “for free”. A 64-bit kernel will have significantly better performance in many areas, most notably NTFS performance as 64-bit filesystem operations won’t have to be “emulated” using 32-bit integers. Having 64-bits of address space to work with eases load upon the VMM, and allows large files to be memory mapped without worry.
<< I really don’t see where OS X is far better than Windows. Frankly, the GUI is shiny, but I was quite dispointed by all that OS X thing. It is a good OS, sure, but not really a revolution. More, a lot of mac users hate OS X. I was surprised by that fact, but most music users I know who like mac hate OS X. The apps are not here, or buggy ( Digital perfourmer 4). Almost nobody use Mac OS X for pro audio ! Apple path is not possible for microsoft, beacause of their marketshare. They have basically the same problem than Intel with their x386 line : outdated, very difficult to enhance, for compatibility reasons. >>
You are right! I guess BT, Seal, U2, The Stones and all those other producers are lying about loving ProTools and Logic on OSX.
There is always going to be an aversion to change. I also know OS9 users who hated OSX… at first. Given a little time to adjust they can no longer concieve of going back to 9.
MS marketshare is exactly ehy they CAN make massive sweeping changes. They own the industry. If anyone can force change it should be them.
>”He also told reporters that windows patches faster than >linux. Linux usally has patches in hours”
>
>With no regression or integration testing, and often being updated several times within the time of the initial update.
>Also, the patches take more time to be integrated into the >major distributions which often don’t include all released >updates in new versions.
True nacer, but it also happenned that a)Microsoft patches makes some other product crash, and b) Microsoft’ fix doesn’t really close the vulnerability: which leaves you in a worse situation than before: you think you are patched when in fact you aren’t.
So apparently, the longer time it takes Microsoft to release its patches is not good enough..
I really don’t see where OS X is far better than Windows.
Well, as you go on to talk about audio, let’s talk about audio. CoreAudio is the lowest latency sound API available (surpassing even ALSA), and it is the standard system sound API for OS X, meaning every OS X application can take advantage of it “for free”.
Contrast this to the Windows side, where the standard system sound interfaces are all very much high latency. A low-latency API exists, ASIO, but it is supported by a small fraction of the drivers for Windows audio hardware.[/i]
The apps are not here, or buggy ( Digital perfourmer 4).
What the hell, Digital Performer 4? How about… Cubase, Logic, or ProTools? Claiming that the “apps are not here” for pro audio applications is completely ludicrous.
Almost nobody use Mac OS X for pro audio !
I believe this was addressed quite well in Ian Eisenberg’s post.
“ose users that love classic are mostly the old codger group who HATE change. forget that OS X has a better multitasking ability…a better memory management system, is 10,000 times more stable, and runs just as well as OS 9 on g3 systems (talking of panther here). ”
No. The guys who love OS X are the guy who are tech guys. There is no question for me between mac OS 9 and Mac OSX. The first doens’t even deserve the name of OS; I was amused by the crash of applications because of interrupt in a dynamic memory allocation.
“You are right! I guess BT, Seal, U2, The Stones and all those other producers are lying about loving ProTools and Logic on OSX. ”
I happen to speak with professionnal guys in France. You know what ? They all hate OS X. I spoke with mac zealots, at the apple expo in Paris: they hate mac OS X.
SO the marketing about the big guys, I really don’t care at all. Whoever pro sound engineer I am speaking with says that mac OS X is buggy (for them, buggy OS means app crash; driver crash, etc… They don’t care if it is mac OS X fault or not. They jsut see that it works on mac OS 9), unresponsive. Almost no one uses Mac OS X for profesionnal music. It was surprinsing for me: a BSD-like system, with true multi taksing, and a good graphic system should be far better. but on a consumer point of view, it isn’t at all.
Speaking about logic and pro tools is a joke : logic is apple, and pro tools 6 (thre only mac OS X version) is not really common in studio.
Nothing to see here, folks.
Intel’s problems are obviously not Microsoft’s problems. But a swift uptake of 64 bit desktop systems will be problematic unless MS moves quickly. There is no shipping 64 bit XP for the Athlon. But there are multiple shipping Linux distros with full support.
….
Could this be a mistake as significant as the failure to note the importance of the Internet in 1993 and 1994? We’ll see, but betas of a 64 bit XP are all they’ve promised for the short-term.
(emphasis mine)
Anyone seriously think Microsoft would be blindsided by something like that? They have their fingers in so many pies precisely for that reason – so they can’t be caught flatfooted by some Next Big Thing(tm).
Remember: in a poorly-regulated corporatetocracy, brute strength and ruthlessness always comes out on top.
“With no regression or integration testing, and often being updated several times within the time of the initial update.”
Hehe…Microsoft has many of the same issues. I have many a time experienced decreased performance or additional problems after using Windows Update to get an update that microsoft took days and days to post. This is a documented fact (although I couldn’t find an article on it right now).
You later say that Major distros often don’t include the patch. Well…it’s a free country I guess…LINUX is always available and updateable.
“What the hell, Digital Performer 4”
The best sequencing tool in the world ? Really, it there were one only reason for me to swithc, it would be for DP. And DP4 sucks, for now. It is the only OS X version.
“Almost nobody use Mac OS X for pro audio !
I believe this was addressed quite well in Ian Eisenberg’s post.”
you mean repeating what marketing guys are saying ? Come on, go to a profesionnal studio, a real one, and see: mac OS 9 or Mac OS X ? You will be surprised ( as I was myself).
“Well, as you go on to talk about audio, let’s talk about audio. CoreAudio is the lowest latency sound API available (surpassing even ALSA), and it is the standard system sound API for OS X, meaning every OS X application can take advantage of it “for free”.
”
Yes, but:
1: asio is available for all “real” sound card. RME, MOTU, M-audio, echo, etc…
2: Alsa on a patched linuyx kernel can go around 1 ms of latency.
3: the latency of W2k and Mac OS X is nearly the same.
http://gigue.peabody.jhu.edu/~mdboom/latency-icmc2001.pdf
The conclusion is that under heavy load, OS X, linux and windows are comparable with 1 or 2 ms. Driver quality matters a lot. The report is 2 years old, so linux should be much better now. I don’t know for OS X and W2K. But frankly, it is not a problem once you buy a decent soundcard.
I happen to speak with professionnal guys in France. You know what ? They all hate OS X. I spoke with mac zealots, at the apple expo in Paris: they hate mac OS X.
Well, I don’t know about France, but here in the states OS X has been a wild success.
My uncle, a professional audio engineer and recording studio owner, has had an OS X rig in his studio since 10.0, and following the releases of Cubase, ProTools, and Reason for OS X converted his entire shop.
I have very few demands of an OS, regardless of which OS, but they are all about ease of use & reliability:
1. Security is a must on many levels.
2. Fast work & desktop organization of it is paramount w/a dozen apps open & OS Crashes are NOT acceptable.
3. Consistency of a clear interface is needed
4. Explanations in Help & Dialog boxes must be written clearly and unambiguously with enough information to be specific, by a person who writes in his native language (& very well).
5. Crash recovery if it does ocurr must be quick and as painless as possible (from CD/DVD &/or Backup HDs) w/adequate explanations.
I use Mac and Windows, and I want the same things from both OSs.
Mac wins it hands down right now for me! Internet problems almost do not exist, No registry, no confused ‘Remove Application: can’t find….’, no config.sys, etal, etal, no BIOS update.
It is all about how easy is it to have your mind concentrate soley on the work at hand and the OS doesn’t get in the way of productivity.
Bo
The best sequencing tool in the world ? Really, it there were one only reason for me to swithc, it would be for DP. And DP4 sucks, for now. It is the only OS X version.
Have you ever heard of Cubase?
An often cited benefit of the 64bit architecture is the support of huge amounts of RAM. For desktop computers, however, the more immediate benefit is that x86-64 architecture is more efficient than x86 architecture: x86-64 architecture doubles the number of registers, which allows the processor to operate more efficiently. Athlon 64 and Opteron computers gather a large user base, which in turn prompts software developers to recompile their software for x86-64 to make the software run faster on those computers. Once that happens, I believe that Intel will be compelled to support x86-64 to be competitive with AMD.
>Could this be a mistake as significant as the failure to note
>the importance of the Internet in 1993 and 1994? We’ll see,
>but betas of a 64 bit XP are all they’ve promised for the
>short-term.
(emphasis mine)
Anyone seriously think Microsoft would be blindsided by something like that? They have their fingers in so many pies precisely for that reason – so they can’t be caught flatfooted by some Next Big Thing(tm).
I’m just wondering how much work it’s going to take them to port XP from Itanium (Intel 64-bit chips) to the 64-bit AMD chips, because that thing’s been fully functional and shipping for quite some time now (although only in English).
Now, there’s the time of stupid interpreted languages and no one worrier to optimize their code a bit.
I’m not surprised, that user needs 4GB. That’s because of too much bloat in Windows.
I’ve got 512MB RAM in my PC, and I didn’t even have to use swap in WinXP, while running games, compiling apps, watching movies, etc.
I think that there will be time, that putting character on screen would require 1GB of RAM and 10GHz CPU, and doing it in the simplest way (e.g. coding in asm) is for lamers and crazy man.
XP 64bit for AMD64 (Windows XP 64-Bit Edition for 64-Bit Extended Systems) is already available on MSDN. I suspect it is kind of a hybrid between the IA64 version and the x86 version. It uses a modified wow64 layer for running 32bit apps. IA64 architecture uses EFI firmware wheras AMD64 uses the traditional bios and plug-n-play.
The NT kernel is written in mostly portable C code with the platform specific bits being kept in the hardware abstraction layer. The experiance of creating the IA64 version probably made making the AMD64 version a lot more straight forward.
The release should be sometime in the first half of next year.
In some ways its a shame the OS couldn’t be ready at the same time as the chip was released. I can’t imagine many OEMs want to ship 64bit systems without the OS support. Do they give you a voucher to upgrade to 64bit XP when it is released?
Won’t it be nice to have the whole of Windows compiled for the newest processor rather than the lowest common denominator x86 version? (i586 at the moment I think)
The biggest problem initially will probably be driver support. I’m sure nVidia and ATI will be up to the task but how long will it be for Creative etc. and other smaller hardware vendors to create 64bit drivers. I don’t expect my digital TV card to keep on working.
Finally! Windows is seen for what it really is. Those of us from “alternative” operating systems that have been saying this for years about windows, are finally proven to be right! Do yourself a favor, grow beyond windows, you can do it. Get an operating system that is able to grow with you, not stunt your growth as a computer user.
XP 64bit for AMD64 (Windows XP 64-Bit Edition for 64-Bit Extended Systems) is already available on MSDN. I suspect it is kind of a hybrid between the IA64 version and the x86 version. It uses a modified wow64 layer for running 32bit apps. IA64 architecture uses EFI firmware wheras AMD64 uses the traditional bios and plug-n-play.
That is probably the one thing that really let down the Opteron; the continual hauling around of the sad excuse for a firmware (BIOS) and configuration technology (PnP). IMHO, they would have been better off either adopting EFI or if they wanted something that was openstandards, adopt OpenBoot, which would free them from the limitations that BIOS imposes onto operating system developers.
[/i]The NT kernel is written in mostly portable C code with the platform specific bits being kept in the hardware abstraction layer. The experiance of creating the IA64 version probably made making the AMD64 version a lot more straight forward.[/i]
IIRC, there is also a small amount of assembly that can be traced right back to the first version. It was used as a way of speeding it up. BeOS did the same thing as well. There was decent amount of assembly with C/C++ floating on top.
The release should be sometime in the first half of next year.
In some ways its a shame the OS couldn’t be ready at the same time as the chip was released. I can’t imagine many OEMs want to ship 64bit systems without the OS support. Do they give you a voucher to upgrade to 64bit XP when it is released?
Well, IMHO, it is better for Microsoft to take their time rather than trying to rush it and end up with AMD copping alot of flak. Why would they get flack? just look at the surveys done. As Windows increase, the satisfaction customers had with the big brands computers increased, as if to say that some how, the vendors computers have increase in quality, which isn’t necessarily true.
Let us also remember than Microsoft has to optimise their compilers for x86-64. Yes, I know, the most logical thing to do would be to work with GCC and use that instead, however, as I said, that would be a logical thing to do.
Won’t it be nice to have the whole of Windows compiled for the newest processor rather than the lowest common denominator x86 version? (i586 at the moment I think)
The biggest problem initially will probably be driver support. I’m sure nVidia and ATI will be up to the task but how long will it be for Creative etc. and other smaller hardware vendors to create 64bit drivers. I don’t expect my digital TV card to keep on working.
Well, Nvidia is on board, same with Matrox as well. I haven’t heard about ATI yet, but I assume that all the major players are working with AMD to get their video cards on par with the AMD processor.
You gotta be kidding me!
Longhorn’s release is only 2 years away and I am not making such drastic ugrades to my computer for it.
I think the very reason Micrsoft can be cought off-guard is that they’re involved in so many ventures. The company is simply spread too thinly, without the ability to focus.
I think the very reason Micrsoft can be cought off-guard is that they’re involved in so many ventures. The company is simply spread too thinly, without the ability to focus.
That would be true if Microsof had a monolythic business model, however, Microsoft has a model similar to IBM where each part is like a mini-business and those parts work with others in so-called “alliances”.
What the concern should more likely be is whether the other parts of the organisation are up to speed in regards to what they will be do once Longhorn is released.
I’ve heard little about the next version of Office. Is it eventually going to go fully .NET so then we no longer have the security issues? Also, Microsoft, with such a long release cycle is really going to set the market up for a real let down if they don’t release some majorly revolutionary.
Old Bill Gates has done the hype-tour but really, do people, deep down in their heart, think that suddenly Microsoft is going to sacrifice backwards compatibility for security and stability? do people really think that they’re going to give Windows the major over haul required? I mean, they’re going to have a release space of 4 YEARS! why not go too all the software vendors and pay for all software to be ported to .NET? they had 4 years to get software vendors on board. Had they done that then they could drop win32/win16/os2 and every other half-baked, backwards compatiblity API that plagues Windows.
Maybe I’m the only one, but I think we’ve actually lost out here.
Back in the days of yore a great deal of innovation happened BECAUSE of the cpu and memory limitations of the machines the programmers were dealing with. As a result we ended up with applications that actually stretched what it was possible to do with the machine, and in doing so extended programming knowledge.
Wind forward 20 years and those limitations are gone. Programmers can get away with writing sloppy, poorly designed, bloated code, secure in the knowledge that Moore’s Law and the cheap memory market will mask their inadequacies.
I realise we can’t go back to the days of the real programmers (Apologies to embedded folks, you still have ’em), but I’m not sure we should all be so eager to rush towards the day where the actual hardware a program will run on is viewed as an irrelevance when considering how to design a program (Java might be an example).
So maybe 640k isn’t enough for everyone, but am I the only one to think that the skills of programmers would be more significantly more advanced than they are if that limit had been imposed?
I realise we can’t go back to the days of the real programmers (Apologies to embedded folks, you still have ’em), but I’m not sure we should all be so eager to rush towards the day where the actual hardware a program will run on is viewed as an irrelevance when considering how to design a program (Java might be an example).
In think you have confused bloat with features. Bloat refers to a feature size comparision, and the size of the code is unjustifiably large for what the application does. Sure, therei some bloat out there, but by in large, most code is fairly tight.
As for the comment regarding Java, the fact is that Java will enable larger and more complex applications to be developed without the downside of this complexity resulting in buggy code, and no, hiring more programmers doesn’t fix a problem, in some cases it makes it worse.
Java will also remove many of the security and memory related issues that plague applications today. If you remove the need to manually manage memory, you reduce the chance of bugs entering the code. If you let something like a virtual machine sort that out, the virtual machine will also provide a much more robust protected environment for applications to run in so that issues such as virus’s, memory leaks, buffer over-runs and so forth become a thing of the past.
As a group have proven via their implementation of the Doom/Quake engine in managed C++ as shown by the article: http://www.microsoft.com/resources/casestudies/CaseStudy.asp?CaseSt…
Once running, the performance penalty is so small, one can place it in the pathetic region of performance loss:
“A large body of C code can be ported to run in the CLR without a lot of work,” says Arvesen. “The performance was sufficient โ 85% of the speed of the unmanaged code, based on benchmarking. The testers didn’t report noticing the performance difference.
“We are still free to use natives like STL in managed code. Our C++ skill set carries forward when creating managed applications with Managed Extensions for C++.
“The experience overall was very good and we felt productive porting and extending the code. Itโs nice to mix native and managed code, to have control over memory management, and to use existing libraries as well as .NET Framework classes, all in the same application.”
This is a game using a tonne of mathematical work for the graphical effects, something that manage code is not extremely strong at. Considering that somethign like Word, Excel, PowerPoint or Access would never use those sorts of functions, the liklihood 95% performance equal to unmanaged code is very high.
Here come bloated OS that requires 10 gigs of ram and 100 gig of HD to install but has problem threading and gaping holes in security and bugs that are called as features. Looks like .Not take a whole lot of ram and cpu power to run with bugs flying all over
Ssh M$ as always is all of marketing but sucks in every of its products.
Have you magically got some “insider” information that no one has yet heard about?
Uncomprehensible english, located in Texas and comments about Windows in an uneducuated fashion. Any relation to GWB?
I think you’ve overlooked something important in the key points for your article.
Windows 3.x/9x/ME were a completely seperate code base from what was NT/2000/XP.
So when you look at the dates of release, you see it is more around the 4 years or so mark between major releases. XP however can’t really be included as its essentially 2000 with a prettier interface. (hence version no. 5.1)
<< you mean repeating what marketing guys are saying ? Come on, go to a profesionnal studio, a real one, and see: mac OS 9 or Mac OS X ? You will be surprised ( as I was myself). >>
I have been to professional studios. There are actually quite a few here in Miami and Fort Lauderdale. In fact quite a number of the current playlists for top 40 radio were recorded either here or in LA.
Guess what… they were all running OSX. You would be amazed at the number of TiBooks with OSX and ProTools and Reason in the professional music business. Except for France apparently.
By Darius (IP: —.dmotorworks.com) – Posted on 2003-10-17 19:16:53
“Free software seems to have the ‘bread and butter’ apps covered pretty well, but there are some gaping holes when you get into the ‘industrial strength’ (*cough* Dreamweaver *cough*) and ‘specialty’ categories.”
To call Dreamweaver and “industrial strength” application strikes me as overstating Dreamweaver’s “strengths”. Industrial strength in Dreamweaver’s problem domain is a powerful Unix-based server, Apache or AOLServer, a flexible development environment and one or more good high level programming languages.
Regards,
Mark Wilson
http://www.theregister.co.uk/content/4/33397.html
“What is going to be important, Gates told reporters yesterday, is security. Microsoft invested over $100 million to refocus on building products that strive to be secure by design, by default and by deployment. In the Windows Division development work was put on hold while Microsoft conducted security training, threat modeling, source-code review and penetration testing.”
Is there anyone else here who feels a little underwhelmed by the fact that out of $50billion they’re scrapping together a piddly $100million for apparent “security enhancements”.
I am french, and what you are saying its wrong!!
A lot of french pro audios workers are moving to MacOsX. Its was not the case when MacOsX appears in the market, that’s true, but since Apple completed the coreAudio, and the majors developers could work with a stable platform for their applications, the situation completely changed.
A lot studio moves to MacOsX, because this system provides impressive stability, performance, and coherence for their job. The CoreAudio is a breakthrough, The Midi architecture is unique, and really it allows a all coherence for audio developers.
MacOsx is today the best system for audio creation, i think you are not really aware about what you are talking about.!!!!
“A lot of french pro audios workers are moving to MacOsX. Its was not the case when MacOsX appears in the market, that’s true, but since Apple completed the coreAudio, and the majors developers could work with a stable platform for their applications, the situation completely changed”
I really hear/see the contrary here, sorry. I don’t want to go into a flameware, that’s not useful, but I was really amazed by the amount of people who hate mac OS X vs Mac OS 9. No need to convince me that mac OS X is better than Mac OS 9; it’s like saying winXP is better than DOS, nobody denes this fact.
But people don’t care about the OS. They care about the apps: a lot of mac lovers are found of DP, for example, it is still not really here. Pro tools, in pro environments, are used with expensive DSP cards. A few people told me they hate mac OS X because it crashes !( I mean, mac OS 9 was terrible, when you had to tweak the memory for each app yourself, and reboot because ram is kept by the OS). I’d suspect buggy drivers, etc… But again, these people don’t care about the reasons. The go back to classic. Some people even sold their mac for PC because of X ! Which I can’t understand myself…
At the last apple expo, really, I was amazed by all the bad things I heard from pro audio guys about OS X. As with the G5 thing; I hope Panther has a real support, because I see a number of apps crashes on G4 with Jaguar, it was amazing (cubase SX 1.05, logic 6, etc…).
Working with laptop is pretty rare, I’d say. The laptop thing is quite different. Apple laptops are very good, for a correct price.
But people don’t care about the OS. They care about the apps: a lot of mac lovers are found of DP, for example, it is still not really here. Pro tools, in pro environments, are used with expensive DSP cards. A few people told me they hate mac OS X because it crashes !( I mean, mac OS 9 was terrible, when you had to tweak the memory for each app yourself, and reboot because ram is kept by the OS). I’d suspect buggy drivers, etc… But again, these people don’t care about the reasons. The go back to classic. Some people even sold their mac for PC because of X ! Which I can’t understand myself…
At the last apple expo, really, I was amazed by all the bad things I heard from pro audio guys about OS X. As with the G5 thing; I hope Panther has a real support, because I see a number of apps crashes on G4 with Jaguar, it was amazing (cubase SX 1.05, logic 6, etc…).
So actually, what are moaning about is crap software quality. How is that any different to me complaining that Outlook is shoddy (which IMHO it isn’t), what do I do? blame the operating system or the software vendor? if I blame the software vendor, what is stopping me from moving to a solution that actually works?
Something I would like to see is Apple continuing to buy up audio and video vendors, esp those who fail to put out a decent Mac product and give MacOS X a bad name.
“It was a while ago, and I can’t seem to find the link. But I remember reading on a microsoft site the minimum requirements for longhorn… wasn’t it something like a minimum of 1GB?”
“No, it wan’t.”
Actually, you are wrong. The MS site does say there will be a 1GB minimum for longhorn.
Sure they sound like huge requirements now but longhorn is still 3 years away. I’m already running with 768mb RAM and a 256mb DirectX9 capable graphics card. It really won’t be an issue when it is released. The way I upgrade I probably have a complete new machine every couple of years and I don’t buy bleeding edge equipment.
The Xbox specs looked impressive when it was announced, now its just a low powered PC. I expect by the end of next year most PCs will ship with at least 1GB in them.
So in 3 years time, according to Moores law, computers will have 1 or 2 GB of memory. So then, all new PC’s will fit the minimal requirements of Longhorn, and that OS will work fine on new computers.
But what I wonder, is what does that 1 GB of memory give me? By taking 16 times the memory (to Win. 2000), what can I do with it? Nice animated 3d-windows flying away when you close them? Each letter another bitmapped font? Each button another UI theme? Windows Explorer taking 128 MB to pre-generate bitmapped animations from SVG icons? Arrange all windows on the walls of a maze? Now that increases my productivity!
I mean, with my old 386 I could use Word 2.0 to write my letters. Didn’t need anything more. Then I got a 486dx4-100 with 512 MB disk space, I could use the entire Office suite, browse the internet, use Delphi 5…
So what can I do more with Longhorn?
That’s always been the question– what would we be able to do with a 1 Gigaherz Pentium?? What could we do with 128 mb of video ram??
Just remember that it hasn’t been the OS’s that pushed computer power to the fullest– it has been, it is, and will be games that propel computer’s performance gaines.
The Operating Systems’s GUI improvement is derived from that.
The proposition that MS can go three plus years without an iteration to Windows is inconsistent with their past history. The only time its happened before was between 2.0 and 3.0. And I’m not going to take a 64 bit version of XP as a major upgrade.
If the 64-bit Edition wasn’t revelant at all, why is things like Media Center Edition and Tablet PC Edition mentioned? They have roughly the same changes.
Longhorn will need 4 gigs of RAM.
How did the zn & .com get in it? Take that out.
The memory also needs to be released back to the other apps when a program closes. MS still has some improvement to do there too, although XP is a lot better than ME in that regard. Of course anything is better than ME.
So Bill G obviously has an inside track at 3drealms. Someone must have told him the release date and hardware requirements for Duke Nukem Forever…
Lmao, yeah you could be on to something with that.
Re the 1gig limit, I cannot see how you can justify needing that much memory just to run an OS, unless of course it ships with just about every service available to man running at the same time out of the box of course, but no one would be that stupid, would they?
You’ll always need a fair sized amount of ram to do anything usefull, but really, 64Mb of ram with a fairly basic UI and no services running should be enough, even Windows 2000 can get along quite nicely with just that amount (though if you want to do anything usefull at speed with it, 256mb really is needed).
You can build MicroOS’s that are so small that they can litterally fit in a modern CPU’s L1 cache, I just can’t see the necesety of it all…
Servers on the other hand are a different kettle of fish, they need as much as they can get, and the more mission crittical your server is, the more resources you should pour into it, but Desktops are not so important.
People downplay bloat in software, thats wrong. Every week nearly on this website I see links to OS’s and software packages that prove this wrong again and again, if you can code a program to only need 5Mb’s of ram, then do so. Just because the resources are there doesn’t mean you have to swallow them up at the first given oportunity!!!
I personally feel that Windows ME was a fairly significant upgrade.
WDM was vastly improved, VM works more stable (early copies had issues I hear), and kernel processes slimmed down. Not to mention that network integration was improved quite noticeably.
The really awesome thing about WinME is that it is so similar to its predecessors. I don’t feel it to be a bad thing, customrs want consistency and predictability.
The one major item of dislike with WindowsME was the castrating of given command line tools. Doing sys a: or sys c: never performed as desired (or at all usually), and getting to be able to use them was sometimes impossible.
Of course, if you have a major registry issue and the machien won’t even boot into Safe Mode, you could be having some major issues getting a backed up registry in place.
But, I have managed to get WinME to boot in 10 seconds… an impossibility with Windows versions prior. I’m glad to see that Windows XP also boots in a reasonable time frame ๐
Of course, as the Zetaleaot that I am, I don’t care for Windows much AT ALL, so I am a bit bias. But I do strongly reccommend to my customers to use Windows ME versus Windows 98, and for the time being Windows ME over XP for existing systems (upgrades or whatnot), especially when scanners are involved. And yes, I have managed to get customers running BeOS… FULL TIME.. and they love it (mostly parents who want their kids to have a lower-distraction computer….since I can customize nearly any aspect of the OS :-));
Party on..
–The loon (leaving from his mindless droning.. be nice with the flames.. my skin is sensitive to heat)
BTW, just cause Bill denies it now doesn’t mean he never said it