“I have been testing Microsoft operating systems since Windows 95, and this is the buggiest OS I’ve seen this late in development,” says Joe Wilcox, an analyst with Jupiter Research. “Look at the older operating systems, and by Beta 2 there is a stable foundation on which the [independent software vendors] can build. Right now, Vista is like a ship on stormy seas.”
I’d be willing to bet that most of those BSODs are associated with Hardware issues which is the fault of the Vendors. I don’t necessarily believe with the method in which windows handles the drivers by giving them enough power to crash the system, but I don’t blame them fully for it either.
I expect the situation to get better in the next few builds leading up to the release (and it seems to be from 5456 and 5472 to be heading that way)
Hardware vendors write drivers which rely on undocumented behavior, that breaks with new OS releases? Color me shocked.
Undocumented behavior? I’d call it more of overly complex driver model. If you havn’t seen the Channel9 video on Kernel Mode Drivers you can see that they’re working on improving it to handle a lot of the mess which could lead to problems.
Too many hoops to jump through? I could believe that (without pointing any fingers).
I’d be willing to bet that most of those BSODs are associated with Hardware issues which is the fault of the Vendors.
That’s an excuse which Microsoft has always used, and used a lot for the instability of NT 4.
Can you disprove it somehow? No?
Can you disprove it somehow? No?
Disprove what, exactly?
It is interesting to note that far fewer problems were experienced in the run up to, and release of, Windows 2000, XP and 2003 when compared with something like NT 4 (which Vista looks like it’s going to be like).
Can you prove that these failures are the fault of third parties? If so, what are these third parties doing wrong or is it as a result of poor documentation and tools from Microsoft? That’s where the onous is.
Just LOOK at the details of most of the common BSODs it even says which driver is at fault.
You’re also missing my point, I’m stating that the fault is not solely of Microsoft. They do have some fault but there is much more to factor in and not just the design flaws of the driver kit.
How many driver models does it take for Microsoft to get it right?. No wonder Linus says your code better be good to get into the kernel.
afaik the Vista driver model is their first change since the NT kernel was introduced, it’s more of a driver framework update that’s needed and as I’ve stated it’s already in the works.
afaik the Vista driver model is their first change since the NT kernel was introduced, it’s more of a driver framework update that’s needed and as I’ve stated it’s already in the works.
The framework (KMDF 1.0) has actually been around since December 2005. The user mode equivalent is still in development (beta) however.
The 3-th parties doing all wrong complain simply sounds illogical. Microsoft has admitted that they have made life easier for developers. That means that some or most of the complexity of the drivers is handled by microsoft themselves. So what you have here. The vendors did pretty good job when the task of writing drivers was more complex and failed when it became more easy for them. Is that what you are trying to say? Or is it rather microsoft put the burden of complexity on their shoulders and thus overstretched themselves. The vendors know better than microsoft how to handle their own hardware specific issues and it is better than the ‘one size fits all’ approach enforced by microsoft. Just to mention – the basic unix principle is “Less is better” here we see “More is better”. We will see how it will end.
Like nearly half a year old?
According to the article, Beta 2 was publicly released in June.
Only if you’re counting the series of leaked builds and CTPs that appeared shortly after Beta 1. Those listed Beta 2 as their version, but were not the officially sanctioned release that’s being referred to in the article.
The official Beta 2 is only a couple months old, as noted earlier.
a few things i want to remember about Windows Vista:
– in development since 2001…
– Ballmer said: “Quality Quality Quality!”
– the release of vista should be in 4 months
of course the drivers! yes of course! Damn who thinks the drivers are crashing the OS!
Joe Wilcox (or the author of the text) didn’t say at least one word about drivers from nvidia or ATI, or Intel or what ever! I think he used the microsoft vista drivers.
so open your eyes, vista will be a new deep point at software development by microsoft.
You hear that?
That’s the sound of marketing wearing off and people realizing they were sold fantastic tales. Let’s try to separate the hype from the real-product.
It was particularly good hype because I couldn’t even tell what he was talking about or what he thought, all I got from it was “Vista.”
Joe Wilcox (or the author of the text) didn’t say at least one word about drivers from nvidia or ATI, or Intel or what ever! I think he used the microsoft vista drivers.
so open your eyes, vista will be a new deep point at software development by microsoft.
Microsoft only makes generalized drivers for certain classes of hardware such as HID and storage. The bulk of the drivers that ship with Vista and any other version of Windows are provided by IHVs. Whether included in the box or obtained via Windows Update or the IHVs’ own website, the drivers are written by NVIDIA, ATI, Intel, et al.
Why, then, do the drivers from the NVIDIA website dramatically increase performance when compared to the drivers that ship with Windows XP?
Surely the IHV (in this case, NVIDIA) wouldn’t send Microsoft a really bad version, and have available for download a much better, faster version?
(And yes – the current NVIDIA drivers on their website are WHQL certified, so they _could_ include them in Windows XP if they wanted).
Because they are just the bare drivers and not all the extras that nVidia puts in that are included in their unified driver release. Extras like contextual menus, all manors of tweaks and settings, etc. These items are absent from the Windows included driver.
Don’t forget the lack of the OpenGL ICD as well. Microsoft doesn’t provide their own, the IHVs provide that.
Because the one with Windows is probably a generic VGA driver.
They probably don’t include them simply because there’s a variety of support issues with different drivers and different chipsets and Windows is completely dependent on having a working display: So they stick with a method that’s guaranteed to work, at least enough to download a working driver.
XP usually uses a VESA driver for unsupported cards. Rarely does it resort to VGA.
All I heard Ballmer say is “Developers, Developers, Developers, Developers”, it’s my favorite composition by Ballmer. Betas are always unstable, only in betas can a company try out technologies, ideas that would otherwise be formidable.
Why is Wilcox just now writing about Beta2, which is what, 4 months old? The various beta sites I’ve seen, the testers say that the builds since Beta2 blow Beta2 away in terms of speed, stability, usability. Wilcox’s hit-piece is woefully out of date, and largely irrelevant.
My personal experience of the private release since BETA2 is that it is unusable (as was BETA2). I tried using it as an everyday OS, really I did. It was just too flaky.
Yes, it does depend on the hardware you use it on and the two machines I used it on were an Opteron (Vista has had flaky 64 bit drivers for ages) and a new CoreDuo laptop (laptops are also notoriously flaky) but if it only works on a small subset of hardware for standard desktops that is really not anywhere near ready for release!
So yeah, if you have a nice new Pentium desktop with tons of RAM and a kick-ass video card it may work for you.
Why is Wilcox just now writing about Beta2, which is what, 4 months old? The various beta sites I’ve seen, the testers say that the builds since Beta2 blow Beta2 away in terms of speed, stability, usability. Wilcox’s hit-piece is woefully out of date, and largely irrelevant.
I’d say 4 months is a reasonable test period. After all not all bugs will show up all at once in the first week of testing. Plus you’d want to evaluate OS stability over time as well as degradation of permance over time (a past Windows affliction).
In my view this is a step up from “reviews” done after a week (or even less) of testing.
I’d say 4 months is a reasonable test period. After all not all bugs will show up all at once in the first week of testing. Plus you’d want to evaluate OS stability over time as well as degradation of permance over time (a past Windows affliction).
Agreed.
Most “reviews” of something released just that week/month tend to just be a parade of screenshots.
Well let’s just wait for the final release, you can’t expect a beta to be rocksolid, even the drivers are beta.
If it has the stability of XP when it releases i’m satisfied and i’m sure it will have that.
I’m a linux fan but you never hear me complain about the stability of Windows since the release of XP.
I never have crashes, bsod’s or anyhting. If you get them it is usually because of crappy hardware, if you use that hardware under Linux you’ll get the same results: instability.
Edited 2006-08-23 19:00
http://www.longhornblogs.com/robert/archive/2006/08/19/Windows_Vist…
“Whew. Brandon was definitely not thrilled with me after this conversation. If you’ve seen my Windows Live Messenger personal message the last couple days, I said “The next build of Vista will rock your world.” It led to a very interesting discussion with Brandon about “expectations”, and what RC1 will shape up to be. Brandon and I both have seen the progress that has been made since 5472, and up until a couple weeks ago, I was pissed at the level of (in)stability Vista still exhibited so close to RC1. My public statement to that fact led to more traffic on this site than I had seen in quite some time.
So what could make me change my mind so quickly? I got a sneak peek at the build TechBeta testers will receive (hopefully) soon. I won’t go into details now, but suffice it to say, it will have been worth the wait. “
At least you won’t get fired for trying Vista like Apple employees trying out early version of Leopard they had to download from torrent sites.
If so, I don ‘t think current Mirosoft customers are going to take a huge gamble upgrading or buying a computer with vista pre-installed until it surpass the stability of windows xp.
Vista: longest MS OS in development ever, buggiest MS RC2 in development ever, most demanding sys reqs ever, and these people STILL think the end result is going to be “the best Windows OS ever” (a hell of a backhanded compliment in itself). What are these people ON?
I would say – propaganda. Or maybe its that classy GUI which takes 80% of pc resources to run ..
Was it called avolon, or it died out along with winfs ?
Correct me if I am wrong, but this has already been posted essentially. Toptechnews.com seems to just be re-hashing an earlier article. The statement ” have been testing Microsoft operating systems since Windows 95, and this is the buggiest OS I’ve seen this late in development,” by Joe Wilcox caught my eye as this is a re-hash from earlier comments made back in July. If this is the case, then OSNews.com needs to print a retraction and do a better job at source checking, something to date that is sorely lacking on this website.
What is news though would be how the current state of the beta is, are these bugs still present and has Microsoft addressed them. Now that is news…
Around the time SP3 gets delivered in the current beta is anything to go by.
If microsoft have still to make major mods in the driver area then it can no way be considered stable. This is even more critical given that 64bit Drivers HAVE to be approved and signed by MS themselves. If that part of Vista is still very buggy then I would not like to be one of the 3rd party driver writers at the moment.
Time will tell is all that midnight oil burnt in Redmond and 3rd Party suppliers will make it stable by the time corporates get their hands on it.
Just imagine the scenes in the Microsoft PR dept if one of the big early adopters says No, we are sticking with XP or even worse, we are moving to Mac or Linux!
I tried Beta 2 on a Dell P4 XEON Server and it worked reasonably well but due to lack of real applications I couldn’t give it a real hard test.
I expect that things will improce by FCS but the 64K$ question is by how much.
As they say, Watch this space…
During the Beta’s of XP the same thing happened. 3rd party vendors did not write proper drivers until after it was released, and I expect the same to be true for Vista. It took Nvidia alone a month after XP was released to provide XP drivers. Of course Windows XP has basic drivers, just not hardware specific drivers built in. When XP was first released a lot of hardware broke, some to never be supported again. Is that Microsofts fault? Not at all, rather the hardware vendors used a new OS as the impetus to stop supporting hardware.
You know what? I may dislike Microsoft a lot as a monopolist and all, and I won’t easily allow an MS product into my house,…
but I suddenly feel sorry for its developers. After all, all they want to do is write great software, and they’re not being paid what they’re getting paid if they weren’t good at it in some way. And yet all these testers finding too many bugs in an OS that had been worked on since 2001, eating 700MB of RAM (OMG!), and soon being sold as “a great product”… it must be a pain to ’em.
When these MS developers discovered, which I don’t doubt they already have, that the open source model is the best model for operating system development, I wonder if they won’t feel “hey guys, we’re not in the winning team” and I wonder what OS they’ll be using at home.
Yes, transparant windows are the best thing since gravity, but I wonder how much fun they’ll be having up there in Redmond, knowing that a guy named Steve will be tearing them apart if their spaghetti code won’t soon be ready for shipping.
When these MS developers discovered, which I don’t doubt they already have, that the open source model is the best model for operating system development, I wonder if they won’t feel “hey guys, we’re not in the winning team” and I wonder what OS they’ll be using at home.
Oh, puh-lease. You can update all of us on the virtues of open source as the “best model for operating system development” after you prove your case. So far, there’s no evidence that either closed or open source produces better code, more stable code, fewer bugs, etc. There is similarly no evidence that the ability to focus more eyes on source code means that people are actually looking at the code. Just look at the list of recent vulnerabilities for Firefox, Linux distros, etc. There has been an acceleration of bugs found, not a reduction.
Trust me, people do look. And they even send in patches out of the blue. And eyes do watch commit messages. I’ve had emails / IM messages coming in minutes after I’ve commited code that someone has found an issue with.
Sure, people look at the most popular projects (ie. Linux kernel, Apache, Firefox). But what about the thousands of lesser known projects? Who’s watching them? I would submit that few people actually watch them.
(…) So far, there’s no evidence that either closed or open source produces better code, more stable code, fewer bugs, etc. (…) Just look at the list of recent vulnerabilities for Firefox, Linux distros, etc. There has been an acceleration of bugs found (…)
Firefox is not an operating system, which is what I was referring too. So you need evidence that open source is the best way of designing an operating system? I thought Microsoft itself admitted that too, their objections to open source are in the fact that they can’t make the huge amounts of money selling open source software that they can make selling closed source software.
But a better case would be the fact that the very stable OSX is based on the very open source Darwin BSD. Jobs wouldn’t have done that if it wasn’t rock solid, which it is. Interestingly, too, the most succesful brand of Linux distro families is the Debian family, a very open source OS family.
Well so are CentOS, Fedora, Opensuse, and I always thought they are employed as servers for, o.a., versatility, stability and security reasons. The fact they now penetrate the desktop market is not just in the OS, it’s also in the DE, to which I might argue the open source model is even better. Both KDE and Gnome have been superior to the Windows DE for a few years.
I’m not saying Vista couldn’t become a good OS, I just think it’s harder for Microsoft to achieve it than for open source OSes. And yet they have the budget.
I don’t know if the above is evidence to you. I agree the number of opensource bugs is increasing. People should stop employing untested bloatware for distribution releases, but the fact that this is now publicly debated and that a bugfix-only Linux kernel, for example, might be released, proves that the fast feedback and open discussion in an open source world is good for something like operating system development, which may simply be to complex to be handled by a single company. With all the pre-internet age tools like NDAs and stuff like that, which you’ll need all the time to keep things a secret.
I agree that the open source model is better, but Microsoft’s problems stem from more than being a closed-source company. They stem from a combination of a criminally incompetent OS division and a marketing department that’s on opiates.
It all hinges on how people define “operating system”. If you ask Linux advocates, they tell you that it’s “just the Linux kernel”; conversely, when you ask them what the Windows OS is, they’ll lump together not only the Windows kernel but IE, IIS, Outlook Express, and every other application that happens to be on the Windows disc. But just try to point out flaws in Apache and other applications bundled with most Linux distros — and they’ll scream, “… hey, that isn’t Linux!”
See how this game is played? I have no problem comparing apples and apples. But many people around here want an un-level playing field. If you want to talk about OPERATING SYSTEMS, look at it from a functional perspective; that is, how user would look at it. And, by that standard, an OPERATING SYSTEM comprises not only the kernel but all of the applications which come distributed with it.
When you include all of the vulnerabilities in open source apps distributed with Linux, it becomes obvious that the open source development model is no better than the closed source development model. Doubt it? Go out on secunia.org and look at all of the bugs found in Linux apps distributed with all of the common distros. Similarly, look at the vulnerabilities found in the Linux kernel. It ain’t pretty.
It all hinges on how people define “operating system”. If you ask Linux advocates, they tell you that it’s “just the Linux kernel”; conversely, when you ask them what the Windows OS is, they’ll lump together not only the Windows kernel but IE, IIS, Outlook Express, and every other application that happens to be on the Windows disc. But just try to point out flaws in Apache and other applications bundled with most Linux distros — and they’ll scream, “… hey, that isn’t Linux!”
See how this game is played? I have no problem comparing apples and apples. But many people around here want an un-level playing field. If you want to talk about OPERATING SYSTEMS, look at it from a functional perspective; that is, how user would look at it. And, by that standard, an OPERATING SYSTEM comprises not only the kernel but all of the applications which come distributed with it.
When you include all of the vulnerabilities in open source apps distributed with Linux, it becomes obvious that the open source development model is no better than the closed source development model. Doubt it? Go out on secunia.org and look at all of the bugs found in Linux apps distributed with all of the common distros. Similarly, look at the vulnerabilities found in the Linux kernel. It ain’t pretty.
That’s why most people who are anal about it, call it GNU/Linux. Because it is the Linux kernel, plus the Gnu utilities. The reason that Windows is lumped all together as the OS is because it’s all installed by default. Can you install Windows (not using hacks and workarounds after installation) without IE, Outlook Express, etc. Nope, didn’t think so. On the other hand, if I install for instance Debian. I can leave it with just the kernel and the gnu utilities (which is mostly why OpenBSD is so secure, because that’s about all a default install comes with!) All the services, etc that Windows installs and enables by default is enormous compared to the ones that a minimal Linux distribution installs.
The reason that Open Source has less vulnerabilities is because they are patched far quicker than Windows ones. Sure the code may originally have a vulnerability in it, but since it is open any number of developers can find the holes and then submit patches to fix them. With Closed source, there could be a known vulnerability for months and the company controlling the source would have to devote the resources to fix it. That is why Open Source development is better.
It all hinges on how people define “operating system”. If you ask Linux advocates, they tell you that it’s “just the Linux kernel”; conversely, when you ask them what the Windows OS is, they’ll lump together not only the Windows kernel but (…) every other application that happens to be on the Windows disc.”
That doesn’t look much like a fair judgment. Everybody knows that an operating system is not just the kernel. It’s filesystems, libraries, debuggers, X, what not.
The word Linux is often synonimous to, for example, an entire Gnome desktop. Although a Gnome bug on say a Fedora system is not strictly a Linux bug, it is in common speach, so I doubt that many Linux users would be as childish as you suggest. Let’s keep this real.
I do have two simple questions though.
Can you entirely remove the Windows GUI and install KDE on it?
Can you entirely remove all KDE libraries on a FreeBSD system and install Gnome on it?
And no, the operating system is most certainly not all of the applications that come with it.
That makes no sense.
you can update all of us on the virtues of open source as the “best model for operating system development” after you prove your case.
Exhibit no. 1: rate of improvement of MacOS X, *BSD, Linux and OpenOffice vs. rate of improvement of Windows, MS Office and whatever-Corel-are-calling WordPerfect Office.
Case closed.
Edited 2006-08-24 08:38
Exhibit no. 2: Opera
Go back into your whole idiot.
Opera isn’t open source, and it’s NOT better than Firefox.
Learn to spell, cretin.
My point was to counter your argument and it IS better than Firefox in many points. Efficiency, memory usage, known secutity, speed in many cases.
Popularity != better
p.s. yes I misspelled “hole”. It’s called a typo.
“Exhibit no. 1: rate of improvement of MacOS X, *BSD, Linux and OpenOffice vs. rate of improvement of Windows, MS Office and whatever-Corel-are-calling WordPerfect Office.”
Huh?
Mac OSX is closed-source (yah, there’s an “open” core, but that core isn’t what makes OSX OSX, rather it’s the closed-source goodies on top, and the “improvements” you refer to are closed source).
OO.o started as the German made StarOffice in the 90’s. MS Office has seen much more improvement since then than has OO.o. OO.o barely has the functionality of Office97. And Office 2007 makes OO.o look like utter garbage.
BSD has hardly changed at all in 10 years.
Linux has hardly during that same period of time besides ripping off the Windows UI.
Lastly, the rate of improvement from Windows 3.0 to Windows XP SP2, and of Mac OS6 to Mac OSX 10.4 (both closed-source systems), is orders of magnitude greater than the rate of improvement in Linux during that same time period.
Edited 2006-08-24 09:03
Mac OSX is closed-source (yah, there’s an “open” core, but that core isn’t what makes OSX OSX, rather it’s the closed-source goodies on top, and the “improvements” you refer to are closed source).
I doubt all the improvements in performance can be related to only the GUI.
OO.o started as the German made StarOffice in the 90’s.
Yeah, and Linux started as a Finnish product in the 90’s, too. Your point?
MS Office has seen much more improvement since then than has OO.o.
Newsflash: “incorporating yet more features nobody needs” != “improving”.
OO.o barely has the functionality of Office97. And Office 2007 makes OO.o look like utter garbage.
I suspect most people would be happy with the “functionality of Office 97”. And unless I’m mistaken, Office 2007 has yet to be released, so that’s irrelevant. Unless the UI stays the same, AND yet more changes to the User Interface and an even MORE scrumptious look are indispensable features, which they aren’t, no matter how much Apple and Microsoft and you MacOS/Windows fanatics say they are.
BSD has hardly changed at all in 10 years.
It has if you look under the hood (driver support, etc.) As a Windows weenie, you are not expected to understand this.
Linux has hardly during that same period of time besides ripping off the Windows UI.
Logical volume management, SMP, threading, preemptible kernel, MS-Office replacement, hardware detection, dynamic device driver configuration, graphics card detection, Amarok, zsh, ogg, stability improvements, composite, compiz, OpenGL, Evolution, Thunderbird, Firefox, WINE.
No, hardly any change at all. And if you don’t want us to “rip off the Windows GUI,” stop that pathetic whining about how Linux is hard to use.
Lastly, the rate of improvement from Windows 3.0 to Windows XP SP2, and of Mac OS6 to Mac OSX 10.4 (both closed-source systems), is orders of magnitude greater than the rate of improvement in Linux during that same time period.
Mac OS improved in one fell swoop by the simple expedient of buying an OS that was much better than the one it had been using. Of which the greater part was open source, by the way. So, close but no cigar.
As for your Windows comments, in your dreams, pal, in your dreams.
Edited 2006-08-24 09:37
Linux has hardly during that same period of time
Yeah, sure. Let’s forget that KDE and GNOME didn’t even exist 10 years ago.
Please just compare NLD9 and 10 (SLED10) and maybe I’ll stop laughing at that comment.
Rate of improvement is a worthless measure here, because you can’t find any steady base to start the rate measurements from.
A great deal of copying of concepts occur, which favors “rate of improvement” for the one who does the copying.
Rate of improvement is a worthless measure here, because you can’t find any steady base to start the rate measurements from.[i]
Yes you can: Linux, BSD etc. on a given date vs. Windows, MacOS on a given date.
[i]A great deal of copying of concepts occur, which favors “rate of improvement” for the one who does the copying.
Nevertheless, imagine the following scenario: take two groups of people developing software. One group develops products using one development method, and the other (identically-sized) group develops products using another. Whichever group’s products show the greatest rate of improvement is almost guaranteed to be the one using the best methodology.
Of course, MS apologists will say that open source developers are not professionals. It’s true that they aren’t NECESSARILY professionals, but the top dogs in successful open source products (Linus, Maddog, for example) usually are. And given MS DOS 4, Windows 98, and the first versions of Word Perfect for Windows, professionals’ track record doesn’t seem to be all it’s cracked up to be, either.
Or perhaps closed-source fans would like to point to the size of the developer base in open source, and say that gives them an “unfair advantage” – or perhaps that it makes the development process “too complex”. Well, sports fans, MS is probably the biggest software development organization out there (bar IBM, *maybe*), so if MS is better than everyone else based on size, it’s clear that big is best. So open source, if bigger, can’t be “too complex”. And if that is true then the “size of developer base = unfair advantage” point can’t be much of a point at all, either.
“Rate of improvement is a worthless measure here, because you can’t find any steady base to start the rate measurements from.”
Well, a Linux fanboy brought it up.
Anyway, let’s just start with Linux’ birth and go from there. When did Linux first arrive, 1991/1992?
OK. So since then, Windows went from Win3.0 to XP SP2. That is, Windows went from a 16bit, single user, cooperative multitasking, shared memory for all OS and apps; to a 32 bit, multiuser, pre-emptive multitasking, separate address space for OS and each app OS. Windows also saw vast improvement in the file sytem, vast improvement in graphics/sound capabilities, added an object model allowing objects of apps to be embedded into documents of other apps, .NET framework, etc.
In the meantime, Linux went from Linux to Linux + GNOME/KDE (UI’s totally ripped off from Windows, BTW).
Which has seen the greater improvement? It’s pretty obvious.
OK. So since then, Windows went from Win3.0 to XP SP2. That is, Windows went from a 16bit, single user, cooperative multitasking, shared memory for all OS and apps; to a 32 bit, multiuser, pre-emptive multitasking, separate address space for OS and each app OS. Windows also saw vast improvement in the file sytem, vast improvement in graphics/sound capabilities, added an object model allowing objects of apps to be embedded into documents of other apps, .NET framework, etc.
Win *needed* big improvements because it was even crappier than it is now in the first place.
Linux went from:
nothing
to:
an OS comparable to XP in ease of use (and incomparably more stable);
running natively in 64 bits;
seeing vast improvements in the file system, graphics, sound, networking, and autodection capabilities;
adding TWO object models allowing objects of apps to be embedded into documents of other apps;
.Mono framework;
etc
in the same time that Windows went from:
Windows 3,
to
being a *rewritten* version of VMS;
having something approaching stability
And all the while, Linux disproved by turns all the rubbish about how crap it was, despite its users and developers constantly being distracted by whining Windows weenies spouting the usual retarded shite.
Anyone who compared the progress between RH 5 and 6 (and was actually LOOKING) could see as plain as the nose on my face it was going to blow Windows out of the water some day.
Which has seen the greater improvement? Yes, it’s pretty obviously not Windows, except to the few remaining learning-impaired Windows fanboys who are still on Microsoft-prescription drugs.
————— Twenex
OO.o started as the German made StarOffice in the 90’s.
Yeah, and Linux started as a Finnish product in the 90’s, too. Your point?
Myself: His point should have been obvious … OO.o was developed off of staroffice which was a comercial product for many years. Interestingly enough Mozilla/Firefox was also was developed from a comercial base, namely Netscape.
—————
MS Office has seen much more improvement since then than has OO.o.
Newsflash: “incorporating yet more features nobody needs” != “improving”.
OO.o barely has the functionality of Office97. And Office 2007 makes OO.o look like utter garbage.
I suspect most people would be happy with the “functionality of Office 97”. And unless I’m mistaken, Office 2007 has yet to be released, so that’s irrelevant. Unless the UI stays the same, AND yet more changes to the User Interface and an even MORE scrumptious look are indispensable features, which they aren’t, no matter how much Apple and Microsoft and you MacOS/Windows fanatics say they are.
Myself: Many people including myself would really like an outline mode though I agree that open office satisfies most needs. I also like the size comparitive size of open office (around 40 mb). Sun sure has been nice to give us such a great product.
—————
BSD has hardly changed at all in 10 years.
It has if you look under the hood (driver support, etc.) As a Windows weenie, you are not expected to understand this.
Myself: First off your quite obviously a linux weenie so your not expected to understand that your head is in your … well you can guess. You’ve nothing intelegent to say to this extreamly simple statement which is sad.
—————
Linux has hardly during that same period of time besides ripping off the Windows UI.
Logical volume management, SMP, threading, preemptible kernel, MS-Office replacement, hardware detection, dynamic device driver configuration, graphics card detection, Amarok, zsh, ogg, stability improvements, composite, compiz, OpenGL, Evolution, Thunderbird, Firefox, WINE.
No, hardly any change at all. And if you don’t want us to “rip off the Windows GUI,” stop that pathetic whining about how Linux is hard to use.
Myself: For someone who doesn’t spend all there time in front of a computer it is hard to use. There’s no centralized API like DirectX which means you need a million and one dependancies to satisfy the requirements programs you actually want to use. There’s no centralized configuration for anything. Tools they give you for programming are so entirely incompetent that they neither let you sufficently tell whatever your configuring what to do or appropriatly reflect the changes you’ve made when you edit the stupid text files yourself. Programs are harder to install and there isn’t as many of them. Often times your left searching the web trying to find a specific version of something to install so you can get another application working. In short if both the dependancies and the app you’d like are not packaged by your Linux vendor you’ve a large painful ordeal on your hands (and i’ve yet to find a distro that doesn’t miss something) and no uninstallation for program’s that have to be compiled or don’t have an rpm. Far cry from a simple click and it’s installed don’t ya think
——- This last one is a long spout from our favorite fanboy ——-
Lastly, the rate of improvement from Windows 3.0 to Windows XP SP2, and of Mac OS6 to Mac OSX 10.4 (both closed-source systems), is orders of magnitude greater than the rate of improvement in Linux during that same time period.
Mac OS improved in one fell swoop by the simple expedient of buying an OS that was much better than the one it had been using. Of which the greater part was open source, by the way. So, close but no cigar.
As for your Windows comments, in your dreams, pal, in your dreams.
OK. So since then, Windows went from Win3.0 to XP SP2. That is, Windows went from a 16bit, single user, cooperative multitasking, shared memory for all OS and apps; to a 32 bit, multiuser, pre-emptive multitasking, separate address space for OS and each app OS. Windows also saw vast improvement in the file sytem, vast improvement in graphics/sound capabilities, added an object model allowing objects of apps to be embedded into documents of other apps, .NET framework, etc.
Win *needed* big improvements because it was even crappier than it is now in the first place.
Linux went from:
nothing
to:
an OS comparable to XP in ease of use (and incomparably more stable);
running natively in 64 bits;
seeing vast improvements in the file system, graphics, sound, networking, and autodection capabilities;
Myself: To bring the Graphics/Sound/Autodetection capabilities up to Windows level.
twenex again: adding TWO object models allowing objects of apps to be embedded into documents of other apps;
.Mono framework;
etc
in the same time that Windows went from:
Windows 3,
to
being a *rewritten* version of VMS;
having something approaching stability
And all the while, Linux disproved by turns all the rubbish about how crap it was, despite its users and developers constantly being distracted by whining Windows weenies spouting the usual retarded shite.
Anyone who compared the progress between RH 5 and 6 (and was actually LOOKING) could see as plain as the nose on my face it was going to blow Windows out of the water some day.
Which has seen the greater improvement? Yes, it’s pretty obviously not Windows, except to the few remaining learning-impaired Windows fanboys who are still on Microsoft-prescription drugs.
Msyelf: Where’s the DirectX equivelant from linux? Prefferably using OpenGL and OpenAL. Mono is a rewrite of something started by microsoft (you might have heard of .NET). I’m not saying I think visa will be great … i’ve not been impressed with it by any means. Linux has the same problem Mac OSX has it’s users are infaliable. Ask a windows user if he likes his os and he’ll list a slew of issues that could use improvement. a linux/mac accolite will simply tell you his os is better than everything else and if it need’s improvement in a certain area it’s either not a problem or it’s better than windows (even if it’s not).
Myself: For someone who doesn’t spend all there time in front of a computer it is hard to use. There’s no centralized API like DirectX which means you need a million and one dependancies to satisfy the requirements programs you actually want to use. There’s no centralized configuration for anything. Tools they give you for programming are so entirely incompetent that they neither let you sufficently tell whatever your configuring what to do or appropriatly reflect the changes you’ve made when you edit the stupid text files yourself. Programs are harder to install and there isn’t as many of them. Often times your left searching the web trying to find a specific version of something to install so you can get another application working. In short if both the dependancies and the app you’d like are not packaged by your Linux vendor you’ve a large painful ordeal on your hands (and i’ve yet to find a distro that doesn’t miss something) and no uninstallation for program’s that have to be compiled or don’t have an rpm. Far cry from a simple click and it’s installed don’t ya think
Your original point was not that it has not become easier to use (also wrong, no surprise there), but that it has not improved. Since you lost that argument, you now move the goalposts.
Sorry, “his original point”, “he lost..”
It sure was nice for twenex to decide what the original point was of my original post. Even though I made many points … I’ll sumarize:
Commercial products have enchanced software quite a bit. Infact many of the most successful applications roots were from comercial applications. Mozilla/Firefox from Netscape code, OpenOffice from StarOffice Code.
Many people including myself would really like to see more features. It’s often been said in the early days open sources downfall was it’s easier to learn how to spell correctly than code a spellchecker. It took a long time to get a office product as easy to use as OpenOffice in the linux community. All software hit’s what’s considered a good enough stage where it’s very hard to improve much on.
MS Office hit that along time ago and now linux has something as well though derived from a source that was originally commercial. This is a direct violatoin example of your ‘Opensource is Better’ methedology so you mr twenex being an evengelist are conveniently ignoring the many times both sides have triumphed or failed.
For someone who doesn’t spend all there time in front of a computer linux can be hard to use. There’s no centralized API like DirectX which means you need a million dependancies to satisfy the requirements programs you use. There’s no centralized configuration for anything and the GUI tools they give you for programming neither let you sufficently configuring whatever your configuring or appropriatly reflect the changes you’ve made when you edit the text files yourself. Often times when your trying to install a program your left searching the web trying to find a specific version of something to satisfy a dependancy and i’ve not found a distrobution once that had everything I needed to run my various programs. If both the dependancies and the app you’d like are not packaged by your Linux vendor you’ve a large painful ordeal on your hands and no uninstallation for programs that have to be compiled or don’t have an package. Ths is a far cry from a simple click and it’s installed.
Where’s the DirectX equivelant from linux? Prefferably using OpenGL and OpenAL. Mono is a rewrite of something started by microsoft (you might have heard of .NET). Linux and OSX fanboy’s are a big problem, a linux/mac accolite will simply tell you his os is better than everything else and if it need’s improvement in a certain area it’s either not a problem or it’s better than windows (even if it’s not).
I could be mistaken here but is the Simple DirectMedia Layer actually part of the OS or just an API like QT/GLFW/etc… if it is part of the os then my apologies … if not then SDL is in no way anything like DirectX which provides (file/networking/video/sound/input) for the opperating system.
Commercial products have enchanced software quite a bit. Infact many of the most successful applications roots were from comercial applications. Mozilla/Firefox from Netscape code, OpenOffice from StarOffice Code.
Those applications have entered their successful phase (or another successful phase) only *after* being open-sourced.
Many people including myself would really like to see more features.
Fair enough, however just because a program has 100 more features than another doesn’t mean it’s better. If it crashes 200 times more often than the other one, it’s worse.
It’s often been said in the early days open sources downfall was it’s easier to learn how to spell correctly than code a spellchecker. It took a long time to get a office product as easy to use as OpenOffice in the linux community.
About five years. I may be wrong in thinking MS Office took that long to be better than Word Perfect.
All software hit’s what’s considered a good enough stage where it’s very hard to improve much on.
Agreed
MS Office hit that along time ago and now linux has something as well though derived from a source that was originally commercial. This is a direct violatoin example of your ‘Opensource is Better’ methedology so you mr twenex being an evengelist are conveniently ignoring the many times both sides have triumphed or failed.
There are many reasons why I think open source is better. My favourite is not speed of improvement, but the fact that it prevents vendor lock-in if done properly.
For someone who doesn’t spend all there time in front of a computer linux can be hard to use.
So can Windows. I’ve never said Linux didn’t need improving, however it’s not my opinion that it’s worse for ease of use than Windows now, and it IS my opinion that the need to rely on the commandline, and the difficulty of doing that, is exaggerated. And as I’ve stated in another item, Ubuntu’s recent problems with X are inexcusable from either a testing standpoint, nor acceptable in a distro which markets itself to Windows users who don’t want to change anything about how they use computers.
There’s no centralized API like DirectX which means you need a million dependancies to satisfy the requirements programs you use.
10 years ago there was no standard desktop. The rip-off desktops were “ripped off” because people like you think that if it’s different from Windows, that means it’s hard. You can’t have it both ways. Now there are two, and no indication there’ll be a third. If people start porting games to Linux natively en masse, the demand for a standard gaming API will increase. And be met.
There’s no centralized configuration for anything;
Depends on distro. Where and how you configure stuff in Windows depends on version.
and the GUI tools they give you for programming neither let you sufficently configuring whatever your configuring or appropriatly reflect the changes you’ve made when you edit the text files yourself.
I’ve never seen a program that ignored text file configuration, unless you’re not doing something correctly. Give me text files over the Registry anyday – a program can easily convert text files to binary itself, a person cannot easily do the reverse. Text files are also easier to use and navigate than the registry.
Often times when your trying to install a program your left searching the web trying to find a specific version of something to satisfy a dependancy and i’ve not found a distrobution once that had everything I needed to run my various programs. If both the dependancies and the app you’d like are not packaged by your Linux vendor you’ve a large painful ordeal on your hands
unless you install software exclusively from manufacturers’ disks and cover disks, you can spend as much time on Windows looking for a decent program that does what you want.
and no uninstallation for programs that have to be compiled or don’t have an package.
make uninstall ? if you can’t type that why do you need a word processor?
Ths is a far cry from a simple click and it’s installed.
a lot of people prefer the package installation system to the “google it, download it, install it” system. At worst they are equivalent. Windows has a larger selection of apps than Linux because it is used more. I’ve never denied that it’s the most fantastic collection of software available for computers today – but that is not a reflection on the quality of the operating system. People use an application platform, not an OS, but if you can make do with a smaller application platform, you might as well use the best OS you can.
Where’s the DirectX equivelant from linux? Prefferably using OpenGL and OpenAL.
Was that you answering your own question?
Mono is a rewrite of something started by microsoft (you might have heard of .NET).
So is .doc compatibility in OO.org. Microsoft technology is not always as much of a problem as Microsoft myopia and lock-in. Apart from Beagle, who uses Mono? Who uses .NET?
Linux and OSX fanboy’s are a big problem,
So are Windows fanboys. Bigger, actually, since Microsoft is so dominant.
a linux/mac accolite will simply tell you his os is better than everything else and if it need’s improvement in a certain area it’s either not a problem or it’s better than windows (even if it’s not).
which is different from a Windows “accolite” how?
I could be mistaken here but is the Simple DirectMedia Layer actually part of the OS or just an API like QT/GLFW/etc…
I’ve no idea. What difference does it make?
if it is part of the os then my apologies … if not then SDL is in no way anything like DirectX which provides (file/networking/video/sound/input) for the opperating system.
I think it’s pretty clear to most people that stuffing more and more functionality into one product is going to lead to problems. There’s a reason why the font of all human knowledge and literary creativity is not collated into one book.
Myself: There’s no centralized API like DirectX which means you need a million dependancies to satisfy the requirements programs you use.
Twenex: 10 years ago there was no standard desktop. The rip-off desktops were “ripped off” because people like you think that if it’s different from Windows, that means it’s hard. You can’t have it both ways. Now there are two, and no indication there’ll be a third. If people start porting games to Linux natively en masse, the demand for a standard gaming API will increase. And be met.
Myself: First off i don’t believe that anyone should be locked into any shell … hell a multi-tabbed cli that was capable of graphics and one app per screen and with good multiple monitor support has long been my personaly dream.
On the topic of a centralized API it would make it much easier for programmers, as it set’s program often have to chose between ALSA/OSS/(KDE’s Sound System) as an example. Program’s that use different libraries to do the same things often cause problems with others.
—
Myself: … GUI tools they give you for programming neither let you sufficently configuring whatever your configuring or appropriatly reflect the changes you’ve made when you edit the text files yourself.
Twenex: I’ve never seen a program that ignored text file configuration, unless you’re not doing something correctly. Give me text files over the Registry anyday – a program can easily convert text files to binary itself, a person cannot easily do the reverse. Text files are also easier to use and navigate than the registry.
Myself: Try editing the xorg.conf, most distro’s configuration tools put a line at the top saying don’t edit this, if you do neither kde’s, xorg’s, or in many cases i know play well with the file you’ve modified (unless it was relatively miner like changing a value already there). Also i belive i expressed my dislike of the registry in a previous post however windows provides a way for just about everything to be configured through the control panel, which no linux distro to date has, without need to edit the registry so it’s a mute point. I do have a feeling that by centralized configuration we were talking about two different things.
——
Myself: Often times when your trying to install a program your left searching the web trying to find a specific version of something to satisfy a dependancy and i’ve not found a distrobution once that had everything I needed to run my various programs. If both the dependancies and the app you’d like are not packaged by your Linux vendor you’ve a large painful ordeal on your hands
Twenex: unless you install software exclusively from manufacturers’ disks and cover disks, you can spend as much time on Windows looking for a decent program that does what you want.
Myself: People usually don’t have to look for software with windows. Huge library with millions of very well written program’s. Whether downloaded or installed from disk it’s easy as double clicking and it’s installed. When you want to get rid of it (with a well written program) you just go to the add/remove programs and with a click its gone.
With linux the dependencies better be included on your distro’s package server (most but not all are), and with uninstallation (with a well written program), you still have dependencies you must decide what to do with.
——
Myself: and no uninstallation for programs that have to be compiled or don’t have an package.
Twenex: make uninstall ? if you can’t type that why do you need a word processor?
Myself: Then you must keep source for all your program’s … hopefully different versions do the same.
——
Myself: This is a far cry from a simple click and it’s installed.
Twenex: a lot of people prefer the package installation system to the “google it, download it, install it” system. At worst they are equivalent. Windows has a larger selection of apps than Linux because it is used more. I’ve never denied that it’s the most fantastic collection of software available for computers today – but that is not a reflection on the quality of the operating system. People use an application platform, not an OS, but if you can make do with a smaller application platform, you might as well use the best OS you can.
Myself: It’s also not a reflection on the quality of the opperating system whither the distro’s package server happen’s to have all the dependancies you need to install your software. It’s also my bet that more people aren’t fond of it the package/dependancy system than are.
——
Myself: Where’s the DirectX equivelant from linux? Prefferably using OpenGL and OpenAL.
Twenex: Was that you answering your own question?
Myself: That’s the video and audio but that doesn’t take care of input/networking/filesystem that was rather more of a suggestion for a starting point something could be built around.
——
Myself: Linux and OSX fanboy’s are a big problem,
Twenex: So are Windows fanboys. Bigger, actually, since Microsoft is so dominant.
Myself: I don’t belive are bigger since I belive they are what keep linux from really shining.
Linux fanboys stop thing’s from happening by falsly insisting that overly complicated is better … unfortunately he is also developing the applications and trying to make them user friendly and adaptable. In any case fanboy’s of either side are annoying … I dislike people who can see no fault with something that is imperfect however I don’t feel at the moment i’m addressing a windows fanboy.
——
Myself: a linux/mac accolite will simply tell you his os is better than everything else and if it need’s improvement in a certain area it’s either not a problem or it’s better than windows (even if it’s not).
Twenex: Which is different from a Windows “accolite” how?
Myself: Actually this is pretty much same as above but my point being that any will swear a problem isn’t a problem which does go for a windows as well. Honestly I don’t believe there are many of them windows fanboy accolites as most windows users have no love for microsoft and will gladly take issue and discuss.
I’ve only a few times seen someone go up to a mac or linux user and suggest there OS/Hardware is a poor choice for them. The opposite happens quite often it seems much to the annoyance usually of the person there talking to even if it harder or may not even work for them. There have been many discussions on linux/mac occult due to this type of behavior.
——
Myself: I could be mistaken here but is the Simple DirectMedia Layer actually part of the OS or just an API like QT/GLFW/etc…
Twenex: I’ve no idea. What difference does it make?
Myself: This was in response to another post but i believe i covered this earlier (see ‘There’s no centralized API like DirectX’) however there’s a big difference … there not even the same thing one’s a cross-platorm API while the other is what drivers optimize for.
—–
Myself: if it is part of the os then my apologies … if not then SDL is in no way anything like DirectX which provides (file/networking/video/sound/input) for the opperating system.
Twenex: I think it’s pretty clear to most people that stuffing more and more functionality into one product is going to lead to problems. There’s a reason why the font of all human knowledge and literary creativity is not collated into one book.
Myself: Again was in response to the affore mentioned other post however … Call me silly but I believe that all the documented functions of an operating system should serve simple purposes and be easily documented in a single book. Were not talking about the whole world were talking about how to create windows, network connections, handle files, etc… Why is it that nowadays people think more complicated = better.
I could be mistaken here but is the Simple DirectMedia Layer actually part of the OS or just an API like QT/GLFW/etc…
This was in response to another post but i believe i covered this earlier (see ‘There’s no centralized API like DirectX’) however there’s a big difference … there not even the same thing one’s a cross-platorm API while the other is what drivers optimize for.
if it is part of the os then my apologies … if not then SDL is in no way anything like DirectX which provides (file/networking/video/sound/input) for the opperating system.
This is probably my opinion, carved out of personal experience. But wouldn’t you prefer a API that talks to hardware the Operating System rather than talking directly to the hardware itself. I think DirectX tries to do the latter, which means that if there is a bug in a game, it could quite possibly take down the entire OS. I’ve never have had that happen with SDL, and yet there is really no performance difference between a game written in DirectX and one written using SDL for linux. In fact, last I heard, the guy who created SDL and was working at Loki, now works at Blizzard.
On the topic of a centralized API it would make it much easier for programmers, as it set’s program often have to chose between ALSA/OSS/(KDE’s Sound System) as an example. Program’s that use different libraries to do the same things often cause problems with others.
You’re wrong, and as it’s pretty obvious you don’t know anything about Linux, you’ll obviously be wrong often, as you have nothing to base your argument on.
KDE’s soud system is already a wrapper around OSS/ALSA. ALSA and OSS provides compatibility between each other. Problems were caused before, because of some features that one system did not use. There is actually no problem now. A centralized API does not mean anything BTW. What you are talking about is a monopoly on the API for each subsystem, which is impossible to do on Linux, as most API in Linux OS are meant to be portable.
Try editing the xorg.conf, most distro’s configuration tools put a line at the top saying don’t edit this, if you do neither kde’s, xorg’s, or in many cases i know play well with the file you’ve modified (unless it was relatively miner like changing a value already there)
Which is not a problem, as if you modify the file, that means you know what you are doing. Even if you didn’t know, wiping the file (yes, even XOrg) and then launching your distro’s config tool (or even reboot, it works in Mandriva) will put a good file back. BTW, I’ve done exactly what you talk about for someone on Mandriva, and the config tool was never bothered by the things it didn’t understand in the file. So no problem adding some confs specific to NVidia clsoed driver. So you’re wrong anyway.
windows provides a way for just about everything to be configured through the control panel, which no linux distro to date has, without need to edit the registry so it’s a mute point
Again you’re wrong : at least Mandriva has one, and I know Suse had one too.
People usually don’t have to look for software with windows. Huge library with millions of very well written program’s
Enlighten us : where is this huge library of millions of well written Windows programs that people don’t have to look for ? Man, you’re living in complete denial, what do you smoke ?
Whether downloaded or installed from disk it’s easy as double clicking and it’s installed
I’ve rarely seen programs installed in Windows with a simple double-click. You’re clearly lying there, sorry. But it’s obvious why, that’s because in Linux, two clicks and it’s installed, and you have nothing like that on Windows.
When you want to get rid of it (with a well written program) you just go to the add/remove programs and with a click its gone
Yeah of course. I’ve yet to find a Windows user for which it has worked every time. Again, you have to say this to counter the Linux way that works way better.
I’ve got the feeling that you try like hard to put Windows on top of every thing Linux OS does better.
With linux the dependencies better be included on your distro’s package server (most but not all are), and with uninstallation (with a well written program), you still have dependencies you must decide what to do with
You don’t have anything to do with the dependencies that stay when you uninstall some app. You try to find problems where there aren’t any.
Then you must keep source for all your program’s … hopefully different versions do the same
I compile everything from source. There are automatic package management and installation for source tarballs you know !
You really believed people installing from source type ./configure; make; make install every time ? Only Windows trolls believe that.
That’s the video and audio but that doesn’t take care of input/networking/filesystem that was rather more of a suggestion for a starting point something could be built around
BS, people already told you there is SDL. SDL is specifically for games, like DirectX, and it does everything you asked for.
I don’t belive are bigger since I belive they are what keep linux from really shining
Now I recognise the troll. No they are not what kept Linux from really shining, MS is responsible for that.
Linux fanboys stop thing’s from happening by falsly insisting that overly complicated is better
Talk about strawman. You’re really clueless. Linux fanboys don’t stop anything, they’re just vocal.
And the mantra of Linux has never been “overly complicated is better”, it’s the contrary.
I’ve only a few times seen someone go up to a mac or linux user and suggest there OS/Hardware is a poor choice for them. The opposite happens quite often it seems much to the annoyance usually of the person there talking to even if it harder or may not even work for them
And having no thinking power, you never realised that people on Windows often come to the Linux guy when they have a problem, and they come OFTEN.
While the Linux guy will come to other Linux guys, not to Windows users. Try to think about what this means.
This was in response to another post but i believe i covered this earlier (see ‘There’s no centralized API like DirectX’) however there’s a big difference … there not even the same thing one’s a cross-platorm API while the other is what drivers optimize for
You don’t even know what DirectX is … Pretty sad. SDL is the same thing as DirectX. I don’t know what you mean by drivers optmizing for it. Drivers provides hook for the API, that’s all. DirectX is a wrapper providing a common API around drivers. A “centralised API” doesn’t mean anything. And DirectX is not cross-platform, which SDL is.
SDL. Simple DirectMedia Layer. http://www.libsdl.org/index.php Unlike DirectX it’s also cross platform, will talk to the hardware through OpenGL and OpenAL, etc.
I still don’t see how Gnome has been ‘ripping off’ Windows. The interface is as completely different as you can get in a GUI (sure, maybe the minimize, maximize close buttons are in the same place.)
What really angers me about Microsoft’s Operating Systems is how they are so greedy. For example, something like theming. I was waiting for MS to put out an OS that had good themes. Even back before Windows 98 came out, there were color gradients in one of the betas. It disappeared for the full release only to come back in Win2k (I never used ME, thank the gods). For XP, when the screenshots first started to appear, I was thinking “Sweet, it’s about time they supported themes” What did we get? They charged for the “Plus” pack, you had to use a hack to use free themes, and the themes pretty much sucked. Most of the good themes for it are actually just copies from Gnome/KDE themes.
I will say that Linux is not perfect. No operating system out there really is. But it does all that I need, and more efficiently and stable than Windows XP does.
Oh, and perhaps this is a little off topic, but if OEMs are going to start including Vista when it comes out, they’d better install the proper version of it to take advantage of the hardware. I’ve been seeing laptops and desktops with the Intel Core Duo in them, but having XP Home edition installed! What’s the point in having dual core processors if they’re not using XP Pro?
BSD has hardly changed at all in 10 years.
It has if you look under the hood (driver support, etc.) As a Windows weenie, you are not expected to understand this.
Myself: First off your quite obviously a linux weenie so your not expected to understand that your head is in your … well you can guess. You’ve nothing intelegent to say to this extreamly simple statement which is sad.
Wow. In soccer they call that an own goal, and it’s a big one.
OK. So since then, Windows went from Win3.0 to XP SP2. That is, Windows went from a 16bit, single user, cooperative multitasking, shared memory for all OS and apps; to a 32 bit, multiuser, pre-emptive multitasking, separate address space for OS and each app OS. Windows also saw vast improvement in the file sytem, vast improvement in graphics/sound capabilities, added an object model allowing objects of apps to be embedded into documents of other apps, .NET framework, etc.
In the meantime, Linux went from Linux to Linux + GNOME/KDE (UI’s totally ripped off from Windows, BTW).
Tell me you can’t think right or you’re clueless, please.
You mean Linux + GNOME/KDE means : SMP, works on Sparc/ARM/PPC/Cell,…, went to 64 bits, hot swap CPU/PCI/…, change kernel without reboot, support buggy ACPI, embedded features, cluster and NUMA features, several FS support, swappable IO and process schedulers, true plug and play USB (the half PnP support being the one that forces IHV to put red tapes on their hardware so that you don’t plug them right away), advanced security features, faster IO, better process schedulers, good scaling to 1024 CPU and to 64 CPU by default, …
Which has seen the greater improvement? It’s pretty obvious
Yeah, pretty obvious.
Edited 2006-08-24 15:04
Anyway, let’s just start with Linux’ birth and go from there. When did Linux first arrive, 1991/1992?
Yes, and it was barely funcational. It lacked the capacity to compile itself. It was a single-user, command-line pet project of Linus. Look at Ubuntu/SLED/Fedora now, and compare.
Then look at Windows 3.0 to now.
Which has seen the greater improvement? It’s pretty obvious.
Indeed it is, if you open your eyes.
And since Linux is still not up to par on usability and other areas, my point about rate of improvement being a useless metric stands.
Anyway, let’s just start with Linux’ birth and go from there. When did Linux first arrive, 1991/1992?
No. Linux is not a commercial product and as such its arrival is simply a source code release of something that ran a shell, not a ready to use OS. Linux itself is not a competitor to Microsoft Windows, because they’re completely different things. And in 1991 Linux was still developed by a student, the community came later.
For a more direct comparison to Windows you can look at recent developments in a products like RHEL or NLD/SLED. In less than 18 months since NLD, SLED 10 was the first desktop OS to ship with a .NET implementation; it also introduced a completely OpenGL accelereted desktop, desktop-wide search, brand new photo and music management applications. And that’s not to mention other major improvements in updates like GNOME 2.6 -> 2.12, XFree86 -> Xorg 6.9, Linux 2.6.8 -> 2.6.16, OpenOffice.org 1.1 -> 2.0.
This all in less than 18 months; go ahead, download (for free) NLD9 and SLED10 and compare it yourself. Also throw in SUSE Linux 7.3, which was the latest version at the time Windows XP was released.
Now please tell me with a straight face Windows XP -> Vista is a bigger improvement than SUSE Linux 7.3 -> SLED 10. And Vista is not even out yet.
Even Microsoft people like Paul Thurrott can’t find enough reasons to make the expensive Vista upgrade worthwhile. Now find someone running SUSE Linux 7.3 on their desktop.
You’re mostly right about Windows. Unfortunately what you missed on Linux is that it went from a bootable telnet system, to a bootable telnet system with a disk driver, to a tiny OS released on minix lists, to a worldwide project, to a functional Unix.
In other words, 0 lines of code, to how many now? You can’t pretend Linux has not evolved since its inception and not sound like you have no clue what you’re talking about.
Gnome, KDE, Windows, Mac, and all other GUI’s are ripped off of UI research, UI tradition, and the designers UI ideas. Microsoft did not invent the GUI, nor did it invent the way we do the modern GUI. It was certainly involved though, heavily.
It doesn’t even matter which has improved more. But the least you could do is show some respect for people’s work.
We’re still getting over the “open source can work to design an operating system” part.
Proving it’s better is a whole ‘nother thing!
“the open source model is the best model for operating system development”
Then why are there 100+ serious security holes in the 2.6 kernel. Shouldn’t those have been caught years ago? Sounds to me like buggy beta type software.
http://secunia.com/product/2719/
(Yes, Secunia only says 94, but a bunch are “multiples”.
As for “ordinary” bugs …
http://www.osnews.com/comment.php?news_id=14537
“Andrew Morton, the lead maintainer of the Linux production kernel, is worried that an increasing number of defects are appearing in the 2.6 kernel and is considering drastic action to resolve it. “I believe the 2.6 kernel is slowly getting buggier. It seems we’re adding bugs at a higher rate than we’re fixing them,” Morton said, in a talk at the LinuxTag conference in Wiesbaden, Germany, on Friday.”
“the open source model is the best model for operating system development”
Then why are there 100+ serious security holes in the 2.6 kernel.
The post you replied to and your own are not really related; being a better model doesn’t mean it produces software without bugs.
No matter how good or bad people think Windows Vista will be it’s clear to anyone that the model used to produce it is flawed, and they will have to change it if they want to get Vista+1 out before 2016. Microsoft realizes this of course, and they already started with the changes; only fanboys would argue on this point.
Yes, Microsoft’s model for Vista’s development was screwed up (the main problem was simultaneously developing multiple modules with dependencies on each other, resulting in checking in code that couldn’t even be tested because it depended on some other unfinished modules, and then trying to test/fix bugs at the last minute). This is why MS did what they called their “reset”, in which they essentially started over using Windows Server 2003 as the starting point rather than XP.
But, that has NOTHING to do with “open source model” vs “closed source model”.
But, that has NOTHING to do with “open source model” vs “closed source model”.
Nothing, as in, nada ?
I wonder what open source project would choose such a development model.
“I wonder what open source project would choose such a development model.”
Well, there are plenty of dead/moribund/failed/unfinished/abandoned projects at sourceforge (the vast majority of the projects there are moribund). I wouldn’t be surprised if they stumbled into the model MS found itself in. But the fact that those many sourceforge projects failed, shows that whatever model they chose, they chose wrong, and it’s nothing to do with closed vs open source (except to the extent that the open source model is subject to projects failing out of boredom, which causes devs to just move on since they aren’t getting paid).
>Then why are there 100+ serious security holes in the
>2.6 kernel. Shouldn’t those have been caught years ago?
>Sounds to me like buggy beta type software.
The Linux kernel is always in BETA stage….as it knows only seldom freeze moments.
Bugs and bugs do always differ from impact, user experience and real world problems.
Off course the Linux kernel contains bugs, no software is without bugs, even software i wrote but i think it is also much more easy to find bugs in open-source software than in closed-software since the open-source software cannot hide behind “No, it is the drivers from X” and “No, it is application X” and “hardware X is faulty”.
When a bugs exists in the Linux kernel one can find it, point to it and fix it. When a bug exists in closed-software one can only guess.
“the open source model is the best model for operating system development”
Then why are there 100+ serious security holes in the 2.6 kernel. Shouldn’t those have been caught years ago? Sounds to me like buggy beta type software.
Because there aren’t 100+ serious security holes in the 2.6 kernel ?
And also because you’re a poor troll ?
http://secunia.com/product/2719/
(Yes, Secunia only says 94, but a bunch are “multiples”.
Actually, they say specifically these bugs are “since 2003”, and that “17 out of 94 are unpatched”.
You’re so much a stupid troll you can’t read right.
To add to your stupidity, absolutely none of those are critical, as it says right on top that at most, these bugs are “Moderately Critical”.
Actually, only one is moderately critical and is supposedly “not patched”, but when you look at the problem, Secunia says upgrade to a superior version of Linux, which means it’s actually patched already. Nearly all other unpatched bugs are not critical at all.
So your link actually shows Linux is in a pretty good shape.
“Andrew Morton, the lead maintainer of the Linux production kernel, is worried that an increasing number of defects are appearing in the 2.6 kernel and is considering drastic action to resolve it
Of course, you didn’t go further and find the rebuttal he himself did for stupid people saying he said Linux kernel was too unstable. He was specifically talking about old drivers that are not supported anymore BTW, drivers being the most buggy area of the Linux kernel.
shocking a beta product has bugs in it… never!!! Come on everyone, we all like to throw the stones at the big guy because our niche OSes never ever crash and if they ever do we take pleasure in it. When you support the most hardware configs, millions of users using the system in billions of different ways on different time scales and metaframes I think MS are doing a good job… i just hope the final release is ready rather than rush because the anti-ms crowd wants to see them fall…. come MS don’t release Vista until its ready.
The point isn’t that the beta of Vista has bugs in it. The point is that Vista has significantly more bugs than any previous release at their second beta, which would suggest that at release Vista could have more bugs than any previous release. And if anyone claims that XP wasn’t buggy when it was released they are in serious denial.
The point isn’t that the beta of Vista has bugs in it. The point is that Vista has significantly more bugs than any previous release at their second beta, which would suggest that at release Vista could have more bugs than any previous release.
This isn’t even provable without MS’ tracking info, and as others have said, many of the bugs attributed to Vista are IHV bugs. For example, most display drivers released with Beta 2 were of alpha quality, plus Vista stresses more code paths in the drivers than previous versions of Windows. I’ve had the driver of a certain GPU verdor hang several times in Beta 2 and other builds, requiring the OS to reload the driver. The kernel mode mini driver for that GPU has also had problems which of course in that case requires restarting the system. Vista’s weakest link is increased reliance on display drivers, meaning IHVs need to get their code right now more than ever, but it’s a necessary dependence to give users a better experience overall. Predicting doom based on months old code is totally naive though.
The point isn’t that the beta of Vista has bugs in it. The point is that Vista has significantly more bugs than any previous release at their second beta, which would suggest that at release Vista could have more bugs than any previous release.
This isn’t even provable without MS’ tracking info, and as others have said, many of the bugs attributed to Vista are IHV bugs.
How about (dis)proving it by seeing if it’s buggy on a larger/smaller proportion of machines than XP?
MS always blames someone else for Windows bugginess. What is more likely, that a significant proportion of the Windows ISV’s and hardware vendors (and not necessarily always the same ones) are frigging useless, or that the one company providing the OS is?
Whatever the arguments for or against closed source, MS’s big competitive advantage using it is that no-one else outside the company can legally look at it and proclaim “Good God, what is this shite?”
Honestly, I think it’s more likely Microsoft has their methods straight than your average ISV and hardware vendor.
Many parts of Windows, and other Microsoft are just a nightmare, but many other parts are reportedly extremely well designed and implemented.
But I partially blame Microsoft for the ISV’s. I think they cater to developers far too much. .Net for example is great, thanks MS, but maybe that investment was more needed in the OS at the time. But the worst part is the backwards compatibility: Break apps, if ISV’s can’t fix them they deserve to be out of business.
If you’re still selling a binary from 1999 something is wrong!
Honestly, I think it’s more likely Microsoft has their methods straight than your average ISV and hardware vendor.
Honestly, I don’t. Not because I hate how crap Microsoft are, and I do, but because statistically, the chances of one company being significantly better than all the hundreds of others is surely very small indeed.
Actually, considering the fail rate for software projects, it’s more likely that one company is siginificantly more functional than most other companies.
Software is very hard to get right. And, I know it’s hard to believe, but since Microsoft is shipping dozens of, mostly, working software titles I’d say they have it mostly right. For each of those titles there is probably a company going out of business because they couldn’t get a project out the door and ran out of money (this would have to be a startup, a large company would just close the division).
It’s also a question of degrees. Microsoft has had some lofty goals in the past. In Win95 they attempted to release a graphical desktop OS that had mainframe level features, like memory management. It was buggy: What do you expect? They wanted to get what they had out, because something released, allbeit buggy, was what it took to get the industry onboard the changes in development (32bit for example).
Apple did something similar with OS X. I think they did it more reasonably, but their customer base is also smarter, they said: Here’s OS X, it’s neat, it’s new, but we’re not gonna lie about it and say you should all switch. Many switched, it was awful. But software vendors had a product to write code for, to adapt their old code to. And by the time Apple has 10.2 (the first usable OS X) they had software that was running on it and customers excited to see a high quality OS X.
Apple, on the other hand, has been smart and only supported OS 9 in limited ways. Microsoft is still trying to support DOS applications: That’s two generations of their desktop OS behind, three if you count 64bit as its own generation.
I’m not going to say anymore, so when you respond you have the last word. I really don’t like defending Microsoft.
Actually, considering the fail rate for software projects, it’s more likely that one company is siginificantly more functional than most other companies.
That would be true if Microsoft and all the other companies competed on technical merit.
Software is very hard to get right. And, I know it’s hard to believe, but since Microsoft is shipping dozens of, mostly, working software titles I’d say they have it mostly right. For each of those titles there is probably a company going out of business because they couldn’t get a project out the door and ran out of money (this would have to be a startup, a large company would just close the division).
The combination of marketing power, underhand tactics AND the attraction of being by far the most complete apps platform goes a long way.
It’s also a question of degrees. Microsoft has had some lofty goals in the past. In Win95 they attempted to release a graphical desktop OS that had mainframe level features, like memory management.
Sorry, but the i386 and 68030 (or 68020 + MMU) had memory management, and they are definitely NOT mainframe class. Unless you mean 1960s mainframe class. And 60s mainframe supported multiple users willingly, not one when they felt like it. AmigaOS crashed less than Win98 *without* memory management or kernelspace protection, for heaven’s sake.
Microsoft’s “lofty goals” are the problem. The gap between the reality of Windows and the hype is so large that if I hyped myself up that much, I’d think of myself as God.
I Do Not Trust That Company.
They wanted to get what they had out, because something released, allbeit buggy, was what it took to get the industry onboard the changes in development (32bit for example).
There were already two perfectly serviceable 32-bit OSes out *on the PC* (to say nothing of other hardware). Making CDE more user-friendly wouldn’t have been as hard as developing Win98, and by then IBM had given in and released OS/2 on the (generic) PC. Or they could have cross-licensed NeXT…
But since Microsoft has a “will to dominate all” computers, no! no! It had to be their own OS!
Microsoft is still trying to support DOS applications: That’s two generations of their desktop OS behind, three if you count 64bit as its own generation.
As I understand it, 64-bit Windows will say goodbye to DOS support.
Windows 2000 was being demoed at a Microsoft event. They claimed back then that the instability of NT 4 was caused by bad 3rd party hardware drivers. The claimed that be using only Windows Certified drivers could you get the stability of Unix systems.
Stability of UNIX systems? Hahah. Hahahah.
Edited 2006-08-23 22:37
You are very wrong actually. Unix is by far more simple than windows and far too old than means at least more stable. New code could never be as stable as old one. The fiasco with the new vista network stack simply proves this. In unix you have shared complexity. Is spread among the driver, kernel and software developers. Microsoft tries to do all the complex work by themselves and thus puts more and more complexity into the os. That’s overambitious in my opinion as it implies that the microsoft developers should be the best experts in all areas. This simply does not make sense as for instance the hardware vendors know their stuff better than MS and are competent enough to made implementation decisions which now are taken from them.
I was not deriding the stability of UNIX systems. I was deriding the idea that Microsoft could produce UNIX-quality code. On second thoughts, delete “UNIX-“.
I still use Win2k. It has been by far the most stable windows for me to date. Not that I had stability problems with XP. My main problem with XP was it kernel scheduling policy changes from 2k in heavy gaming was inferior to 2k. I will admit that XP in some corner cases was better on Bad HardWare than 2k was. But I stress that was known crappy hardware that had never been “stable” on any OS tried. Perhaps Vista will bring the best of XP with the Best of Win2k? Well we can hope but I will tell you one thing, Beta testers from Win2k were in near total agreement from about RC1 on that Win2k was “Ready For PrimeTime” I wonder if Vista will be the same? As for me I tried Vista on a 2nd machine, and when it did not have drivers for my 3com 3c905tx network card…. I re-installed 2k and called it a day. How on earth could ANY company neglect such a known high quality, business common, piece of Hardware??? Wow… most truly color me shocked!
M.S. if you are listening guys, get your basic drivers in order already!!! There just is no excuse for not having my 3com nic driver!!
Happy computing,
“Microsoft Plans to Offer Vista Discount…”
http://www.bloomberg.com/apps/news?pid=20601087&sid=aX5yb_YjdHiM
I am mostly a linux user but I keep XP around to run turbo tax and other programs I can’t run through wine. XP is not that bad really. It could have used an updated browser, some increased security and maybe some transparency support for a cooler theme. Microsoft should have introduced an XP II or something like that before Vista. Vista is new technology and it probably won’t be stable for quite a while. An interim release or upgrade to XP would have helped fill the void. These are really bad management decisions. It is not a question of whether Vista is stable. Microsoft mismanaged their product management on the Vista delays by focusing too much on Vista instead of addressing the needs of current customers.
Edited 2006-08-23 23:38
XP is not that bad really. It could have used an updated browser, some increased security and maybe some transparency support for a cooler theme. Microsoft should have introduced an XP II or something like that before Vista
Sounds like a good idea to me. MS could simply release an edition based on Server 2003 SP1, bundled .NET and a new theme. Let’s call it “Windows XP Me”
The new theme alone would translate to “new Windows” for the majority of users (who are clueless anyway).
Actually, that would have been a bad idea. Microsoft would have been severely criticized for simply adding new chrome to Server 2003 SP1. Plus, Server 2003 doesn’t have nearly the requireed app compatibility required by Vista.
I thought they basically did. It’s called Windows XP Media Center Edition Well, at least the part about the new theme. Actually Windows x64 is based on the 2003 kernel though.
Amen.
Yes sure and there was also word out in ye olde days that Windows 95 was as good as a Mac.
Yeah, Maybe a Mac running System 1 or so…
windows 95 was a *lot* better than MacOS at the time. preemptive multitasking, memory protection that sorta worked etc.. I have a laptop running 95 and a powermac 8100/120 running mac os 8 and the mac crashes constantly and I have yet to crash my laptop.
It’s been about ten years since NT4’s initial buggy release, but things have changed dramatically since then. Between web based apps, FOSS applications and platforms, and Apple’s marketshare Microsoft can’t afford to continually delay this release. Their marketshare will continually errode while Apple, Google, Sun, Novell, RedHat, CodeWeavers, Citrix, etc. reap the benefits by introducing really viable alternatives. But I’m sure it’s a calculated risk.
Vista will ultimately fail to be accepted by most IT organizations for 12-18 months, particularly if ISV’s and OEM’s are struggling to produce stable code. I would venture to guess that this wil slow acceptance at home as well That’s a long wait from Wall Street’s point of view.
Microsoft is paying the price of writing a new system all at once. They have the capital and the revenue to take these chances. A smaller company would have gone the transitional route by swapping out a layer a time by deprecating the API’s and introducing new ones. An overhaul like this is really dangerous and only a company like Microsoft could take such a huge gamble without killing the goose.
Windows 95 never has been stable…
neither in beta nor in production, neither ever was dos, or any windows version based on dos.
As far as i understand , WinVista will be a dos-based-os too.
If only we could look at source code.
But wait a minute .. we can look at bug fixed. It a miracle if a bug is only in one win-version only. As i recall, there was at least one bug in win3.11 – .. – winVista.
.. hmm .. strange , isn’t it .. or is it.
Dude, I never miss a chance to diss Windows (maybe someday the idiots will wake up…some hope), but that post made NO sense.
lol. that did not make any sense at all. and NO, Vista is *not* ‘based on dos’. windows ME was the last version of windows to have a DOS unpinning.
95? Try 98. Or rather don’t, unless you’re a masochist and particularly enjoy rebooting. Every. five. sodding. minutes.
wow, you really do like dissing windows
A friend of mine has a windows 98 laptop.. course its running the unofficial SP stuff and lots of other tweaks, but its perfectly stable. I have an old p75 laptop running windows 95 that I use with wifi testing and such and it never crashes
they are sucky OSes, but the ‘crashing every 5 minutes’ its horribly exagerated, unless your system is infested with malware and viruses.
On a semi-related note. I once saw a machine that took, basically, 2 hours (not exaggerating) to startup because of all the adware that started when you logged in, and because the machine had 64MB of RAM.
I’m sure that somewhere someone has installed such bad software on a Win98 machine that they had to reboot every “5 sodding minutes.”
But these horror stories are purely anecdotal, of course. They’re funny though.
ouch, I’ve had machines that took ~30 minutes to finish loading. thats insane…
wow, you really do like dissing windows
Yes. Why? Because increasing awareness of other OSes is the only way to ensure better hardware support for them, and a more competitive software industry. Plus it sucks, of course!
A friend of mine has a windows 98 laptop.. course its running the unofficial SP stuff and lots of other tweaks, but its perfectly stable. I have an old p75 laptop running windows 95 that I use with wifi testing and such and it never crashes
I’m happy for you. No, I really am. Not for Microsoft, but for you. My Windows 98 desktop crashed constantly.
they are sucky OSes, but the ‘crashing every 5 minutes’ its horribly exagerated, unless your system is infested with malware and viruses.
What pisses me off about 98 specifically is that:
(a) all the stuff they put in to support “supervisor mode” and “memory management” (et. al) was for nothing. Those features DO work if you do them properly, but my Amiga crashed less than Win98, and it had neither feature. The PC may have been infected by a virus, it may have had bad hardware – but either way, on that machine Mandrake Linux never blinked.
(b) If Microsoft OSes were a tenth as good as they say they are, I’d not have room to grumble or use Linux – well, not in 1998 anyway.
Ok, I have tried this beast in both it’s native forms.. 64bit running on my AMD64 3200, and 32bit running on my Presario 5100 notebook.
I give the 64bit desktop version on MY hardware (nforce 4 chipset base with nForce 6800GS PCIe) thumbs down due to the fact that alot of I/O operations took “ALONG” time to complete that also tends to bottleneck the system.. however this was something I more then expected to see fixed in later versions.. which it may well have already been, Beta 2 is the only release I currently have..
The 32bit version, running on my Presario notebook with Sempron 3300 chip, ATI XPress 200M graphics, 1 gig of ram.. all I can say is ‘flawless’.. I really have experienced no issues that concern me.. and I run it as the only OS on that machine which I use on a day to day basis.
Overall, I am happy with Vista.. I have never had it crash on ‘either’ machine, and the horror stories I hear I just cannot relate to with personal experience… so Hardware must be a very major factor.
Right now, I don’t have any fears of buying Vista upon it’s release for either of my two machines.
Apparantly Wilcox doesn’t remember Win98 and WinME – which were buggier than vista in their REALEASE forms… There’s a reason 98SE came out within what, 6 months and ME was the nail in the coffin of the hybrid Win16/32 line.
Given they’ve started over from scratch, I’d not say they’re doing TOO bad – I’m reminded of the XP beta in fact.
But then I see statements like “I tried using it as my daily OS”… All I can think is “You do KNOW what BETA means, right? This isn’t a release candidate!”
“Given they’ve started over from scratch, I’d not say they’re doing TOO bad – I’m reminded of the XP beta in fact.”
Ahem. MS tried to do it from scratch, failed miserably and then reverted to the Windows 2003 code base. Vista is NOT a new rewrite if you thought so.
“You do KNOW what BETA means, right? This isn’t a release candidate!”
Does it really matter? If Vista is to be shipped out later this year it should be significantly more stable at this point to warrant adaption by anyone this year.
It’s not possible…
I can’t believe that Vista is the buggiest OS.
You made me laugh a lot. +1 for you
Thanx dude…
One ISV that asked not to be named says a private beta it is working with that shipped after Vista Beta 2 is more stable but is still a memory hog. “The memory consumption has been reduced from a gig to 700MB, which is about three times what XP requires.
700MB? Holy shit! Nice to see them moving backwards.
Twenex: There’s a reason why the font of all human knowledge and literary creativity is not collated into one book.
On a completely unrelated side note i’d sure as heck love to own that book everything book.
For the record I use linux and windows (windows for desktop since it’s easier for me to configure the system, manage programs, and has a larger application selection though this this is no reflection on the quality of the operating system ).
My Footnote:
I actually made my posts due to counter attacks that were made against a few people expressing there opinions (some of them which were wrong) but many of which were very well founded. All of the opinions in my post are my own but they were voiced in response to rebuttals towards people who voiced there opinions and which I felt were unduly ridiculed. It would be much appreciated if you were more respectful in someone was more respectful in there views and tried to see both sides a little better. Spouting attacks on windows and there users just because someone likes linux does not make there opinion any more valid and these peoples opinions any less.
This was a very long post and if I messed up somewhere screw this i’m tired and going to bed. Bottom line is you should chose to learn and be more respectful of other peoples opinions and not spout theory as fact especially though not excusably when there is supporting evidence against some of those theories. Some of the insults that have been hurled to day by someone over simple opinions are quite hard to excuse.
On a completely unrelated side note i’d sure as heck love to own that book everything book.
No, you wouldn’t, mate, it’d be as big as the Statue of Liberty, at least. Imagine reading that in bed!
All of the opinions in my post are my own but they were voiced in response to rebuttals towards people who voiced there opinions and which I felt were unduly ridiculed.
An opinion to which you are perfectly entitled.
It would be much appreciated if you were more respectful in someone was more respectful in there views and tried to see both sides a little better.
1. I feel little need to respect someone who’s giving me no respect. Wrong of me perhaps, but I’m not going to apologise for it. 2. Disagreeing with someone on a point is not the same as not seeing both sides.
Spouting attacks on windows and there users just because someone likes linux does not make there opinion any more valid and these peoples opinions any less.
Please, try pointing out the reverse to certain Windows users.
Bottom line is you should chose to learn and be more respectful of other peoples opinions and not spout theory as fact especially though not excusably when there is supporting evidence against some of those theories.
Again, you could direct your comments elsewhere with equal or greater cause. And I’d honestly be VERY interested to know what theories *I’ve* spouted as facts.