Many have gotten antsy the past months about the Conficker worm, and all with good reason. Though the worm hasn’t done much of anything (yet) except spread like the plague, it’s infectious if one doesn’t have his or her Windows operating system up-to-date with the most recent security updates. The worm is supposed to execute on April 1st, and the computer world is holding its breath to see if a disaster comparable to the hyped-up supposed Y2K doomsday will ensue or if it’s just someone’s idea of a sick April Fool’s Day joke.
Infections
An estimated twelve million computers worldwide have been infected, mostly outside of the United States where there are more illegal versions of Windows floating about, most of them unable to receive the needed security updates from Microsoft due to their pirated nature. Though security firms have been vigilant in creating methods for prevention and removal, and Microsoft itself has issued alerts and removal instructions, the virus has still maliciously infected many millions of computers worldwide, and it’s likely that most people don’t even know that their computers have been compromised. Another win for the Macintosh and Linux crowds, Windows is the only vulnerable system. However, remember that just because a system can’t be infected doesn’t mean that it can’t help pass a virus along.
Theories
The Conficker worm’s purpose is still widely unknown as are its creators, but due to several of its characteristics and to previous worm exploits of similar nature, experts have several theories as to what will happen on April 1st when the worm “detonates,” in a manner of speaking.
What is known is that once the virus is triggered on April 1st, it will try to connect to 50,000 domains, 500 each day, to download additional files and instructions. The virus already has indications of a peer-to-peer infrastructure, so one need only use his or her imagination of what may happen.
It is possible that the whole thing is an April Fool’s Day joke that will leave everyone just a little more irritable and more protective of their systems, none of whom will be laughing. We’ve seen mighty hoaxes in the past such as the Y2K Bug or even Google’s TiSP.
Optimism aside, since the peer-to-peer technology is already set into place, the worm has a much more malicious purpose. Officials are pretty sure that the intent isn’t lighthearted, either:
Perhaps the most obvious frightening aspect of Conficker C is its clear potential to do harm. Perhaps in the best case, Conficker may be used as a sustained and profitable platform for massive Internet fraud and theft. In the worst case, Conficker could be turned into a powerful offensive weapon for performing concerted information warfare attacks that could disrupt not just countries, but the Internet itself.
-Phillip Porras, research director at SRI International
It’s worth noting that these are folks who are taking this seriously and not making many mistakes. They’re going for broke.
-Jose Nazario, researcher at Arbor Networks
According to a study of the Conficker worm done by SRI International, it contains code that enables “infected computers can act both as clients and servers and share files in both directions. The peer-to-peer design is also highly distributed, making it more difficult for security teams to defeat the system by disabling so-called super-nodes.”
It could very well possibly be that it will be used to hijack innocent computers to “rent” to shady industries, particularly that of spamming and the further spreading of other malware. It’s also been suggested that Conficker could be used to create a darker and unwilling version of something akin to Freenet.
The most horrifying scheme of all is that it may be used to create what researchers call a “Dark Google,” a network of infected computers searchable for information at the whim of the criminal underground as a whole with the authors of the Conficker worm selling the answers to criminal queries put into the system. A thoroughly terrifying idea, to be sure, but only to those infected.
In order to stomp out the infection, many security officials have been strenuously working on prevention and removal, and Microsoft has even offered a $250,000 bounty for information leading to the arrest and conviction of the author(s) of the virus.
Symptoms and Prevention
Infected systems can be spotted by several ways, or may not be spotted at all unless equipped with an antivirus program. Users may notice their account lockout policies are being reset; automatic updates, Background Intelligent Transfer Service, Windows Defender, and Error Reporting Services are disabled; domain controllers respond slowly to client requests; the local network seems slow and congested; and websites related to computer security are blocked.
Anyone who has been receiving Microsoft’s automatic updates is safe from the initial spreading of the virus, and antivirus software can take care of other infections from other sources (as described in the below section). However, if successfully infected and undetected, the worm can disable antivirus software as well as automatic updates. Users must be sure to have strong passwords on their network shares to prevent the virus from spreading to their computers from a network. Users should also be wary of the AutoPlay function.
Strains and Spreading
There are apparently four strains of the virus out there already, and the fourth, Conficker.D, will be downloaded and updated to the previous strains and then have its executions triggered on April 1st. The virus is able to spread through a vulnerability that was previously patched in Windows, so those who have successfully performed an automatic update since the patch’s inception should be safe from this form of spreading. In the .B and .C variant, the virus also infects machines through network shares breaking through poor passwords, mapped network drives, and removable drives (usually spread through USB drives). The worm uses a scheduled task to initiate the virus on remote machines or via an AutoPlay entry added by the worm. Conficker also connects daily to various generated domains to receive any updates in order to counteract efforts to thwart it. Read more about the different strains in detail here.
Roast Them Alive
Don’t take this wrong when I say this, but I’m one to admire the genius it takes to initiate such a scare. However, I am of the opinion that the creators of the Conficker worm ought to be roasted alive and fed, while barely being sustained in consciousness by various medical methods, to insects such as the ants seen on Indiana Jones’ newest escapade. It always irks me when people take advantage of the great and marvelous computer technology that we have today to turn it into something so criminal, dark, and terrible that users have to live in fear of their computers being compromised. We will see all too soon whether this rampant virus is truly a curse to the extreme or simply a joke gone foul.
… to switch to Linux or one of the BSDs.
Or it’s a reason to keep your system up-to-date.
Most infected system are those of people who have their system up to date with Microsoft security systems and patchs , and more importantly only uses Microsoft softwares … Who falsely believe it’s safe and secure.
Microsoft OS and it’s security don’t work , that’s why there is a billion dollar industry bigger then there own and OS sales who work on “that problem”.
Cut your nonsense. Conficker was already patched BEFORE it got out in the wild. Users who have kept their Windows systems up-to-date have nothing to fear.
Please, no more FUD.
You should know by now , that I don’t write nonsense at all.
http://www.google.com/Top/Computers/Security/Malicious_Software/Vir…
Here’s the link to the people who you should be talking to. Ask them ” Are virus , trojan and other Windows nuisance only targeting unpatched Windows systems”
Answer is no , they exploit known security holes in Microsoft windows and they uopgrade based on the patches Microsoft delivers.
No , it got patched after someone found it in the wild , it also got patched at least once before under many other aliases.
Actually , you have to run a detection program to be sure.
F ear
U ncertainty
D oubt.
“The term originated to describe disinformation tactics in the computer hardware industry and has since been used more broadly.FUD is a manifestation of the appeal to fear”
I am not appealing to fear , I am using knowledge and real information to counter the Microsoft FUD that other OS are unjustified and using Microsoft real problem in order to attack it.
I am not crying wolf here. I am telling the wolf and it’s people that it can’t get at me or my chicken with it’s sheep disguise.
If you where right , this could be used as a detection method to remove the illegal windows user from the internet , by disconnecting them for fraud and illegal usage of one company products and bring them to justice , with proof of there illegality , it just so happen that those infected usually are the legal user’s.
I kinda have to agree with Thom here.
Conflicker.c was released February 20, 2009? (the virus we’re talking about) MS08-067 was published: October 23, 2008? Conflicker.a was released in late November. http://www.microsoft.com/protect/computer/viruses/worms/conficker.m…
Also here http://blogs.securiteam.com/index.php/archives/1150
and here
http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2008-4250
Leads me to think that there was a 2 week overlap from when this vunerability was discovered and when viruses like Gimmiv.A were written, and when it was patched. Conflicker did not exist in October period. I guess you could be one of the 3,695 people that got Gimmiv.A before it was patched. http://blog.threatexpert.com/2008/10/gimmiva-exploits-zero-day-vuln… But I doubt it.
If the cited sources are wrong let me know.
-Bounty
Edited 2009-03-31 21:13 UTC
“Conflicker did not exist in October period.”
You seem to bold “Conflicker” so as to make a point that it’s spelled “Conflicker” and not “Conficker.” I’m sort of a Nazi when it comes to these things, so I’d just like to point out (not just to be smart Alec, but for the benefit of all) that the true name of the virus, whether it’s a genuine threat or just a load of baloney, is Conficker. No ‘L’ in it.
If that wasn’t your intent, or it was merely a typo, well… thank you for allowing me the opportunity to point out that it’s Conficker, anyway. There are many legitimate websites (not just personal blogs stating that “Conflicker is cuming!!!1”) that have named it incorrectly, and I obviously get my feathers ruffled a bit over it.
Edited 2009-04-01 00:37 UTC
Sorry for the typo, but no, I made that bold to point out that I was talking about the virus called conficker (fixed!) Not some other previous virus that uses the same flaw to propagate.
And yet, we still have a major security problem here. Could it be that Daniel Bernstein has a point, and that releasing patches for insecure software is no substitute for not having released the faulty code in the first place? In the worlds of Sleeping Beauty and Cinderella, everyone probably applies patches. But in our world, they obviously do not.
We’ve gotten so used to thinking in terms of that old cop out “All software has bugs; It’s how fast the project patches that counts” that we’ve lost sight of reality. We’ve come to think that “It’s perfectly all right to release software with security bugs because you can always patch it and get a big pat on the back when you do”.
I watch as, over and over again, Firefox releases insecure software… and gets a big pat on the back when they release a fix quickly after the problem has become public knowledge, when what they really need is a big black eye for having put everyone at risk in the first place.
So Microsoft put bad code out there and then released a patch. It does not change the fact that many, many machines are at risk due to the original release of the bad code.
Bug-free code is impossible. Plain and simple. All you can do is be as quick as possible with fixes. In this case, Microsoft was more than adequate. Even BEFORE the hole became known and Conficker got out, the patch had already been pushed through Windows Update for all versions of Windows. The patch has been out for SIX MONTHS.
If you STILL get infected, then it’s your own damn fault. I feel no pity whatsoever for those who have been infected.
If you want bug-free code, be prepared to pay at least 10000 EUR per copy of your operating system. That’s the reality of things.
Edited 2009-03-31 18:10 UTC
I would agree with this.
I would disagree with this. “Bug-free code is impossible” does not inevitably lead to “All you can do is be as quick as possible with fixes”, however. Because while “bug-free code, every time” is impossible… code with fewer bugs *is* possible. Modern programs suffer from too many projects which take advantage of the lax view that users take of those projects’ accountability for security problems. And we on OSNews are far more critical than the general public, which by and large simply accepts “computer viruses” as being a fact of life.
At any rate, to the extent that what others do, or do not do, has direct effects upon us, we cannot assume that everyone is going to patch.
Yes, because Linux and BSD have such a small market share that no sane worm writer could be bothered.
Seriously, you and people like you should get it in your heads: for most, Linux just doesn’t cut it. If it did, we’d be using it. Stop bothering us.
Edited 2009-03-31 07:31 UTC
Maybe so, but I’m still gonna laugh my Butt off at Windows users, especially those that can’t even be bothered to pay the Microsoft tax.
Security updates are distributed even to non-genuine users.
And laughing? Well, I’m not really laughing, I’m just pointing and telling my friends: that’s not how to do it.
Edited 2009-03-31 07:46 UTC
He does have a point, but I don’t know if he realises it or not. Assuming Linux is just as insecure as Windows, you’re still at less risk because it has a smaller marketshare and noone writes rooting worms with payloads for it (rather most are for spam/CPU botnets or for targetting other Windows installations). It’s a good argument against software monopolies, the same way low genetic diversity is considered a bad thing (bananas).
Imagine a Windows super-bug destroyed most Windows installations next week… we’d be screwed At least if there was a mix of Linux, BSD, OS X and Windows, all running various different sets of software (KDE, GNOME, etc.) the damage would be far more contained.
Probably not, actually. You see, most security breaches do not appear as low-level as the kernel – they appear higher up in the stack. And since a lot of packages are common throughout Mac OS X/BSD/Linux distributions…
Still, a mixed environment would be so much nicer for the world. One can dream.
I understand that, but the stack on OS X/Linux/BSD is at least more varied than on Windows…
I think the more diversity the better. Even if SAMBA is vulnerable, and that is shared across all three UNIX/alike platforms, a large number of those won’t be running it, or will be running NFS instead. Of those who are running it, the ability to gain root-level access might also depend on a kernel bug, limiting the potential to spread to other user accounts from the Samba accounts/users.
If Windows’ RPC is vulnerable… we have Blaster and Sasser making huge parts of the internet infectious. It’d be a lot slower at propagating if there were 1/4 as many computers to infect There’s room for some non-UNIX-alike systems in the mix too .
It’s not a panacea, nothing really is… it just helps reduce overall risk.
Agreed! That was exactly my point….
If there were a much more mixed Internet (as far as OSs are concerned), worms like this would have a much harder time of things. As it is, because the net is so Windows-dominated, they can cut a swathe across it with no problems at all.
Putting it another way – if your forest is all Dutch elms, you get hammered by Dutch elm disease.
Agreed….
As for the argument put by a previous poster that “if Linux cut it, we’d all be using it” – not necessarily. Windows doesn’t “cut it” (security-wise) and yet many still use it.
I guess some of us are just much faster to spot a good thing when we see it…..
Edited 2009-03-31 08:37 UTC
Agreed. Windows domination has nothing to do with being the best, and everything to do with being the quickest with market penetration and marketing.
Don’t get me wrong; I like XP and 7, and do use them. But I wouldn’t necessarily say they’re the “best”.
You’re right – I remember saying over 10 years ago that the ideal situation would be a world with 6 different operating systems and multiplatformness is done through adhering to standards set down or through something like Java or .NET.
The sad reality; the market demanded one operating system for ‘cheapness’ – and well, they’ve got the cheapness and it has come at a cost.
If the best software / hardware always prevailed, then the computers in use wouldn’t look anything like the ones we currently use.
Fact of the matter is, Windows is on every desktop because Microsoft are the best businessmen, not the best OS developers.
But anyway, this whole other debate is somewhat off topic
It’s windows OS who get infected and it’s not there marketshare that is the target it’s there failed security … GNU/Linux is the #1 desktop by usage globally.
Actually , If you look at real security report they try a lot more to break into GNU/Linux system , they just fail because the OS is secure , can’t say the same for BSD as it’s not a mainstream OS.
Serious people don’t defend Microsoft security record or practice or OS.
Had you said for most people don’t know GNU/Linux exist I could have agreed.
1 million dollar say your using it right now and just don’t know it.
Stop using infected windows system and attacking everyone elses.
Very valid point. Typical sentences I hear from “Windows” users (who usually call themselves “professionals”):
3. “I don’t have a virus / trojan / malware …”
(Nothing checked, just wishful assumption.)
2. “My system is safe.”
(No protection programs, false assumption.)
1. “I don’t care.”
(Honest, okay for themselves, but not for others.)
Because people like car analogies, here’s one: You are not alone in the traffic. If you drive like an idiot, you can die, but you can kill others, too, no matter how carefully they drive.
String majorPlatform = “XYZ”;
String[] minorPlatforms = {“ABC”, “DEF”};
if(costOfMultiplePlatformsMaintenance + costOfMultiplatformApplicationsDevelopment > costOfLossesThroughSecurityBreaches)
{
selectMajorPlatform(majorPlatform);
spreadMajorPlatform(majorPlatform);
startNetworkEffects(majorPlatform);
pushIntoNiches(minorPlatforms);
}
If I can lose 1 million dollars through security breaches, but I have to spend 2 million to make cross-platform applications and 1 extra million for maintaining a network of various platforms (as compared to a network made up of machines with the same platform), why would I ever chose the multi-platform option?
Perhaps because you know then that you’re spending 2M for infrastructure and 1M for maintenance, thats 3M in total. The damage of a security breach can be 1M but also 100M, you don’t know that.
Nobody will hold his breath for the “maybes”. However, extra spending “for sure” will make a lot of people gasp
Most applications can be built cross-platform without too much expense (particularly if you use cross-platform toolkits like QT or a byte-code language like Java).
Use open standards and it shouldn’t take much more effort to have a multi-platform network.
I, myself, run a network comprised of Solaris, Linux, XP and Vista machines and each of them work perfectly on the network.
In fact, most networks should consist of a range of platforms as you wouldn’t want a secure network server running the same OS and application suites as the users desktops. Even if it’s just different platforms within Microsofts own product range.
BSDs don’t have the hardware compatibility, and dependencies on desktop Linux just aren’t worth their problems. A little wasted space and some reboots isn’t that bad and security wise it’s like putting in a $1000 dollar system for a problem that doesn’t exist. If the community admitted that dependencies will never allow for a system that “just works,” Linux might have an actual chance on the Desktop.
P.S. I’m sure this is the same reason kernel developers are preferring Macs.
Dependancies on Linux aren’t really a problem like they once were.
In fact Ubuntu does “just work”. this is largely down to the excelent Debian package manager. apt-get (thus synaptic / or any other GUI tools you use that plugs into apt) is great for sorting out the dependancies so the users don’t have to.
I’m also a big fan of ArchLinux’s pacman package manager for the painless skill at which it manages dependencies for me – and that’s not even aimed at the n00bie like Ubuntu is.
In fact, I have more of a headache on Windows these days as some applications require specific (or greater) service packs installed which have to be manually downloaded and installed,
hundreds of unused “shared” DLLs left over from uninstalled applications….. and so on.
Plus the whole pain of having to go online, google the software / drivers you want, scower through the results to find the website and then through the website to find the downloads. Then download each individually and manually and the install then each manually and, again, individually. Plus the pain of usually having to reboot after each driver install and a fair number of applications too (most of which I doubt need a reboot).
Then the lost hour spent tidying up the bloody start menu, desktop and quick launch bar because no sodding program installs it’s icons in organised grouped locations (instead favouring cluttering up every inch of screen real-estate.
So no, Windows has next to no package management or dependency management.
It’s a complete headache to keep organised and that’s precisely the reason why so many windows users just give up, reformat and start again once every n months.
With Linux, you don’t have to do that. You can remove unused dependencies. You don’t have to hunt around the internet for applications (just type in a name or description and the GUI does EVERYTHING else for you). You don’t have to spend half your life organising the start menu as applications install in themed groups.
Linux package managers “just work”.
Edited 2009-03-31 16:37 UTC
Except all those arguments fall apart horribly when compared to the actual situation. Ubuntu only maintains a small baseline of packages, the package the user wants is often out of date and they need to hunt down the latest while hoping it provides the right package format. Cleaning dependencies often means running a special command instead of it taking care of itself.
Saying windows had problems with dependencies is a little more than ridiculous, the good programs that need things like directx updates check for their prescence themselves and send the user to the download site if not found. The vast majority of applications ship with everything they’re dependent on so it’s never a problem. The add/remove programs feature is practically flawless and does get rid of anything that’s not configuration files in user folders last time I checked. It’s very doubtful you’ve used Windows recently. And like I said, a little wasted disk space just isn’t that important. Any problems from dlls are really from bad programmers, and the latest ext4 fiasco proves there are a lot more of those on Linux if that could slip by so many people.
Linux isn’t getting anywhere because of people like you who won’t admit flaws and take objective viewpoints of the actual situation. Go ask the 98% of people who don’t like Linux why, and you’ll find all their reasons stem from dependencies.
False. One of the things that I really, really, like about Ubuntu is the 26684 packages in the Main, Universe, Multiverse reposity triumvirate which comes installed by default.
Sure, distros like Fedora, RHEL, and CentOS can seem rather limiting, even with Extras, EPEL, RPMForge, etc. But in those rare instances in which I can’t just apt-get what I want, I’m always a bit startled.
For it to have any value, you are going to need to back that unsupported assertion up with some actual evidence.
Go ask them. Of course you’ll have to find ones that have used Linux, but you’ll find the defection rate is incredibly high. It’s literally information you can get by going around and asking people, so stop being lazy.
Edited 2009-03-31 17:46 UTC
Which translates to: “shadoweva09 doesn’t actually have any evidence and is pulling this claim out of his posterior orifice”.
My security professors agree it’s not good, my classmates agree, you have to GET OFF THE INTERNET AND TALK WITH REAL PEOPLE, NOT A CONCENTRATION OF LINUX FANS.
As a consequence of my conscious decision to maintain a broad perspective, I consider OSNews.com to be my main home base. And administer around 80 mostly nontechnical Linux desktop users, as well as some still using Windows for certain tasks. Does that count?
Paying attention to perspectives presented by OSNews participants keeps me apprised of the views of people working in many different environments. While my work as an administrator of Linux desktops keeps me very well aware of where Linux does well, and where it is a challenge, on the desktop. And dependency management is simply not anywhere near the top of my list of things that have been problematic in the real world. Dependency management has actually been more of a problem for us on the Windows boxes.
Out of curiosity, exactly what are your qualifications for making such bold claims as you have made?
5 Years home pc repair. Sure I can agree Linux is better in an office environment, but home users have vastly different demands of their computer. In that case internet isn’t always on, update notifications once a day are annoying and will start being partially ignored, etc… If Linux doesn’t meet home users demands, it really doesn’t have hope of success as a desktop. Home users don’t want to have to be connected to the internet to manage software, find it annoying that there are 3 different downloads for Linux applications (tarballs, rpm, deb), and don’t want to learn extra stuff like how to use the command line. If you are on Ubuntu, go ahead and look at the blender package in the repository. It was horribly out of date last time I checked, then look at blender.org and see you need to use tarballs and manually resolve dependencies yourself if you want the latest version.
Edited 2009-03-31 18:41 UTC
One step at a time, please. Let’s conquer the business desktop first, before we go after Ma and Pa Kettle. (Although I do have business desktop users who have decided to use Ubuntu at home, as well.)
Regarding Blender, I’m running the current Ubuntu Beta. (Final release planned in 23 days.) The latest version from the Blender site is 2.48a. The latest version shown in my synaptic (click, click, click) is 2.48a. So no dice for your claims on that.
Edited 2009-03-31 18:52 UTC
Earlier you came across ignorant.
Now I think you’re just a closed minded fanboy.
Yeah Linux has it’s faults. But attacking the one think many distros actually do right is absurd.
And if that wasn’t bad enough, you then go on to make up statistics (calling people for lazy because they ask you to back up your own figures) and continue to make blind allegations is just insulting to our intelligence.
I’m all for intelligence and informative discussions about the pro’s and con’s of each platform. But I have better things to do than to engage in a childish flame-war.
So, unless you’ve got some facts to back up your blind assertions (which you’ve thus far failed to do), then I’m ducking out of this pointless argument.
Since we’re all pulling stats out of our asses, and we all know how wonderful anecdotal evidence is, here’s something to counter your claim.
Back in 2001, we (the local school district IT dept) started migrating elementary school labs to Linux using LTSP (thin-client setup initially, then a hybrid diskless/thin-client, now a fully diskless setup). My sister was in grade 7 when we started, and absolutely loved the Linux system. She asked several times why we didn’t have it at home.
Then she moved on to high school, which still had Windows XP everywhere. Her and her classmates were quite upset, and never missed an opportunity to ask why they couldn’t access Linux in school anymore. Just her luck, we switched her high school over to a diskless Linux setup the summer after she graduated.
We’ve now moved 7 high schools (ranging from under 100 to almost 500 computers per school), and all elementary schools (~40 with 30-90 computers each) to diskless setups, with NX Client access to their accounts from anywhere in the world. That’s just over 15,000 students using Linux everyday, with very few complaints, and just under 2,000 staff using Linux everyday, also with few complaints (now, there were a lot before they used the system for more than 30 minutes).
We hand out Knoppix CDs in the schools, provide access to downloadable ISOs for Debian, Ubuntu, and Kubuntu, and hear stories all the time about students running Linux at home, whether it be via dual-boot, LiveCD, or replacing Windows.
We’re also just the tip of the iceberg moving through the province. Or perhaps the initial snowball that leads to an avalanche. Several other school districts are experimenting with thin-client and diskless installs in computer labs.
So, around here, it would seem to be the exact opposite of what you are claiming.
Out of curiosity, where is this? It’s definitely a more forward-thinking area than where I’m living .
School District 73 – Kamloops/Thompson, in sunny Kamloops, BC, Canada.
Note: that’s 26,000+ packages, and not 26,000+ programs or applications. Many of those packages are meta-packages (linux-image*, for example), or library packages, or programs split into multiple sub-packages (all the kde packages, for example). So while it looks cool to say “Ubuntu has over 26,000 packages available”, it’s nowhere near the same as saying “Ubuntu has over 26,000 applications available”.
I’m talking about the actual situation.
I’m talking about actual real life experiences from not just myself but other Ubuntu users I have spoken to.
I’m not hypothesising here – I am talking about the real actual situation.
Rubbish. Everything I’ve ever needed to install has been available from Ubuntu repositories. And those few specific applications that aren’t can easily be found by adding additional repositories to apt-get’s source’s file (again, all can be done via a GUI).
Yeah, but the command does take care of it.
Windows doesn’t even have a command.
Pot calling the kettle black
But then it has to be manually installed.
Linux manages that for you.
That’s only 1 aspect of a much larger picture though.
It doesn’t manage dependancies or shared DLLs.
It doesn’t download application updates or windows hot-fixs / service packs.
It doesn’t manage installing new applications.
Actually I do use it often. Sure Windows is getting better, but it’s still got a long way to go.
But while we’re making assumptions: it’s doubtful you’ve used Linux recently
It is when it starts to slow you’re system up because of the excessively large system directory and registry keys due to lack of package management tools.
Actually, having shared DLLs is a good programming practice, not bad.
The problem is Windows isn’t good for logging which applications use which shares so you end up keeping dozens of DLLs just in case removing them breaks other applications.
I’m very objective. I just think you’re talking BS.
The very reason I use Linux is because of the package managers. If it wasn’t for that, then I’d be using FreeBSD on my laptop.
Conincidently 98% of all statistics are made up on the spot
But if we must play the “lets make up statistics” game, then here’s my input:
98% of people don’t like Linux because it’s not Windows and doesn’t run Windows programs (Photoshop, MS Office, etc)
Most of your argument falls in on itself. Like windows doesn’t have a command line for uninstalling software. That would mean you blame windows for being designed to be graphical instead of based on a command line. And wow, manually installing up to 3 programs that you qualify as dependencies in the lifetime of the machine is a problem how? Are you sure it doesn’t manage shared dlls, or are you looking at the Microsoft shared dlls that came with the system? I’m sure it has more to do with bad programmers again.
and here’s a good stats site: http://www.atinternet-institute.com/en-us/internet-users-equipment/…
Also, why don’t you try PCBSD, I here it manages the problem well in theory.
I NEVER SAID THAT!!!
I said Windows doesn’t have a package manager – just a tool to uninstall programs
And Ubuntu’s package manager does have plenty of GUI tools so you don’t even need to hit the command line.
Go back, read my posts again. Please.
Because you’re either completely missing the point or deliberatly dodging the facts because they disprove your original argument.
I’m just saying it’s more of a problem than Linux, despite you harping on about how poor Linux is as managing dependencies.
I’m 100% sure I meant what I posted. I know it from experience having managed various Windows systems for a good number of years.
So yes, I did mean what I posted.
It doesn’t!
You can’t blame bad programmers for shared DLLs. That’s absolutely absurd!
Using shared DLLs is good programming practice. Trust me, I used to be a Windows programmer (before I migrated platforms full time)
The problem isn’t a programming problem (thus not the developers fault) but a problem with the lack of management of the files by the OS itself (thus Windows fault)
Webstats don’t proove your arguments about package management.
Try again.
I have
Actually, it does. All that the Add/Remove Programs applet does is call a separate uninstall app, that’s usually stored in the app’s installation folder. Most have simple names like “uninst.exe”. You can run these manually from a command.com prompt, a cmd.exe prompt, a 4nt.exe prompt, any other command prompt, and even the Run… dialog. Most of these even come with handy /? output that tells you how to run them.
Of course, what that has to do with anything written in the post you are replying to … I’m not really sure. You must be reading a different post than I. Or reading between the lines or something.
And your proof, citations, or evidence of any kind would be … ?
I’m sure you’re wrong, I have 3 reasons:
1: I can’t open my documents (all in the latest MS Office format)
2: I can’t use my favorite windows-only program and I don’t want to look for a good (or better) alternative.
3: My hardware doesn’t work out of the box
Not even 1 person said something about dependencies if I only count people I talked to in real life.
This is true. However…
1. You would first need the latest MS office in order to save documents in that format (you claimed they were YOUR documents). since MS Office 2007 doesn’t run on linux, this problem is moot. Earlier versions of office (up to 2003) will run perfectly under wine. Works for me. If you must open MS Office 2007 documents, the best solution is still ms office on windows. There are converters for OpenOffice but I’ve yet to see one that works 100%.
2. fair enough – however I have this trouble when trying to run linux-only programs on windows. its a real pain.
3. I’m finding this hard to believe. What you may have meant is that your hardware does not have ALL of its features supported 100% on linux. This isn’t a problem for most pc users. As long as they have a desktop and can surf the net and use email they’re fine – and linux does this superbly.
I am a home user and I’ve used linux for 7+ years now. Yes sometimes it is a pain in the neck, but I switched because I thought the same of windows. I’m not suggesting linux is perfect, but I do believe it is technically better than all other mainstream OS’s. For myself, I prefer using it for the customizability and freedom it gives me. I hardly use any office software – mostly I do software development(qt) and use the internet, email and watch tv. incidently my tv card doesn’t work on windows, no matter how hard i try.
I agree, if you include the lack of games in reason 2 then that covers the most common reasons (by far) that I hear for not using Linux. Those 3 certainly cover 95% of why I stick with Windows.
I do still hit software installation and upgrade issues in Linux now and then, but things have improved massively in this area. I wouldn’t say that the dependency issues I’ve experienced in modern Linux distributions are any more annoying than the unnecessary restarts when installing software in Windows.
I sometimes work from home, so being able to open documents from other people is essential. There’s no point in using OpenOffice at all when most of the documents I’m working with have to be in Microsoft’s formats. Booting into Windows to open a form someone’s sent me, or trying to run MS Office under Linux, are a waste of my time.
Then there’s the quality of the available software, and the hassle learning to uses a different app. I’m not saying that GIMP, for example, isn’t a decent product, but I have several years of experience with Photoshop, I already own a copy of it, have loads of images saved in its format, and the thought of switching to something else isn’t appealing when there’s no reason to do so.
I also use my PC for ripping audio CDs and playing music, and the tools available for Linux just aren’t on a par with the equivalent Windows software. No Linux ripping software compares with Exact Audio Copy for features and automation, and Foobar2000 is more mature and feature rich than any Linux audio player. It wouldn’t be the end of the world if I had to downgrade to the software available for Linux, but it would outweigh any advantages that Linux offers.
Hardware compatibility is a bit less of an issue these days, but I’d still have to replace my E-mu USB soundcard and a couple of other components if I was to switch. That would erase any cost advantage when comparing Windows to Linux.
Ok, on this one I’m calling bs. The Windows add/remove tool does one thing and one thing only: it stores information on currently installed programs and among that information is a path to their uninstaller. Their uninstaller. Windows itself has nothing to do with the removal process, other than to fire up the uninstaller. Some of these uninstallers are coded badly, and leave, not only registry keys, but files all over the place. Anyone try and remove Norton Antivirus lately? How about Microsoft Office 2003? Windows does absolutely nothing when it comes to uninstalling programs, just like it has nothing to do with installing and managing them, and when you’ve got an incomplete uninstaller, it is you, the user, who pays the price for that.
And yes, I have personal experience on this matter, not only have I dealt with badly coded removal programs, but I’ve also created a fair number of installers and uninstallers myself and know exactly what the add/remove feature in Windows does and does not do as a result. Before you say things such as your above comment, please look into the matter more thoroughly (i.e. do your research).
Actually, Linux is MORE compatible with hardware than windows is. Windows is terrible at hardware and driver management. The truth is that windows isn’t compatible with any hardware without a pile of drivers and other programs like directX to fill in the gaps between the makeshift OS and the hardware.
From the article: “We’ve seen mighty hoaxes in the past such as the Y2K Bug”
A hoax? I think you should clarify what you mean here.. are you referring to the doomsday type scaremongering? There were many systems that would exhibit buggy behaviors with various date related calculations if they weren’t patched back then.
Edited 2009-03-31 08:32 UTC
One of my first professional programming jobs was in 1996 repairing a library database that thought the year 2000 was 1900. Really mucked up it’s search results.
Ah nostalgia.
April Fools Day history reminder: The fool was the person who made the joke, not the one who fell for it.
I can’t wait for the havoc to begin. My Linux box is safe.
I’ll bring the popcorn. 🙂
Ha! yeah, it should be great…as millions of zombie windows boxes start generating enough internet traffic to choke all the major trunks to their knees
If you can only speculate what the thing will do, saying the creators should be roasted alive is a bit premature imho.
Clearly the creators have a profound understanding of both the windows platform, it’s usage, the internet and p2p communication. Probably knowledge we can learn/benefit from.
If, for example, the worm would demonstrate that such a thing can be built and would erase itself, there’d be no problem. I agree that this is highly unlikely, but you have to agree that “roasting” people with such understanding is a loss.
Good intentions or bad by the creators, this is an opportunity to learn.
I think it is more about how this demonstrates how to make an ever present, ever evolving worm to others. Especially if it changes the vulnerability it affects on April 1st.
Like mentioned by others a mix is the best solution to problems like this. I would love to see a lot of different OS’s and a lot of different programs all using the same standards for communication and saving documents. Users can choose the OS or program they like without having to fear they can’t communicate or open documents from others.
That doesn’t need to be bad for Microsoft if it manages to reinvent itself, like others have done in the past.
I guess its due to the excelent marketing department at MS that most virusses are written for MS-products. Unfortunatly their OS isn’t that excelent (in my opinion).
By the way I don’t use windows, so I won’t be affected by this one.
I refer you to a link [1] that doesn’t try to scare or summarize rumors. Enjoy some real information, lean back, relax and – although I do like bashing Windows like the next guy – stop creating “The end is nigh” scenarios.
If this link doesn’t work for you you _might_ be infected. This site is targeted by conficker and is blocked if your machine is already compromised.
1: http://www.secureworks.com/research/blog/index.php/2009/03/27/confi…
“Don’t take this wrong when I say this, but I’m one to admire the genius it takes to initiate such a scare. However, I am of the opinion that the creators of the Conficker worm ought to be roasted alive and fed, while barely being sustained in consciousness by various medical methods, to insects such as the ants seen on Indiana Jones’ newest escapade.”
Sorry, but no free pass on this stupid and cruel statement. When will people get it through their egotistical thick heads that THIS is the attitude that makes humanity weak. This is why people like the creators of the Conficker work (if there is such a threat) create their malicious works. Grow up. Evolve a little.
Hello Australia,
do you still have internet or are you doomed? As you already have tomorrow please let us know.
Well…New Zealand still seems to be here (8:50am local time)
Australia is still here.
I’m more worried about Telstra’s incompetance, the indecisive South Australian State Government, and daylight savings. It’s not so much fun when your mobile phone tells you daylight savings time ended when it actually didn’t, and you spend half the weekend being an hour late…
-Brendan
Well, can you?
Apparently, the mainstream media reporting on the Conficker worm simply cannot.
http://blog.linuxtoday.com/blog/2009/03/53-pages-10-mon.html
So, here is your opportunity to practice. Say it slowly, after me …
“Wwwwww …”
try again
“Wwwwwin …”
Getting there. Keep trying good people.
Edited 2009-03-31 22:57 UTC
I registered on this site (being a lurker for literally years now) to note that the article is crap. The effect of the 01.04. date change was analyzed and published. I posted one of probably lots of links that point out that news coverage for this topic is subpar. Uhm – actually it’s just FUD.
I do have to admit though, that reading the comments so far reminds me of Youtube.. I suggest to take xkcd’s advice: http://xkcd.com/481/
The implications of this virus are indeed scary I also believe the authors should be exterminated, even though they are brilliant. If Microsoft had this kind of talent the worm would not be successful. The article speculates several possible purposes of the worm but the actual intended function of the virus remains unknown and the real impact on computing is no more damaging, at this point than Windows Vista which slows down networks with it’s daily updates and slows down computers with it’s bloated programming. Unfortunately, until the virus actually does something other than spread and update itself we will not know what it’s purpose is.