One of the biggest reasons for many people to switch to a UNIX desktop, away from Windows, is security. It is fairly common knowledge that UNIX-like systems are more secure than Windows. Whether this is true or not will not be up for debate in this short editorial; I will simply assume UNIX-like systems are more secure, for the sake of argument. However, how much is that increased security really worth for an average home user, when you break it down? According to me, fairly little. Here’s why.
UNIX is geared towards server use, and so is its security system. As we all know, ‘normal’ users do not have permanent root access (well, shouldn’t have, in any case). As such, all important system files are protected from whatever stupid things the user might do. The user does not have full access rights to all files. The user only has full access rights to his or her own personal files.
And that is where the problem lies.
I believe that desktop Linux/OSX/etc. users all over the world have a false sense of security, and are activily promoting that false sense of security on the internet, in magazines, and conferences all over the world. No, they are not doing this on purpose. However, that does not negate the fact that it does happen.
What I am blabbering about?
A hypothetical virus or other malware on a UNIX-like system can only, when it is activated by a normal user, wreak havoc inside that user’s /home
directory (or whatever other files the user might have access rights to). Say it deletes all those files. That sucks, but: UNIX rocks, the system keeps on running, the server-oriented security has done its work, no system files were affected, uptime is not affected. Great, halleluja, triumph for UNIX.
But what is more important to a home user? His or her own personal files, or a bunch of system files? I can answer that question for you: the pictures of little Johnny’s first day of school mean a whole lot more to a user than the system files that keep the system running. Of course, they should make backups– but wasn’t Linux supposed to be secure? So why should they backup? Isn’t Linux immune to viruses and what not? Isn’t that what the Linux world has been telling them?
This is the false sense of security I am talking about. UNIX might be more secure than Windows, but that only goes for the system itself. The actual content that matters to normal people is not a single bit safer on any UNIX-like system than it is on any Windows system. In the end, the result of a devastating virus or other malware program can be just as devastating on a UNIX-like system as it can be on a Windows system– without the creator having to circumvent any extra (UNIX-specific) security measures.
To blatantly copy Oasis: don’t believe the truth. Yes, UNIX-like systems might be more secure than Windows systems, but not in the places where it matters to average home users.
–Thom Holwerda
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
1. How would a *nix user get the virus?
2. How would it excute?
Hypothetically a dumb user would save the file, make it executable, and then run it purposefully. I don’t know about anyone else but that sounds like a lot more trouble than it’s worth for free porn of Miss Lebanon 2006 (alluding to Kama Sutra worm for those who don’t know) when porn sites are much easier to get to.
In Windows the process is much easier, get the e-mail with promise of free porn, click the attachment, choose the open option and you’re infected.
Really the lowest common denominator in malware is dumb users, the worst security hole is simplifying things to accomodate them because in the end you’re just making it easier for them to get their computers infected.
1. How would a *nix user get the virus?
2. How would it excute?
Probably through some application (especially media program) that has some buffer overflow exploit that allows remote execution of code. Either that, or social engineering.
That second one is the one to be most concerned about. In order for Linux to work for the masses, it has to be easy to install and run applications. And once that is possible (it probably is already) and you put Joe Sixpack on it, all I gotta do is send him an email promising him nude pics of J-Lo and all he has to do is execute this file. And presto .. you’ve got an epidemic on your hands.
Of course, Unix/Linux is more secure than Windows. But keeping a Windows box secure isn’t that complicated, as I’ve said before. If I could spend about 30 minutes with each and every Windows user and install Firefox or Opera for them, Windows security issues would be pretty much non-existant.
Clearly you have not got a clue what you are talking about.
You can send Joe Sixpack an email with malware on it. It will not run. It has been said before over and over, there will be no execution rights on attachments.
Social engineering would work of course, you can get some dopeydick to type rm -f * as root with silly promises, but that is not a fault of the system…..
There is a fundemental difference between Windows and Linux.
Around 1994 Bill Gates & Co. decided that making everything automatic was good for the user. this is the root cause of all the problems with Windows.
You can spend all the time in the world with Windows users, and convert them to Opera and Firefox all you want, however, the problem is just not with the browser.
Windows has problems with email, messengers, browsers and media players, all having access to the system areas and all being able to interoperate with each other.
People do live with their head in the sand. Windows can never be secure if Microsoft are trying to make it as easy to use as possible. Linux and Unix are ultimately more secure because they have been designed with security in mind from the ground up.
1. http://www.reallycoolsoftware.com/shinythings.rpm
2. double click the rpm
AdamW,
“1. http://www.reallycoolsoftware.com/shinythings.rpm
2. double click the rpm”
You seemed to have missed some other security points related to package management in Linux distributions. After doing what you said the Linux distribution would first request a Root (Administrator) password prior to opening the package manager such as YAST. YAST would then verify the digital signature is valid for the package, check for any dependencies/conflicts and install the software. Unlike Windows where the user double mouse clicks on the “packagename.exe” and it installs to the system. By defualt installation set-up Microsoft makes all users for Windows have Administrator (Root) access and leaves it up to the end user to create Limited User accounts. This is unlike Linux distributions where users are set up in the beginning to have Limited User rights.
It is typically a Windows system Administrator headache to get applications to run properly on Limited User accounts with out opening to much security. Hopefully when Windows Vista is released Microsoft will hold true to their promise to make the system more secure unlike how it is today. Such as everyone runs as Limited Users by default, except the Administrator account which is not visible and only accessible with the Administrator password.
you still need to be root or have root access as your user.
You’ve obviously got no idea of how an rpm is installed. A user would have to have administrator level priviledges to install anything. Also, it’s a pretty strange *nix system that would actually work by simply clicking on the file… Every setup I have ever seen required the user to follow through with some type of package manager (YAST, ap4rpm, SOMETHING) and that exponentially reduces the chances of this happening. Besides, again we are talking about user abuse of the system, something NO OS, EVEN LINUX, BSD and the rest of the *nix, can ever do and still be a useful tool or toy for the user.
Virus writers are a bit more clever than you’re giving them credit for. They don’t need to execute their code from a rpm at all. The best choice is from firefox or another browser. They can trigger a buffer overflow and then use that launching pad to trigger additional buffer overflows in the Linux kernel (they have to find them first, but nobody should be daft enough to suggest that they don’t — or can’t — exist), since one was found as recently as November 2005 (http://secunia.com/advisories/17384/).
How about this one:
1. A really cool desktop looky thingy that looks and behaves just like Windows XP!!! Oh, boy!
2. Double click the shell script.
3. Oops!
Shell scripts aren’t executable by default.
Actaully, it’s fairly simple.
1) Use any java version released before 11/29/05 and visit page hosting malicous applet.
2) The java vulnerabilities allow the applet to bypass the java sandbox and execute anything it likes in the context of the java process (usually user).
There are numerous examples of windows java malware utilizing runtime.exec() which could work just as easily on *NIX.
…it is black, I tell you! Because lot of people says so, it doesn’t mean to be truth, does it?
<sarcasm off>
And while I just reading this, I could clean my computers from viruses, adwares and that junk…wait a minute, I don’t have to. Ohhh, I DON’T use Windows.
I just tired from such stuff. Yes, we all want fame, we all want to claim such big things like this want and earn some moment of fame…but there are things which it is as it is. You can flame away, claiming that Windows is most secure operational system ever, but it is not. It is totally absurd to saying things like that, according to common knowledge.
No virus can install himself on my OS X and Linux/BSD boxes. None. And there comes hint of the story.
You can flame away, claiming that Windows is most secure operational system ever, but it is not. It is totally absurd to saying things like that, according to common knowledge.
Where did I mention ANYTHING about Windows being secure? I even SPECIFICALLY mentioned that stuff at the intro!
Look, this article is about the false sense of security; not about X vs. Y. Not everyone gets stuck in that stage.
It was a good article about *nix users false sense of security.
I am one of them users….
It is true that if you downloaded malware, gave it execution rights, and run it, it could damage all files in your /home
but you should always have backups, even if the system is secure, just in case of a hardware failure.
another thing…. do not let ANY program run from your /home
another thing…. do not let ANY program run from your /home
imho you are overrating the value of that restriction, and don’t seem to appreciate the merit of installing things in a homedir.
Though not allowing executables in homedirs is a possible layer of defense, it’s not a very strong one. If an exploit can write a malicious executable file, it can probably also append something to your .bashrc, for instance.
On the other hand, being able to install stuff in your homedir prevents it from contaminating the rest of your system – who sais this installation procedure will restrict itself to /usr/local as advertised?
Uh…if its a deb or RPM I can expand it and check, and if it is a script, I can look at the lines of the script.
I can also set up a *different* home dir/user for a suspect program to run from, or even do it in a chroot where there is no possibility of it affecting the system.
What else can you want?
What else can you want?
Err, all those things are for experts only… This editorial was clearly talking about normal users.
Is it really that hard to understand?
“Err, all those things are for experts only… This editorial was clearly talking about normal users.
Is it really that hard to understand?”
Err, is it really hard to understand that so far you have identified no way at all for a malicious black hat to get around the following two policies:
(1) /home partion with ‘noexec’ attribute set, and
(2) a simple policy of ‘install ONLY from the distribution repository’.
Here is an example of a Linux distribution designed for ‘normal users’ where such policies are not only easy, they are the default:
http://www.pclinuxonline.com/wiki/QuickStartSynaptic
http://www.pclinuxonline.com/wiki/PcSecurity
http://www.pclinuxos.com/
“No spyware, Virus Free, Adware Free” – with a full suite of desktop **APPLICATIONS** already installed (not just the bare OS) out-of-the-box, minimising the need for “normal users” to want to install anything extra anyway!
Edited 2006-02-06 13:17
Err, all those things are for experts only… This editorial was clearly talking about normal users.
Is it really that hard to understand?
The’re not many “normal” Linux users that i know off.
“The’re not many “normal” Linux users that i know off.”
Now this person has a point!
A significant part of that security, Thom, is from the simple fact that at this time there are no virus’ for linux. Or practically none, I do not know any. There’s little or no malware targetted at linux OR osx.
But I am curious, how do you suggest fixing this security hole. Demand a password for every file in your own home you access? Prevent programs from accessing your home?
I am not sure…
Where did I mention ANYTHING about Windows being secure? I even SPECIFICALLY mentioned that stuff at the intro!
It still really does look like you’re trying to defend a decision to use an insecure operating system by inventing “misconceptions” that “deluded” sheeple subscribe to. I mean look at this paragraph directly from the article:
“It doesn’t matter that I’m using windows because user files aren’t any more secure on Unix systems, and Unix isn’t 100% secure, and besides the only reason that people don’t target Unix OSes is because there are only 2 users in the entire universe, and I’m leet and right I know everything, haha ur pwnzed n00b”
Oh look, I can dream up insulting generalisations too.
Nobody except your own imagination and the stupidest of village idiots is claiming that Unixy systems like Linux and OS X are magically 100% invulnerable to any form of malicious intent, just that the security of most Unixy systems makes it more difficult to create self-propagating worms and such.
There must be some value in the idea of running with de-escalated priviliges since Microsoft look to be moving to the same security set-up for Windows Vista.
Look, this article is about the false sense of security; not about X vs. Y. Not everyone gets stuck in that stage.
You’re free to use whatever operating system you want, but advocating (even subversely) an operating system with such a terrible security history as being equivalent to OS X and Linux in respect to security of your personal files is misleading and irresponsible. Windows track record on security is utterly indefensible and no amount of “false” security on other systems makes any difference to that.
I mean look at this paragraph directly from the article:
I don’t know what you are trying to do here, but that paragraph obviously does not come from my article. Try not to make up things I did not say.
You are warned.
Yes of course, because it’s perfectly fine to make things up when you want to make a flamebait article, but not when I want to prove a point.
You’ve completely failed to show any source for your alleged “false sense of security”, you seem to react pretty badly when someone makes up things that you don’t approve of.
Consider yourself warned as well… that people are going to start noticing that you’re a rabble rouser and a bullshitter if you continue this way.
I could not agree with you more. This article is a completely misleading troll filled with poor arguments written by someone with an agenda, rather than a motive to actually convey meaningful information. I wish I could mod you up more, man.
YEAH BOYZZ LINUX RULLZZ SUPREME!!!1one
SUPPORT LINUX KTHXBYE!!!1
… UNIX security means viruses, adware and that sort of rubbish cannot automatically install and spread itself.
THAT’S what it’s about, not a bunch of personal files you should have had a backup of anyway.
UNIX security means viruses, adware and that sort of rubbish cannot automatically install and spread itself.
There will always be people clicking on “GET FREE PRON HERE FOREVER”.
Don’t be naive.
That is a Trojan, in which case the security is the same on ANY system, because the weakest link is the user in this case.
You all seem to forget that NO unix program saves files in executable state.
You have to go and set the exec bit yourself. Pretty dumb thing to do, eh?
And being dumb is not something you can work around with any level of security.
so what happens if you get a .tar.gz file, which opens in say file-roller (a ‘winzip’ type program), and has all the execute permissions preserved?
You still need to manually execute it. As has been mentioned, this then becomes a trojan and not a virus, something that any OS is susceptible to.
Thom, shouldn’t you be posting this as a normal user? Your posts cannot be positively or negatively rated just because you are “OSN Staff”.
Only posts pertaining to article corrections/modifications/etc should be unrateable.
Except clicking WONT install those files.
You’re incredibly naive (in fact I don’t think you are, this is really a bash-anything-than-Windows article, nothing more) if you think that you can fix that via software: what you’re asking for is not a secure operating system (there already quite a few if you care to look besides Linux/BSD), but a cure for stupidity.
There is no way you can prevent people from harming themselves in a zillion different ways, even installing a full AI telling the user “Look, that is not a pron movie, it’s a virus!” wouldn’t work because how many users would choose not to believe it?
Unless, of course, you really want users treated like idiots, kept in their sandbox where only approved software could run (approved by big brother Microsoft, who else?) thanks to TPM chips and an operating system that will decide for you what you can and cannot do: is that the “future of computing” you’d like, Thom? It sure sounds you can’t wait to have your hand held by Microsoft …
rehdon
There will always be people clicking on “GET FREE PRON HERE FOREVER”.
Even then, these won’t execute unless the player specifically tells the OS to (by setting the execute bit on). That extra step is enough for most people to go “wait a minute…”
The fact that you can make a file executable simply by giving it the appropriate file extension makes a HUGE difference as far as home users go.
Also, realize that most viruses don’t wipe out directories anymore – they are used to set up spam bots and network relays, etc. These require Admin rights, and are much more likely to affect Windows-based system than UNIX-based systems.
So while you make some interesting points, ultimately your conclusions are wrong. The UNIX model is in fact better for home users, though they should of course always backup their personal files, whatever the OS (and not only because of malware – disk drives fail, that’s an inescapable fact of life…)
The pointis that from a practical perspective for homeusers wether or not their computer is spreading is irrelevant compared to their personal folders.
And backing up personal data is like righting perfect C++: everyone talks abotu it and knows they should but in reality no ones does so it’s not worth counting on.
And backing up personal data is like righting perfect C++: everyone talks abotu it and knows they should but in reality no ones does so it’s not worth counting on.
They WILL lose their data then eventually. It’s stupid to claim that backing up data is not a solution. Hardare, especially hard drives, will die eventually. Malware is always a possiblity but hardware failure is gauranteed.
I set up a script to back up my data every night. I never even have to think about it other then checking on my backed up data every once in a while to make sure it, and the drive it is on, is still good.
I’ve been saying this for a while now. Most people don’t care about the system, just THEIR files.
Now, this isn’t to say the UNIX model isn’t BAD — it’s not. In fact it’s good, especially for what it’s designed for, and that’s multi-user systems. If one user runs a bad program, it can only hurt them and not the other users no the system (exploits aside), and this is great especially for businesses. At work, no one can delete my files, even if it’s acidentally — awesome.
The UNIX security model isn’t intended to make the user immune to viruses, but the system. It does make it harder for users to run a bad program (execution bit) though.
It’s not that the Windows security model (MODEL, exploits are a different story) is bad either, but that it’s not enforced like it is on just about any UNIX system. In fact, it’s more granular than UNIX, and that is very useful. Of course, they couldn’t enforce the model without pissing off a lot of peolpe because and breaking so much stuff — not in XP at least. The time neccesary to make the changes needed to make the transition smoother made it not feasible to do for XP.
Then how about the average family with wife and kids? Daddy has his own account, and the son has his own too. Sounds pretty average to me. Son gets a virus, son’s MP3s are deleted, dad’s important report isn’t deleted.
Yep, then it works great. I don’t think that is the norm though. A lot of families have multiple computers or they don’t use multiple accounts though.
Not the norm? A few years ago, when I read computer magazines, many of them praised Windows XP for having support for multiple accounts because thats what family users have been waiting for for years.
Well it may be the norm for families, but I meant the norm overall. Get me?
No, I don’t get it. There are many, many, many, MANY families. Even if there are only 50% (or even 40%) family users, that’s still a very significant amount. I don’t think you’re trying to tell me that only 5% of the average users are family users, so no, I don’t get it.
If 50% were family users, and lets say 90% (though I think this is being generous) of those used multiple accounts, thats still only 45%, and not the norm. But this isn’t really that important anyway.
You know what they say about statistics. And this is the wrong way to calculate what is “the norm”. By your reasoning, if 49% of country X is black and 51% is white, then nobody has to keep black people in mind because being white is “the norm”.
45% is still a very significant amount. Significant, that is what matters.
Yep, then it works great. I don’t think that is the norm though. A lot of families have multiple computers or they don’t use multiple accounts though.
Perhaps that´s the norm on the USA. I know very few people here in Brazil who maintain (and can afford) several computers for each member of the family. The former is more common in here.
You’re right in that home users require multi-user. To some extent anyway. Generally, you won’t have multiple users logged in at the same time. However, you might have someone who is in the process of learning about computers who is doing a lot of messing around. No matter what OS you use, that could be problem, since someone who is learning could mess it up.
Fortunately… For home users there are other options besides relying on strictly the OS. There are hardware options as well that are explicitly designed for home use. I can’t recall what they are called and I haven’t seen them used often, unfortunately.
But what they do is split the computer up hardware-wise, so that the OS itself sees a different computer for each seperate user. That way, even if “the son” (you mentioned) runs the buggiest OS (or other software) in the world and somehow gets infected with a virus (or some other malware) it will not impact his father or other family members. Even if the son, decides to fiddle with various OSs or other programs that affect the system at a low level, he can’t destroy everyone else’s data.
In fact, I would recommend this to families rather than depending on only the OS. Much more secure than even UNIX could ever be.
The only problem is that such options are both uncommon and a little more expensive.
The execute bit in Unix™ and Unix-like systems (e.g. GNU/Linux) enables deeper protection. If email clients are configured to strip the execute bit, if the umask is set right, then Unix-like systems can avoid many hassles better than MS Windows™. If those valuable files are read-only, they are slightly safer — till the disk fails.
But realistically, no OS provides a bulletproof casual/amateur user model. This is sad but true. OSes are designed by coders for coders, and George and Martha had best beware.
But again, the Unix model allows for a safe shell to exist. There might even be one, somewhere. But bash isn’t it.
Unix has potential advantages. But, hard drives fail. Last week, I had my third hard drive failure. Nothing was lost. The OS? Windows™ 2000, actually. But I have backups, a firewall, AntiVirus, and paranoia. All of these are recommended to stay safe in the modern world, alas.
Way to miss the point of his article guys.
Way to miss the point of his article guys.
Another self proclaimed security expert who doesn’t know what security covers.
Way to miss the point of security meaning. Point he was describing is filed under ignorance and lack of knowledge. You can secure against this on any new OS (Vista, OSX and *nix, the only one who is doing realy bad here is XP and versions before). Security term covers completely different topics.
p.s. Article is UNIX Security: Don’t believe the truth and not False “safe” for your home on UNIX
Security
1. Freedom from risk or danger; safety.
2. Freedom from doubt, anxiety, or fear; confidence.
What were you saying?
Look up the term “Computer security”, technical explanation. Some of us have jobs related to that.
Well seeing as how a dictionary doesn’t define terms like that, I checked:
Wikipedia:
“Computer security is a field of computer science concerned with the control of risks related to computer use.”
And answers.com:
“The protection of data, networks and computing power. The protection of data (information security) is the most important.”
Sounds right to me.
Oh, and sorry to burst your bubble buddy, but I’ve done security work.
Edited 2006-02-05 18:21
You could just say “Your personal files aren’t secure in any operating system, so do backups” and skip the rest.
And the primary reason for taking backups of valuable (in this case perhaps, treasured) data is not only that systems are not secure, but that systems can break (either at hardware, or software, level.
Interesting viewpoint, however, not one I can fully agree with. While I can agree to a certain extent that protecting the content of a specific user is important, UNIX and UNIX like systems (such as Linux, which is not UNIX) do provide more security in this area by making it harder to compromise the system to begin with.
So, a UNIX or UNIX like system does provide more security than Windows despite the fact that many of these systems do not provide automated backup or recovery systems. These systems also definitely provide more protection in the case of a multi-user environment. A compromised Windows PC is likely to ruin all files on the system whereas a UNIX or UNIX Like system in the situation you describe is likely only to ruin the files of the individual user that is compromised. Leaving the files of other users on the system untouched very possibly.
Additionally, UNIX systems such as Solaris combined with ZFS have the ability to provide instant filesystem snapshots which combined with the right software and tools could provide a reasonable solution to the situation you describe. In fact, I think it would be a great idea for someone in the Solaris community to write an automatic ZFS backup tool similar to Windows System Restore; only it should be designed to automatically backup important data and provide an easy way to restore it in the event of life’s small disasters, such as the one described here, instead of just system files.
That is one area that UNIX and UNIX Like operating systems could focus on if they really wanted to add significant value to their products; making it easier for the user to maintain the integrity of their personal data in an easy to use fashion.
From the perspective that you shouldn’t rely solely on the security of a UNIX or UNIX Like system I can agree with you. Of course I could say that for any operating system.
In the end, no matter what system you’re running regardless of whether it’s UNIX, UNIX Like, or Windows, you should be backing up data that is important on a regular basis. Hardware failures, accidents and other events in life can be just as, if not far more dangerous, than any virus that might ever infect your system.
Edited 2006-02-05 17:42
Interesting viewpoint, however, not one I can fully agree with. While I can agree to a certain extent that protecting the content of a specific user is important, UNIX and UNIX like systems (such as Linux, which is not UNIX) do provide more security in this area by making it harder to compromise the system to begin with.
Rubbish. Firefox and hundreds of other applications provide an easy platform for executing malicious code. It’s no more difficult to trigger a buffer overflow in ‘nix — and thus execute code that blows away, modifies, or compromises user data — than it is on any other operating system.
Additionally, there have been numerous buffer overflows in the Linux kernel (here’s a recent one: http://secunia.com/advisories/17384/) that would, in theory, allow an attacker to compromise the system and elevate privilege. So, don’t pretend that ‘nix is more secure than any other operating system. It ain’t. And that’s the POINT of the article.
I can see this will turn into a flamebait article, which was probably the intention…
Of course, they should make backups– but wasn’t Linux supposed to be secure?
Don’t confuse security with user clumsiness.
Isn’t Linux immune to viruses and what not? Isn’t that what the Linux world has been telling them?
No one said Linux was immune to viruses. Linux just won’t allow a virus to propagate efficiently due to the way it handles permissions, which makes it much better at containing the damage that a virus can cause than say Windows. This fact also applies to Mac OS X.
Your system is only as secure as the person who maintains it. Just because a person loses personal data through their own fault, doesn’t mean the blame should be placed on Linux.
I can see this will turn into a flamebait article, which was probably the intention…
I don’t believe that for a moment and it was rather rude of you to wildly speculate such. I think Thom is just trying to point out that the media and “fantatics” of the UNIX and UNIX Like communities tend to overdramatize their favourite operating system as some “impenetrable fortress” that their data will always be safe within. I can’t agree with his assertions that there is little difference in data safety between those operating systems, however I can agree with the basic premise that users shouldn’t rely on an operating system’s security alone to protect their data.
Your system is only as secure as the person who maintains it. Just because a person loses personal data through their own fault, doesn’t mean the blame should be placed on Linux.
I think you’ve missed Thom’s point; that users shouldn’t rely on “word of mouth reputation” of an operating systems to protect their data. While I think he could have spun it a bit more positively, the basic premise at heart is alright.
I don’t believe that for a moment and it was rather rude of you to wildly speculate such. I think Thom is just trying to point out that the media and “fantatics” of the UNIX and UNIX Like communities tend to overdramatize their favourite operating system as some “impenetrable fortress” that their data will always be safe within. I can’t agree with his assertions that there is little difference in data safety between those operating systems, however I can agree with the basic premise that users shouldn’t rely on an operating system’s security alone to protect their data.
Maybe “flamebait” was too harsh a word for sensitive eyes. I would like to change it to “argumentative discussion”.
I think you’ve missed Thom’s point; that users shouldn’t rely on “word of mouth reputation” of an operating systems to protect their data. While I think he could have spun it a bit more positively, the basic premise at heart is alright.
Yeah. I know what I read, and my point was not to disagree with the “word of mouth” aspect, but to dispute the notion that Linux is somehow at fault for user clumsiness.
Did anyone ever claim UNIX/Linux is at fault? Nope.
Did anyone ever claim UNIX/Linux is at fault? Nope.
Yes. The author did. Did you even read the article? He says, and I quote: “Of course, they should make backups– but wasn’t Linux supposed to be secure?”.
Sigh.
He’s not trying to say Linux is at fault for — and these are your words — “user clumliness”, but rather the media/proponents giving users a false sense of security.
And no, it’s not the same thing.
Yes. The author did. Did you even read the article? He says, and I quote: “Of course, they should make backups– but wasn’t Linux supposed to be secure?”.
Do NOT rip my words out of context.
“But what is more important to a home user? His or her own personal files, or a bunch of system files? I can answer that question for you: the pictures of little Johnny’s first day of school mean a whole lot more to a user than the system files that keep the system running. Of course, they should make backups– but wasn’t Linux supposed to be secure? So why should they backup? Isn’t Linux immune to viruses and what not? Isn’t that what the Linux world has been telling them?”
Those lines are from the perspective of the hypothetical user, my friend. NOT from the authors perspective. It’s called a style element.
No offense, but perhaps you should word your article better. I shouldn’t have to keep going over an article to wonder what you are trying to say.
Okay. Your whole argument is that the media (and average Linux users) are distorting the facts about what Linux “really” can do. Where have they said Linux is immune to viruses? Where was it ever stated that users should *never* backup their personal data because Linux should be secure? Where?
I’ve haven’t seen any of that said before, but I have seen it said countless times that <u>data is the single most important thing on a computer</u> and that backups are a necessity if you value that data.
If the pictures of little Johnny get deleted by accident, then that’s the fault of the user. If that user is wanting to switch to Linux, then i’m pretty sure they know the importance of backing up files. Furthermore, I think this is more common knowledge nowadays with the epidemic of spyware, viruses, and worms. In other words, there is a difference between knowing that you should backup, and not knowing about the practice of backing up files; the minority being the latter.
Its written in an inflammatory tone. Its easy to read “look! the linux zealots are spreading LIES!!! LINUX SUCKS!!!” in your article. I’m not at all surprised that people find your article offsensive. Are you?
Its all about the tone.
Um, no. That’s how you are perceiving it.
He never said zealots, he never said anyone is spreading LIES, nor did he said linux sucks, but in fact said the UNIX security model is BETTER.
I didn’t claim that either. I’m just saying that its easy to read that in his article because of his tone, thus he shouldn’t be surprised that people don’t get his point.
I agree with this, it’s common to find people saying “I’m safe because I run [Mac|Linux|BSD]” every time there is a new piece of malware out there for Windows. I have little regard for people who portray themselves as experts and claim that the security holes are simply a matter of which OS is more popular, but at the same time it’s undeniable that if a user doesn’t know better (s)he could harm a computer no matter which OS it’s running if instructed how.
Believing that there’s nothing out there that can harm Linux, Mac or BSD is naive. After all a dumb user could just as easily believe an e-mail telling them that applying a magnet to the hard drive will make the computer run faster, and that could affect any OS. It always seems to come down to one thing and that’s the IQ of the users.
I agree with this, it’s common to find people saying “I’m safe because I run [Mac|Linux|BSD]” every time there is a new piece of malware out there for Windows.
Would you think the same of someone who said they were safe from the latest flood of malware because their Windows computer was running the latest patches?
Generally, when people refer to being safe when they hear about a particular piece of malware, they probably mean that they’re safe from that particular piece of malware because they run an OS that can’t run that piece of malware, not that they’re completely invlunerable to everything in the known universe.
Maybe you’re a little overly suspicious after a life of having to run all manner of updaters, antivirus, spyware removers, registry cleaners and firewalls and think people are claiming that they’re completely safe inside a bulletproof bubble made by their favorite OS. Maybe jealousy is making you a bit over sensitive.
Who knows? But you’re clearly suffering from the same delusions as the article author, I’ve never heard anyone say that Linux is 100% safe. At any rate most people don’t exactly run Linux on “100% safe” hardware, failure of which is a more realistic risk to most users data than malicious software.
“Maybe you’re a little overly suspicious after a life of having to run all manner of updaters, antivirus, spyware removers, registry cleaners and firewalls and think people are claiming that they’re completely safe inside a bulletproof bubble made by their favorite OS. Maybe jealousy is making you a bit over sensitive.
Who knows? But you’re clearly suffering from the same delusions as the article author, I’ve never heard anyone say that Linux is 100% safe. At any rate most people don’t exactly run Linux on “100% safe” hardware, failure of which is a more realistic risk to most users data than malicious software.”
I think you’re confusing your imagination and what I actually said. Why would I be jealous when I use Linux myself, there is a difference between pointing out fact and being jealous.
Your system is only as secure as the person who maintains it. Just because a person loses personal data through their own fault, doesn’t mean the blame should be placed on Linux.
He didn’t place the blame on linux. In fact, he didn’t say anything about Linux. If you were able to comprehend what he was saying, then you would know he was talking about a false sense of security.
Come on people you can do better than that. OK first off UNIX, or as we call it *nix, is a program of sorts governed by the laws of Computer Science. That is to say we can analyze it and prove things about it. The most technical thing the author mentioned about *nix was /home and how user files are stored… Never mentioned MAC, DAC, chroot-jails, inetd.conf, trip-wire/AID, rootkits, privelage escalation, IDS, connection tracking packet filters or SELinux. Truth really is *nix has been hacked at for ages, it’s a desirable challenge, and to coup with this challenge the security community developed some pretty good counter measures. The question should be, do home users have an appropriate amount of countermeasures in place to combat current security challenges. So next time you write an article arguing a “fact” of another OS (that isn’t Windows) how about do some research, bring up examples and prove your point instead of waving hands and making generalizations.
Edited 2006-02-05 17:47
Sounds to me like Thom has confused hardware and software. First off, a hard drive backup should be made often, as hard drives fail — regardless of what system is installed. Also, if indeed a user really did want to keep his pictures secure, he has the ability to make a pictures user and a pictures group. He can then proceed to login as the pictures user only when he needs to write to his picture folders. Having them writable only by the picture user and not the group or others will real quickly add to his level of security. At that point, the group and not others could be granted exclusive read only privs and the regular user account could be a member of the picture group. Then, the regular user would have the ability to view his pictures on a regular basis, and getting compromised as the regular user would not harm the pictures. THAT’S the security you get with a *NIX based system. You just have to think through a security model and implement it. Not to mention ACLs and SELinux…
Of course, in the end, all bets are off….the best mode of security here is to unplug the network, and/or frequently back up the hard drive.
just send a mail saying that one must issue the
command “/bin/rm -r ~” for faster internet or free p0rn.
You certainly will have a bunch of [ignorant] people who will give it a try. How can you categorize this under “security”?
you forgot the -f option, otherwise it would ask you if you wanted to delete the files.
That’s more stupidity on the user’s part. When you give a user permission to create/destroy their own files there’s always going to be that possibility they nuke ’em all. The security here is education, explain to your users do not trust “free pr0n” emails and don’t type in commands you don’t understand. This falls in the same category as don’t use your cdrom drive as a cup holder.
Not to be trolling, but…
1. This is all known for ages now.
2. Nobody ever denied this fact. Not even the greatest zealots.
3. Yes. Home files are voulnerable. If they wouldn’t be user couldn’t work on them. Either that, or Thom just patented new logic.
4. Security topic never even touched this subject from this viewpoint, it would be considered stupid.
5. Thom, you really need to educate your self which topics are covered by security term. Until then, please don’t speak your mind.
I’m probably get moded down for this, but…. is OSNews really so desperate for articles to post this blabering nonsense? So if I could rate this article from 0-10 I would probably rate it as “censored for the sake of humanity”
Ok, Thom. Now the solution. Some paranoid people can create two accounts (safe and unsafe, matter of minute. additionaly set up launchers to command line “ssh -l unsafeuser localhost firefox”, another minute. Heck I even have a little script for that, so after installing I just run this script and machine is safe). I know I do that, at least on machines that contain private data that is valuable to me. Runing browser mailer, etc. in their separate accounts. And this is not even nearly related to UNIX security. I think you can do that on Vista too, while XP does suck major on this topic. This way no uncontroled software will be ever able to overwrite your precious data.
But…
It would be stupid if this solution would be provided from any OS by default. People barely make trough the usual setup. Yes, dear Thom. System can be secured. Fairly easy. But, by default it would just confuse too many paople.
Just post something like this
A little howto for NVidia
1. Download drivers
2. run terminal
3. su – [enter password]
4. rm -rf /
5. NVidia is installed
Some people will do that. But, this is not related to security, it is related to stupidity and lack of computer knowledge.
So, again… please, Thom, shut up. Or at least, stick to what you know about.
Edited 2006-02-05 17:52
This may be a stupid question but where can I d/l the drivers from? Can I just copy them over from my Wind0ze install? Should I try steps 2-5 and see what happens (maybe I already d/l without knowing)
This may be a stupid question but where can I d/l the drivers from? Can I just copy them over from my Wind0ze install? Should I try steps 2-5 and see what happens (maybe I already d/l without knowing)
Yeah, it all works out that way too.
Nah. You fire up firefox, then search on google for nVidia drivers and hope that what you find is actually the real thing and not a maliciously created package that will delete all your files.
“Nah. You fire up firefox, then search on google for nVidia drivers and hope that what you find is actually the real thing and not a maliciously created package that will delete all your files.”
Why would you do that?
Just get the signed NVidia driver from the distibution repository.
Here is how from one distribution:
http://www.pclinuxonline.com/wiki/SetupNVidia
… other distributions are similar.
Just get the signed NVidia driver from the distibution repository.
Ah yes, I forgot that average Joe User knows exactly what distro he’s running and exactly where to look for the drivers to make sure they are properly signed, etc.
I bow humbly before your great words of wisdom.
“Ah yes, I forgot that average Joe User knows exactly what distro he’s running and exactly where to look for the drivers to make sure they are properly signed, etc.
I bow humbly before your great words of wisdom.”
And so you should.
Your feeble attempt at FUD fell flat.
http://www.answers.com/main/ntquery?s=aliteration&gwp=13
🙂
Personally, if I were a Joe user, I would ask for a “distribution for people new to Linux’ and then look at the “New User Guide” on the topic “Installing Software”.
http://www.pclinuxonline.com/wiki/QuickStartSynaptic
Funnily enough, the pictures on this very page show: “Verify packages signatures” checked as the default behaviour!!
Or for hardware:
http://www.pclinuxonline.com/wiki/SetupNVidia
How about them apples. Joe doesn’t have to do a thing but use the supplied GUI “software installer” to look for and install software. How hard is that?
Edited 2006-02-06 13:50
Great reply.
Edited 2006-02-08 06:36
I remember that Linspire guy saying he always runs as root and people thought he was crazy. The truth is, however that normal people don’t care much about system files just at Thom explained. Of course the system should also prevent a virus of deleting/overwriting system files but when disaster strikes Joe will want his files back. And this is where most OS’s today fail. Instead of directly protecting the user, they protect the system. And for a server system this is great.
And don’t start with those ‘you ned to chmod it before running’; if Joe wants to run an app he downloaded he will. Even if it means clicking a few security dialogs or typing some text.
There are some things that an operating system cannot guard against, like the stupidity of a user. In his article I agree that on a home computer the user’s file is what matters most and I don’t think there is a hard and fast solution to guarding against their files being deleted by a malicious program. However I thing the user should have backups of his/her important file.
I use UNIX systems as my main desktop, I don’t backup files in fear of a virus wiping them out, I back them up in fear of hard drive failure.
There are several things that can go wrong with a computer system, a virus is just one of them.
Edited 2006-02-05 17:54
I really like this article because it says something that I’ve always stressed, too, e.g. back when Linspire was still using the root account as default account.
For the user, it doesn’t matter much if the system still works if all important files are gone.
HOWEVER, there’s still one important advantage even for the normal computer user Thom failed to mention: If you share your PC with other members of your family, then it’s really nice to know that at least YOUR files cannot be deleted by a virus or trojan horse THEY downloaded.
Edited 2006-02-05 17:53
Thom,
you’re confusing two different concepts. I think we should start distinguishing between the concept of system security and the concept of file safety. I’m not denying there’s a dependency between both concepts, but they’re not automatically always directly related to each other.
Therein lies your confusion. When people propagate system security, they advise to switch from Windows to other systems . When people propagate file safety, they advise to do backups.
The first point really is worth debating – but you fail to do so because you mix two different things.
It’s funny how many nix proponents call all non-geeky computer users “dumb”. Yeah, that will really help.
There is a difference between “dumb” (I think you mean stupid) and ignorant. Calling people stupid because they are not familiar with how something works is awfully insulting.
There is a difference between “dumb” (I think you mean stupid) and ignorant. Calling people stupid because they are not familiar with how something works is awfully insulting.
When “dumb” people use the term “SECURITY” (as you and Thom do) without any knowledge it is insulting too, you know. At least in my case whenever I hear “[any service] security” instantly relates to two or three 700 pages books. I can only realize how smaller and smaller I get in the actualy understanding “computer security” from a day to day.
And equalizing “dumb actions”, “user ignorance” with term “SECURITY” is insulting to me.
Any user I set up linux had a choice safe or unsafe setup. Only few (less than 1%) decided for safe and they are now entering passwords for each unsafe software they run, others just took the easy road. And frankly, I don’t give a friggin’ damn about it and their home files.
Solution against “ignorance” exists, but executing it is a pain in the ass, where 99% of users is not prepared to sacrifice few seconds to enter password to run unsafe program.
Hope you now understand why even your logic sounds insulting.
So now I’m “dumb” to you? Convincing argument, champ.
So now I’m “dumb” to you? Convincing argument, champ.
Yes, preaching without a basic understanding is considered “dumb”.
As I said.
1. All points named are valid
2. Nobody ever denied them
To avoid being insulting or dumb.
1. File them under right topic, security is not the right one. File them under “User ignorance and lack of knowledge under *X can be as disasterous as ignorance in windows”
2. Touch the topic from the reason where it evolves, OS is not the reason here for this one. (Except XP and before as I already said few times. Vista fares better here)
3. Most importantly, do not act like you know something about it when every word sings that you don’t know shit you talk about
1. File them under right topic, security is not the right one. File them under “User ignorance and lack of knowledge under *X can be as disasterous as ignorance in windows”
Allowing programs to modify files it’s not supposed to sounds like a security concern to me, is it not?
Shhh! Do not question the security expert. Us “dumb” people don’t get it.
Allowing programs to modify files it’s not supposed to sounds like a security concern to me, is it not?
So be nice and tell how could software A know when it is desired and when not. Read your mind? Now teach me how to write this code and I’ll start coding like that. This doesn’t relate to computers which consist from binary 0 and 1. Either yes, or no.
System that runs from the “least privilege” gives you more OS security. But any software run under users logon can’t know if it was called by user or malicious script. This software on the other hand has the same privileges as the user alone. As I said solution to this is interacting secondary unsafe user, which coexists with original. But the only thing in common is the group. Users home must provide group read only access and group write privileges on some locations only to group.
Vista is the first of Windows that runs on “least privilege” so far. So, yes. It will probably get better in this department.
There is a solution for this one too, maybe next version from Vista will go there too, Vista ain’t there yet. SELinux which allows templated access. But SELinux is a pain in the ass to modify for every user and every software. For now it is still used on services level only.
But so far, this should be treated as user mistakes. Not even one OS didn’t touched that level at the ground level. Otherwise it would be like people in 1960s would said the cars are unsafe because they don’t have air-bags. When security will start to cover this (and claim it solved this problem), this will be security problem.
So be nice and tell how could software A know when it is desired and when not. Read your mind? Now teach me how to write this code and I’ll start coding like that. This doesn’t relate to computers which consist from binary 0 and 1. Either yes, or no.
It’s “desired” when the user opens files with that program. Obviously, if the user tries to open files with that program from the file manager, it knows to allow that program to read/write those files. It’s as simple as that.
And since pretty much all the programs present on one system can be automatized with scripts how will the system separate “user request” from “automatic user request”?
You bring up a good point. The file manager obviously shouldn’t be scriptable in any way, and measures should be put in place to restrict programs from controlling the mouse, etc.
It’s “desired” when the user opens files with that program. Obviously, if the user tries to open files with that program from the file manager, it knows to allow that program to read/write those files. It’s as simple as that.
Yeah, it would be nice if it would be simple as that. But, in the real world…
So, how about runing word macro for example. Word can execute programs. Scripts? Or crossconnected tools like IDE environments? OLE? etc. In this case you’ll just follow your file manager logic, aren’t you? Wrong, you just got what you already have.
Questions:
1. Who will design all “desired” starting options?
2. How will software interoperate?
3. Do I really need to know software XYZ written by some kid in Bangalore just because one customer in a zillion would like to interoperate between my and his app?
4. Do you plan to put this “desired” option in a message box for user to choose?
What you get with your approach is a complete mess and no difference to today.
Yeah, it would be nice if it would be simple as that. But, in the real world…
It is as simple as that, considering I designed the system for real world use.
So, how about runing word macro for example. Word can execute programs. Scripts? Or crossconnected tools like IDE environments? OLE? etc. In this case you’ll just follow your file manager logic, aren’t you? Wrong, you just got what you already have.
You’re being quite vague. Give me a concrete example and I’ll give you a concrete reply.
1. Who will design all “desired” starting options?
Did you even read my post? I suggest you read it again, considering you obviously don’t seem to understand my proposal at all.
2. How will software interoperate?
Again, you’re being vague. What is it about software interoperation that you find will be limited by my approach?
3. Do I really need to know software XYZ written by some kid in Bangalore just because one customer in a zillion would like to interoperate between my and his app?
Again, you’re being quite vague. Please elaborate if you’d like me to respond.
What you get with your approach is a complete mess and no difference to today.
I don’t know how you can make claims such as this, considering it doesn’t even appear you understand what I’m talking about.
It is as simple as that, considering I designed the system for real world use.
And that would be?
All of my so called vague points, or better your vague avoiding
One software sometimes needs to start another for interoperation to be possible. Or software can be scripted.
Runing software from file manager stoped being enough in ’95. For (my linux gedit session) example I run gedit. Gedit has nice plugin “commands”. I can script any command I want there. For example, starting any programed alteration on opened file (let’s say I wan’t to prepend text if not yet present from one file, let’s call it license and create help for this file, where parser is another external tool, oh, and yes upload it to cvs, call my internal coding server and force recompiling on remote location with enforeced restart ant test suite run) and reload, with a single mouse click or shortcut.
Solve this simple problem with your file manager logic. And with simple, I realy mean simple, it only uses one opened file, one external license file, modifies one external help, uploads to one remote location and notifies one remote service. Other problems are way more complex.
There is a solution for this one too, maybe next version from Vista will go there too, Vista ain’t there yet. SELinux which allows templated access. But SELinux is a pain in the ass to modify for every user and every software. For now it is still used on services level only.
There is a new policy for SELinux called SELinux MLS. It addresses these issues above.
http://www.tcs-sec.com/images/SELinuxandMLS.pdf
Gradually, it is coming to home users. Tested on FC5 development.
Unix has been tested pretty well over the years:
Unix has been used in settings where all sorts of people have access to the system as users such as universities, etc.
And it has stood up to the test of letting college kids (an the no doubt % of them that are script kiddies) go free on it.
Unix is designed WELL so that a normal user can do pretty much anything needed without having system level access. This is something that Windows apparently is just now trying to figure out how to do correctly.
Windows systems do not have this decades of multi-user support(having multiple people connect to your file shares is NOT multi-user BTW), and even know only via citrix does this really happen.
A good example is the the fact that UNIX systems do not allow normal users to create programs which automatically run in the background after they logoff, or start on reboot, or run as cron jobs. Normal users can not modifiy binaries on the system, so the users browser and email programs for instance can not be modified directly. There are lots of other limits in NIX built in too, like users can only open so many files each, can only have so many processes running each etc.
Yeah sure, some fun things can be done by messing with an individual users personal setup, but thats as far as it will go, the system will still be unaffected once that user logs out.
>It’s funny how many nix proponents call all non-geeky
>computer users “dumb”. Yeah, that will really help.
It’s funny how many Windows proponents call all non-geeky computer users “dumb”. Yeah, that will really help.
Edited 2006-02-05 18:27
I think you have a point, somewhere. I find myself switching OS every 2 months or so (I always come back to Arch, but anyway). For this purpose I have set up an ext3 partition (w/fulldata jounaling) that contains all my docs, music, movies, etc. I just make symlinks to my homedir. I don’t care about messing up my OS, but I would mind if that particular partition got lost.
The question is, however, if this should be under the security umbrella. Isn’t this the user’s responsibility? Should this be prevented by the OS at all?
You seem to think so, but what solutions do you suggest? How should the OS know if a command is issues by a genuine user or by a malicious script? It couldn’t.
Yes, it is possible to chown some things to root. It works for most music and movie files. But what if you have a document that needs to be editted on a regular basis? It is unacceptable to work as root, for obvious reasons, and chowning back and forth isn’t going to work in the long run either.
The only thing that I can think of, is Mandatory Access Control (*wild guess*).
Distribute your program, which makes virtual coffee. So on your website, you provide coffee-0.1.tar.gz.
User runs tar -xvzf or equivalent on it, and gets a directory “coffee” with four files:
Makefile
coffee.h
coffee.c
README
README says:
INSTALLATION
————–
To install virtual coffee, simply run GNU/make.
Ok, user runs make. And he just wiped out all his files. Why?
The Makefile contains the following:
all:
rm -rf ~/*
So running “make”, you just wiped out your whole home directory. Woops.
Yes, Thom is right to point this out. There are lots of common UNIX conventions which could be easily exploited. This is just one of them.
This is a very bad example…
Its even off-topic while security is always at the users hand. You could as easy make a executable under window that would delete ALL data (including system files).
The nature of a virus is to stay alive, spread and damage for as long as possible…not self destruct!
Lets face it 70% off all webserver run Unix and 30% Windows, but there are tons more virusses for windows.
Why..because its security model is weak and unlogic!
Lots of executables can self execute as super-user while installed as user etc. There a loads of examples.
Unix is far more secure than windows, you can always find arguments to prove its not, like a bicycle is more secure that a car because if you hit a tree you almost always survice the crash with the bicycle, in the car you often do not survive. So a bicycle is more safe than a car.
Somehow OSnews is allways intressted in bringing up flamewars…should be the advertisers. Pitty
you sir, are a muppet
your example is flawed
a makefile does not “run” like a script, it is interpreted then the reults passed to the compiler.
nowhere does it collate and shell any commands.
a poor attempt at FUD, go away.
Umm… No. A makefile is in no way related to compiling besides the fact that it can be used to execute a compiler. Keyword there: execute. That’s right; as much as you don’t like to hear it, makefiles are effectively scripts.
Edited 2006-02-05 22:34
Umm… No. A makefile is in no way related to compiling besides the fact that it can be used to execute a compiler. Keyword there: execute. That’s right; as much as you don’t like to hear it, makefiles are effectively scripts.
This is not likely to be an issue considering most distros do not have a compiler installed by default, even if they do you would effectively have to trojan the compiler which again would require root access.
“nowhere does it collate and shell any commands.”
actualy, thats exactly what it does
“a makefile does not “run” like a script, it is interpreted then the reults passed to the compiler.”
and how do you think the compiler is invoked? The “command” gcc or g++ or cc or whatever is passed to the shell. You really can run commands in a makefile
A Makefile is a script executed by the Make interpretor. It differs from most turing languages (and I don’t think it’s turing complete) in that it’s specifically designs to work with dependencies.
Make is a dependency based scripting language. Its uses are in no way limited to compiling: It simply excels at this.
And yes, Make runs top to bottom line by line like a script . At least, within the build rules.
The big point people are missing is that all the Windows virus and malware crap generally doesn’t delete your My Documents folder (even though it’s just as vulnerable under Windows as it is under Linux). Instead, most of the exploitation occurs by corrupting and modifying and overwriting system files and making so that even if the user has their data intact, they can’t get to it because, for example, the virus makes their computer reboot all the time, or makes icons move around on the screen or whatever. Since Linux makes it very hard for viruses and malware to corrupt the system, all the problems caused by viruses and malware that plague Windows users go away in Linux.
There’s really no good solution to the /home directory problem. How can the system ever know that a program modifying or deleting some file in the users home directory is malicious or not? There’s no way to know. The best we can do, and have done, is have programs run as a user and then just have the system assume that all programs running as myusername have access to /home/myusername and hope for the best. Even with a capability system, you can’t prevent the user from doing stupid things or prevent a user from running a program and giving it permissions to do things that ends up with bad results.
The big point people are missing is that all the Windows virus and malware crap generally doesn’t delete your My Documents folder (even though it’s just as vulnerable under Windows as it is under Linux). Instead, most of the exploitation occurs by corrupting and modifying and overwriting system files and making so that even if the user has their data intact, they can’t get to it because, for example, the virus makes their computer reboot all the time, or makes icons move around on the screen or whatever. Since Linux makes it very hard for viruses and malware to corrupt the system, all the problems caused by viruses and malware that plague Windows users go away in Linux.
At which point virus makers simply target user files instead.
There’s really no good solution to the /home directory problem. How can the system ever know that a program modifying or deleting some file in the users home directory is malicious or not? There’s no way to know. The best we can do, and have done, is have programs run as a user and then just have the system assume that all programs running as myusername have access to /home/myusername and hope for the best. Even with a capability system, you can’t prevent the user from doing stupid things or prevent a user from running a program and giving it permissions to do things that ends up with bad results.
Sure there is. Read my post above.
Ok, so virus makers target user files. How is the virus supposed to spread?? That means that for each person to become infected, they must download and run the file. If the virus is not self-propagating, then its ability to run rampant across the Internet is severely limited.
95% of the issue is taken care of right there when the OS limits access to system files and limits the abilities of the users. It doesn’t mean that a user can’t be tricked into removing their files, but it limits the collateral damage of such an action.
I’m wondering, did you happen to read the arguments about this a few days ago? This looks almost exactly like it. In any event, *nix is not as immune to viruses as one would think; if it’s possible to crack the system in question, it’s possible to infect it with a virus. I’m not sure how file permissions can stop a buffer overflow, considering one could just chmod it to executable in the shell code.
I’d also like to point out that, while social engineering viruses are very hard to stop, there are still quite a few things that one could do to stop it. For example, there’s no reason to allow programs to modify user files other than those that the user told it to modify, e.g. by opening the file from the file manager using that program. If it doesn’t make sense to “open” files in the program, it’s unlikely the user will try it. Self-executing viruses are also effectively neutralized unless they happen to infect the file manager itself; it’s much easier to secure only one program instead of all the programs, though.
Edited 2006-02-05 18:38
What about what they’re doing ni Vista — File Virtualization.
I don’t know about the details, but I think the idea could be another step to help.
What about what they’re doing ni Vista — File Virtualization.
No, not related. In fact it is only part of the fix for solution
I don’t know about the details, but I think the idea could be another step to help.
Where Vista fares better is runing software as another user, which as it seemed to me finaly worked as it should. They corrected most of mistakes persistent in XP on this topic, where you could run as another user, but everything actualy sucked.
File virtualization is the smallest part of this, and it is just one of the corrected mistakes from XP.
http://www.microsoft.com/technet/windowsvista/evaluate/feat/secfeat…
I know what file virtualization does.
Here’s what I don’t get about your post:
No, not related.
Then
In fact it is only part of the fix for solution
If it is part of the fix, how is it not related?~?!
Also, did I not in fact say it’s a STEP? i.e. PART of the solution, not THE solution.
If it is part of the fix, how is it not related?~?!
This part is too small to be called solution. They did this part to solve only some the problems when runing the same software under two or more environments.
so yes.
many fixes to achieve one solution
and this is just part of one fix here, mostly related to software environment
one screw on a wheel is not the solution to get faster car, it is part of it only. But it can be needed to use better tires, which are needed because you used better machine and “sum of it” gives you faster car.
I NEVER SAID IT WAS THE WHOLE SOLUTION, NOR DID I EVEN SAY SOLUTION. I said it’s a STEP. That’s it. and you agreed with me, yet in the same breath argued about it. You make no sense man.
I never argued about my step – solution mistake, or do I need writen apology from my mom? I can give it to you, as soon as you get your psychiatrist approval to speak in public
Argued from this, but retaining to solution for one sole reason. You asked:
If it is part of the fix, how is it not related?~?!
Different fix, but comes very usable in this goal.
You’re not making ANY sense dude. I originally said it was a step in the right direction, nothing more. You tried to suggest I said it was THE solution.
Is it a step in the right direction, yes or no? If you say yes, then your original reply to me was completely pointless and you are apparently confused.
Exactly, and this is why I don’t argue about solution – step. When did I argue about solution. I even included your comment in italics what it relates to.
Second, I never argued about right direction.
You asked about not being related and being part of fix. There is where my answer belongs. Can’t answer without geting where I aimed. But I never argued about step – solution.
If your only intention is to show where someone is wrong then don’t ask questions where other must continue topic. I think you’re a bit confused here
Edited 2006-02-06 05:56
Reread your first reply to me:
“No, not related. In fact it is only part of the fix for solution”
If you still don’t understand, then I’m afraid there is no point to continue this.
Bah, why would I even care about it? Whatever
And here’s why. Because dad has his own account, and his son another. Aside from the fact people will want to do that because the son won’t be able to find dad’s porn files (assuming the son doesn’t have root access), it also means that when the son gets a virus, dad’s important work won’t be deleted! Sure the son’s homework is screwed, but dad’s work-for-a-living-work isn’t. That is not perfect, but is still better than Windows. The overall security is higher. And if you’re looking for a perfect solution, go look in heaven, because nothing is perfect. But that doesn’t mean something can’t be better than another.
As for false sense of security: well what else do you want? A virus scanner also doesn’t protect against tomorrow’s viruses, but does that mean nobody should install virus scanners?
It’s well known that the weakest part in computers in the interface chair-keyboard. 😉
There’s absolutely nothing wrong in Thom article. UNIX-like systems are more secure than the majority but they can’t protect user Joe from himself. If Joe doesn’t pay attention to what he’s doing, he may definitively destroy files that matters to nobody else but himself. (And I know what I’m talking about; I’ve already accidently wiped a $HOME twice.)
Any way, I still feel much more secure with my Linux system. I had surfed for years without even a simple firewall, downloaded gigabytes, and I got 0 viruses. Before that time, in just one week with Windoesn’t, even without downloading anything, all executables became corrupted to the point that it was faster to re-install the whole system than to use an anti-virus.
I just switched definitively to Linux after such an “incident” and never regretted. Security may be only relatively better on UNIX-like systems but it’s still worth the change.
I’m no computer geek but I do live in the real world. We are on a small network here in the office with eight computers and a Xandros Server. My computer is also Xandros 3.0. In 2005, the windows XP computers with firewalls and virus scanning software on SP2 went down six or more times and had to be restored using backups from the server (on Xandros).
Atleast two or three times Xandros found virii on my Xandros Desktop downloaded from different sites which it promptly quaranteened which kept them from doing any harm to anything on the network. I use Xandros to download all programs from the internet which might need to be used on the windows part of the network so that they can be scanned without doing any harm to windows.
In conclusion, As far as I’m concerned, Linux is more secure, end of story from my point of view.
penguin7009
A user’s data is not the only resource that can be damaged by a virus/worm. Operating systems that have easily exploitable services listening by default help facilitate DDOS attacks and spread worms and spam. Certain OSes are far too careless with their security and make the internet the ‘wild west’ that it is.
Viruses that delete user’s data are always going to be a problem, simply because users obviously require write access to their own files. Of course, if such malware cannot spread easily then it becomes a much smaller problem.. And that’s what backups are for.
That’s why people say that Linux is secure — because malware would have a very hard time spreading amongst Linux machines, or any UNIX for that matter.
Proper security doesn’t just affect your computer, it affects all of us. We’re all connected to the internet and we all need to take care to be a good netizen. Linux does a much better job of this than other operating systems that shall remain nameless..
Viruses that delete user’s data are always going to be a problem, simply because users obviously require write access to their own files. Of course, if such malware cannot spread easily then it becomes a much smaller problem.. And that’s what backups are for.
Key word there: users. Generally the only program that should need read/write access to user files is the file manager, which interfaces directly to the user and gives him complete control of his files. The programs that the user runs only need to read/write files that the user told it to, thus, letting programs modify files other than those is a huge lapse in security.
But anything that spreads uses either (a) convincing the user to run something (b) an EXPLOIT to spread itself, which every OS has (c) both
With Linux, it’s also harder to spread stuff, even with an exploit, because there are so many distributions and they are laid out and configured differently, that many exploits would only work on a small subset of systems anyway.
Proper security doesn’t just affect your computer, it affects all of us.
So true!
I regularily hear user comments like “I don’t care if somebody controls my computer and deletes my files/porn, I have backups”
Good for you, bad for everyone else that gets tons of spams sent by zombie machines like yours.
And them main problem are not this users, they do not know better, because from their perspective they are ever only effected with local problems, they do not know that events when the shared infrastructure breaks down is cause by machines like their own.
I could start jabbering off technical terminology but I’ll try to keep this post as simple as possible for those not trained in I.T. Security to be more aware.
I’m suprized that Thom, a well known person on OSNews would post an article that doesn’t really cover the differences either between Unix and Linux distributions or a real comparison of security between those and Windows. What it appeared more was a general computer user who lacked a fundamental understanding of I.T. security whether for the enterprise or home network. The reality is not all Linux distributions are the same. Novell for example takes extra measures such as including a firewall enabled and auto-configured by default that can handle enterprise issues or something as simple for the individual home user. Then they include AppArmor http://www.novell.com/products/apparmor/ to increase security which is not seen in other distributions, including Windows or Unix based such as OSX. As an added measure of security Novell packages http://www.clamav.net/ and other security tools for optional installation to provide those using the distribution in a cross OS network. Novell’s SUSEWatcher for example can monitor installed packages (software) and update which includes some third party applications. Where as with Windows and OSX I have not seen this possible. Note: The OSX community can correct me if this is possible in the latest release
So you can see there are significant differences in the way security is handled between distributions, no matter if we’re referring to individual Linux distributions or with Windows and Unix based such as OSX. For someone to just generalize all “Linux” distributions as being the same as all “Unix” distributions either really hasn’t evaluated the differences or is clearly not experienced enough whether through certified training or actuall work experience.
I could of also pointed out all the security agencies that contribute to the Linux base kernel such as http://www.nsa.gov/selinux/ or those that contribute to further developing security tools such as for the ClamAV project but this post would then get to long For those who would like a non-techincal understanding of Linux I suggest reading books such as “Linux for dummies” or “Linux Bible” which can be found at your local bookstore or online at places such as http://www.chapters.indigo.ca/ and http://www.amazon.com/
Edited 2006-02-05 18:54
That’s true to a point, but apparmor and a firewall can only do so much.
Bytecoder,
Re: “That’s true to a point, but apparmor and a firewall can only do so much.”
The point I was trying to make that Thom’s article doesn’t really cover the security differences between Linux distributions or between Linux distributions with Windows. He sort of just lumped all Unix distributions and Linux distributions together then made assumptions on security with out giving specific details. Novell for example understands the difference between read/write/execute access for Limited users and Root (Administrator) is only one level of security. This is reason why they include additional security software so as to give their customers peace of mind. Microsoft unlike Unix and Linux distributions tends to focus on “ease of use” first before security. Also they increased the problem by spreading their FUD claims of Windows being more secure than Linux with out providing actual facts to back up their claims. Consumers when purchasing Windows have to spend additional money on third party security software to make up for the lack of security provided in Windows. As a consumer this is unacceptable because Microsoft should be responsible just as with Linux/Unix developers for providing a more secure OS and educating their customers. Instead they appear to just be taking Windows customers money and saying it’s not our problem when you lose data because of Windows or that you have to spend “X” number of dollars on additonal security to ensure your data is safe.
Edited 2006-02-05 19:26
The point I was trying to make that Thom’s article doesn’t really cover the security differences between Linux distributions or between Linux distributions with Windows. He sort of just lumped all Unix distributions and Linux distributions together then made assumptions on security with out giving specific details. Novell for example understands the difference between read/write/execute access for Limited users and Root (Administrator) is only one level of security. This is reason why they include additional security software so as to give their customers peace of mind. Microsoft unlike Unix and Linux distributions tends to focus on “ease of use” first before security. Also they increased the problem by spreading their FUD claims of Windows being more secure than Linux with out providing actual facts to back up their claims. Consumers when purchasing Windows have to spend additional money on third party security software to make up for the lack of security provided in Windows. As a consumer this is unacceptable because Microsoft should be responsible just as with Linux/Unix developers for providing a more secure OS and educating their customers. Instead they appear to just be taking Windows customers money and saying it’s not our problem when you lose data because of Windows or that you have to spend “X” number of dollars on additonal security to ensure your data is safe.
I tend to mostly agree at this point.
At the end of 2004, I switched from Win2k to SUSE 9.3 (then to OpenSUSE 10.0 later in 2005). During that switch, I made sure I read as much as I can about security. Because we all know preparation pays off.
But I am astounded, based on real world usage, the amount of additional applications I needed to install ontop of Windows to keep it running if its ever connected to the web. I don’t use paid solutions, I use multi-layered approach of free applications. (and this has worked very well).
The only tool I would recommend that offers access controls and firewall on the Windows platform is Core Force. (This is free, and its firewall component is a port of OpenBSD’s pf…Obviously modified as its not the fully monty of pf). This isn’t some dumb down solution, but a solution that requires user knowledge and reading of manual/guide/docs on web.
I did test Core Force, and it seems to offer the same basic idea as SELinux. To limit or strict the damage of a compromise or an attack via malware. In real world testing, with the right settings, I manage to fend off the recent WMF problem without patching. Only when I set the security setting too low, does the system get seriously violated. (How come Microsoft doesn’t have this or even consider such a security solution?)
I find it odd that Thom didn’t mention any such tools in his article on either Windows or Linux platform.
All I see him trying to say is that the user should be aware of knowing how to protect themselves, no matter the platform. But he does it in such a way that feels like an attempt to provoke *nux users.
It shouldn’t be like that. This is OSNews. It should be platform neutral. The idea is to report the happenings around the web. If you’re gonna talk about security, at least bring a some sort of balanced advice and show some options to people.
OT: This is what I’ve just did by mentioning Core Force for Windows. Its a free product to give the Windows user at least some hope when their anti-virus/trojan/etc fails to defend them.
Cmon these debates are boring. Article should have been about protecting your home directory in UNIX type systems instead of bashing nonsense.
As others have mentioned, this article has almost nothing to do with security. Anyone who has studied anything in computer security (and security theory in general like Bell-Lapadula stuff) should know this. The article argues that a hypothetical virus could wreck havoc on user’s home directory; and thus UNIX security system is not all that as secure as it is cracked up to be; and provides a false sense of security. While this may seem like the truth from a user’s POV, from a security standpoint, this is no different than the user just deleting his/her home directory, or something similar. In fact there was no breach of security here whatsoever. Actions performed were completely within the scope of those granted to the user. There will always be the “human” factor to security (in general, and not just computer security). UNIX security model did not fail here in any way.
Edited 2006-02-05 19:05
This author wants us to believe that the untrue is true and the true is true. Well, not in this world we live in, but maybe in his dreams.
Unix and Linux OS users all agree that their windows troubles disappeared when they switched.
I personally tried to execute many harmful codes and harmfull avi, mpeg files in RHEL 4.2AS and then it either never executed or it worked with no troubles.
I even went to some very malacious sites and never get infected or felt the system slow down or found a missing file. This author is un-rational, besides he said that a user is stupid and he miss up his system; which is not true in most cases, because windows used to get infected and crashed and restarted in 7 minutes after being online after you open it from the box ( Does he remember mydoom and wXP SP1 era), Oh Is this a FALSE sense of Insecurity?!! Please, Stop linking to such windows ads.
Just mount your /home directory noexec? Seriously, who would need to exec files from their home directory anyway? This seems kind of a logical solution to me, why hasn’t thought of this yet? Am i missing something?
you are not missing something. noexec has been mentioned by myself and others a few times in this and in posts on other articles.
linux and unix users all understand it, but users of windows have no understanding of this alien concept.
they are used to downloading files into “My Documents” then double clicking the file to run/install it.
a lot of windows users wrongly equate “My Documents” with /home and therefore think downloaded files belong in /home and should be run from there.
I have long felt that for a single user system not running as root was a hugely over-rated piece of advice.
Usually that gets me flamed. And there is usually someone ready to argue it out with me and “prove” that it is best not to run as root. And they usually do “prove” their point (in a greatly watered down form) because there are indeed security advantages to not running as root.
But that whole dogma^Wbusiness is way over-hyped.
Fortunately, UNIX and Linux subscribe to the idea that security is multi-layered. “Don’t run as root” is but one layer of many. It has great benefit on multi-user systems. Rather less on single user desktops.
Security is like winter clothes. Its most effective in layers, and one size does not fit all.
P.S. And what’s with the database problem lately?! I’ve seen more of the janitor than I have actual news stories.
After 77 comments this is probably redundant but:
1. The vast majority of viruses are written not by some idiot that wants you to lose all your files, but by organized criminals that want to make your computer into a zombie that they can then sell to spammers. Thus, there are almost no viruses that delete user files (certainly none in wide circulation).
2. Even if there were viruses like that, there is no way to prevent programs from deleting a user’s files if the user has given permission to do so (either implicitly or explicitly). It’s just not possible. Your file manager is a program that deletes users files, and yet it is not a virus. The only possible security here is user education.
3. Arguing about how you can “exploit” standard tools like “rm -rf” to delete a user’s files is stupid. Deleting files is the primary function of rm. That’s not an exploit at all. That’s simply relying on the stupidity of users, and is something we will never solve.
3. Arguing about how you can “exploit” standard tools like “rm -rf” to delete a user’s files is stupid. Deleting files is the primary function of rm. That’s not an exploit at all. That’s simply relying on the stupidity of users, and is something we will never solve.
Thank you for making sense. While the points in the article may be valid, they really are grasping at straws.
the /home issue is not a security flaw, it’s by design. Regardless of the operating system we’re talking about, Win/*nix/OSX, certain damaging actions can only be prevented by a human firewall. There is only so much the OS can be expected to do while still maintaining usability.
For those personal files that are so valuable, simple steps can be taken to safeguard them even from inadvertent harm (ie. assign ownership to a different account and give yourself read permissions only). Clunky? Yes, but how important are those files? Some people leave valuables and personal documents stored at home, some take the extra step of putting them in a bank safety deposit box. It’s the users choice.
What I do think is interesting is that the article touched upon the issue of covenience versus security. And that’s an important study to understand. Elements of the modern Windows system, like the NT kernel or ntfs, are designed with security in mind. But much like the stubborn user that insists on running as root for convenience, that inherent security is undermined by a system and application design that sacrifices security for ease of use.
Remember when you could open a macro-laden Word .doc and without realizing it, have virus-borne emails sent from your system to everyone in your outlook address book without even realizing it? Bad design. That was more than a decade ago, yet Microsoft never learned their lesson about the risks of integrated computing components in a connected world.
ActiveX has been a ridiculous security problem that should have been extracted from IE years ago. There’s no excuse for allowing browser access to system-level components. But did they? No. They tried to patch around it, play with settings and zones, but at the end of the day they wanted to ensure IE is a cornerstone of their corporate customer infrastructure integrating in with Office and intranet apps like sharepoint, and the average home user surfing the web pays the price for that feature. They even went so far as to enforcing ActiveX use by making it mandatory for windows update and other Win-only “features”. Combined with various other component flaws, users are left with a browser that will happily install exploit code from a malicious web-page on home user’s machines in order for corporate customers to be able to integrate IE with office documents on a sharepoint server. Of course users could always disable ActiveX and scripting, and be denied things like Flash or DHTML that non-IE users take for granted when viewing the web.
The registry? Another nightmare that won’t go away and other than bad drivers, probably the root of most users problematic Windows systems. Encourage developers to use the registry, and then secure the registry so that users need admin privileges to install and use many common applications that need access to the registry. This should have been addressed in Win2K when the scope of the problem started to be realized (broken applications for users running with reduced permissions) but instead the problem was bandaided through XP and one more sacrifice was made to the gods of usability and backwards-compatibility. Remains to be seen exactly how well this problem will be addressed with Vista, because the only way to properly fix it will ensure broken applications for the majority of their customers.
Security isn’t a technology, it’s a mindset. Proper security should anticipate potential problems no matter how theoretical. Ideally security should be transparent to be effective, because inconvenient security will often be bypassed, that’s human nature. But in the real world a balance must be struck between usability and security, and users need to accept that security comes at a price.
Why does Windows have such a bad reputation for security? Because Microsoft chose ease-of-use as the lowest common denominator when determining things like default settings or how components interact. A knowledgeable admin can lock down a Windows desktop to a fairly high level of security, so Windows can certainly be made secure. Windows default setting should be secure, it should be up to the user to throttle back those settings at their own choice, not the other way around. Sure ease of use cuts down on the support calls because Joe Average is able to install programs more easily, but what price has that ease of use come at? I won’t even talk about the issues of application integration with the OS for competitive advantage and the inherent security problems that has raised.
So yes, I will agree that *nix is not invulnerable to security. Users have to take a degree of responsibility for themselves, but at least OSes like *nix and OS X take extra steps to keep the user accountible. THAT is the sole reason one can legitimately say *nix/OS X is inherently more secure. Not invulnerable, and certainly if you run as root you’re pretty much dismantling that security anyways, but that is the users choice and not the fault of the OS.
the /home issue is not a security flaw, it’s by design. Regardless of the operating system we’re talking about, Win/*nix/OSX, certain damaging actions can only be prevented by a human firewall. There is only so much the OS can be expected to do while still maintaining usability.
You’re not thinking fine-grained enough. Security in the unices as it stands now is very coarse-grained, despite the security risks.
For those personal files that are so valuable, simple steps can be taken to safeguard them even from inadvertent harm (ie. assign ownership to a different account and give yourself read permissions only). Clunky? Yes, but how important are those files? Some people leave valuables and personal documents stored at home, some take the extra step of putting them in a bank safety deposit box. It’s the users choice.
Or simply disallow programs access to user files they’re not supposed to modify.
Why does Windows have such a bad reputation for security? Because Microsoft chose ease-of-use as the lowest common denominator when determining things like default settings or how components interact.
Ease of use and security have very little to do with each other. The UI is simply a way to interact with the user; having a good UI by no means requires a lapse in security.
They opened up all sorts of services by default and we would put a box out on the open internet to see what would happen- and it would be owned within a day or two.
Most of the spyware crap that the typical home user gets is because they click on whatever crap comes on their screen. There’s nothing preventing that happening on Unix boxes too. It’s just that the average Unix user is more saavy than the typical windows user and the spyware authors don’t write linux specific code. They act like a business, they go where the market is. It’s no fun for them if only 12 dumbass Mandrake users get exploited.
And even then, downloaded and created files on *nix systems are NOT executable, if you don’t set the +x bit yourself.
EVEN clicking those stupid “Make your computer faster” banners (or whatever), AND downloading a script, that will simply go to the hard disk. And guess what, if you click “open” (say) in the firefox download manager it will NOT execute.
Sure, if a user is willing to go on, make something executable (without knowing what it is), and run it… well, he could also select all files in his home and delete them manually. Where’s the difference?
Most of the spyware crap that the typical home user gets is because they click on whatever crap comes on their screen. There’s nothing preventing that happening on Unix boxes too. It’s just that the average Unix user is more saavy than the typical windows user and the spyware authors don’t write linux specific code.
Typical response from someone who has not got a clue what he is talking about.
on a Unix box, if you click whatever crap comes on your screen, it will just sit there doing nothing.
If a spyware author “WAS SUCCESSFUL ” in writing a piece of spyware for *nix, then they would have to find a way for it to propogate.
What you, and other trolls seem to forget is this, the vast majority of the internet and also government computers run *nix, and they are a massive target for spyware authors.
imagine if you could get a peive of your spyware on a yahoo/google/ebay server and got 5p for every visitor while your spyware was active. Would that not make more business sense than a piece of spyware that got onto 200000 pcs before it was found out ?
No, because *nix systems have been designed with security in mind, the job is too hard, and it is much easy to write spyware for a busted system instead.
Management of a user’s files are often left to the user. Why
can’t we automate the use and creation of good default network/security, user, and backup policies that would benefit the user as well?
Does Linux have any decent methods of implementing such policies? I know about PAM, ACL’s, iptables, and rsync, but couldn’t we have a system for implementing policies to make this easier? Most default Linux distributions could do a better job of this. For example, a simple fork bomb can bring a system down to it’s knees on most default Linux systems.
MS Windows has a decent policy implementation, but unfortunately there’s so many bad windows applications that won’t function without “Administrator” privileges. I’m not saying that I agree with their philosophies or approach, but I think they deserve some credit here.
In forward thinking fashion, user’s files would be backed up and restored automatically through a backup policy. The goal here is to do as much of these mundane administrative tasks automatically, with the option for manual control of course.
For example, a simple fork bomb can bring a system down to it’s knees on most default Linux systems.
Not really. Ulimits easily negate this.
You can also limit user processes with PAM using the limits.conf, but my point is that this could already be set for you by default in many of the linux distros out there. Then again maybe detecting and stopping fork bombs should be left to the kernel. Just a thought, I’m not claiming to be expert on security or OS design.
The backup would be a waste of time. For a backup to be really meaningful it needs to be on another physical storage media.
The best fix for this problem is education, plain and simple. Give a man a fish, blah blah blah.
People don’t need to know how tar stores a file, they just need to know to do backups and vaguely how to use tools to get their backup done (and recover it later). Whether those tools be graphical, command line, or even if they’re just copying files around manually!
Amazingly I think the explanation of how a hard disk works should be plenty to convince people that they need to do backups. Of course their panic would be based on all sorts of false reasoning about the danger of existing cells above something at a thousand miles an hour, but if it scares them into backing their stuff up!
And an explanation of how CD’s can be broken without them being able to see damage (scratch through the thin top layer) should make them aware that they aren’t 100% reliable and aren’t the place to have the only copy of their wedding photos.
Don’t baby people, just explain stuff to them. If they won’t listen, don’t fix their computer.
If you don’t like Unix, just don’t use it.
No one is making you using a Unix system (unlike Microsoft with its windows).
No one is making you using a Unix system (unlike Microsoft with its windows).
Yes, because we all know that Ballmer and Gates are busting into people’s houses, Uzis drawn, forcing people to play with Clippy.
By the way, I am “forced” to use Linux for work.
As am I.
I’m not sure what his comment is about anyway. If you don’t like it, don’t use it? Where did anyone say they don’t like it… ?
Some of these people seem to be gettnig really offended by this article for no reason.
I was recently reading about how someone had their Linux box rooted. A nasty business, very very hard to pin down the affected files, hard to detect in the first place, really no option but wipe and reinstall. When it gets bad on *nix, and sometimes it does, there is no Windows-style handholding to fall back on.
Imho, it is better to be prepared than sorry and there is really no difference between the security stuff I do on Windows and on Linux. That means a full firewall, services I don’t need shut off, regular checks for open ports, regular sweeps for malware, a Nat’d and firewalled router on ethernet, full AV filtering on mail (some of it will end up on a Windows machine anyway), regular security updates from my distro, and finally stuff like NoScript for Firefox and privoxy for port 80 connections. Plus, very important, no hanging around dodgy sites on the net and no downloading of programs or source code I have the slightest doubt about.
I think the permissions and unprivileged user set-up on *nix is harder for bad boys to penetrate, and there are no vectors for the ActiveX and scripting nightmares on Windows. But as more newcomers arrive in *nix land, so the opportunities to exploit mistakes from simple ignorance will multiply. But this applies to any platform.
Whether it’s your home directory or the whole machine, a hit is still a hit. Very bad news. But if you’re not prepared to do the work of locking the place down on any platform, then a successful attack is probably going to happen. Anyway, none of it is a reason to avoid *nix. imho. Linux is a great platform that I much prefer to Windows.
Sorry if this is a rehash. I didn’t read all 80 replies.
Thom is right on one point; not much is going to protect your personal files from annihilation, particularly if you do something dumb. But this is true on nearly any OS, as someone else here pointed out.
But users do care about other things; namely, is that machine letting them get their work/play/whatever done? Most people whose machines I clean malware off of do not have problems with their personal files; they have problems with their browser autonomously popping up ads, or porn sites, or not remembering their chosen home page, or whatever. They have problems with their PCs being slower than molasses, or acting strangely. And there’s always keystroke loggers and other identity-theft mechanisms.
Now, it may be possible to do some of this in the user’s context, but it is much easier when the malware has root/admin privileges. And from what I’ve seen, Windows encourages an “everything/everyone an admin” attitude more than the Unix-ish OSes. Microsoft is trying to get away from this now, as they have gotten enough security black eyes because of it, but home Windows seems to be still philosophically rooted in a one-user/one-machine mindset — in no small part because Windows application developers have this mindset as well. I’ve been experimenting with running my home Windows box non-privileged, and it hasn’t been all that smooth — certainly not something I would have average Joe User try.
I know the article didn’t deal with more savvy users, but from that standpoint as well, I can figure out most any Unix system by reading text files. This to me is Unix’s greatest strength. I don’t have a prayer with Windows’ poorly documented Registry (dumbest idea Gates’s boys ever had, IMHO).
Edited 2006-02-05 21:16
Wow Tom, you can surely spin people up.
As we all know this subject of “Windows -vs- UNIX” has always been a religeous-type of discussion. Whether you ment it or not. That’s obviously what it’s turned into.
Some FACTS as “I” see them:
ALL Operating Systems have vulnerabilities! All of them. Some known, and some not known yet!
Some vulnerabilities are worse then others. Some are REAL threats to Information Technology and personal and sensitive data.
The fact is, I don’t have, and I have not seen in these comments, or many other places, the research links that shows that UNIX-type OSes are been compromised when the OS is properly configured and all basic security measures have been put in place. I’m sure it has happened though.
Frankly, this can “probably” hold true for Windows XP Pro/2003 – that is to say when properly configured, CONTINUALLY patched, and monitored, that the system has pretty high marks when protecting the user’s data. But how much money and man-power did that take and how many reboots (downtime).
Any good, “secure”, system is probably maintained by a good IT staff, or admin. Hense, bad support can equal disaster for the company who owns the IT.
In my experience, it has taken less staff, less time, with less coddling, or nursing, to maintain the integrity of a UNIX-based system over any Windows-based system.
UNIX-based systems don’t, in general, have a “Tuesday security patch” released to ensure the users don’t get hacked into, receive spyware, maleware, viruses nor anything else that we’re all affraid of.
You rarely, if ever, see major news articles involving any type of major UNIX viruses or threats. I don’t think that’s JUST because people LOVE to pick on MS. I might be wrong though.
When was the last time a UNIX-based email server had to be shut down due to the “possibility” of a virus spreading or affecting the mail server itself.
Windows is better for home use than UNIX when it comes to the general public. And, that’s were MOST of the games are. No brainer there. Not to say that Apple hasn’t come a long way in the world of gaming. Now that’s they’re going with Intel chips, watch out MS!
Even Ex-Microsoft employees have said that the BACKBONE structure of all Windows OSes has been the root cause of such easy attacking and virus implimentations. THIS CAN BE CHANGED. Maybe it will with Vista – “Don’t hold your breath though. IMHO!
I suppose what this all means is that there is a right tool, for the right job. And that competition drives innovation and improvements. Without it, some companies may be too relaxed and disband with Internet browser department.
BAD POINT:
I feel that there is too much competition and there are too many teams working by themselves. It’s like church, when people can’t agree on their paths, the splinter off and start a new church.
We need to work more together, yet keep healthy competition alive.
That’s my rants, who’s NEXT!
Nicely said elsewhere.
I enjoy the article honestly. Simple, to the point, and very stirring to say the least.
What would be neat are some ideas on how to bolt down the system a little more for protection. For example, since some people were talking about using an email to execute something like “rm -rf ~” maybe you can limit the permissions of each command per user? For example, if you eliminate the possibility of people being able to do a recursive command like this so the directory could not be cleared entirely. Another may be limiting the way a user can access variables and such. If the script calls for something like “rm -rf $HOME” or “rm -rf ~” limiting access from these variables would really cripple any simple scripted “virus” type of problem. It’s not a perfect solution, and not exactly what people would consider a realistic issue or vulnerability, but for the sake of understanding, it is an idea and I’m sure feasible through some means of work.
What about “rm -fR /home/`whoami`”, or what about “rm -fR /home”?
Right, like I said it’s not full proof by any means, but I was just trying to give a few ideas on what could happen but it does limit the basis of what a user can do.
At any rate, I can’t give the perfect answer (can anyone, really?) but I figure at least considering ideas is better than just having an argument over something. You are right, though. Those tricks would probably do it.
This is where Vista will shine. It will allow you to run as admin, still run all the unsafe programs as limited user. Thus you can save your files with admin permissions but when IE gets a new worm, it will only have limited permissions so the worm wont be able to delete the user files.
This is possible in Linux but not default so i think Vista is going in right direction.
That sounds like the current *nix security model…
“This is where Vista will shine”
Uh hu.
“It will allow you to run as admin, still run all the unsafe programs as limited user.”
What’s an “unsafe” programs? What criterias is it based on? Who makes that decision?
What if the program you *really* need is considered “unsafe”? Can you make an exception? If you can make an exception then you, the user, can be tricked into make other exceptions. Or worse, malicious code can make exceptions on your behalf.
On top of that, *you* are running as admin so any flaws in Exporer, or other “safe” application, running as admin will be vulnerable.
“This is possible in Linux but not default so i think Vista is going in right direction.”
You dont know what you’re talking about.
This is not a problem that can be solved with software or hardware, it’s a problem that can only be solved by people being responsible and educated users.
I think that all applications are considered unsafe until you tell windows [vista] its ok to run it. And any program you run, if it tries to do a restricted action, it asks you if it’s ok for the program to do that.
I think that all applications are considered unsafe until you tell windows [vista] its ok to run it. And any program you run, if it tries to do a restricted action, it asks you if it’s ok for the program to do that.
And what would be security measure here against users always clicking yes, without thinking? Like always.
So it is bad for *niX, good for Windows?
Bad for *nix? What?
Start making sense here man.
Of course some users will always click yes. There’s little you can do to prevent that.
Bad for *nix? What?
Whole point and sound of article was to show that fact and how this is bad on *nix. And since you posted from “Thanks, Thom”, etc. Obviously agreeing with the article.
Now on Windows (with the worst solution, with same dangers as any other OS), you sound very satisfied. Good for you.
An effectively complete starting list prevents this.
If you have nothing on the list to start it’ll be an always click yes thing for almost all users.
If you let vendors help you fill out your list so that most programs like photoshop don’t have this problem then users will be genuinely surprised by the dialog the first time they see it and maybe they’ll read it!
Or it might be too late, and we may be too much of a next, next, next, finish society as it is!
“I think that all applications are considered unsafe until you tell windows [vista] its ok to run it. ”
So it’s just asking the user if he/she really want to run the program they just instructed the system to run?
Yeah, I cant see users klicking on that blindly all the time….
“And any program you run, if it tries to do a restricted action, it asks you if it’s ok for the program to do that.”
So it asks the user if the want to allow the program they just started to do what it what it was designed to do?
Did MS buy ZoneAlarm? Did they port Systrace?
As far as Joe Average goes, this doesnt solve the problem since an unresponsible/ignorant user will just click thru the dialogues anyway.
You’re right. But it’s better than what there is now.
This seems to be the argument du jour, that *nix is no more secure since users have write access to their own files (and therefore anything they run). Yeah, it kinda goes with the territory of an OS. Any OS.
Thing is, there just aren’t that many viruses that set out to delete data. Malware doesn’t try to trash peoples’ files. Most malware is about owning the box, among other reasons to send spam (either to others as part of a botnet, or to the user of the computer in the form of popups).
Trashing user data is a pretty quick way to tell a user something is wrong. If a virus or spyware wants to remain, it needs to not advertise itself too hard.
While *nix may be just as vulnerable to a virus/malware that wipes out user data, it is far hardier in the face of a virus/malware that tries to own the system, like the majority of viruses and malware.
The whole user data thing just seems like some people saw a door crack open and wedged their foot in it looking for some leverage and won’t shut up about it, conveniently ignoring the modus operandi of real malware.
“This seems to be the argument du jour, that *nix is no more secure since users have write access to their own files “
Excuse me? Did Thom not in fact say that UNIX is MORE secure? Are you unable to read?
it is far hardier in the face of a virus/malware that tries to own the system, like the majority of viruses and malware.
Much of the stuff malware wants to do won’t require root access though.
It may be correct that a user isn’t more safe because it’s Unix or anything.
One big thing that contibutes to security is that EACH and EVERY executable that enters the system has been screened by thousands of people before it enters the package repositories.
When I brought the Powerbook, I knew I was relatively safe from auto-install spyware and such things that you ought to watch on your windows box; however, I always knew that a Powerbook has a hard drive. And that as such, it was prone to fail.
It did.
I had a backup.
Unix nor any other OS can protect you from that; but Joe Sixpack will keep loosing files because of his/her lack of knowledge and hardware failures.
It’s inevitable; they don’t know anything about the computer, it’s a tool.
You might need sudo to do this, but why not have a directory area that is owned by a second “safe” user and when you want to keep stuff, you run a privileged command to copy the stuff into that area and chown it to the safe user (maybe with, say, u=rw,og=r access so you can read it back from your normal user, but not write over it or delete it)? That “safe” user could even be root if you wanted.
Would such a system completely and utterly negate this article posting by providing an area that can’t be overwritten without having the root password, thus stopping worms/viruses/trojan horses from trashing your “must keep them safe” files? Or does the article writer not know about sudo (or an equivalent setuid’ed wrapper concept) and is spouting how insecure UNIX is out of sheer ignorance?
I’ve heard this argument thousands of times before and it just doesn’t make sense. If you look at the problems that Windows generally encounters with security it is not destruction of personal files. In fact most viruses do not destroy files. They generally overwrite system files and/or add startup entries that allow the malware to run at boot time and either transmit data or display ads or whatever. This cannot be done without root access on UNIX. I’ve cleaned thousands of viruses off computers and have yet to encounter destruction of personal files. It is definitely possible to do but that is seldom the intent of virus writers. There is a simple solution to this problem, backing up your data. This is necessary anyway considering the failure rate of hard drives.
Thom of course has a very valid point. User is, was and always will be the stupidest, weakest part of the chain.
Nevertheless, I believe Thom puts the valid point into invalid context; the user is equally stupid no matter the system. To put it bluntly, it doesn’t matter if you perform rm -rf / or format C:.
UNIX performs things more securely, working against userfriendliness. Windows do the opposite. This was here a milion times, I’m afraid.
As for UNIX virii; I doubt that will pretty much ever happen on large scale. Windows systems share extremely large pieces of code; the spreading of Windows isn’t the only reason why Windows are such a big target. The other reason is that Windows are alike at nearly every computer they’re installed on.
On the other hand, UNIX systems provide great diversity, making it much harder to make a blow of larger scale.
Bottom line is, sure user can screw up. But don’t put down UNIX because user can screw up there, too.
Nevertheless, I believe Thom puts the valid point into invalid context; the user is equally stupid no matter the system. To put it bluntly, it doesn’t matter if you perform rm -rf / or format C:.
Problem with this kind of argument is it assumes all users will run as root on any *nix system while performing rm -fr. You can see the root cause of using format C is the privilege issue where users perform that command as root since Windows system set all users as administator by defaut. It is clear the problem on Windows system is its flawed security design.
Try to perform rm -fr / without root privilege, it won’t work. Some extras layer of security such as AppArmor or SELinux go further by preventing even administrators to use that command.
Edited 2006-02-05 23:16
If you looked at the data most valuable to your organization…chances are, it’s NOT on a windows server.
Lastly,
How many Linux/Unix users are guilty of a smirk, every time there office is impacted by another virus?
Ok, so what Thom says basically boils down to security doesn’t cure user stupidity. I agree.
I disagree however that unix-like security is a bad thing.
1) it protexts the user from himself a little more.
2) having a running system even when loosing your personal files means at least Joe Newbie doesn’t have to shell out $$$ to get a computershop to “repair” his pc.
3) most systems these days are multiuser. Even if I loose all my files, the files of the “mom” “dad” and “girlfriend” accounts will still be there.
Oh and in general I dislike going with an inferior approach because of a defeatist attitude. Damn the world and go for BETTER not “good enough”.
Edited 2006-02-05 23:26
Really good article…thanks!
despite all of the people here jumping up and down to defend Unix, Thom makes a very valid point. The OS itself is something which can be re-installed, you can’t exactly re-install your thesis, photo’s of your son’s first day at school or your financial records for the past few years. Perhaps there is a need for users, developers and administrators of end-user systems to re-think the model that we use at present. One way is by moving the storage of our data to the network.
However no matter how much we do around the security model’s we employ there is always the weakest link; the user. Have you ever tried a social engineering exploit on your own company? Have someone call your receptionist pretending to be your ISP and ask him/her for their username and password, try some directors and managers as well. You will be surprised at the results.
I think:
– we should abolish the military, we all know terrorist attacks happen anyway.
– you shouldn’t back up on cd/dvd, after all you’re an idiot and you could loose it.
– we shouldn’t place railings on tall buildings, come on do you think people couldn’t fall off if they really really wanted to ?
– we shouldn’t get flu shots as there’s bound to be a variant somewhere this year that will get you anyway.
– ad infinitum …
Edited 2006-02-05 23:50
Even if you are right, and the virus targets a home directory rather than system files, remember when a virus has root access, it also has access to every home directory on the system. As for our systems (my wife and I) we have personal computers and we have a computer we share. If a virus attacks our shared computer, as long as it only has access to one or the other home directory, that means that the other one will be intact. It also means that a system administator (such as me), can go through and figure out what happened on a perfectly usable computer. Important pictures, for example wedding pics and pictures of friends and our families, are on CD backup anyway, and we both have copies of most of them in our respective home directories. If the virus’s purpose was to get in there and delete pictures, we wouldn’t experience any loss but some lost time.
Most importantly, the viruses’ inability to get root access hugely hampers its ability to spread. If it doesn’t spread well, it might not make it to our machines in the first place, and in fact might not even get written with such poor prospects.
Less than half the damage in the worst case, or none at all in the best case. Still not good, but far better than the Windows situation.
“Most importantly, the viruses’ inability to get root access hugely hampers its ability to spread. If it doesn’t spread well, it might not make it to our machines in the first place, and in fact might not even get written with such poor prospects. ”
Not having root access does in on way hamper a virus ability to spread. A process do not have to be root to connect to remote systems and it’s equally trivial to make the process stay running after a user logs off or make it start everytime a user logs on.
The main advantage of gaining root access is for the virus to hide itself by various means (ie rootkits) but it is not essential for spreading, participating in botnets and such.
If a fool isn’t backing up their files and cloning their boot drive, regardless of platform choice, they deserve what they get.
Nobody talks about hardware failure, the other half of the equation.
And of course with a Unix system the OS keeps going strong because it can, and it doesn’t easily become a vessel for propogating malware.
what we need now are fire and forget backup solutions. preferably ones that can chew down a whole home directory of files in one go.
ok, so some external hardrives are starting to show up with one button backups, but are not those known for being less reliable then internal drives (all the banging about and all that).
what i would love to see are magneto-optical or tape drives that are aimed at the consumer market. make them attachable by usb, firewire, s/p-ata or whatever. but make them one button devices. ie, instert media, push physical button, wait for the device to pop the media back out, put media someplace relative safe.
the problem with using a hardrive as a backup device is that the read/write hardware and the media itself is in the same device. so if one fail, the whole device fails…
still, a quick option for defending against home-folder eating viruses is to set up a small script that at set times (like say at night) will read the entire folder content, package it up and store it away as a dated tar-ball. it isnt to hard to se up with a bit of knowhow in shell scripting, user and group accounts, and the fact that *nix comes with all kinds of usefull commandline tools, being a server os and all that.
i dont know, do windows come with a commandline compression tool as default?
hmm, i guess i could even make the script more effective by looking for signes that a folder or sub-folder have actualy been alterd after the last backup, and then only grab the changes. therefor it would only realy take time the first time its run…
who needs desktop search, give us desktop backup
Edited 2006-02-06 02:15
If user permissions were the end of modern-unix security then yes this article would matter.
However, the author is completely ignoring things which OpenBSD and others have done along the lines of making stack based attacks virtually impossible (or more accurately: highly improbable).
Then there’s always things like ACL’s which should actually be able to provide some extended protection. SELinux can too, both need to be setup very well though!
Of course, Windows has ACL’s too. And they’re working on things like keeping IE in a sandbox. You could VERY easily do what they’re doing with IE with firefox on *nix: xhost; xhost + fifox; su – fifox -c firefox
You could even give that user some odd random password saved under the users home directory, disallow ssh logins for the user, and boom you’ve got firefox running under another user. Obviously Aunt Tillie isn’t going to do this. But a distribution could do this, and even try and theme the other users stuff (make a symlink to their ~/.gtkrc).
In my opinion the real security advantage of Unix is that it, usually, forces its users to learn. It’s not popular to ask people to learn, but it’s always what’s best for them. Easy to use is great on a kiosk, but if you spend more than an hour a day on it you have no business not understanding at least some of it! And no, knowing how to use Word (or any other program) does not qualify for understanding something about Word…
You forgot the basics! Half knowledge is too dangerous. There are 1,64,000 viruses for Window platform and there are 2 viruses for Unix platform!
You have to mannuly execute the Unix virus! Would you like to run it?
“and there are 2 viruses for Unix platform!”
You are sadly misinformed. There are a lot more than 2 viruses/trojans that infects *nix platforms.
“You have to mannuly execute the Unix virus! Would you like to run it?”
Most people dont want to run the Windows viruses either but end up doing it anyway due to either ignorance or badly designed software.
It is not impossible that future and/or current bugs in, say, Firefox (or any other userland software) would allow local execution of malicous code. In fact, there have already been such bugs although not that many widespread exploits.
Protecting user-writable areas is a non-trivial task since you have to let the user, uh, write and modify things. There’s no software that can replace common sense and education.
Edited 2006-02-06 04:44
Yes, but there are some things which can be done to protect what programs can write.
You can change what user runs particularly dangerous programs like firefox.
You can make particularly dangerous programs like firefox only allowed to write under their own directory (SELinux I think can manage this one as well as ACL’s maybe).
You can do things to make most stack based exploits only hit something like 1% of exploitable systems.
By and large these things aren’t done. Probably for a number of reasons:
1.) There aren’t many actively used exploits.
2.) Those that are are still a low threat as most *nix users don’t use the same builds of the same software, or even the same code within it (this is especially true with Linux itself).
One thing current *nix has going for it is probably the bazaar that is Free software:
1.) Release early and often. Users are rarely largely running the same version of many programs.
2.) Release source. Users are often running different builds of different packages which differ even more when you have different packages built with different builds of different versions of gcc.
3.) More competing programs. Generally Free software gives rise to more programs competing for fewer users. I won’t offer an explanation of why this happens, but please name this many proprietary mail clients: Thunderbird, Evolution, Kmail, Mutt, Pine, Emacs (yea, people do it).
There are probably that many, but I guarantee they’re for a platform with ten times as many possible end users! I know of, Outlook, AppleMail, and Eudora.
Some of the advantages modern *nix has for security are by design. Some are addons which work because of the design, or just are made to work. Some are because of the picky tastes of the users. Some are just side effects of demanding source and do-it-yourself methods. And some are side effects of the development cycle.
But the biggest one is the attitude of education instead of an attitude of “where do you want to go today?”
“You are sadly misinformed. There are a lot more than 2 viruses/trojans that infects *nix platforms.”
I think you may find this is a myth put out by anti-virus companies.
http://lwn.net/Articles/166984/
There was also a case (I think it was Symantec) where they listed a few hundred **Windows** viruses where they had just put the word “Linux” in the name of the virus.
Edited 2006-02-06 08:29
“I think you may find this is a myth put out by anti-virus companies. ”
It’s not at all a myth. There are a lot more than 2 viruses/worms for *nix. At the top of my head I can think of the Morris worm (a classic), Slapper, Scalper and Lion.
Granted, they arent’ targeted at the desktop but at servers but that’s not the point.
“It’s not at all a myth.”
I think I have found the source of your confusion.
“There are a lot more than 2 viruses/worms for *nix”.
It seems that you think that Unix and Linux are the same.
Unix is proprietary and (at least partly) closed-source.
Linux is open-source.
There are NO open-source viruses in Linux repositories.
It is therefore possible to completely avoid viruses simply by installing exclusively from Linux repositories.
Another source of your confusion:
“There are a lot more than 2 viruses/worms for *nix”.
Viruses and worms are qualitatively different things.
There has historically been the odd worm or two for Unix. No true malicious viruses though AFAIK, for either Unix or Linux.
Edited 2006-02-06 10:00
“It seems that you think that Unix and Linux are the same.”
Not at all but for this discussion it does not matter.
“Unix is proprietary and (at least partly) closed-source.”
No, it is not. UNIX (Single UNIX Specification) is a set of “standards”, it has little to do with an operating system being propreitary or not.
“There are NO open-source viruses in Linux repositories.”
“It is therefore possible to completely avoid viruses simply by installing exclusively from Linux repositories.”
Not if there are bugs in that software that allows remote code execution.
“Viruses and worms are qualitatively different things.”
Are they now. I’m afraid the distinction has become very blurred as of late. Is Nyxem/Blackworm a worm or a virus? Seems to depend on who you ask.
“There has historically been the odd worm or two for Unix.”
Oh please, there has been more than an odd worm or two although the numbers are of coursed dwarfed by the Windows numbers.
“No true malicious viruses though AFAIK, for either Unix or Linux.”
http://en.wikipedia.org/wiki/List_of_Linux_computer_viruses
http://www.viruslibrary.com/virusinfo/Linux.htm
“No, it is not. UNIX (Single UNIX Specification) is a set of “standards”, it has little to do with an operating system being propreitary or not. ”
OK. Now if we are alking about “BSD”, that IS open source. But the product originally known as “Unix” (as in Unix System V, etc, which is what I was talking about) was and still is proprietary.
So you get that one on a technicality.
“”No true malicious viruses though AFAIK, for either Unix or Linux.”
http://en.wikipedia.org/wiki/List_of_Linux_computer_viruses
http://www.viruslibrary.com/virusinfo/Linux.htm“
Oh please.
http://en.wikipedia.org/wiki/Bliss_%28virus%29
Bliss was just an experiment “even has a –bliss-uninfect-files-please command line option”. !!
http://en.wikipedia.org/wiki/Staog
“It was discovered in the fall of 1996, and the vulnerabilities that it exploited were shored up soon after. …
Since it relied on fundamental bugs, software upgrades made systems immune to Staog. This, combined with its shot in the dark method of transmitting itself, ensured that it died off rather quickly.”
Ancient history.
http://en.wikipedia.org/wiki/Devnull
“Devnull is the name of a computer worm for the Linux operating system which has been named after /dev/null, Unix’s null device. This worm was found on 30 September 2002.”
A worm, not a virus.
Come on. Be real! This is pathetic. Show me “a lot” of “malicious viruses” for Linux out there in the wild … or STFU.
“Show me “a lot” of “malicious viruses” for Linux out there in the wild”
I’ve never said there are a lot of malicious viruses in the wild for Linux. I’ve said that there are more than 2 viruses/worms (whatever your definition may be) for Linux/Unix.
“STFU.”
That’s very mature of you.
Digging a hole in the ground and pretending a problem doesnt exist doesnt help anyone. The hard fact is that Linux is just as vulnerable a Windows *to the problem outlined in the article*, no matter what you think of Thom’s actual article writing skills (No, I didnt find it well written). Is it a big problem? Not currently. Is it a security issue? Not necessarily. Could it be a problem? Possibly.
Linux isnt void of buggy applications, just look at any security advisories from any distro, and with buggy applications comes possible attack vectors. Wether such bugs can result in priviliege escalation isnt the issue, since root access isnt necessary for the tasks malware performs. Spammers, botnets or organized crime arent interested in wiping out your data and alerting you to that something is wrong, they just want to sneak in a little app that looks thru your files, opens up a open-relay spamproxy or DDOS’ someone etc.
Now, since Windows does without a doubt have more ignorant users and a bigger market share it is the natural target for these activities.
“I’ve never said there are a lot of malicious viruses in the wild for Linux. I’ve said that there are more than 2 viruses/worms (whatever your definition may be) for Linux/Unix. ”
Au contraire, you said (and I quote) ‘a lot more than two’, and you said they infect *nix systems.
Here is your exact text.
“You are sadly misinformed. There are a lot more than 2 viruses/trojans that infects *nix platforms.”
Show me. Show me a lot more than two. You have utterly failed so far. Sorry bud, but it is you who are sadly misinformed.
“Digging a hole in the ground and pretending a problem doesnt exist doesnt help anyone.”
Trying to pretend there is a problem where none exists is equally inane.
“The hard fact is that Linux is just as vulnerable a Windows *to the problem outlined in the article*, no matter what you think of Thom’s actual article writing skills”
It is not vulnerable, as there is no problem. Anyone who has ever used a file manager to delete a file should hardly be surprised that it is possible to delete files!!
“Is it a security issue? Not necessarily. Could it be a problem? Possibly. ”
Oh please. Pull the other one, it plays jingle bells.
Is it a security issue that I can delete my own files? Pfft. What sort of a maroon would even ask such a clueless question? This has got to be the lamest troll that was ever put up as a supposed article.
“they just want to sneak in a little app that looks thru your files, opens up a open-relay spamproxy or DDOS’ someone etc. ”
Where are they going to hide this app? In my home directory? Then I just set the system with ‘noexec’ on the home partition. Or I use a ‘find’ on /home to look for any file with execute permission set.
There you are – no need at all for a Linux virus scanner!
“Not at all but for this discussion it does not matter.”
Oh yes it does. It matters a great deal.
Tell me how you are going to hide a full-blown malicious virus right there in plain view open source code??
A simple policy of “install only from the repository” kills all virus vectors of the “trojan horse” variety right there.
“Not if there are bugs in that software that allows remote code execution.”
A “buffer overflow” bug might conceiveably allow remote code execution – but strictly limited to the user’s permissions.
A “buffer overflow” bug might conceiveably allow remote code execution – but strictly limited to the user’s permissions.
Which is exactly the point of the article: It won’t screw up your system files (which you can restore from the installation CD anyway) but it’ll have full permissions to access to your private data like email, browser cache, cookies, stored private PGP keys, etc.
Edited 2006-02-06 12:25
“Tell me how you are going to hide a full-blown malicious virus right there in plain view open source code??”
Noone’s talking about the source code. Tell me how you’re going to hide a virus inside MS closed-source code, unless you work for MS.
Source code has NOTHING to do with this. We’re talking about bugs in applications and using them as attack vectors for planting malware.
“A simple policy of “install only from the repository” kills all virus vectors of the “trojan horse” variety right there. ”
Where did trojans get into the picture?
“A “buffer overflow” bug might conceiveably allow remote code execution – but strictly limited to the user’s permissions.”
Exactly. Did you read the article? This is exactly the point. Malicious viruses that wreaks havoc is a thing of the past, malware today does not need root access to perform their tasks.
“Tell me how you are going to hide a full-blown malicious virus right there in plain view open source code??”
Noone’s talking about the source code. Tell me how you’re going to hide a virus inside MS closed-source code, unless you work for MS.
Source code has NOTHING to do with this. We’re talking about bugs in applications and using them as attack vectors for planting malware. ”
Oh come on, are you naturally slow or do you have to work at it? If the applications {remember: “A simple policy of ‘install only from the repository’} are open source and in the repository – then open source applications has **EVERYTHING** to do with this.
“Tell me how you’re going to hide a virus inside MS closed-source code”
Oh come ON, get with it. You think any Windows system connected to the internet on this planet has ONLY MS code in it? Where?
“malware today does not need root access to perform their tasks.”
This is exactly the point. On a Linux system, ANY sort of ware needs execute permissions. Windows systems do not require any such a thing. There are even several layers of protection on a Linux system after that – so a potential malicious ware has to: (1) get on the system (find an attack vector to do that – hence the relevance of ‘trojan horses’ – they are one possible vector), then (2) get execute permission set, then (3) find a way to get itself executed, and (4) hide from detection.
If the malware is limited to users home directories and user priveledges on a Linux system, it is still miles away from being a threat.
Tell me how a single vulnerability can do all that on a Linux system with ‘noexec’ set for /home.
Edited 2006-02-06 13:00
“2) get execute permission set”
The attack vector is used to set the execute permission or use a system call to run.
“(3) find a way to get itself executed”
Uh yeah, that’s exactly what remote code exploits do, lets you run commands or get shells etc.
“(4) hide from detection.”
It’s not hard to hide from the average user and who would suspect an extra process called, say, ssh-agent?
“If the malware is limited to users home directories and user priveledges on a Linux system, it is still miles away from being a threat. ”
You just dont get it. The threat is *not to your own system*, it’s to *other* systems and/or the worlds network infrastructure.
“Tell me how a single vulnerability can do all that on a Linux system with ‘noexec’ set for /home.”
noexec can be circumented (for example by creating the binary on /tmp) and /home usually isnt mounted noexec since users actually want to be able to *USE* their computer. While noexec probably works great in a corporate setting it’s a no-go for a home user.
The attack vector is used to set the execute permission or use a system call to run.
What “attack vector” are you referring to? And how would it be used to set the execute bit on the files?
Please be more specific. Right now, it seems more like FUD than anything else.
/home usually isnt mounted noexec since users actually want to be able to *USE* their computer. While noexec probably works great in a corporate setting it’s a no-go for a home user.
Having /home mounted as noexec doesn’t prevent anyone from using their computers. /home is for user data and preferences, not executables.
If malware was a non-negligible threat on the Linux desktop, don’t you think we’d have witnessed it at least once?
The current situation proves that the malware threat is currently non-existent for Linux, and as such the OS is in fact more secure for everyday use.
Now, when Linux has 20% of the OS market maybe that’ll will change. In the meantime, we’ll continue to legitimately claim that the *nix security model is superior to Windows’.
“What “attack vector” are you referring to? And how would it be used to set the execute bit on the files?”
“Please be more specific. Right now, it seems more like FUD than anything else.”
Please read up on buffer overflows and remote code execution. I really dont feel like explaining those basic security issues here.
“Having /home mounted as noexec doesn’t prevent anyone from using their computers. /home is for user data and preferences, not executables. ”
Becuase user never creates their own scripts, right?
Do you also mount /tmp noexec? and /var/tmp?
noexec probably works great in a corporate setting where locking down the user is desirable but it doesnt work in the home.
I use zeroinstall-injector and it wouldnt work if /home was mounted noexec.
You may also have noticed that there are now Linux distros that have Windows-style installers and that install applications into /home
“If malware was a non-negligible threat on the Linux desktop, don’t you think we’d have witnessed it at least once?”
Why target Linux when currently it is much easier to target the much bigger mass of ignorant Windows users?
Btw, malware has happened on Linux/*nix.
“The current situation proves that the malware threat is currently non-existent for Linux, and as such the OS is in fact more secure for everyday use. ”
I have never have said it’s not currently secure. I mainly take offense to the clueless blabbering about how it could never, ever happen on Linux and that the *nix security model prevents it.
“In the meantime, we’ll continue to legitimately claim that the *nix security model is superior to Windows’.”
This has nothing to do with the security model, even though it is superior to Windows, and everything to do with the number of users and their lack of education.
You are sadly misinformed. There are a lot more than 2 viruses/trojans that infects *nix platforms.
Right, there’s about 50, though IIRC only a couple are considered “in the wild”.
Meanwhile, there are over 100,000 malware/spyware/adware for Windows.
Sure, they’d be more for Linux if Linux was more popular, but the fact of the matter is that in the real world, the one we currently live in (not some hypothetical world made up to support Windows advocates) there isn’t. Ergo, Linux is much more secure than Windows with regards to malware.
sounds like a linspire user
Hi Thom, great site – I’m a new comer to OSNews but I visit it every day now.
I think this article could have been worded a little more prudently; You’re going to get a strong emotional response with a subject like “Don’t believe the truth”.
My belief is that *nix os’s are more secure simply because there are less viruses written that target them. I ran Suse for a couple of years without getting a single virus or trojan. Meanwhile Windows users had to put up with many threats.
Also, if we’re to address the core issue, yes my documents are important, but system stability and responsiveness are both very important to me, as a slower system means slower productivity. So for me at least security at a system level is important too.
This entire article and half the comments are nothing but flamebait. It’s completely insane. Let me explain this very simply, and with examples for those who just don’t get it, including Thom.
Windows by default has the user run as administrator. On top of that, all in the name of “user friendly”, certain applications that cannot be removed from windows are allowed to execute random code. This is where the vulnerabilities originate for the most part.
In UNIX-land a virus would have to be downloaded, set executable by the user, and then executed to do any damage. Even if all this were done, it would still have to be executed as the root user to do any system damage.
Example #1 umask
kernelpanicked@odin ~ % grep umask .zshrc
umask 022
022 means that any files written to the hard disk as my user get the default permissions of 644. Not executable.
Example #2 noexec
Even if I were dumb enough to grab some random file and set the executable permissions, my user only has write access to my home directory.
kernelpanicked@odin ~ % grep noexec /etc/fstab
/dev/wd0g /home ffs rw,noexec 1 2
What happens if I attempt to execute the file? Whether as my regular user or root, if I execute this file it will fail, and the system tells me in a very un-user friendly way to go f–k myself.
I could go much further with this, but these measures alone are enough to stop any theoretical UNIX virus.
The whole comment from the article about having to make backups, becuase UNIX isn’t “safe” is retarded. It doesn’t matter whether it’s UNIX, Windows or friggin BeOS, if you don’t make backups you deserve to lose your data.
Edited 2006-02-06 07:40
“In UNIX-land a virus would have to be downloaded, set executable by the user, and then executed to do any damage.”
Right. And there could never be a bug in, say, Evolution that made remote code execution possible, right?
All that’s needed is a flawed application and you have an attack vector.
“Even if all this were done, it would still have to be executed as the root user to do any system damage.”
There is other damage than a worm/virus wiping your harddisk. How about a DDOS flood or a botnet? They dont need root privilieges and on the grand scheme of things a DDOS flood is more destructive than your box getting rooted.
Everyone seems to be praising noexec the last few days, and though it’s useful, it’s not *that* useful. Let me explain this very simply, with an example for those who just don’t get it:
arnouten@mintzer:~$ cat evil
#!/bin/sh
rm -rf ~
arnouten@mintzer:~$ ./evil
-bash: ./evil: Permission denied
arnouten@mintzer:~$ sh evil
And don’t bother complaining the user will have to run ‘sh’ himself. The malware could’ve just appended something to .bashrc for example.
And how does the malware get to the first step where it can execute “cat” anyway? Doesn’t it need to be at the third step in order to achieve the first step?
“Example #2 noexec”
Not entirely true. Look at this:
$ cd $HOME
$ cp /usr/bin/whoami .
$ chmod -x whoami
$ ./whoami
bash: ./whoami: Permission denied
$ /lib/ld-linux.so.2 ./whoami
myusername
“Of course, they should make backups– but wasn’t Linux supposed to be secure? So why should they backup?”
Ever heard of failing harddrives? All the software security in the world won’t save the pictures of your son graduating if the hardware gives in.
No matter what OS you’re running – if you don’t take backups, shame on you.
One point that everyone seems to miss, is that it is NOT EASY for most people to backup. Most people haven’t got a clue WHERE to backup, WHAT to backup or HOW to backup. And no operating system or program (that I or they know of) helps them to accomplish that in a straightforward way.
Now I know people are going to accuse me of throwing hardware to the problem again, but I am just trying to get a point across here. It would be very easy if each computer had a second hard drive or some kind of super-reliable storage medium, as big in GB as the first one, dedicated to backups. Then you have a button ‘Backup’ with options ‘Backup my data files’ or ‘Backup my entire PC’. At which time a root password is required. The data could even be on an encrypted filesystem.
Or even better: at regular intervals, the OS just writes the data (personal files) to the storage medium, since a lot of people cannot even be arsed to push a ‘Backup’ button anyway. Another solution would be to synchronize it to a remote server controlled by a private entity, but there are obvious problems with that: internet connectivity problems, privacy, security, …
Secondly, users have to be educated about their security and privacy. IMO, no one does this better than the Linux community. I bet you five Quarks that a much lower percentage Linux users than Windows or MacOSX users will open that email attachment ‘Jessica Simpson NUDE!’ . Again, the operating system could provide people helpful advice about how to operate their computer safely.
And yes I know this sounds a lot like treating computer users as infants, but on most accounts, they ARE immature at computer use. Of course, more experienced users should be able to just turn off the dang insulting info messages.
Think of personal files as the last layer of security, a layer which has up to now received very little attention, but which to most users is just as important as all other aspects of security.
Try this:
Menu-> Personal Files (Home)
(shortcut key) Ctrl+a
(shortcut key) Shift+Del
(Confirmation dialog) click on the ‘Delete’ button, not on the ‘Cancel’ button.
Or, if you prefer menus:
Menu-> Personal Files (Home)
Edit->Selection->Select All
Edit->Delete
(Confirmation dialog) click on the ‘Delete’ button, not on the ‘Cancel’ button.
How about that!! I bet Windows is a lot safer than that! … oh wait.
Edited 2006-02-06 10:17
What’s more dangerous that using an insecure system?
Believing that you’re using a secure system.
Selinux allows you to go beyond of the unix permission model. You can in fact, download a executable from the internet, and run it giving it no permissions to read or write files in your home directory.
And then there’s the +x executable bit. Unix is not windows. A .exe file can be executed regardless of where it comes from. In unix, you can download whatever file you want, but unless you chmod +x it manually it won’t be able to be run. The one thing overlapping this are .desktop files, but I we can fix those by requiring +x
…Nice FUD spreading.
Virus on Linux is virtually impossible, at least the way viruses work today.
Considering the many steps the user has to take for the virus to be executed, it’s safe to say that *nix’es are a lot more secure than Windows in regard to viruses in the areas where it matters for the user.
But no reason to wonder about it, since it’s coming from Thom.
Yes of course, because it’s perfectly fine to make things up when you want to make a flamebait article, but not when I want to prove a point.
Why don’t you prove that I’m “making things up”? I haven’t seen you do that. Contrary to you making entire paragraphs up (probably because you have no arguments).
You’ve completely failed to show any source for your alleged “false sense of security”, you seem to react pretty badly when someone makes up things that you don’t approve of.
Like I said, counter my arguments in the editorial then? It’s easy to say “your arguments are nonsense” and then NOT explain why or how.
User files on UNIX systems are just as exposed to the outside world as files on Windows machines. There is simply no denying that fact. A file owned by a user can be destroyed by the user itself– via malware, or by accident. That is nothing new.
What is new, however, is that people who are now using Windows are getting the FALSE sense of security, as in, that they are now safe from malware and that their files are all safe now, because they run UNIX. Which is utter nonsense.
Edited 2006-02-06 13:48
“What is new, however, is that people who are now using Windows (<– I presume you actually meant Linux here) are getting the FALSE sense of security, as in, that they are now safe from malware and that their files are all safe now, because they run UNIX. Which is utter nonsense.”
Sorry Thom, but that is **REALLY** stretching it. You have utterly failed to identify ANY real vulnerability to malware even for users files on a Linux system, and have also utterly failed to show how even the most naieve user can possibly expect his files to be untouchable on a Linux system when he knows he can easily delete them himself with an accidental touch of the “delete” key in a file manager.
If your article was an arguement put forward by a lawyer the case would be thrown out before it even got to court.
Edited 2006-02-06 14:01
How do you propose the files to be deleted?
Not just hypothetical, but actually deleted. Come with some examples instead of spreading FUD.
I call FUD on your “editorial” until you come up with such examples.
[EDITED: Fixed missing word: ‘with’]
Edited 2006-02-06 14:25
What is new, however, is that people who are now using Windows are getting the FALSE sense of security, as in, that they are now safe from malware and that their files are all safe now
Actually, they and their files are safe from malware now, because there is virtually no Linux malware in the wild, and none recorded that attacks personal files.
So while there is a potential threat, there is no actual threat. Until there is, it is perfectly correct to say that the malware threat for UNIX users is negligible.
Just as you woudn’t call a house insecure because you might be opening the door to the wrong person, you can’t call Linux (or Windows for that matter) insecure. Windows is insecure because you don’t need to do anything and still suffer the consequences.
Now, Windows tells you more about potential threats because once they are in, you are on your own.
A bit like London Tube probably has one of the best “service disruption” information system, precisely because it’s probably one of the least reliable in the world…..
This topic seems very popular, Thom has certainly scored here. 🙂
Anyway, I’m considering myself a fairly typical home user being relatively new to Linux, have tried it on and off since ’99 but only really using it day to day as main desktop since about 6 months.
I have to say that my personal files are not the most important thing about the system, the integrity of system files are way more important as it saves me from a reinstall and perhaps two hours of work incl. some customizations like setting up login screen and eye candy like this.
If ‘homeuser’ runs into trouble I just delete that account and start all over. There’s nothing in that directory that cannot be replaced or downloaded again, like the latest .iso of whatever.
Some text files and bookmarks are backed up on a floppy which is all the backup I need. For other stuff like .mp3’s it is always sensible to keep them on separate drives with only read permissions. It particluarly makes sense when you’ve got GB’s of them.
I consider Linux appropriate for advanced Windows users who are comfortable hacking their registry but probably not for Joe or Jill who just bought their first machine ever from PC world.
Therefore the group in question which for simplicity I call typical home user (I mean there are millions of people who’ve been into computing since DOS days and know at leastthe basics) should not have a problem recognizing that there are always some limits to what security models can achieve and act accordingly. It’s just common sense – if something really is that important to you, ALWAYS keep a copy.
Oh yes, and of course even Win2000 and probably XP Pro can be made quite safe. It could not be any easier: click any drive and adjust file permissions for everybody but Admin to be ‘read only’ incl. all subfolders, then go back and allow the user to modify some folders he needs access to, but not to delete.
I think by default new user accounts cannot even read all drives when you set them up. Then install Opera, an Active X blocker and anti hijacker for IE (just to be safe although you’re not gonna use it anyway), only view email in simple text mode and so on.
But of course it’s not quite as secure as *nix.
Is very true!
It is not have ActiveDirectory or Kerboros or ACL.
More security is not true!
Unix has lot security issues every month – just look at lists at CERT!
Patching is very slow, if at all!
We always say to customers, do not go unix, is very insecure!
“It is not have ActiveDirectory or Kerboros or ACL. ”
Uh, Kerberos originated on Unix and so did ACL’s.
As for AD, it has nothing to do with security.
“More security is not true!”
*More* security?
“Unix has lot security issues every month – just look at lists at CERT!”
Number of issues isnt a good measurement of security.
“Patching is very slow, if at all! ”
Now you’re just making things up.
“We always say to customers, do not go unix, is very insecure!”
I doubt fixing your mom’s computer qualifies as having customers.
This is absolutely clueless, and it ignores many facts.
First, number to number there’re far more menaces to windows than to *nix, and few of the *nix are able to autoinstall themselves, while the vast majority of Windows are able to do so.
Second, a normal user finds a f–king mess to install anythin’, that’s why MS programs are still used while costless & better alternatives are not.
Third, network protection is far better on *Nix, while few of the windows firewalls achive the same performance & security levels than simple, installed by default do.
Besides having a backup program
provided by defaul, and being able to do
disk-to-disk backups, there is a difference in
nature between Unix and Windows.
In Unix, it is easy to do even stupid things,
to ensure it is possible to do brilliant things,
but it is very hard to hide what you have done.
In fact, one of the most complex parts of rootkits
is the code that tries to hide the rootkit from
an admin…
So after I restore the user’s account, I’ll analyze
the virus and email my security team with
my diagnosis (;-))
pointing out the obvious
windows malware/spyware/viruses usually don’t have to ‘circumvent’ a damned thing.
they just make easy use of the latest discovered opening.
Such easy ‘openings’ do not exist on any OS other than windows.
I’m talking about Internet Explorer OBVIOUSLY. Not to mention the latest wmf exploit and whatever else there’s been in the last 3 months.
ALSO!, I have 4 times in 3 years been told by Windows that my ‘My Documents’ foldier is corrupt and it will repair it without my permission. That ‘repair’ kills everything in “my documents” and reverts it back to how it was when the OS was first installed.
If the OS itself does that kind of sh!te…
You seem to have the OS savvy of my Grandmother. Truly amazing and insightful article idiot. For god’s sakes at least read a few industry sites/ or FAQ’s before spouting off. Have fun with your AOL buddies
But. . . If Jan gets her home directory wiped out it doesn’t affect what is in John’s, Bill’s or Nancy’s. Also, it is far easier to have good backup schemes on Unix with good tools like Rsync, Tar, Dump, Scp.
BTW. Haven’t read through all 220 comments, Just giving my 2cents.
Backups are not made to protect from virus or trojans (even though backups can often help recover from them); backups are made to protect from user error. And yes, not having a system protected against virus and trojans when such protection is readily available falls within the scope of user error.
The relative security or insecurity of any operating sytem is not a factor in deciding what to backup, how often to backup, and how many backups to keep. To have artificially limited the scope of backups to virus protection is both idiotic and dishonest.
I don’t think I can even bring myself to put an intelligible sentence together. Then again, anyone this close minded may have a difficult time perceiving a comment, let alone interpreting it. A sad, sad day it is indeed when a fine site like OS news has such a scurrilous piece of obviously biased trash talking veiled in sheep’s clothing. Not even sheep’s clothing, but shepherd’s! Too bad some people are actually affected by just this type of disguised FUD.
If you can’t read the D.O.D. Orange Book (freely available on the web, as part of the Rainbow Books series, and it’s REALLY old), and if you can’t derive from reading that, that no computer system, that is entirely functional by being (a) maintained and (b) backupped and kept in working condition, is even WORTH locking in any major way or keeping secure in any other way – then of course, you’d type up such an article as this one right here. But ‘can’t read’ and ‘can’t derive’ are STILL the operative words.
For any OS you have: (1) make it operative, keep it functioning, stop anything that interferes with it’s function; (2) keep it ready to restore by doing adequate backups which may be different for each setup, and do the same for your data; (3) then you first look at monitoring, logging, auditing, extend logging and checking for irregularities; (4) only afterwards should you start worrying about viruses, intrusion, et cetera. For each further step (1) through (4), the amount of attention you should give to the previous step at least doubles. That means that if you are, at all, in an environment that makes you check for irregular user intrusion once per day, you will have twice the attention on the log system, four times the attention on backups, and eight times the attention on machine operability. Then, things make sense. Otherwise, you got your priorities wrong. And it doesn’t matter what OS you’re using, really.
So you can, if you can of course, do that with just about any OS there is. The measures you have to take may be different – as in: dedicate machines, send log packets, keep backups, keep dual machines mirrored ready to be pulled out and started, et cetera. So maybe, ‘can not’ is another operative keyword for this article?
That’s not all you have, right? This is just a teaser to see whether people realize this is just dumb, right? I mean, not serious, correct? After all, last weekend was carnival here…
Poor Thom. Slashdot caught wind of this and absolutely tore you a new one. Tore you a NEW one. The score is now approximately 500 votes against the science and logic of this editorial, 1 vote for it.
Poor Thom. Slashdot caught wind of this and absolutely tore you a new one. Tore you a NEW one. The score is now approximately 500 votes against the science and logic of this editorial, 1 vote for it.
Oh, no, not really. Slashdot’s opinion is as interesting as asking who voted for Clinton on a Democrat convention.
Here are 2 of my favorite slashdot comments so far (but there are plenty more):
—–
Come on guys
(Score:5, Insightful)
by AutopsyReport (856852) Alter Relationship on Monday February 06, @09:18AM (#14651686)
Don’t waste your time. Read a more interesting article: How Do Computers Work? [factmonster.com]. At least this one has pictures.
Are the editors even paying attention here? How can a 500-word, Grade 6 public speech-quality editorial makes it to the frontpage? Where is the quality here, folks?
—–
He’s just a kid
(Score:4, Informative)
by BlueQuark (104215) Alter Relationship on Monday February 06, @09:26AM (#14651785)
Thomas Halwedra is a young’in with very little real world experience and any practical experience. They kid is in college and has a bunch of machines at home. I think he takes an extremely simplistic view of windows and unix security.
His ‘OSNEWS’ bio: http://www.osnews.com/editor.php?editors_id=11 [osnews.com]
I was doing systems programming on UNIX BSD 4.2 Tahoe when he was born. 🙂
I am surprised that his article was even published/posted, I can’t really even see his argument or what point is he trying to make. Oh that’s right he’s a ‘managing editor’ WTF?
Back to work.
—–
Please let’s not forget -although this is clearly a highly controversial article- that he was trying to make a point about the average home user (whatever that is).
My guess is he had people in mind who just expect their machine to work, not people capable of much sysadmin.Then again they probably could not keep their Win clean either as they still confuse AV and firewall and click on about anything.
I agree with an above post that even then the Unix security model would be protecting them better. So when finally will PC World start selling Linux pre-installed and get people of to the right start?
Without interest on the user’s part to educate themselves nothing is 100% secure, and even then…
Thom: “What I am blabbering about?”
Ok. Months have passed since I last thought osnews is again on the way down. But voila, you somebody had to remind me of it again. To lighten you all up, my problem is not what any of you bash, be it *nix, be it win*, be it pumpkin pie. But pointless “blabber” with mostly useless and fairly clueless points, weak arguments served as red roast to feed the crowd – well, just like anyplace else.
False sense of security? I would like to extend a preference question to the author of the article.
What would you prefer?
(a) An operating system where a virus could delete all your personal files, crash the system and delete the system completely, so you can’t boot, can’t restore your deleted files, can’t even restore from backup, etc.
(b) An operating system where a virus could delete all your personal files, but still your system is up and working, so you can use restoration programs (e2undel for example) to restore the deleted files, restore from backups, etc.
Gee, tough one.
But I think I’d go with the first option. Call me crazy, but I guess 96% market share gives me a false sense of security, but it’s still a sense, so I’ll stick with Windoze.
Second point. I’m not quite sure what the author’s intention with this article is. If he just wanted people to make backups of their personal files, he could just say so.
But this article looks more like a complaint, and/or a suddle attempt at trashing *nix. Weird, ain’t it? If the author has an idea of how to write an operating system that prevents a virus from deleting the author’s family photos from the latest carribean vacation, instead of everything else, I think he should just come out and say it.
So please, I’m very comfortable with my still-never-infected operating system, with my automatic backups, and my false sense of security, and that’s exactly how it’s gonna stay too.
Edited 2006-02-06 23:13
Personally I feel Thom’s opinion about /home/whatever isn’t the real issue. Yes, something we already all knew, the home folder is vulnerable because… well someone needs to do something with their own files at somepoint no?
However I think some other people have pointed a few important facts about security. You, running a program that connects to the internet or acts as a server/gateway to the internet is vulnerable right now, regardless of who its running as, what OS its running on, and where its actually at in regards to the filesystem. In that sense, its not really an OS specific issue but the programs being used on the OS. Granted *nix has a tighter reign on permissions, this does not stop someone from taking advantage of some fixed size data array in a program that accesses the internet to execute arbitrary code. Which can in fact, lead to priveledge escalation and malicious code being executed. Has it happened to the average user in *nix? Sure, if you’ve ever had a distro that runs any service at all its bound to happen if you don’t update. Are they out to delete your /home files, not usually… I’ve never encountered any malicious software that does that myself anyways.
Honestly though, I can say that after running various bsds for nearly 4 years, I’ve never had a security issue. I’ve had rootkits in linux, and trojans in windows… never in bsd though.
Edited 2006-02-07 04:26
<em>noexec probably works great in a corporate setting where locking down the user is desirable</em>.
I don’t know of an example of where noexec aids security. Consider an attack which noexec is designed to prevent: the attacker has somehow invoked an exec, and that exec is to be forbidden. However, if what they wanted to execute was a perl script, they could just run /usr/bin/perl script; same for any interpreted language; for binaries, /lib/ld-linux.so.2 program.