Google employees have always had a remarkable amount of freedom when it comes to what operating system they wanted to run on company computers – Linux, Windows, Mac OS X, it was all fine. Since the China attacks, however, this has changed: Windows is no longer welcome on Google computers.
The Financial Times has talked to several Google employees who confirmed the change in policy. The cited reason is security concerns; not entirely surprising after it became known several Google computers were hacked earlier this year. Google claims the Chinese were behind the attack, even though the Chinese government obviously denies this.
“We’re not doing any more Windows. It is a security effort,” one Google employee told The Financial Times. “Many people have been moved away from [Windows] PCs, mostly towards Mac OS, following the China hacking attacks,” another employee said.
If you come to work at Google now, you’re given the choice of either Mac OS X or Linux – in order to use Windows, you’ll need CIO level clearance. “Linux is open source and we feel good about it,” one employee said, “Microsoft we don’t feel so good about.”
Despite the validity of the security concerns, there’s of course another reason this news has come out – and that’s a marketing one. Google is working on the Linux-based Chrome OS, so promoting internal use of Google products is high on the agenda. “A lot of it is an effort to run things on Google product. They want to run things on Chrome,” an employee said, “Before the security, there was a directive by the company to try to run things on Google products. It was a long time coming.”
also IBM works hard to get independant of Microsoft Windows.
MS Office and MS Project are already replaced.
A Lotus Notes Client exists for Linux.
…
pica
Disclaimer: I do not work at IBM
Many people see MacOS X as a secure OS, while in fact it ain’t that more secure than Windows. Applications like Safari and Quicktime prove that not all Apple programmers are good at secure programming.
You can look for yourself at Secunia: http://secunia.com/advisories/search/
Edited 2010-06-01 10:05 UTC
Agreed. What Google _really_ should be banning from their premises is Flash and Reader.
Uh, what’s with the modding on the comment? Flash and Reader make up 80+% of exploits. Safari+Snow Leopard is no more secure than IE8+Win7. The single, most effective way to be safe from 0day attacks on the web is to block Flash/Reader or uninstall them–that’s the reality of the situation.
Google are deferring the blame from themselves whilst taking a pot shot at Microsoft. There’s security flaws in old versions of Chrome too, so why is Windows the fault when Google didn’t keep the software up to date on that machine. IE6. Good grief.
The single most effective way is NoScript + Flashblock.
This way I can enable only the features I need, when I need them. I don’t see any flash ads ( and, with AdBlock, I see almost no ads ), but I can watch YouTube without issue.
Many times I have landed on a malicious web-site that needed javascript to launch an exploit, so disabling scripting is simply the easiest way.
Granted, you need to understand javascript and web-programming well enough to know when to enable it safely, but it still the most effective way – on any OS.
–The loon
….or just don’t go to sites that you wouldn’t trust javascript to run on in the first place. What I don’t like about noscript is that the internet is effectively broken until you enable it manually, site by site.
Options, options, options. All you have to do is set NoScript to “temporarily allow top-level sites by default.” It’s the very first checkable option and on the very first tab of the extension’s options window. This will allow all sites you actually *visit* to run scripts, but no sites from other domains. You might have an occasional problem with some sites such as YouTube that require running scripts from other sites (ie. ytimg.com), but it’s pretty rare. Easy and effective… I set up my mom’s computer to work this way, and she has no problems; mine is fully manual, no script run without my temporary or permanent permission.
There are loads of situations where that will not work fine. And the thing is, something like noscript gives you a false sense of security anyways. There are plenty of javascript exploits, but there have been attack vectors for virtually every piece of technology on the web. There have been loads of exploits related to images executing arbitrary code, do you disable all images by default? CSS expressions are another one, maybe whitelist all styles? And browser plugins are hands down the worst culpret of them all….
Being secure on the web means making smart choices about what websites you visit, not by arbitrarily blocking technology that makes the internet what it is.
Why would anyone install Adobe Reader on anything other than Windows????
OSX natively supports PDF, I use mostly Skip which is an app that hosts the native PDF control.
Similarly on Linux, I use Okular which is again a native KDE app.
Does Reader offer anything other than an insecure, alien app???
There’s three major reasons to use Acrobat Reader on non-Windows systems, including two valid ones.
1/Having a PDF reader embedded in Firefox. AFAIK, only Adobe does that on linux.
2/Having full support for new PDF features like digital signing.
3/”If I want to read a PDF, I’ve got to install this, like on Windows”
Edited 2010-06-01 13:27 UTC
I know is not the same but… you can get embedded pdfs on konqueror
I know But last time I checked, Konqueror was still a bit too sluggish for everyday use, and had some issues with that Flash player plugin which I happen to use on a regular fashion…
Moreover, Firefox has its extensions, plus that nifty address bar stolen from Opera that works so well…
Edited 2010-06-01 13:41 UTC
Have you given a try to reKonq? I would be using it if I could find out why all my webkit based browsers crash a lot :'(
Actually, there is a new project on kde-apps that fixes the Firefox + Opera issue. It is a Kpart plugin.
http://kde-apps.org/content/show.php/KParts+Plugin?content=125066
I really see no reason to use Adobe Reader on linux. It is sluggish, ugly and I did not find any PDF I could not open with the free pdf readers. If I find one any day it probably will not be worth my time as I see pdf as a method to distribute information while preserving original layout, and all crap someone may insert to prevent it from be viewed defeats whats it is worth for me.
This is the first “feature” of Adobe Reader that gets disabled on all our work computers, as this causes all kinds of confusion. “I used File -> Print and only got one page”, “I clicked the X to close the PDF and Firefox closed”, etc.
Personally, I’ve never understood the desire to “embed” applications within the web browser for displaying docs. IE and Office is even worse. If I click on a document link, I expect a separate app to open to view that document.
Only if those features are needed.
Perfect opportunity for user education.
I didn’t say that I did encourage this, just that those were valid reasons for using Acrobat
That’s your point of view. I happen to agree with you, but most Mac users, as an example, prefer a desktop metaphor where apps seemingly do not exist, making everything a seamless browsing experience.
“After all, what is the difference between opening an image and opening a PDF ? Those are both viewable files, they should be treated the same way.”
Agree. I just hapened to use Windows recently at work because of those Acrobat features that KPDF did not provide on FreeBSD. So I can understand those who use Acrobat for compatibility reasons.
Edited 2010-06-01 16:25 UTC
4. The documents require Acrobat 7 or newer
5. CAD Drawings are embedded
6. Javascript/SWF embedded content
7. Forms in FDF1.6 or greater, etc.
You only expanded 2/… I didn’t think that it needed several points, since those are all power user features that probably don’t get used more than twice a year on an average computer.
Edited 2010-06-02 07:38 UTC
I’ve found that on some large PDFs, Acrobat Reader is more efficient and responsive than Okular. And it really is better than the open source PDF viewers available on the ancient RHEL 5.5 I have to use at work, at least in my experience.
I’ve found that on 99.9% of PDFs (made-up figure, but you get the gist), Okular loads far faster and is more efficient and responsive than Acrobat Reader on Windows.
In fact, this observation holds true for most of the applications I use on Kubuntu 10.4 at home, compared to the equivalent applications I use on Windows at work.
My four-year-old modest self-assembled-from-piece-parts ordered-over-the-web desktop self-administered Linux system, costing me about $300 all up (once-only charge, hardware and full set of desktop applications included), is far faster, more capable, more responsive and more secure than the newer-and-supposedly-more-powerful-hardware difficult-to-maintain-requiring-an-IT-depratment very-expensive-software-with-ongoing-license-costs Windows system I am forced to use at work.
Go figure.
windows has alternate PDF readers with plugins too.
Adobe reader is easy to get rid of, i haven’t used that for years because there are loads of much better PDF readers out there…
Flash on the other hand is very hard to get rid of, because there are no alternative flash players out there that are usable yet.
Those technical dimwits at Google have been fooled by Apple marketing again
Why – in your opinion – would Google say it prefers MacOSX to Windows on security grounds if they are both equally insecure?
Whilst they may be equally insecure, there are far less instances of developed malware for the Apple operating system.
There is a large botnet of macs though – pirated versions of adobe software with trojans built-in caused a huge number of them.
1. Microsoft is really slow a lot of the times with creating fixes when problems are reported. Some say they are getting better with writing software. I’m not so sure. Every new release of windows they say, we’ve rewritten large parts and checked large parts of the code base. And every single time some bug is found in DCOM or whatever and it turns out they were vulnerable for that bug ever since the NT4-era all the way up to the lastest version. The only difference is, in newer versions they added a firewall or similair general purpose layer. Sometimes it’s even on by default.
2. Apple on the other hand is slow with pushing security updates for things which they did not create themselfs, like Java and open source libraries. We don’t know much about the rest, but it doesn’t seem to be much better.
3. Their is a lot more malware out their for Windows.
4. Many malware writers were so annoyed by Vista as their primary desktop they switched to using a Mac and started writing malware for the Mac as well.
To be honest, I don’t know what is better.
These companies don’t seem to be interrested in pushing out updates. It’s work they don’t want to do, they just do enough to not get a bad image.
But if you look at what Google is doing, you have to remember, they are doing this because it fits their way of working. I wouldn’t be surprised that in their environment a Unix-based Mac is much easier to secure because it’s similair to Linux (I heared Ubuntu).
So it means less work for them.
Edited 2010-06-01 11:59 UTC
…. and your evidence to support this claim is?
http://arstechnica.com/apple/news/2009/04/evidence-suggests-first-z…
Pirated CS4.
I did read about the botnets and the Adobe/iWorks trojans.
http://www.crn.com/security/212902615
http://macdailynews.com/index.php/weblog/comments/20815/
You can read about it at the above link. Bottom line is that if you legally purchase your software, you won’t get these nasties… Also don’t just automatically type in your administrator password when asked for it by an installer or an app. Research it first.
That being said I feel the Mac is more secure because of the lack of malware and better over all security model vs XP. Vista and 7 take a bit from what OS X has had for a while and that is a step in the right direction.
Still a lot of the malware that gets installed on a system is due to plain old user error and saying YES to a dialog box that they should not have accepted. I have seen many fake A/V apps get installed this way instantly disabling the windows security center and any security app with it.
Edited 2010-06-02 02:43 UTC
Security through obscurity is what comes to mind when talking about Mac OS. Mac was the first victim in Vancouver’s PWN2OWN, no?
after RTFA..
The reasoning is based on the number of exploits and the targetting of the platform by the bad guys, not really on security.
OS X, for now, is safer than Windows 7. It is NOT more secure.
Pardon me for being blunt, but: Prove it. Don’t just blindly spout something like that without backing it up. And before you call me a fanboy, I’m posting this from Windows 7 and I’d call you out just the same if you swapped the OS names in your statement.
Sorry, but I just like a little fact with my supposition, thank you.
So I’m certain that as soon as the Chinese hack into their Macs they’ll ban OSX too.
We’ll wait for the article…
It’s security by low market share : the vast majority of people still uses windows, why should one write viruses that only target less than 20% of desktop computers ?
Ok, using your link to Secunia:
Search = “MacOS” : Found: 43 Secunia Security Advisories
Search = “Windows” : Found: 1676 Secunia Security Advisories
“Google is working on the Linux-based Chrome OS, so promoting internal use of Google products is high on the agenda.”
Doubt this would extend to the developers though, as they would surely need access to software development tools, e.g. an IDE, etc.
Google uses Ubuntu extensively; there are excellent coding tools in Ubuntu.
Of course, wanting to use google products will cause google to increase the number and power of web applications, chromeOS allows web apps to use client hardware, if I recall correctly.
Actually no there isn’t.
As of 6months ago, much of Android development required a version of Eclipse that’s more recent than the latest copy in Ubuntu’s software repositories.
This meant that Ubuntu users had to manually download and install Eclipse – ie configure their coding tools from sources outside of Canonical’s distros.
Aside that, Ubuntu isn’t really that much different from most other Linux distros in terms of development tools.
Wow, Ubuntu users having to do what all other people on all other OS’s have to do. Install apps manually. Man Ubuntu sucks. LOL!
You’ve missed the point of my post if that’s all you’ve taken from it.
I wonder why more people aren’t using rolling-release distros, where things like this never happen.
Because updates only endure little stress-testing in order to ship faster, so that a single update can break everything.
As an example, I stopped using Arch Linux because one day, they pushed a major update of packman, the package manager itself, in the “stable” repo with insuficient testing, and this totally broke package management on my system. I couldn’t install a single package, not even the packman bugfix release that was delivered some weeks later.
Rolling release is not good for everyone, because it forces people to think and read changelogs before installing updates. Most people don’t want that, they just want to fix security holes, and new stuff will wait until they decide to try out the newest release of their distro of choice.
Well, it could be a “rolling release”. I mean, get regular apps updates but not the core system apps, like pacman
Wouldn’t that be great?
Yes, that would be great.
But it will never happen in linux-land until a distro comes up with the concept of “base OS” and “3rd-party apps”, and packages them separately.
I don’t know. It could only be displacing the source of bugs : if OpenOffice crashes after an update, the average office suite user with a schedule will feel only slightly less angry as if it’s the kernel that crashes…
First off, before the flames start, I’d like to say that I love Linux, and use it everyday.
Software distribution is however SOOO FUNDAMENTALLY FLAWED in Linux. Why do I have to wait for the distro packagers to deliver an app. Why can’t I just grab a .app like OSX and JUST RUN IT?????
The .app in OSX is really a directory structure, which houses the executable, any libraries, an icon, config files, etc…, similar to a java jar.
It would be so freaking simple to have Gnome or KDE support such a system.
I’ve “installed” a newer version of Eclipse in Ubuntu that what came with the distro, and compared with OSX, its a nightmare. First, stick it somewhere, like /opt, get all the permissions straight, then starts the nightmare with editing .desktop entries to give me a menu item, or a desktop icon. This is crazy.
A user should be able to grap a .app from anywhere, stick it anywhere on their file system, and run it, or drag it to the desktop or toolbar to automatically create a launcher for it.
You can. Its called precompiled software. It may or may not run though. With Macs, you have a single platform target. With Linux, you have many targets. So you do one of two things. Compile everything you install from source, like Gentoo. Or you have packages. Even Windows has different packages. There are less options, but 32/64 and specific OS means Windows devs have to compile multiple apps or just not support old OS’s like Win2k, 98.
Besides the technical difficulty of creating one ‘package’ (.app in Mac-speak) that works everywhere in Linux, how do you do updates? What happens when one of those libraries in the .app as a security bug?
Let’s say a fictitious libPDF (for fun) finds out there’s a bug that needs to be fixed. All the developers/vendors that use libPDF in their .app now have to update their version and deploy a new .app for people to use, plus notify their users that it’s available. Then, you have to trust that the users are actually paying attention and will download the new version of the .app. Repeat for each app that uses libPDF. Plus, since the libraries are in the .app, you may end up with 20 different copies of libPDF in .apps, many likely the same version (wasted disk).
The repository system isn’t perfect by any means, but .app isn’t either. Apple chose compatibility over disk space and ease of updates. It’s a design decision.
There are groups that have done similar things for Linux, I believe, just none of them have caught on with a major distro.
most Mac apps handle this by asking you to update them when you start them up. it would be nicer to update them in one go but he whole lib thing is not really an issue on OSX. They rather just have you download the app the full updated app bundle than deal with just updating one lib in the package. It sucks but its the way it is. That’s why mac application updates are so damn huge sometimes.
The repository system was designed when hard drive and memory space were at a premium. It’s better described as a legacy system.
Compared to the system in OSX it’s completely retarded.
I completely agree. This has always been an issue and is one of the major reasons why the “year of desktop linux” will never come… until it’s fixed, of course.
The average user, trying to get real work done, is not going to resort to obscure commands or upgrade their entire distro every time an application update comes out that they need.
This is a repo policy problem and not an issue with package managers or shared libraries. It’s entirely possible for 3rd parties to provide their own DEBs or RPMs (and many do, actually). The package manager can install dependencies as needed.
What would be nice, though, is if the package managers could subscribe to feeds from 3rd party sites to know when updates for those packages are available. The format could even be the same for regular repos, but it would be transparent to the user. When ISV Program #37 whose DEB you downloaded from their site has an update available, it shows up in system updates just like anything else (and in Synaptic, etc.). You download the update, it pulls in new dependency libraries/versions as needed and the user basically just has to click a button. This is all possible and reasonable with a package management system. It’s just not implemented properly thanks to Ubuntu (and some other distros)’s ridiculous policies with upgrades.
This has been a problem since long before Ubuntu appeared on the scene and, as far as I can tell, there’s never been any real attempt to improve it despite the fact that it’s been an obvious issue for numerous people.
At this point I doubt it will be fixed before the entire repository system goes the way of the dodo bird.
That could well be true, alas. Just like Windows security problems were more a result from the culture that arose around Windows 3.1 and Windows 95/98 (single user, no security model), than from weaknesses in NT/2000/XP, so too does it seem that there is just a “way things are a done” with package managers and software installation on Linux, and that doesn’t seem to want to change. Like I said, there aren’t any technical reasons why the problems people run into couldn’t be easily solved. But the culture is still one of constant flux in libraries, only one version of a library/app installed at a time, etc. As long as that culture is still around, package managers will be crippled, as you and others have pointed out. I guess I’m still optimistic that someone will get it and push for change, but maybe I should just return to my normal cynicism .
For a number of years, I always kept one of the distros on a secondary computer and optimistically waited for this issue to be fixed… I gave up about 2 1/2 years ago. 😉
At one point, I even thought the people behind Ubuntu would push for this change given all their efforts in the area of ease of use, but it was not to be…
OS X simply does not have any package management. Stating that repositories is retarded in contrast to the utter lack of package management in OS X is itself a “retarded” statement.
Now one may argue about the relative ease of installation/uninstallation of apps in OS X, and there is much to be said for how OS X does this.
The reason why Linux does not offer something like what OS X does is really simple. In fact there have been umpteen failed attempts to implement something similar for Linux. That reason is: Linux, at least in terms of desktops/servers(not so much for the embedded space/and tablets/smartphones etc.), is simply not possible without package management(apologies to the LFS guys/gals). Package management is a fundamental necessity due to the incredibly large number of diverse libraries and the resultant dependencies.
Linux is not like OS X or windows for that matter. For every one kind of library that exists for OS X there are 10 different ones for Linux. And then of course there are many, many concurrent versions of said libraries.
Now one could argue that there should not be so many different libraries or that there should only be one current version in use of any library. But that would not be Linux.
Personally I find having to hunt all over the web for an application and download it from some random website and install it is a horrible way of installing software. There are so many security issues involved with this method that no one who is remotely security conscious could truthfully advocate such a system.
Simply put repositories and package management systems are the solution to managing the inherit complexity in application distribution in Linux. And frankly when I use OS X I miss having access to the repositories that I take for granted, having been a Linux user for the last 15 years.
For all the apparent advantages to the relative simplicity of of app install/uninstall on OS X, the fact that each app is an island unto itself and that there is no “system” is markedly deficient and lacking in my eyes.
The homogeneity of the Mac software ecosystem(all the way down to the IPad) obviates much of the need for such mature solutions like repositories and package management. But this is due to the relative lack of libraries and functionality provided within that software ecosystem, which on the whole, is far, far less mature than what is offered under Linux.
Maybe one day, when OS X grows up, we will see package management on OS X too
Oh look another Linux status quo defender. How refreshing.
OSX doesn’t have package management because Apple engineers designed an application management system that works. They looked at the shared library system in traditional Unix and decided that it was not worth including.
No it isn’t, people like you are either unable or unwilling to think of better solutions.
Even if Linux had 1000x the amount of libraries available that still doesn’t require that applications share as many libraries as possible.
That functionality doesn’t require shared libraries. It can be done with binaries. See the portable apps suite as an example:
http://portableapps.com/news/2010-05-28_-_portableapps.com_platform…
A repository with statically linked libraries or verified binaries can have the same level of security.
No they exist as way to preserve hard drive and memory space, a reason that is no longer valid. I can dig up an old IBM paper on Unix that states this as the main purpose of the shared library system if you would like.
So if OSX developers had more libraries to choose from then a shared library system would be needed? That doesn’t make any sense. More libraries for OSX would just result in more options for developers. There is no reason why the software management system would have to be changed.
OSX is certified Unix and users can install updates for programs without having to worry about a package manager breaking existing programs. They can also install updates directly from the developer instead of waiting for the update to trickle through the distro which is a security compromise. The shared library system is retarded in comparison. Users should not have to wait months to update a single program and a system update should not be able to break working applications.
Why then, has Apple chosen a (somewhat more restrictive) repository type system for the iphone and ipad?
Because all MacOS X developers create a package for MacOS X, while only a few Linux developers create a package for each distribution (because that’s a lot of sucky work). They just publish the source code, so you can compile (and package) it for your distribution yourself.
You can always download RPMs and DEBs from any website and install the software just like you would in Windows (albeit without an installer asking you thirty questions about where and how to install). You can also subscribe the repositories that have more up-to-date versions of software or beta/testing versions. And those will/can be auto-updated, so you don’t have to hunt around to each non-distro-release piece of software that you have (unlike Windows and Mac OS X where every 3rd party program has its own wacko updater).
No it doesn’t ask you any questions, it will just go ahead and break a dependency without your permission.
Not these days. Usually, the installer uses Yum or apt as a backend, not the more primitive rpm or dpkg. So it will install the package, along with any dependencies.
So you are saying that Yum and Apt won’t break dependencies?
Besides good reasons people repeat again and again, and all talking about diversity on libraries and file system structure on Linux world (options for them also), I will list some to you:
* Sharing libraries – it makes a program starts faster if some libraries are already loaded and make the app uses less memory on this case – in macs most developers target the base system libraries and dump whatever they may need inside the app;
* Libraries updates means that an app may benefit from fixes without being recompiled/reinstalled, i.e., they have a wide effect (on security also);
* Optimization for your architecture.
It is possible to create a static app for linux (you will find some), but most people do not care that much as they may lose what was cited.
Funny how people complain about Firefox and OpenOffice loading slow in Linux compared to Windows. Saving memory and hd space isn’t an issue with the typical laptop comes with a 320gig hard drive an 2+ gigs of ram.
Or that update might break the app. Repositories have their own security problem which is that updates of programs are often delayed until dependency issues have been resolved.
Funny how Firefox in Wine loads faster than the native version. Same for OpenOffice.
Distros do not have software management systems designed around static apps and you get to deal with all kinds of incompatibility issues which are normally ameliorated by releasing the source and lettings hundreds of package maintainers handle the rest. That brings up another issue which is that the repository system is a giant waste of labor.
Firefox and OO are not good examples because they are both bloated, inefficient piles of crap. Even on Windows, Firefox is slower to start than IE or Chrome. Watching strace I/O for Firefox makes me want to cry. It’s just plain inefficient at startup. Has nothing to do with shared libraries. Other programs load plenty fast on Linux. They also use considerably less memory than on Windows. The cost of shared libraries these days is pretty much nil. It’s been measured. It’s fast. Deal with.
Oh, and OS X does use shared libraries for all system stuff, which is probably the bulk of library-age for any given program.
It’s an example of how there is more to load time than the shared library system. It isn’t as if common programs load faster in Linux.
Load times are also plenty fast in Windows and OSX.
The cost is significant for software developers that want to keep their programs outside the repository system. Deployment, porting and support costs are higher for ISPs due to the repository system. There is also a cost for users when the shared library system FAILS as it regularly does which can be seen searching Ubuntu help forums. Most users and developers would prefer Linux to have an OSX style system.
Shared system libraries that are stable for the life of the OS. However programs in Windows and OSX don’t share third-party libraries that constantly change.
But I know I shouldn’t question the repository system when Linux is such a raging success on the desktop. Users love having to go to a forum to learn about a bunch of commands they need to enter so they can upgrade a browser.
http://blogs.computerworld.com/15443/talling_firefox_3_6_one_more_r…
No, it’s not. It’s an example of how Firefox is a bloated pile of crap. Other programs that use shared libraries don’t take forever to load. Your point is utterly invalid.
So you are saying that shared libraries are okay then? Both Windows and OS X use plenty of shared libraries. Even if a program ships with its own copies of libraries, they are still in DLLs and still behave like libraries that are actually shared in RAM by multiple programs. So they face the exact same performance problems you have on Linux (that is, none).
I think you are making it out to be a bigger problem than it is. I’ve rarely had shared library problems on Linux. Most libraries don’t make big changes from release to release. The few that do often allow multiple versions to be installed at the same time so that programs can use the correct version. After all, shared libraries on Linux are properly versioned, so you can install any version alongside any other version. Package managers stupidly don’t provide all versions necessary, but that’s actually easily fixed. It’s just a matter of packaging those other versions, or simply not removing them from the repo when the version is no longer the latest.
Most Linux programs don’t share third-party libraries either. Most programs use toolkit libs, sound libs, X libs, XML parsing libs, etc. All of these are system libs on Windows and OS X and they are effectively system libs on Linux, except that on Linux you have the option of not installing them. Regardless, Linux could simply adopt a policy of limiting API/ABI changes to these libraries, or having more of them have side-by-side version installations. Easily solved. And still, at the end of the day, if a program really needs to have some specific version and doesn’t trust the distro, it can always include it in its RPM/DEB package and just use its own copy instead of the system copy. That was never impossible and never will be.
Ubuntu made a stupid choice in not allowing non-system software unable to be upgraded. Again, this isn’t a repo problem or a shared library problem. All of those can handle that as shown by that thread. If you add a repo that does support advanced versions, you can easily install the newer versions of Firefox without problem. This is simply a policy problem, and one that, IMHO, Ubuntu needs to address. You are right, users should not have to enter cryptic commands to add another repo just so they can get an up-to-date version of a piece of software. The solution, however, is not to dump package managers, or get rid of shared libraries. It’s simply to provide those versions in the repo (or have Mozilla produce their on DEBs for updated versions).
It has more to do with the layer they put around to isolate the host system and the effort dispensed on optimizations for a specific platform than anything else. It can argued that developers of the named programs put a lot more effort on making them run faster on Windows.
First, that is why repositories exist, to test before assemble and ship. Second, on cases where an update is not possible because of dependency issues, if something is a security risk or has other issues it usually get patched and then released. If you follow the bug reports you will find this a common pattern.
Do you think that Wine does something magic about it? Perhaps, you may know if some REAL Windows libraries are need to run them or just what comes with Wine is enough. Anyway, already answered.
I was talking about a very specific need, what is far from be common, i.e. the need to use an application that or is not inside the distro repository or is old and lacks some needed features. The rest of your reasoning is pure nonsense.
It could also be that the Win32 library is more efficient, but whatever the reason it is clear that the shared library system does not automatically confer loading benefits. I also haven’t heard complaints about Windows or OSX loading programs slowly on a modern system. Thus the supposed speed benefit is highly questionable.
That’s still a security compromise because there is a delay from the point at which the developer patches the exploit to when the distro creates their own update. The program should be able to update itself on the day the exploit is discovered.
Any of these statements false:
1. The repository system causes problems for proprietary developers that doesn’t exist in Windows and OSX.
2. The repository system causes problems for users that doesn’t exist in Windows and OSX.
3. Most users are dependent on at least one proprietary program.
The nonsense is that people like you are defending technology decisions that were made in an entirely different era. Linux is closer to a religious movement than an engineering project. Any sane team of OS engineers would ditch the old style shared library system where programs share as much as possible. It simply causes more problems than it solves.
Its is possible to Windows to be more efficient. Windows 7 is a very nice system, I like to used it, but I fail to see how the “performance” issues you list are related to shared libraries, as Windows use a particular form of it. Our arguments are based on theory or perception only, I would like to have a benchmark to back my guess but, unluckily I do not have. Probably the more programs you run on parallel that share important libraries the more you will perceive the difference. Anyway, most of the users run at most 3 to 4 programs at same time.
Well, on a good Linux distro it is really easy to update the whole OS, there is a tool for that. On Windows or Mac? You see, I happen to work with support for this 3 OS and it is way more common to find old software with known bugs on Windows and Mac, from my experience, of course.
None of them are entirely false or entirely true. But from them to conclude that shared libraries are evil, well, you are ignoring that Windows and OSX both have shared libraries too, only that Linux uses it broadly. It is a decision I still prefer. Also, there is a well known fact in Linux world: developers leave the burden related to package (make binaries) their applications to Linux distro maintainers. But, as I said, you can find “static” programs: Firefox, Thunderbird, Opera, OpenOffice, Skype, Qt SDK, Eclipse, Netbeans and so on. It is just not the preferred method to distribute software in Linux.
Hum, perhaps you are assuming that I am a Linux “all time” user? Well I am not. As I said, I enjoy using Windows 7 and OSX (less a bit) too. Anyway, I still prefer the way Linux (*BSD) solves the problem of system maintenance. And quite frankly, I think you are the one being a little “religious” here.
You should give PC-BSD a try. It uses its own PBI format for app installs, which is similar in concept to MacOS X .app bundles. Plus, it’s build on top of FreeBSD, so you don’t have to deal with Linux.
That is actually very easy.
You simply download a .deb file, click on it, and it installs in the KDE or gnome desktops on a debian-based system.
Because Ubuntu is far more extensively used than any other distro, you have sites like getdeb, which allow you to do just that.
Personally, I prefer adding third party PPA repos; I then get updates as the developers make them.
There is nothing stopping you from downloading a binary and running it, however it is this system that is flawed… A package manager is hugely superior, and this is why most linux users would never even consider downloading and running software by hand…
How do you ensure that your manually installed apps are up to date? Do you really want each app running its own update program in the background?
How do you ensure that the site you download from is legitimate? Sounds like extra work…
Similarly, even locating the app in the first place is extra unnecessary work…
A package manager is just better, apple operate something similar with the iphone app store and users love it… The linux system has all the benefits of the app store and none of the downsides.
Similar thing happens with ruby gems, specifically rails.
ruby libs have their own package management system (basically a CPAN clone called rubygems) that does a good enough job. But debian also tracks specific libs in its repo (one of which is rails), so if you do “apt-get install rubygems”, you end up with a modified version that spits out an error if you try to install a debian tracked library through gem.
The reason this is retarded is that certain packages (like rails) move WAY faster then the debian repo maintainers. That means that if you install rails on ubuntu the supported way, you end up with an obsolete version that can’t be upgraded until the next release. Replacing the crippled ubuntu rubygems install with one off of rubyforge is potentially a non trivial job as well. Finally, if rubygems is an existing system that does a better job then apt for ruby libraries, and is used exclusively by every rubyist, why the hell would you bother trying to fix something that isn’t broken?
I think in a general way, apt is great for app installs, but terrible for dev tools, or anything where staying remotely up to date with upstream is desirable.
It all depends on your usage requirements…
Business and other non technical users want stability, they want one version and to know it isn’t going to change significantly or break, but they do want to receive security updates and be able to install them easily. Ubuntu and RHEL are aimed at users like this.
Technical users want to be on the bleeding edge and have all the latest stuff, and are generally clued up enough to deal with compatibility problems that the latest untested software might bring. Gentoo and Arch are aimed at users like this…
It doesn’t depend at all on usage requirements. It has to do with pulling stuff out of a package system that is used by everyone into your own, not doing as good a job, and actively preventing people from using the one that is used by everyone. Not only that, but developers are, by their definition, technical users, so dev tools should not be treated the same as other stuff.
I was referring to the last paragraph in the article, and thinking more along the lines that if Google were to make it’s employers eat their own dog food (e.g. using Chrome OS), then this might be somewhat difficult for their developers and probably not a necessary move.
? Eclipse on Linux is just fine esp if they are coding for Java and Linux. I am sure some people will still be using Windows tools for coding on Windows. Then again maybe they can port to Windows without using Visual Studio?
*Face Palm* Seriously ?
How do you think the Linux Kernel is written do you think they use an IDE ? no its called vi or emacs and gcc….
Seriously that has to be the stupidest comment I have read.. Why do you need an IDE to code ? Windows is probably the _WORST_ platform to code on.. The developers are probably already developing using Linux and are probably reading your reply and laughing and mocking you right now..
Vi, emacs and gcc would still be classed as development tools.
<sarcasm>
Or are you trying to imply that they would use vi / emacs and gcc via a web browser in ChromeOS?
</sarcasm>
Edited 2010-06-01 12:32 UTC
what is an ide ? integrated development environment (IDE)
So in your definition a text editor and compiler == integrated development environment ??
They are development tools yes, but an IDE ?
The company policy is Linux / Mac, where does chromeOS come into this ? Also just fyi , ChromeOS is based off of Linux, how difficult do you think it would be for them to build a developers build of chrome for internal development ? Wow seriously I am not sure who to give the dumb ass of the year award to now
I never mentioned IDE. I simply said they were development tools.
For someone who advocates vi, you sure keep harping on about IDEs.
And the original post said “development tools eg IDE”.
He didn’t (and nor I) state that developers had to use an IDE nor that IDEs were the only type of development tools available.
But then who needs to properly read what people post when you can start an egotistical flamewar?
I take it you didn’t bother reading any of the comments before kicking off? Otherwise you’d know who originally raised ChromeOS and why it was being discussed.
So I suggest you go back, read what people have *ACTUALLY* written, *THEN* comment.
ChromeOS may have a Linux kernel (or at least Linux-derived – I’m not sure how much custom code is in it), but that doesn’t mean it will have all the user-land tools and APIs needed to run their development tools.
In fact, I don’t know why I’m wasting my time with muppets like yourself when there’s much more mature people on here who also happened to know what they’re talking about.
To quote original commentor:
Yes they are indeed development tools, is Vi / Emacs and GCC not development tools ? So what is the parent poster implying ? That you can only find development tools on windows ? That Windows would be the primary platform for development , that IDEs are the only viable development tools ? .. Please note :
they would surely need access to software development tools, e.g. an IDE, etc.
Then re-read my comment then come back to your own.
Now back to your points:
Look at the time of my post .. Again in relation to the main article not to any ones comments but to the main article what does ChromeOS have to do with development tools ?
Next FYI, Linux is Licensed under the GPL :. If they develop “custom code” that code falls under the GPL and gets released back into the Linux Kernel. So it will be a 100% Linux Kernel, I hope that also makes sense to you..
You get the dumb ass award right there well done..
You can take a base distribution, lets say Debian and then build your own new Interface, but in the background still run a full development toolkit..
Also please understand, the development version and the developers version of ChromeOS will be completely different to the final released version of ChromeOS.
If your building for a platform you would build it on the platform, not on a completely separate platform, because you need to be able to compile the code and test the interface.. Its like logic is thrown out of the window, both of you deserve the dumb ass of the year award, Seriously congratulations, well done..
What?!!! We’re talking about ChromeOS’s viability for development, not Windows.
In fact, I don’t recall anyone in the conversaion even mentioning Windows until now.
And as for the IDE comment, you’re the only one that keeps making that point. Nobody apart from you has implied, stated or even asked that question, and this is a point I’ve made several times now. So why keep reiterating something that never happened?
You’re a terrible liar.
Not only have you referenced other comments, you’ve quoted them and hit reply so OSNews’s threaded view as grouped the conversations together.
You’re about 25% correct (and that’s being generous!): If Linux gets developed upon, the source code has to be available due to the GPL licencing – that part is correct. Everything else you’ve stated from then on is quite wide from the mark:
Not all updates to a particular kernel project makes it back to the kernel main branch. For example, much of the work Google have done on Android’s kernel is unlikely to see its way onto the vanilla Linux kernel branch. Some times this is due to incompatibilities between the two designs, sometimes it’s because there’d be a duplication in functionality (different forks will have a different approach to the same problem) and sometimes it’s because there’s simply no need for a specialty piece of code to appear in the vanilla kernel.
Furthermore, because of this there is not a “the Linux kernel” as you put it. There will be several branches depending on the specific developments going on. You’ll also find than some base distros have their own adaptations (eg ArchLinux) rather than using the vanilla kernel (as Slackware does).
However, the key issue with your whole argument is that you’re assuming ChromeOS is using standard GNU user-space tools. AFAIK it’s not. That’s not to say that development tools couldn’t be ported to run on ChromeOS, but doing so might not be straightforward.
So the question is this: if ChromeOS -in it’s current build- isn’t suitable for development, then why change it. After all, there are other OSs (and Linux distros) already capable and ChromeOS wasn’t really developed with this this type of usage in mind (it’s more designed as a web front end for low-end machines – essentially a bootable Chrome browser).
But of course you knew all this already, what with being the intellectual you are
ChromeOS isn’t a base distribution. You seem to have some idiotic opinion that ChromeOS is Debian/Red Hat/or whatever but with Chrome bolted on. It’s not. The changes in ChromeOS run deeper than whatever window manager and desktop environment is bolted onto Xorg.
I suggest you read up on it before engaging is a discussion (or at least be man enough to ask questions rather than making absurd arguments)
I doubt it will be “completely” different, but yes, there will be some changes and some might be significant. However that’s not a get out clause for your idiotic statements earlier.
errr, most of ChromeOSs apps will be web apps and you don’t need ChromeOS installed to test that as, well, pretty much any computer can run Chrome web browser.
You keep talking about dumb ass awards and yet you spouting off a whole load of rubbish about a platform you clearly know nothing about.
Oh, and let’s not forgot that ChromeOS isn’t their only project: they have Chrome (for several platforms), Android, numerous cloud products (from Google DNS to GMail).
From here on I’m writing you off as a troll (and judging by the number of times you’ve been marked down, I’d say others agree with me). So unless you’ve got anything intelligent to say, I have nothing further to say to you.
I have been thinking about your dilemma, and the dumb ass award of the year definitely belongs to you
No No .. You and the other guy both get to stand up and share it seriously..
How can you even think that there is no good development tools on Linux / Mac , Seriously ? Have you ever used Linux ? The whole OS can be turned into one massive integrated development environment if you want it to be.
Did you make that comment as a joke or were you serious and just didn’t know any better ?
I don’t actually recall saying there was no good development tools on Linux / Mac.
Yes I have used GNU/Linux.
Did you even read the article and my original comment?
I’m going to have to object on that one. Programming under windows is by no means enjoyable, not in the least. Limited control of system, especially diagnostics, most good free development tools are ported from the unix world, and major headaches with the whole mt, md, etc, etc builds that can’t mix and match, subpar support for many many source control systems, gui easily becomes very non responsive during performance testing. I cannot for the life of me understand why people keep repeating that windows is so great for developers.
Maybe because we do use the right tools?
I develop for Windows and Unix systems, while it is true that Windows out of the box does not support many of the nice tools Unix seems to have, there are tons of tools to install and ease developers life.
True, many of them are commercial, but if you are developing for leaving what are a few euros, if they make your life easier.
Somehow I feel you don’t have real Windows development experience.
When was the last time you developed in Windows? 1998?
If you are talking about doing development on windows, chances are you are talking about using visual studio. If you are talking about doing unix development on windows, I understand how that could suck.
http://qt.nokia.com/products/developer-tools
http://www.youtube.com/watch?v=U7yje3D1UM4&feature=player_embedded
http://www.eclipse.org/
http://www.wingware.com/wingide
http://www.kdevelop.org/
http://www.codeblocks.org/
Integrating with the world’s premier compiler for multiple platforms:
http://gcc.gnu.org/
http://en.wikipedia.org/wiki/GNU_Compiler_Collection
Welcome to 2010.
Edited 2010-06-01 12:58 UTC
His comment was in reference to how Chrome OS only provides a browser.
But thanks once again for providing links that are redundant when it comes to supporting your point.
Now here are a link for you:
http://dictionary.reference.com/browse/pithy
Welcome to 2010 – you are so funny
You really didn’t need to go to all that trouble in providing all those links, but hey I guess it is your free time and what you enjoy doing.
I am quite aware that GNU/Linux is well supported in terms of development tools at that Chrome OS is based on the Linux kernel.
What I was getting at is this, if Google were to insist employees use Chrome OS then this probably would not extend as far as the developers or support teams unless they modified a version of Chrome OS for this purpose.
Do you know what “cross-platform” actually means?
Development on GNU/Linux with its multitude of of IDEs in conjunction with gcc can target a huge array of end platforms and architectures.
Even Chrome OS, even Chrome OS running on any of these architectures:
http://en.wikipedia.org/wiki/GNU_Compiler_Collection#Architectures
composing programs in any of these source languages:
http://en.wikipedia.org/wiki/GNU_Compiler_Collection#Languages
can be targetted by a developer running an IDE in conjunction with gcc.
I note with interest that even Google’s Go language is to be targetted.
You can run Chrome OS itself under a virtual machine.
The IDE and gcc can itself run on many platforms, including Mac OSX and GNU/Linux.
Edited 2010-06-01 23:28 UTC
Will Google Embrace Microsoft? No.
Such an enlightened company. More should follow their example.
Wasn’t the attack brought against an XP machine running IE6?
Companies running this setup are hardly in a position to blame the software vendor for insecurities.
True, IE6 is known deficient, XP too. However, number of critical vulnerabilities exist for OSX, too – Safari, Quicktime, iTunes, and so on. Many are cross-platform – Flash, Reader, Firefox, etc, etc. If it’s SECURITY to run after – build a custom kernel and pre-build an image of the preferred OS with number of approved applications and their respective versions. Build a local repository and have a security team review each update, remove portions of code that you don’t like, contribute back to the open source community, and you should be fine. Personally I’m not enforced to a specific OS for work, hence OpenSuse is my choice. Which, on the other hand, is a bad habit – you forget how to deal with viruses and malware on a windows installations
Edited 2010-06-01 11:23 UTC
Yeah, but no one is running OSX and Safari from 10 years ago.
If google had been running Win7 and IE8 then it’s very unlikely they’d have been compromised.
They’re trying to blame Microsoft for their own mistakes and promote themselves in the process.
So you are saying that you can’t get Malware on Windows 7? LOL! From what I have seen you can get it just as easy as XP. (We are testing Windows 7 now for the government agency I work for)
Only way you are really safe on Windows 7 is going 64 bit. But as we have found out here, tons of custom apps built for XP won’t work in Windows 7 64 BIT. So you are still stuck.
We have to make a HUGE investment to move those apps to the 64 Bit version of Windows. A lot of people want to go to the 32 bit version. We are having a big fight over this now. 🙁
Considering your nick highlights you as being unbaised and everything you said makes perfect technical sense, I’ll wholeheartedly agree with you.
It’s well established that Vista and 7 are less vulnerable to malware than XP. If you are getting malware in an office environment then the IT guys are doing something wrong.
64 bit is safer because of KPP but the 32 bit version can be plenty safe if locked down properly.
We have not implemented 7 in our office yet. We are just testing to see the security profile of windows 7 matched against the useability we need for our users. Yes we can lock it down rock solid, then again is it then useable? That is the dilemma we are working with. And then the 64 BIT version!!! Blah!!
But yes Windows 7 is more secure then XP! Better be with a 10 year gap between them!
But
You can remove iTunes (Don’t need it at work) and quicktime. You would be using the chrome browser and flash is supposed to be more secure with it’s built in version that now comes with chrome.
Plus as people LOVE to state. Mac OS and Linux have NO market share and that is what makes them more secure! So they should be pretty safe right?
Remove QuickTime? Since when?
The player, yes, but that’s like removing iexplore.exe. QuickTime is a core part of Mac OS X, you can’t remove it.
It’s more like removing mshtml.dll, iexplore is just the GUI. You can remove QuickTime.app, but the QT frameworks are a central part of the operating system that you cannot remove without breaking it.
And anyway, most people’s complaints about IE and QuickTime are the UI, and not the actual engine under the hood (spare web developers of course)
Far as I have seen most of the QT security issues are related to the player and the codecs not the built in framework. But I could be wrong.
You are right though it is a core part of the OS. But then if you have no vector to get to that core framework. (IE not using something like the quicktime player or something else that needs that framework) and there are no ports open to it, then how would you attack it. I guess you could do something that is triggered by the user?
Edited 2010-06-01 12:41 UTC
You know what the desperate attempts to whip up anxiety about the claimed insecurity of MacOSX reminds me of, a sad alcoholic pointing at the small glass of cold beer you sip after a days work and shouting “see your as hooked as I am”
For all the froth and splutter the simple fact remains – Macs do not get comprised or exploited in the real world. You can argue as much as you like about why but that simple fact remains true.
That’s a dangerous assumption you are making there. Mac OS X boxes get owned all the time. The sneaky part is that the owners don’t realize it when that happens. Because it “can’t happen”.
Do you run without a firewall and with services enabled to the outside world? When was the last time you checked the system logs of your mac for anomalies?
Think about it…
What utter bollocks – please contradict me by offering the slightest bit of evidence to support your absurd statement.
You clearly haven’t.
Well, the recent pwn to own contest being an example…
EDIT: Hint, look up Charles Miller and what he has been doing with fuzzing.
Edited 2010-06-01 18:51 UTC
LOL did you seriously believed Google bullshit on this, security! It’s all public stunt ofc. I think bigger problem is that company which manages large number of customer data isn’t taking security seriously. I think Google is trying to dodge bullet here, soon someone might come and say “You guys have lot of stuff but not seem to use them yourself, why should we”. I hardly think any Linux company uses Windows there.
I bet this will be a slow process and most people will opt to shift to Mac OSX. It will just the the hardcore techies who will show Linux any love.
I’d expect a lot of developers at google would fit under that label. However, I’d also expect that for most developers linux or osx would be equally acceptable, unless they do something osx-specific.
for me at any rate, it would be the other way around. unless I need something linux specific, I would rather use OSX, even though the majority of my tools run fine on both (gvim/bash/ruby)
Come on! Why the OP got an -1 rating?
I was member of staff during the 23th Chaos Communication Congress (a hacker conference) in Berlin.
There, a talk must be canceled because a techie could not connect his Linux box to the projector. And this is a common problem on Linux boxes that is known for decades and has not been fixed for decades.
Two years ago on my job, I had to fallback to a paper presentation because my Linux notebook has not managed to connect to a projector although I’ve tested it SEVERAL times before my presentation with the very same projector (and it worked — kindof).
So, the conclusion can only be: Linux is not yet ready for the desktop/notebook market. Google Chrome OS coud change it… but there is a long way to go.
My daughter called me the other day, and she asked if I could help out at her workplace (after-school care), because they were having a presentation that evening and could not get the projector to work with a laptop.
It turned out to be a Windows 7 computer that would not connect to the projector through a video splitter. Windows XP (on another laptop) would, Windows 7, no. They had a presentation program (for a video) that was only installed on the Windows 7 laptop. I had to take out the splitter, so that the presentation was visible only one one screen, not two as planned.
The quintessentially Windows experience (i.e. nightmare) involving binary-only executables, non-working drivers, dumbed-down to the point of lack of configurability, software which is licensed to run only on one particular machine, non-portability of applications, and application-specific file formats very nearly sunk the whole show, even though all of the hardware was in perfect working order.
There is a looooooong way to go for Windows.
Edited 2010-06-01 23:14 UTC
Are you f*****g kidding me… who here has had an infected version of windows where you as the user has had complete control over the system and it has been infected with a virus. It does’t happen…
No ifs buts maybes… unless someone has purposefully tried to hack a system and guess what – most systems no matter what OS will fall over time.
Please tell me if I am wrong. I doubt it. Make your response more than a one level thread.
Marketing campaign pure and simple…
I would call it marketing if they did not throw OSX into the mix. But throwing OSX into the mix and not going straight Linux seems like its not 100% marketing.
No matter how you slice it and as people always like to say, Mac and Linux have no market share so no one attacks them so they should be safer right? And they would be using the Chrome Browser on both so that should make them even more safe.
This isn’t entirely true. Linux distributions have far more market penetration as servers than Windows does. Yet virus writers don’t attack them? Why not? They are more likely to have fatter pipes and more hardware to drive their spam bots. It just so happens to be that they are more secure by their very nature of being modeled after Unix. If I were a virus / bot writer, I’d try to create one for Linux and get it out there. Think about the super computers doing crime for you.
The simple case is, that Windows is far easier to get into and there are simply a lot more stupid people that run it.
Windows can be secured as well, it’s just not that way by default. Linux distributions are.
You’d think if Linux (and here I’m referring to anything using the Linux Kernel) were so insecure, all the routers, firewalls, cell phones, etc would have viruses.
You know that is what I say but everyone tells me that its because of market share (Which is why I made my statement)
I know almost every router and switch in the world and almost every webserver and almost all of Google’s servers and Yahoo (Who mostly uses BSD) and Amazon and Facebook etc, etc.
But you know the market share argument seems to keep showing up. Crazy.
On top of that you know that EVERY version of Linux is different. Everyone changes the Kernel for their distro. Hard to write viruses for that for sure. You are only going to hit a small group of users. As a matter of fact it’s hard to attack users of the same distro if they are using different versions of the distro. Using Ubuntu 9.10 is not the same as 10.04 that is for sure.
On the other hand, it’s very easy to write very nasty virulent code in Perl, Python, or even in shellscripts, which are not dependant on libraries or kernel ABI.
Get a bad package with some nastyness in the pre or postinstall scripts and watch the chunks fly. And that even works across several unices.
No, *nixen are not immune. Watch those checksums.
Well, the thing is that in the Windows world, it’s very common to run everything as an Administrator user (root). And in the unix-world, it isn’t. That’s a serious obstacle for a virus, but not an insurmountable one.
When a Windows machine is infected, it usually has all the rights to cause some serious damage on a system and hide itself. That doesn’t work so well on a unix box. Anomalies are easier to detect with unix. But that doesn’t mean that any Unix is immune to them, they are just not as easy to hide.
Unless something gets run as root, then you are SOL.
That isn’t accurate, actually. This is just one source (eweek, Oct ’07):
“In 2000, Windows comprised about half of the server operating system market, followed by Unix and Netware at about 17 percent each and Linux reaching towards 10 percent, she said, noting that today Windows owns about 70 percent, Linux about 20 percent, with Unix below 10 percent and Netware barely registering.”
No doubt Linux is doing well, and a jump from 10% to 20% in ~8y was great progress, but even if that trend continued (which it may not, as the article in question is actually about Linux rate of growth slowing) it wouldn’t have “far more” market presence in the server arena than Windows.
I’d imagine Linux proved far more dominant against Unix and Netware, particularly as those companies suffered after the dotcom boom, and Linux took a large percentage of those users. Competition against Windows has been stiffer. As the pool of Unix/Netware shrank, the growth _rate_ in Linux was bound to slow, even if overall usage continued to go up. W2K8 was a solid release as well, which I could see stiffening competition even more.
Edited 2010-06-01 22:24 UTC
Telns.. Servers for what ?
For web ?
http://www.securityspace.com/s_survey/data/200905/index.html
Apache 26,018,677 71.51% 26,313,623 71.59% -0.08%
Microsoft 6,362,860 17.49% 6,381,149 17.36% +0.13%
http://news.netcraft.com/archives/2010/05/14/may_2010_web_server_su…
Market Share for Top Servers Across the Million Busiest Sites
September 2008 – May 2010
Developer April 2010 Percent May 2010 Percent Change
Apache 664,232 66.82% 664,186 66.82% -0.00
Microsoft 168,829 16.98% 167,740 16.87% -0.11
nginx 46,698 4.70% 48,598 4.89% 0.19
Google 20,913 2.10% 19,367 1.95% -0.16
Just feel free to look through those numbers and bear in mind the numbers are actually skewed in Microsofts favour, (the million busiest sites clearly indicates Apaches complete market dominance at +70%)..
Well, you’re right and you’re wrong at the same time. The first problem is, of course, that running Apache just means you are running Apache, not that you are running Linux. Most of those are likely are running Linux, but you can’t actually know, since Apache runs on Unix/BSD/MacOSX/Windows, usw. as well.
There may also be selection bias problems related to any attempt to extrapolate overall OS usage patterns from a survey of the web servers used by the top N busiest sites. I’m not sure what kind of bias that would introduce: pro-Linux, anti-Linux, I’ve no idea for sure, but I don’t imagine it can be taken for granted. If anything, if I were pressed to guess, I think it might introduce a bit of pro-Unix bias.
And, last of all and most importantly, of course, there are more servers in the the world than just web servers.
Some more numbers, this time quite recent (Q3-4 ’09):
*snip from http://www.zdnet.com/blog/microsoft/behind-the-idc-data-windows-sti… *
Here’s IDC’s OS share data break out.
Units (Q3 2009/Q4 2009)
Windows 1,248,200 (73.9%) 1,434,225 (73.9%)
Unix 72,001 ( 4.3%) 84,851 ( 4.4%)
Linux 357,491 (21.2%) 412,041 (21.2%)
Total 1,688,859 1,941,966
*end snip*
Those figures won’t be perfect, because I don’t know if the study controlled for servers shipped sans OS (which a lot are). Units shipped sans OS are a question mark, not an unambiguous win for Linux. Every server I’ve ever bought or recommended was ordered without the OS, whether we planned to install Linux or Windows.
To develop Windows port of Chrome browser and other applications for Windows and to test websites to see if they work correctly on IE, I think that they would not ban all of Windows machines anyway even after transition completed. But I guess they might just limit the network usage of those machines though. Now their in-house technical support team should have fewer burden
http://en.wikipedia.org/wiki/VirtualBox#Licensing
http://suepke.eu/embedding-windows-in-linux-with-virtualbox/
http://suepke.eu/wp-content/uploads/2009/02/vbox.png
Edited 2010-06-01 12:53 UTC
And MSIE runs in wine too. No need to virtualize windows.
not only that, but ie6 is xp only, so you need virtualization anyways, even if you are using windows as a host.
I don’t know what is. LOL You see, You really can compete with and endorse a competitor’s product at teh same time.
First of all I am assuming that we are generally smart and tech minded. So please try to sound a little smarter, or at least pleasant/funny
Google is a business. and it has made a business decision it is not a hard one to make when you look at windows and any possibility of return on investment/ security. And I mean real security. The ability to retain a secure config in the office and in the airport or café. Windows is not that OS. But I am for real. sick and tired of Apple Hate. It is Yet Another Unix, and a fine one. I am not an Apple Fanboy or some Naïve newbie who doesn’t understand my system but if I was that would be OK too. A user is valid if he is in the server room or in the HR office. I would look foolish or retro if I hated on KDE or GNOME for the reasons that get tossed around for hating Apple.
No Apple isn’t perfect but it doesn’t come with a legacy of bugs and security holes. Not like Microsoft. And If you ran Unix all day as the Admin account you would get the same bugs and security holes. MacOS is neither free or open, however I can run f/oss software very easily on a Mac. I have used both Linux and Mac OS for years and the one thing that I really would like from the f/oss world is standards and ease of support. That is the nature of the business world. The average user does not want to be technical all of the time. I do not want to have to wonder if this drive or that codec is free or stable.
Maybe Apple Will become the next Microsoft. Maybe not, but Google certainly will become the next Apple. It is a legacy of success that I would want to emulate if I was there. And the reason is there clear as day, If you leave Linux and Unix in the hands of a committee then you will have SW that has no chance of making it in the big world
Not certainly. Maybe. Possibly. Google has been effective in two areas (Search and mobile OS), but only profitable in one area (Search). Android is a loss leader/defensive play for extending search to mobile platforms. Google has made a lot of noise with projects like Wave, Google Office, etc, but few of them have taken off. So, unless Google manages to significantly expand the Search market or make Android profitable, it’s unlikely that will have the same revenue growth curve as Apple.
I thought the Google attack was targeted, and not random? If that is the case, security through obscurity only helps against script kiddies. Google should eat their own dog food or use BSD.
-Bounty
It was targetted, by it was leveraging an already infected windows box. it was actually a hell of an attack, don’t remember the specifics (read it a few months ago), but it ended up leveraging multiple exploits to get in.
Edited 2010-06-01 18:56 UTC
Lets not forget the big G is currently creating a competing product. I find it hard to take these comments sincerely with this conflict of interest.
Users: “Google [Privacy] We don’t feel so good!”
Security on Windows is improving and it has always been an uphill battle.
Google and security are pointless when they have no problem not respecting your privacy.