Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.
I am really surprised that a Company like MS, with a successful operating system line, considers its operating system a cornerstone in any strategy… duh.
More interesting is that they are focusing their next release on the server market, because that’s where MS is loosing to GNU and with their large installed client base they can neglect the client users (aka home users) and instead present “rich applications”. So cool. Whatever…
Let’s just wait and see and not jump to conclusions. The end of 2006 is still a good piece of time away. Lots of stuff can happen till then.
Linux is great but the usability is zip zilch nada. Try installing a scientific app in Linux. First command line unpack the gz file then untar the file into /usr/local/ with root privileges. THen chmod the installer file and run the install script. But wait that does not necessarily mean you are done after running the install script? But what? I thought I just installed it by running the install script. Well theoretically you did but it is not in the path so you cant go back to your home directory and call it and just make the app work. Frustrating. I like the Windows way of things. Same app that I was talking about I installed in Windows in like 5 seconds and was up and running in another 5. God. I know that Linux is secure and has the fewest bugs per line and all that doodah but if you cant make the user experience any better so that the people instead of being able to run the apps they want trouble free have to go about tweaking this config file and getting this retarded dependency with version mismatches cause you have the latest version and the app requires the older version…and you see why Windows is so damn popular. Yet Linux has an extremely robust codebase. Tell you what Open Source Linux community. MAke a Windows version of Linux ya know? Point and click and no dependency headaches. Until you guys do, well Windows will always be ahead in the desktop war. And whoever said Linux won the desktop war has got to be kidding or I am really really out of the OS loop.
Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.
Actually a lot of distros can use Apt-get not just Debian. I use SUSE as my desktop at home and it runs apt just fine even though YAST is more than capable. All you have to do is edit the sources list and add a repository.
How long does it take to update all your programs in windows? you have to download individual patches for every program and then run a seperate install for each. On my system its just apt-get update.
Ofcourse the main problem is that there is no central install process that everyone obeys which is both a good and bad thing. But projects like autopackage are making progress in that area.
Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.
Well, most Linux distros with a modern package management system will allow single-command installs…sometimes from the command line, sometimes from a GUI. When the packages are avilable, it’s actually much simpler than Windows…
In Debian (and descendants), you can do it with apt-get/frontends.
In Slackware you can with swaret.
In Redhat (and descendants), you can do it with RPM/frontends.
In Arch, you can do it with pacman.
In Gentoo, you can do it with that emerge thingy.
This actually covers most distros/users out there. Most people are using Redhat/Debian or a relative or one of the others I mentioned.
>>Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.<<
I don’t know this application, it is not part of the Debian repositories, not even not-free or contrib. Perhaps it is closed source, i don’t know. It doesn’t matter for the sake of argument.
That you can simply click on an installer and the software installs flawlessly is thanks to the efforts of the creator of the software. You can use MS’ own InstallShield to build packages or any other method, e. g. Wise or brew your own but ultimativly it is because the software creator did it – not MS Windows or Microsoft.
You can do the same for GNU/Linux and lots of applications just do this, probably best known are Star Office, Visual Works or any high-tier game like Doom 3.
You see, just like the driver issue, it is that the vendors and creators support MS Windows not the other way around. Bad luck for GNU/Linux…
It all depends on who makes the software. For example take firebird. Download the bin file and run it. It is a nice graphical installation just like in windows. Point and click, choose your own path. If you are not root the install it on your own home directory. No registry or anything need it!
All you have to do is edit the sources list and add a repository.
Right, but the tricky part sometimes is trying to figure out which repository has the version of whatever application you’re looking for. Sometimes it is painless, sometimes it is not.
How long does it take to update all your programs in windows? you have to download individual patches for every program and then run a seperate install for each. On my system its just apt-get update.
The nice thing about the ‘Windows way’ is that assuming an app has no auto-update feature (which many do these days), I simply go to the vendor’s website (usually accessible via the Help menu) and I am pretty much guaranteed to be getting the latest stable version. I don’t have to worry about updating my whole system at once and getting a version of something that is several months out of date. Sorry, but power users like to have the latest and greatest, though not necessarily the bleeding edge beta. With Windows, I have a choice. With apt, I type in apt-get <software name>, and who the hell knows what I’m getting???
I don’t know this application, it is not part of the Debian repositories, not even not-free or contrib. Perhaps it is closed source, i don’t know. It doesn’t matter for the sake of argument.
Sure it matters, because it illustrates the fundemental flaw of package managers like apt. If you can’t find what you’re looking for within the package manager, it makes the ‘just use apt’ mantra completely pointless. Sure, it’s great when it works, but if I have to go futzing around repositories every 5th app I want to install (and worse yet not finding what I’m looking for in any of the repositories), its use is limited.
All you guys over here that installing apps on linux is a breeze. Yes, that’s true, but don’t forget that it requires internet connection, and even more it should be fast. You cannot get a CD ROM full of apps like amarok, say totem and such and install them with single click or a command without net right? Why do you think eveyone in this world has an internet connection? Really, it’s not that way.
Where I live people still install Redhat 7 or Mandrake 7.2 just because they cannot get latest versions, and try to make their winmodems work with these outdated distros. Such a pain
If the CD is set up as an apt-cdrom source, you can install programs off of it just as easily as an internet apt repository. If programs are distributed correctly, they’re easy to install. Just like in Windows.
Even if windows apps had auto update (a lot still don’t) its still going to each app and updating it manually and if its a “critical” app followed by a reboot and repeat.
I like the apt-get cause global upgrades are painless (at least for me) as is adding and removing packages. If you’re not a CLI kinda person ,Synaptic provides a nice gui interface. Like I mentioned previously its not perfect but projects like autopackage are trying to fill in the gaps for software that do not included in apt repositories.
Sorry, but power users like to have the latest and greatest, though not necessarily the bleeding edge beta. With Windows, I have a choice. With apt, I type in apt-get <software name>, and who the hell knows what I’m getting???
First of all, you don’t have to type anything. You can install programs in Synaptic. To have the “latest and greatest” applications, you need to run a development version of a distribution, but you have to be prepared for things to break. If you only want the latest versions of specific programs, you either download the individual package (but then you don’t get dependency checking) or use apt pinning. It’s something you have to learn, but isn’t that what being a power user is all about?
Is it to make writing networked business applications that run locally (ie, fat clients) so easy to write and deploy that one would make one of those in .NET, instead of using a web-based sytem, which has the potential to work on linux desktops?
You’re somewhat correct, but even if CD is collected with all the need dependencies it wouldn’t run on Mandrake, Redhat, Debian, etc. You see, they all would require diferent CDs. Now imagine how a pain this would be to magazine publishers to say distribute Amarok to ALL linux users.
Don’t get me wrong, I am quite happy with my Debian box and synaptic, but maybe in 10 years it will be a valuable way to install apps not just in EU or USA.
Even with apt-get you will have problems with dependancies. Example: I tried installing Firefox on RHEL 2.0. No can do – library conflict. However some of the libraries that should be updated, can’t, because otherwise they would break the system. In Windows you have a much better backwards compatibility.
If I want to uninstall an application, I can’t just go to my “add/remove programs” applet, highlight the app and click on “remove”.
Even when an installation is automated, in Linux, it won’t create the “shortcuts” and icons in the “K” or Gnome menus.
The only companyI saw taking care of some of the abovementioned issues, was Loki. They, I think, had these nailed down with their installer and deinstallers.
But they’re dead, and in spite of the OSS-ness of their software, noone picked up on their work.
Thanks for the replies guys. I think it is a bit unfair of me to make some comments like that considering I am looking at things from a Windows point of view but you do have to admit that theer are things in Windows that are quiet well done as there obviously are things in Linux that are well done. I just wish there was an amalgamation of the two. I think that would suit everyone just perfect.
Sure, it’s hard to distribute a program on a CD with a magazine, but is that really necessary? Most dependencies and general purpose programs come on the distribution CDs. For the few isolated cases where internet access isn’t possible and dependencies are needed, you can burn a program and all of it’s dependencies onto a CD. That’s essentially what Windows programs do. They include all the DLLs necessary and install them if they aren’t present.
Supporting many distributions would still be a problem, but that’s not really a big deal with open source programs. The developer can release the source and the community can make packages for each distro. With commercial apps, it’s a bit harder, but Skype managed. Basically, you support as many distros as you can, and allow others to install a statically compiled binary. The program will still work, but you won’t get to use the package management features. It’s not that big of a sacrifice.
Even if windows apps had auto update (a lot still don’t) its still going to each app and updating it manually and if its a “critical” app followed by a reboot and repeat.
Right, except for two things:
1. Most apps these days don’t require reboots. I installed VS.NET 2003 last night and didn’t have to reboot afterwards.
2. Even though many apps need to be updated manually, you don’t need to do them all at once, so it’s not like you gotta sit there for hours every week updating all your apps
Niran
First of all, you don’t have to type anything. You can install programs in Synaptic.
It’s not the typing that bothers me, just that even when using Synaptic, if you’re using the same repositories as the command-line apt program, you’re still going to have the same issues of incomplet/outdated packages, so Synaptic really doesn’t solve anything.
To have the “latest and greatest” applications, you need to run a development version of a distribution, but you have to be prepared for things to break.
If I want the latest and greatest, why do I need to run a development version of a distro and expect things to break? Why not create a package system that doesn’t suck so that I can use any distro I want and get the latest and great of what I want? Ahhh yes, I know … it’s free so bitching is not allowed, I forgot. Just goes to show you that the package mangers aren’t as badass as people say they are. Or, it least they would be if somebody had the common sense to do it right. (Hint: Each distro should not be rolling their own packages.)
What a healthy, resilient, long-term strategy—-live your life to defeat threats from other companies. And especially, threats from people who are *not* in other companies, but do computer work for its own sake.
I paid very large sums of cash (from a home-office computer user’s point of view,) for XP Pro and Office Pro, and microsoft has absolutely no interest (that I can tell) in supporting my old, reliable, Panasonic dot-matrix printer for printing labels in Microsoft Access. There are bugs in the driver. BTW, the only reason I bought it at all was because I was asked to. I wouldn’t have for myself.
They only started talking seriously about file system security when they got lambasted for it in the press. They only started doing anything serious about it when the press wanted to know when they would stop talking it up instead of acting.
Why in the world are people still spending so much of their hard-earned money on products from a company that sees everyone else as a threat, and has been convicted of felonious anti-trust practices?
Frankly, I think they should significantly lower the cost of their products, or provide full-range support for them including legacy devices. If nothing else, they should open-source the drivers they don’t want to support, so we can fix whatever is wrong with them.
The latter two issues you mention have been solved in Ubuntu. The problem was that most developers don’t include the .desktop file that is necessary to add things to the menu. All supported packages in Ubuntu have those added to them. The first problem I’ve never had, but I think the problem was that you were installing a Firefox package that wasn’t meant for your version of RHEL. A properly made package doesn’t update any base libraries.
@Darius
You completely ignored the other option I gave you for getting the latest versions of only specific packages: apt pinning. Just because you don’t understand how package management works doesn’t mean it’s being done wrong. You can do everything you’re asking about, but you just don’t want to take the time to learn. That’s the reason distros come preconfigured. If you want something beyond what they offer, you have to learn how to do it. Someone coming from the Linux world to Windows would be just as put off by the fact that they have to go to each application developer to get programs instead of downloading it from a centralized place. It’s just a different way of doing things. If you try it for a while, I don’t think you’ll be that put off by it.
True I may not be updating them all at once, but its nice to just run one command and see if there updates for every single one of my apps as well as system based upgrades.
You completely ignored the other option I gave you for getting the latest versions of only specific packages: apt pinning.
Apt pinning, from what I understand of it, still does not solve the underlying problem. Say that you have 5 distros set up in your preferences file, but the actual package you’re looking for is on a 6th repository somewhere and you have no idea where it is?
Another thing, I know you can install from whatever repository you want, but do they have a command where it’ll go out and query all the repositories and report back which version of a package each repository has? And if they do, what do you do if all your repositories have an out-of-date version from the one you really want?
Anyway, with all this apt-pinning and other BS, it kind blows a hole through the myth that install apps in Linux takes like 3 seconds. Yeah, sometimes it does, sometimes it doesn’t.
Heretic
True I may not be updating them all at once, but its nice to just run one command and see if there updates for every single one of my apps as well as system based upgrades.
Well, in your istance, I’m assuming you’re running some kind of command that’ll tell you whether your apps are up to date. But is the information really accurate? Are you really being told what the latest and greatest is, or only the latest and greatest of whatever happens to be in your apt source repositories? I mean, it’s better than nothing, but not exactly optimal.
I use the gwdg.de source for my repositories and they generally have the latest and the greatest, I usually get updates a day after they are released and if I want to be on the bleeding edge I can always opt to dl unstable packages. Its definatly optimal and it works great with software that you wouldn’t know is out of date unless u check the developers site (eg codecs). Its a great way to keep your software up to date as well as installing new programs.
Linux definatly offers a lot more choice than windows. In windows if your executable doesn’t work thats it. In linux u can get it from a repository or try a custom install script or compile it from source. It may be confusing to a regular joe but I wouldn’t give it up to go back to “oh it doesn’t work, oh well” option of windows.
I use the gwdg.de source for my repositories and they generally have the latest and the greatest, I usually get updates a day after they are released and if I want to be on the bleeding edge I can always opt to dl unstable packages.
Sounds good, but are you able to query different repositories to see what each one of them has? For example (pulling some made up command out of my ass):
apt-get query <package_name>
Would return to you a list of all the repositories in your sources.list file (or whatever it’s called) along with whatever package version that repository has? If not, they should definitely add that capability For a power user, they would pretty much the click n’ run thingies pretty pointless. That is ASSUMING I don’t have to look for three months to find a .deb file for some app that isn’t very popular.
Linux definatly offers a lot more choice than windows. In windows if your executable doesn’t work thats it. In linux u can get it from a repository or try a custom install script or compile it from source. It may be confusing to a regular joe but I wouldn’t give it up to go back to “oh it doesn’t work, oh well” option of windows.
Ummm, no. I generally tend to stick with apps that have proven to be feature-rich, stable, and reliable over time. Since my Windows box is running optimally all the time (remember, I’m no newbie), if an app crashes on startup, there’s usually something wrong with the app. If I look in the docs or the program’s FAQ and can’t figure out what’s going on, I generally tend to think that it’s not worth the trouble. I’m not going to spend an eternity trying to get one app working in ANY OS when there’s probably at least 3 dozen other apps that do the exact same thing.
That’s the reason why I don’t buy cheap & crappy hardware.
Linux will never be all things to all people. If, after considering all of the relevant factors, you determine that Windows meets your needs better than Linux, than please by all means use Windows. Personally, I’ve found that it is becoming increasingly rare that Windows is a better fit for my needs than Linux, but your needs are no doubt different than my own.
The shortcomings each of you have attributed to Linux seem to me to primarily be consequence of your being mistaken as matters of fact. Other than that, you each list some complaints which amount to holding Linux at fault for failiong to do things in the fashion you’re accustomed to Windows doing them.
In some cases Windows does things better than linux. In some cases Linux does things better than Windows. In an awful lot of cases Linux and Windows simply do things differently. If doing things just like they are done in Windows is strong requirement of yours, then Linux will likely never satisfy you and you just stick with Windows.
Whatever you choose, good luck to you and happy computing.
” Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.”
You will have NO problem running Legacy apps on Windows Longhorn, at ALL. During several of their demos Microsft has shown older Legacy apps running under Longhorn including but not Limited to Visicalc. Users wont have any problems with their legacy apps and incompatibilities. You can and will still be able to run untrusted apps on Longhorn.
I’m sorry, but if Caret is only available as source code for Linux (instead of RPMs or other installable binaries), then it’s not the distro maker or Linux developer’s fault, it entirely the responsibility of the Caret developers. There’s nothing preventing them from using a GUI installer (like Codeweavers use for Crossover Office, or the OpenOffice installer, or Autopackage) or “commercial” rpms that install the program but still require a serial key to use.
Meanwhile, the near totality of software I use is available in the Mandrake repositories. And to those who asked about how you know if the version you install through package managers is stable: stable and unstable version packages are kept in different repositories, which you can easily set up separately, so you can choose which version of an app or library you want to install: older, latest stable or bleeding-edge (and unstable).
Even with apt-get you will have problems with dependancies. Example: I tried installing Firefox on RHEL 2.0. No can do – library conflict.
Were you using the repository for your distro? Because that shouldn’t be a problem. I have no problems installing firefox through rpmdrake (urpmi frontend, similar to synaptic). It automatically installs dependencies if needed (of course, I have to use the correct repository for my distribution).
However some of the libraries that should be updated, can’t, because otherwise they would break the system. In Windows you have a much better backwards compatibility.
This can easily be solved through static linking. I think that’s what it does if you install Firefox with the installer from mozilla.org.
If I want to uninstall an application, I can’t just go to my “add/remove programs” applet, highlight the app and click on “remove”.
Yes you can, with rpmdrake at least. You can install, update and remove all apps through the GUI, and it will take care of dependencies for you.
Even when an installation is automated, in Linux, it won’t create the “shortcuts” and icons in the “K” or Gnome menus.
Again, if you use the package manager it should. Whenever I add an app through rpmdrake, the menu shortcut will be created (it is also possible to do this with the Loki installer, as Codeweavers has proved).
Darius is demanding to be catered to by developers the same way that they do for windows, and he’s right; it’s not bloody likely. There aren’t repos for windows because there don’t have to be. It’s up to individual application developers to come up with their own installer. It’s not Microsoft’s job to worry about you getting your apps, it’s the developer’s. With linux, it’s the other way around. Because there are hundreds of distros with a dozen package managers, it’s not fair to expect a developer to build an installer, so the responsiblity falls on the distro. If the app doesn’t rate, the user gets caught out. He’s not wrong. It’s a hole in the linux system. People are working on it let’s deal instead of deny.
As an Ubuntu apt user I have to chime in here and say that I think the windows install lovers are completely nuts. You guys really enjoy downloading 100+ software packages from all over the place and clicking accept next next next next over and over again? I suppose it’s ok if you only install software occasionally but the effort involved in installing a windows machine and then making it useable drives me crazy.
Please keep in mind that Apt (and similar packaging systems) are meant to scale and fit many different circumstances and systems. Sure, while finding a repository with a particular obscure .deb package can be tricky sometimes, building such a package from source is usually not that problematic with deb-make. I think apt and kin really shines when it somes to system-wide upgrades and repetetive tasks (like upgrading many computers). Being able to do “apt-get upgrade” on ten computers over ssh from one computer has made my day many times, as compared with running around and downloading thousands of corresponding .exe files if the machines had been M$s.
All you guys over here that installing apps on linux is a breeze. Yes, that’s true, but don’t forget that it requires internet connection, and even more it should be fast. You cannot get a CD ROM full of apps like amarok, say totem and such and install them with single click or a command without net right? Why do you think eveyone in this world has an internet connection? Really, it’s not that way.
If you only have a slow modem connection you wouldn’t be any better off if you use windows. An unpatched windows system is compromized within minutes even if you use a slow modem connection, and the service packs tend to be very large.
It will deliver some unique features. So I guess we will see new hardware too. We see a new video card from ATI and NVIDIA twice a year. So you can imagine a video card of 2006. According to some .ppts from PDC’03 you will get pure hardware acceleration for the font rendering only with DX10.
For those who expect to run Avalon on XP. Well it will run, but you do not get the same behavior because Longhorn brings some important tweaks in the kernel. And but the way the whole Longhorn vs. XP story is more about new shell experience. And I just can wait for WinFS .
Just want to say for all Mac and Linux guys. Longhorn is not only about Avalon as a new way to make GUI being hardware accelerated it is about a new programming model. The same applies to the other WinFX stuff. This is what you will not see on Mac or Linux in the near future. What is more important: to have some candy built-in features or a powerful programming model when it take less in times to program the same application?
So be cool. There is no way Apple can be competitive to MS in the near future.
WinFX brings a new programming model. As a part of it there is a new application programming model. And it will make installation, updating and other management much, much better then you can imagine today. I hope it will so .
I do not understand. There are a lot of interesting, really interesting new features in Longhorn. Why do you people pick up from the end of the list? As an example, what about such feature as Least-Privilege User Account? Is it what all XP users want? And once again the main feature is WinFX. It makes windows developer much more productive than Linux developer. So Windows will see more cool apps.
And look at the coming versions of VS. It seems to me that VS going to be the number one (if not VS2005 then definitely Orcas). Heh. May be Migel can do something
“Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.”
HAHA, this comment made me crack.
So these users will be frustrated to upgrade to Longhorn, but they will NOT be frustrated with the thought of losing ALL their applications to move to linux or macs?
I wouldn’t call that Least-Privilege thing an interesting or new idea. It’s basically what everyone else has been doing since day 1 – it’s a massive potential security hole that they *might* be doing something to fix, at last.
But ultimately it will be of little good unless everything works with it – a number of games will only run as an Administrator. If they don’t fix that in a seamless fashion, it won’t really matter if the control panel works or not.
Does WinFX _really_ make developers more productive? What will it allow them to program that we haven’t seen already?
I’m a little jaded about all these so-called wonderful new API’s – I’m not sure you really see much benefit in terms of applications being fundamentally better than they have been. It may make them easier to develop, but they aren’t getting cheaper either…
I highly doubt it. That link you provide was one guy’s opinion saying that he thought that Longhorn might do that – I can’t see Microsoft actually giving in and letting users interoperate with Linux, their no.1 adversary.
I’m dual booting between windows 2003 servier and debian.
apt-get doesn’t always work with debian. Just today when I insalled “lame” and I to use the tar-ball.
One thing I’ve always disliked about windows, is that I have to reboot every time I sneeze – even the servers. A software install on windows almost always means a reboot.
I think the worst part of installing apps in linux, is that there is a few dozen “standard” ways to do it. It’s insane: apt-get, rpm, urpmi, swaret, on and on. I really think this needs to be somewhat standardized.
Somebody mentioned how easy it is to remove software on windows. In theory it may be easy, but in practise, it can be nearly impossible. Often software you remove isn’t really removed, it still on your disk and in your registry. Some apps don’t show up in the add/remove sections. Some apps absolutely will not let you remove them: symantec firewall.
I think the worst part of installing apps in linux, is that there is a few dozen “standard” ways to do it. It’s insane: apt-get, rpm, urpmi, swaret, on and on. I really think this needs to be somewhat standardized.
Personally I blame the software developer if the package does not install properly (this is in regard to binary packages). If they did the same thing to you in Windows would you blame Windows? Obviously not.
Windows developers realised a long time ago that they needed to add the libraries their programs needed into a nice neat installation package. BTW Windows does not have a package installer that any developer can use. They actually have to go out and buy their own package installer. And they buy it because it is part of how to develop for Windows.
When you write software for Linux you need to do more work becuase it is less clear what the installation routine should be. But it is not the fault of Linux if a developer does not understand how to package for Linux. It is the developers fault. Maybe they need a Linux Install 101 course.
Be real, with more than 95% of the desktop operating system, with market share growing from year to year you realy have to come with a realy usable desktop and with applications to overcome this. Until there is not going to be something consistent in the Linux desktop arena. An API approoved by all to be good enough to develop on it you won’t get mass desktop on Linux. It can be done but it just requires a great effort at this point in time and at a fraction of the price.
(And from a vendor who is known to send virus infected binaries to inocent consumers, none the less. And never mind the NGSCB dongle taking yet more of lusers freedom away still.)
If I want the latest and greatest, why do I need to run a development version of a distro and expect things to break? Why not create a package system that doesn’t suck so that I can use any distro I want and get the latest and great of what I want? Ahhh yes, I know … it’s free so bitching is not allowed, I forgot. Just goes to show you that the package mangers aren’t as badass as people say they are. Or, it least they would be if somebody had the common sense to do it right. (Hint: Each distro should not be rolling their own packages.)
Darius, you are being pretty unfair against the packaging process on GNU/Linux. So you wan’t to run the latest and greatest on GNU/Linux? Well, that comes down to running development packages, because that’s the latest and greatest on GNU/Linux. By the way, development packages are mostly found in the developer version of a distro.
By implication, you are comparing beta software on GNU/Linux to released software on Windows. Guess what, on Windows you are never running the latest and greatest as an enduser. If you are not a super-NDA-ed beta-tester, you can only run a release version and updating it means getting bugfixes, not a complete new version. On windows a new version of a proprietary package means a new purchase. (Sometimes there is the occasional PUBLIC beta, which is not bleeding edge either).
The problem with software on GNU/Linux is that people don’t want to wait a few days or weeks, till the packagers put the new software up for download or in the reposoitory. No, it has to be now, like commercial off the shelf software. Never mind that COTS software also has the packaging process which takes time, it’s just that an end-user doesn’t see it. They get a shiny box with an “already stale” release NOW and that seems to be immediate, but it isn’t.
Just because an end-user IS in the loop with the development and packaging process on GNU/Linux, doesn’t mean that the basic flow of creating and packaging of the software is any different to that of Windows software. It takes time to get software ready for an enduser. It would be nice if people wouldn’t blindly point to the false immediacy of proprietary software, when they are criticizing the speed of software availability on GNU/Linux. Just because GNU/Linux allows you to come into the kitchen (wich proprietary firms don’t), doesn’t mean you get to eat before diner is ready.
I didn’t say linux is the end all be all OS for everyone, sure it has its problems as does every OS. My point is that when it comes to software installw its not as problematic as most people think. If you’re not that technically proficient then maybe u should stick with windows but if u like customizing your OS to fit your needs and u don’t need mind getting your hands dirty linux is definatly worth a look.
Sure windows is easy but it won’t let me do all the things I want ( customizing the kernel, changing desktop enviornments, removing any program I want, creating my own security solutions to vulnerabilities etc)and that price is a little too high just for “ease of use”.
The totally annoying “Windows has finished installing support for your Hardware. Please Reboot” (or something similar so don’t nit-pick).
Why does it loose its drivers like this when the hardware have been stable for months and not USB or FireWide devices plugged in and the system has been up for days. Auto Updates is totally disabled as well.
It makes me puke that a modern operating system can get away with crap. Ever Server 2003 does it.
Oh well, back to my trusty VMS server. At least here when a driver is loaded it stays loaded.
If M$ cannot provide in this day and age a stable operating system that does not:-
1) Nag you to death about everything under the sun it thinks you should be doing rather than getting on with real work
2) Mess around with the Drivers etc without asking you
3) Not allow me to take a complet backup copy of my O/S disk in case of H/W fault and then not let me boot the copy on the same system.
These are, IMHO pretty basic things but all M$ seem to care about is making everything more complcated.
With Linux, I can choose how much “Stuff” gets installed and when it is installed it stays installed.
Did you install Matlab for Linux ? I did. It is easy as Matlab for Windows installation.
You only has to run
/mnt/cdrom/setup (or install, I don’t remeber the name)
and respond some questions to graphical dialogs that appear.
And more: Matlab for Linux is MUCH better than Windows version. It is more stable and it has better permormance (except those java IDE; I use -nojm to disable these junk).
Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.
I am really surprised that a Company like MS, with a successful operating system line, considers its operating system a cornerstone in any strategy… duh.
More interesting is that they are focusing their next release on the server market, because that’s where MS is loosing to GNU and with their large installed client base they can neglect the client users (aka home users) and instead present “rich applications”. So cool. Whatever…
Let’s just wait and see and not jump to conclusions. The end of 2006 is still a good piece of time away. Lots of stuff can happen till then.
Regards
Linux is great but the usability is zip zilch nada. Try installing a scientific app in Linux. First command line unpack the gz file then untar the file into /usr/local/ with root privileges. THen chmod the installer file and run the install script. But wait that does not necessarily mean you are done after running the install script? But what? I thought I just installed it by running the install script. Well theoretically you did but it is not in the path so you cant go back to your home directory and call it and just make the app work. Frustrating. I like the Windows way of things. Same app that I was talking about I installed in Windows in like 5 seconds and was up and running in another 5. God. I know that Linux is secure and has the fewest bugs per line and all that doodah but if you cant make the user experience any better so that the people instead of being able to run the apps they want trouble free have to go about tweaking this config file and getting this retarded dependency with version mismatches cause you have the latest version and the app requires the older version…and you see why Windows is so damn popular. Yet Linux has an extremely robust codebase. Tell you what Open Source Linux community. MAke a Windows version of Linux ya know? Point and click and no dependency headaches. Until you guys do, well Windows will always be ahead in the desktop war. And whoever said Linux won the desktop war has got to be kidding or I am really really out of the OS loop.
apt-get <software name>
3 seconds
Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.
Actually a lot of distros can use Apt-get not just Debian. I use SUSE as my desktop at home and it runs apt just fine even though YAST is more than capable. All you have to do is edit the sources list and add a repository.
How long does it take to update all your programs in windows? you have to download individual patches for every program and then run a seperate install for each. On my system its just apt-get update.
Ofcourse the main problem is that there is no central install process that everyone obeys which is both a good and bad thing. But projects like autopackage are making progress in that area.
Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.
Well, most Linux distros with a modern package management system will allow single-command installs…sometimes from the command line, sometimes from a GUI. When the packages are avilable, it’s actually much simpler than Windows…
In Debian (and descendants), you can do it with apt-get/frontends.
In Slackware you can with swaret.
In Redhat (and descendants), you can do it with RPM/frontends.
In Arch, you can do it with pacman.
In Gentoo, you can do it with that emerge thingy.
This actually covers most distros/users out there. Most people are using Redhat/Debian or a relative or one of the others I mentioned.
>>Really it is that simple? Only in the Debian install I bet. So explain to me how a program like Caret I would be able to get using apt-get. I am posting from a Windows laptop right now so I cant try it.<<
I don’t know this application, it is not part of the Debian repositories, not even not-free or contrib. Perhaps it is closed source, i don’t know. It doesn’t matter for the sake of argument.
That you can simply click on an installer and the software installs flawlessly is thanks to the efforts of the creator of the software. You can use MS’ own InstallShield to build packages or any other method, e. g. Wise or brew your own but ultimativly it is because the software creator did it – not MS Windows or Microsoft.
You can do the same for GNU/Linux and lots of applications just do this, probably best known are Star Office, Visual Works or any high-tier game like Doom 3.
You see, just like the driver issue, it is that the vendors and creators support MS Windows not the other way around. Bad luck for GNU/Linux…
It all depends on who makes the software. For example take firebird. Download the bin file and run it. It is a nice graphical installation just like in windows. Point and click, choose your own path. If you are not root the install it on your own home directory. No registry or anything need it!
All you have to do is edit the sources list and add a repository.
Right, but the tricky part sometimes is trying to figure out which repository has the version of whatever application you’re looking for. Sometimes it is painless, sometimes it is not.
How long does it take to update all your programs in windows? you have to download individual patches for every program and then run a seperate install for each. On my system its just apt-get update.
The nice thing about the ‘Windows way’ is that assuming an app has no auto-update feature (which many do these days), I simply go to the vendor’s website (usually accessible via the Help menu) and I am pretty much guaranteed to be getting the latest stable version. I don’t have to worry about updating my whole system at once and getting a version of something that is several months out of date. Sorry, but power users like to have the latest and greatest, though not necessarily the bleeding edge beta. With Windows, I have a choice. With apt, I type in apt-get <software name>, and who the hell knows what I’m getting???
I don’t know this application, it is not part of the Debian repositories, not even not-free or contrib. Perhaps it is closed source, i don’t know. It doesn’t matter for the sake of argument.
Sure it matters, because it illustrates the fundemental flaw of package managers like apt. If you can’t find what you’re looking for within the package manager, it makes the ‘just use apt’ mantra completely pointless. Sure, it’s great when it works, but if I have to go futzing around repositories every 5th app I want to install (and worse yet not finding what I’m looking for in any of the repositories), its use is limited.
All you guys over here that installing apps on linux is a breeze. Yes, that’s true, but don’t forget that it requires internet connection, and even more it should be fast. You cannot get a CD ROM full of apps like amarok, say totem and such and install them with single click or a command without net right? Why do you think eveyone in this world has an internet connection? Really, it’s not that way.
Where I live people still install Redhat 7 or Mandrake 7.2 just because they cannot get latest versions, and try to make their winmodems work with these outdated distros. Such a pain
If the CD is set up as an apt-cdrom source, you can install programs off of it just as easily as an internet apt repository. If programs are distributed correctly, they’re easy to install. Just like in Windows.
Even if windows apps had auto update (a lot still don’t) its still going to each app and updating it manually and if its a “critical” app followed by a reboot and repeat.
I like the apt-get cause global upgrades are painless (at least for me) as is adding and removing packages. If you’re not a CLI kinda person ,Synaptic provides a nice gui interface. Like I mentioned previously its not perfect but projects like autopackage are trying to fill in the gaps for software that do not included in apt repositories.
Sorry, but power users like to have the latest and greatest, though not necessarily the bleeding edge beta. With Windows, I have a choice. With apt, I type in apt-get <software name>, and who the hell knows what I’m getting???
First of all, you don’t have to type anything. You can install programs in Synaptic. To have the “latest and greatest” applications, you need to run a development version of a distribution, but you have to be prepared for things to break. If you only want the latest versions of specific programs, you either download the individual package (but then you don’t get dependency checking) or use apt pinning. It’s something you have to learn, but isn’t that what being a power user is all about?
Nope, it wont! I cannot get needed dependencies without an internet connection. Last time I tried, it couldn’t.
Is it to make writing networked business applications that run locally (ie, fat clients) so easy to write and deploy that one would make one of those in .NET, instead of using a web-based sytem, which has the potential to work on linux desktops?
It could work..
You’re somewhat correct, but even if CD is collected with all the need dependencies it wouldn’t run on Mandrake, Redhat, Debian, etc. You see, they all would require diferent CDs. Now imagine how a pain this would be to magazine publishers to say distribute Amarok to ALL linux users.
Don’t get me wrong, I am quite happy with my Debian box and synaptic, but maybe in 10 years it will be a valuable way to install apps not just in EU or USA.
Why? 3 reasons:
Even with apt-get you will have problems with dependancies. Example: I tried installing Firefox on RHEL 2.0. No can do – library conflict. However some of the libraries that should be updated, can’t, because otherwise they would break the system. In Windows you have a much better backwards compatibility.
If I want to uninstall an application, I can’t just go to my “add/remove programs” applet, highlight the app and click on “remove”.
Even when an installation is automated, in Linux, it won’t create the “shortcuts” and icons in the “K” or Gnome menus.
The only companyI saw taking care of some of the abovementioned issues, was Loki. They, I think, had these nailed down with their installer and deinstallers.
But they’re dead, and in spite of the OSS-ness of their software, noone picked up on their work.
Thanks for the replies guys. I think it is a bit unfair of me to make some comments like that considering I am looking at things from a Windows point of view but you do have to admit that theer are things in Windows that are quiet well done as there obviously are things in Linux that are well done. I just wish there was an amalgamation of the two. I think that would suit everyone just perfect.
Sure, it’s hard to distribute a program on a CD with a magazine, but is that really necessary? Most dependencies and general purpose programs come on the distribution CDs. For the few isolated cases where internet access isn’t possible and dependencies are needed, you can burn a program and all of it’s dependencies onto a CD. That’s essentially what Windows programs do. They include all the DLLs necessary and install them if they aren’t present.
Supporting many distributions would still be a problem, but that’s not really a big deal with open source programs. The developer can release the source and the community can make packages for each distro. With commercial apps, it’s a bit harder, but Skype managed. Basically, you support as many distros as you can, and allow others to install a statically compiled binary. The program will still work, but you won’t get to use the package management features. It’s not that big of a sacrifice.
Heretic
Even if windows apps had auto update (a lot still don’t) its still going to each app and updating it manually and if its a “critical” app followed by a reboot and repeat.
Right, except for two things:
1. Most apps these days don’t require reboots. I installed VS.NET 2003 last night and didn’t have to reboot afterwards.
2. Even though many apps need to be updated manually, you don’t need to do them all at once, so it’s not like you gotta sit there for hours every week updating all your apps
Niran
First of all, you don’t have to type anything. You can install programs in Synaptic.
It’s not the typing that bothers me, just that even when using Synaptic, if you’re using the same repositories as the command-line apt program, you’re still going to have the same issues of incomplet/outdated packages, so Synaptic really doesn’t solve anything.
To have the “latest and greatest” applications, you need to run a development version of a distribution, but you have to be prepared for things to break.
If I want the latest and greatest, why do I need to run a development version of a distro and expect things to break? Why not create a package system that doesn’t suck so that I can use any distro I want and get the latest and great of what I want? Ahhh yes, I know … it’s free so bitching is not allowed, I forgot. Just goes to show you that the package mangers aren’t as badass as people say they are. Or, it least they would be if somebody had the common sense to do it right. (Hint: Each distro should not be rolling their own packages.)
What a healthy, resilient, long-term strategy—-live your life to defeat threats from other companies. And especially, threats from people who are *not* in other companies, but do computer work for its own sake.
I paid very large sums of cash (from a home-office computer user’s point of view,) for XP Pro and Office Pro, and microsoft has absolutely no interest (that I can tell) in supporting my old, reliable, Panasonic dot-matrix printer for printing labels in Microsoft Access. There are bugs in the driver. BTW, the only reason I bought it at all was because I was asked to. I wouldn’t have for myself.
They only started talking seriously about file system security when they got lambasted for it in the press. They only started doing anything serious about it when the press wanted to know when they would stop talking it up instead of acting.
Why in the world are people still spending so much of their hard-earned money on products from a company that sees everyone else as a threat, and has been convicted of felonious anti-trust practices?
Frankly, I think they should significantly lower the cost of their products, or provide full-range support for them including legacy devices. If nothing else, they should open-source the drivers they don’t want to support, so we can fix whatever is wrong with them.
@Anonymous
The latter two issues you mention have been solved in Ubuntu. The problem was that most developers don’t include the .desktop file that is necessary to add things to the menu. All supported packages in Ubuntu have those added to them. The first problem I’ve never had, but I think the problem was that you were installing a Firefox package that wasn’t meant for your version of RHEL. A properly made package doesn’t update any base libraries.
@Darius
You completely ignored the other option I gave you for getting the latest versions of only specific packages: apt pinning. Just because you don’t understand how package management works doesn’t mean it’s being done wrong. You can do everything you’re asking about, but you just don’t want to take the time to learn. That’s the reason distros come preconfigured. If you want something beyond what they offer, you have to learn how to do it. Someone coming from the Linux world to Windows would be just as put off by the fact that they have to go to each application developer to get programs instead of downloading it from a centralized place. It’s just a different way of doing things. If you try it for a while, I don’t think you’ll be that put off by it.
“Nope, it wont! I cannot get needed dependencies without an internet connection. Last time I tried, it couldn’t.”
That’s odd. You cannot get automatic updates for Windows without an Internet connection either.
True I may not be updating them all at once, but its nice to just run one command and see if there updates for every single one of my apps as well as system based upgrades.
Niran
You completely ignored the other option I gave you for getting the latest versions of only specific packages: apt pinning.
Apt pinning, from what I understand of it, still does not solve the underlying problem. Say that you have 5 distros set up in your preferences file, but the actual package you’re looking for is on a 6th repository somewhere and you have no idea where it is?
Another thing, I know you can install from whatever repository you want, but do they have a command where it’ll go out and query all the repositories and report back which version of a package each repository has? And if they do, what do you do if all your repositories have an out-of-date version from the one you really want?
Anyway, with all this apt-pinning and other BS, it kind blows a hole through the myth that install apps in Linux takes like 3 seconds. Yeah, sometimes it does, sometimes it doesn’t.
Heretic
True I may not be updating them all at once, but its nice to just run one command and see if there updates for every single one of my apps as well as system based upgrades.
Well, in your istance, I’m assuming you’re running some kind of command that’ll tell you whether your apps are up to date. But is the information really accurate? Are you really being told what the latest and greatest is, or only the latest and greatest of whatever happens to be in your apt source repositories? I mean, it’s better than nothing, but not exactly optimal.
I use the gwdg.de source for my repositories and they generally have the latest and the greatest, I usually get updates a day after they are released and if I want to be on the bleeding edge I can always opt to dl unstable packages. Its definatly optimal and it works great with software that you wouldn’t know is out of date unless u check the developers site (eg codecs). Its a great way to keep your software up to date as well as installing new programs.
Linux definatly offers a lot more choice than windows. In windows if your executable doesn’t work thats it. In linux u can get it from a repository or try a custom install script or compile it from source. It may be confusing to a regular joe but I wouldn’t give it up to go back to “oh it doesn’t work, oh well” option of windows.
I use the gwdg.de source for my repositories and they generally have the latest and the greatest, I usually get updates a day after they are released and if I want to be on the bleeding edge I can always opt to dl unstable packages.
Sounds good, but are you able to query different repositories to see what each one of them has? For example (pulling some made up command out of my ass):
apt-get query <package_name>
Would return to you a list of all the repositories in your sources.list file (or whatever it’s called) along with whatever package version that repository has? If not, they should definitely add that capability For a power user, they would pretty much the click n’ run thingies pretty pointless. That is ASSUMING I don’t have to look for three months to find a .deb file for some app that isn’t very popular.
Linux definatly offers a lot more choice than windows. In windows if your executable doesn’t work thats it. In linux u can get it from a repository or try a custom install script or compile it from source. It may be confusing to a regular joe but I wouldn’t give it up to go back to “oh it doesn’t work, oh well” option of windows.
Ummm, no. I generally tend to stick with apps that have proven to be feature-rich, stable, and reliable over time. Since my Windows box is running optimally all the time (remember, I’m no newbie), if an app crashes on startup, there’s usually something wrong with the app. If I look in the docs or the program’s FAQ and can’t figure out what’s going on, I generally tend to think that it’s not worth the trouble. I’m not going to spend an eternity trying to get one app working in ANY OS when there’s probably at least 3 dozen other apps that do the exact same thing.
That’s the reason why I don’t buy cheap & crappy hardware.
Linux will never be all things to all people. If, after considering all of the relevant factors, you determine that Windows meets your needs better than Linux, than please by all means use Windows. Personally, I’ve found that it is becoming increasingly rare that Windows is a better fit for my needs than Linux, but your needs are no doubt different than my own.
The shortcomings each of you have attributed to Linux seem to me to primarily be consequence of your being mistaken as matters of fact. Other than that, you each list some complaints which amount to holding Linux at fault for failiong to do things in the fashion you’re accustomed to Windows doing them.
In some cases Windows does things better than linux. In some cases Linux does things better than Windows. In an awful lot of cases Linux and Windows simply do things differently. If doing things just like they are done in Windows is strong requirement of yours, then Linux will likely never satisfy you and you just stick with Windows.
Whatever you choose, good luck to you and happy computing.
” Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.”
You will have NO problem running Legacy apps on Windows Longhorn, at ALL. During several of their demos Microsft has shown older Legacy apps running under Longhorn including but not Limited to Visicalc. Users wont have any problems with their legacy apps and incompatibilities. You can and will still be able to run untrusted apps on Longhorn.
are you able to query different repositories to see what each one of them has?
Yep.
apt-get query <package_name>
Close. It’s actually: apt-get search <package_name OR package_description>
Amen.
I’m sorry, but if Caret is only available as source code for Linux (instead of RPMs or other installable binaries), then it’s not the distro maker or Linux developer’s fault, it entirely the responsibility of the Caret developers. There’s nothing preventing them from using a GUI installer (like Codeweavers use for Crossover Office, or the OpenOffice installer, or Autopackage) or “commercial” rpms that install the program but still require a serial key to use.
Meanwhile, the near totality of software I use is available in the Mandrake repositories. And to those who asked about how you know if the version you install through package managers is stable: stable and unstable version packages are kept in different repositories, which you can easily set up separately, so you can choose which version of an app or library you want to install: older, latest stable or bleeding-edge (and unstable).
Even with apt-get you will have problems with dependancies. Example: I tried installing Firefox on RHEL 2.0. No can do – library conflict.
Were you using the repository for your distro? Because that shouldn’t be a problem. I have no problems installing firefox through rpmdrake (urpmi frontend, similar to synaptic). It automatically installs dependencies if needed (of course, I have to use the correct repository for my distribution).
However some of the libraries that should be updated, can’t, because otherwise they would break the system. In Windows you have a much better backwards compatibility.
This can easily be solved through static linking. I think that’s what it does if you install Firefox with the installer from mozilla.org.
If I want to uninstall an application, I can’t just go to my “add/remove programs” applet, highlight the app and click on “remove”.
Yes you can, with rpmdrake at least. You can install, update and remove all apps through the GUI, and it will take care of dependencies for you.
Even when an installation is automated, in Linux, it won’t create the “shortcuts” and icons in the “K” or Gnome menus.
Again, if you use the package manager it should. Whenever I add an app through rpmdrake, the menu shortcut will be created (it is also possible to do this with the Loki installer, as Codeweavers has proved).
Darius is demanding to be catered to by developers the same way that they do for windows, and he’s right; it’s not bloody likely. There aren’t repos for windows because there don’t have to be. It’s up to individual application developers to come up with their own installer. It’s not Microsoft’s job to worry about you getting your apps, it’s the developer’s. With linux, it’s the other way around. Because there are hundreds of distros with a dozen package managers, it’s not fair to expect a developer to build an installer, so the responsiblity falls on the distro. If the app doesn’t rate, the user gets caught out. He’s not wrong. It’s a hole in the linux system. People are working on it let’s deal instead of deny.
As an Ubuntu apt user I have to chime in here and say that I think the windows install lovers are completely nuts. You guys really enjoy downloading 100+ software packages from all over the place and clicking accept next next next next over and over again? I suppose it’s ok if you only install software occasionally but the effort involved in installing a windows machine and then making it useable drives me crazy.
Michael
Please keep in mind that Apt (and similar packaging systems) are meant to scale and fit many different circumstances and systems. Sure, while finding a repository with a particular obscure .deb package can be tricky sometimes, building such a package from source is usually not that problematic with deb-make. I think apt and kin really shines when it somes to system-wide upgrades and repetetive tasks (like upgrading many computers). Being able to do “apt-get upgrade” on ten computers over ssh from one computer has made my day many times, as compared with running around and downloading thousands of corresponding .exe files if the machines had been M$s.
Over and out.
Wasn’t it said one time (by Microsoft themselves I think), that Longhorn will be able to run Linux software?
http://www.vnunet.com/comment/1156794
All you guys over here that installing apps on linux is a breeze. Yes, that’s true, but don’t forget that it requires internet connection, and even more it should be fast. You cannot get a CD ROM full of apps like amarok, say totem and such and install them with single click or a command without net right? Why do you think eveyone in this world has an internet connection? Really, it’s not that way.
If you only have a slow modem connection you wouldn’t be any better off if you use windows. An unpatched windows system is compromized within minutes even if you use a slow modem connection, and the service packs tend to be very large.
Close. It’s actually: apt-get search package_name OR package_description
Close. It’s actually: apt-cache search package_name or description
Sorry, i’m just nitpicking
Victor.
It will deliver some unique features. So I guess we will see new hardware too. We see a new video card from ATI and NVIDIA twice a year. So you can imagine a video card of 2006. According to some .ppts from PDC’03 you will get pure hardware acceleration for the font rendering only with DX10.
For those who expect to run Avalon on XP. Well it will run, but you do not get the same behavior because Longhorn brings some important tweaks in the kernel. And but the way the whole Longhorn vs. XP story is more about new shell experience. And I just can wait for WinFS .
Just want to say for all Mac and Linux guys. Longhorn is not only about Avalon as a new way to make GUI being hardware accelerated it is about a new programming model. The same applies to the other WinFX stuff. This is what you will not see on Mac or Linux in the near future. What is more important: to have some candy built-in features or a powerful programming model when it take less in times to program the same application?
So be cool. There is no way Apple can be competitive to MS in the near future.
WinFX brings a new programming model. As a part of it there is a new application programming model. And it will make installation, updating and other management much, much better then you can imagine today. I hope it will so .
I do not understand. There are a lot of interesting, really interesting new features in Longhorn. Why do you people pick up from the end of the list? As an example, what about such feature as Least-Privilege User Account? Is it what all XP users want? And once again the main feature is WinFX. It makes windows developer much more productive than Linux developer. So Windows will see more cool apps.
And look at the coming versions of VS. It seems to me that VS going to be the number one (if not VS2005 then definitely Orcas). Heh. May be Migel can do something
“Based on what I’ve read here. Microsoft will be making it more difficult to install any application that you want to install based on it’s “Security” methods. Methods aimed at securing Windows by closing a lot of holes. The question is how they are going to retain compatibility with people’s favorite programs. Are they going to stop working? Are they going to have to upgrade first?
I see the above frustrating people. If they have to go through any hassles they might as well turn to something else like Linux or Macs.”
HAHA, this comment made me crack.
So these users will be frustrated to upgrade to Longhorn, but they will NOT be frustrated with the thought of losing ALL their applications to move to linux or macs?
I wouldn’t call that Least-Privilege thing an interesting or new idea. It’s basically what everyone else has been doing since day 1 – it’s a massive potential security hole that they *might* be doing something to fix, at last.
But ultimately it will be of little good unless everything works with it – a number of games will only run as an Administrator. If they don’t fix that in a seamless fashion, it won’t really matter if the control panel works or not.
Does WinFX _really_ make developers more productive? What will it allow them to program that we haven’t seen already?
I’m a little jaded about all these so-called wonderful new API’s – I’m not sure you really see much benefit in terms of applications being fundamentally better than they have been. It may make them easier to develop, but they aren’t getting cheaper either…
Wasn’t it said one time (by Microsoft themselves I think), that Longhorn will be able to run Linux software?
http://www.vnunet.com/comment/1156794
I highly doubt it. That link you provide was one guy’s opinion saying that he thought that Longhorn might do that – I can’t see Microsoft actually giving in and letting users interoperate with Linux, their no.1 adversary.
From my experience.
I’m dual booting between windows 2003 servier and debian.
apt-get doesn’t always work with debian. Just today when I insalled “lame” and I to use the tar-ball.
One thing I’ve always disliked about windows, is that I have to reboot every time I sneeze – even the servers. A software install on windows almost always means a reboot.
I think the worst part of installing apps in linux, is that there is a few dozen “standard” ways to do it. It’s insane: apt-get, rpm, urpmi, swaret, on and on. I really think this needs to be somewhat standardized.
Somebody mentioned how easy it is to remove software on windows. In theory it may be easy, but in practise, it can be nearly impossible. Often software you remove isn’t really removed, it still on your disk and in your registry. Some apps don’t show up in the add/remove sections. Some apps absolutely will not let you remove them: symantec firewall.
I think the worst part of installing apps in linux, is that there is a few dozen “standard” ways to do it. It’s insane: apt-get, rpm, urpmi, swaret, on and on. I really think this needs to be somewhat standardized.
http://smartpm.org/
The biggest problem in Linux is applications or ones
that actually work without crashing and searching
user forums for solutions.
Windows has Office, in the big Corporations and
Enterprise it is the desktop and Office is the
suite used for ‘standard’ business desktop.
Linux downfalls:
—————-
Linux distro’s come and go faster than a cheeta
chasing its meal.
Linux lacks serious apps for specialized
applications.
Linux needs WIRELESS drivers, that work and
more multi-media applications.
I have have been running Redhat since 6.0 days
on and off through 9.0 but now Running
Fedora/Redhat Enterprise RHEL3.0 and it is
really nice.
I have all of my multi-media applications set up
with all of the codecs installed so now I am completly
ALL linux at home. But at work, Windows is the
only solution because of ‘Windows’ only based
applications.
I like Linux better than Windows, it is running
right now on my laptop, right now.
Cheers.
Yeah, apt-get said wifi driver if only!
apt-get new soundcard driver, if only!
apt-get new vid card driver, if only!
apt-get 5 button mouse driver that works, if only!
apt-get sata raid driver, if only!
apt-get virtualdub for linux, if only!
apt-get avisynth for linux, if only!
apt-get industry accepted cad program for linux, if only!
apt-get fix for gnomes stupid file selector, if only!
apt-get drag and drop menus for gnome, if only!
apt-get latest greatest games for your kids, if only!
apt-get and I could go on and on and on and on!
I absolutely love linux, but it’s got a hell of a long way to go imho!
Personally I blame the software developer if the package does not install properly (this is in regard to binary packages). If they did the same thing to you in Windows would you blame Windows? Obviously not.
Windows developers realised a long time ago that they needed to add the libraries their programs needed into a nice neat installation package. BTW Windows does not have a package installer that any developer can use. They actually have to go out and buy their own package installer. And they buy it because it is part of how to develop for Windows.
When you write software for Linux you need to do more work becuase it is less clear what the installation routine should be. But it is not the fault of Linux if a developer does not understand how to package for Linux. It is the developers fault. Maybe they need a Linux Install 101 course.
More research is required but it seems that there are quite a few different package installer solutions out there.
http://autopackage.org/
http://www.bitrock.com/
Be real, with more than 95% of the desktop operating system, with market share growing from year to year you realy have to come with a realy usable desktop and with applications to overcome this. Until there is not going to be something consistent in the Linux desktop arena. An API approoved by all to be good enough to develop on it you won’t get mass desktop on Linux. It can be done but it just requires a great effort at this point in time and at a fraction of the price.
Yes. Who in their right mind would agree to a licenses like thease:
http://proprietary.clendons.co.nz/licenses/eula/
(And from a vendor who is known to send virus infected binaries to inocent consumers, none the less. And never mind the NGSCB dongle taking yet more of lusers freedom away still.)
If I want the latest and greatest, why do I need to run a development version of a distro and expect things to break? Why not create a package system that doesn’t suck so that I can use any distro I want and get the latest and great of what I want? Ahhh yes, I know … it’s free so bitching is not allowed, I forgot. Just goes to show you that the package mangers aren’t as badass as people say they are. Or, it least they would be if somebody had the common sense to do it right. (Hint: Each distro should not be rolling their own packages.)
Darius, you are being pretty unfair against the packaging process on GNU/Linux. So you wan’t to run the latest and greatest on GNU/Linux? Well, that comes down to running development packages, because that’s the latest and greatest on GNU/Linux. By the way, development packages are mostly found in the developer version of a distro.
By implication, you are comparing beta software on GNU/Linux to released software on Windows. Guess what, on Windows you are never running the latest and greatest as an enduser. If you are not a super-NDA-ed beta-tester, you can only run a release version and updating it means getting bugfixes, not a complete new version. On windows a new version of a proprietary package means a new purchase. (Sometimes there is the occasional PUBLIC beta, which is not bleeding edge either).
The problem with software on GNU/Linux is that people don’t want to wait a few days or weeks, till the packagers put the new software up for download or in the reposoitory. No, it has to be now, like commercial off the shelf software. Never mind that COTS software also has the packaging process which takes time, it’s just that an end-user doesn’t see it. They get a shiny box with an “already stale” release NOW and that seems to be immediate, but it isn’t.
Just because an end-user IS in the loop with the development and packaging process on GNU/Linux, doesn’t mean that the basic flow of creating and packaging of the software is any different to that of Windows software. It takes time to get software ready for an enduser. It would be nice if people wouldn’t blindly point to the false immediacy of proprietary software, when they are criticizing the speed of software availability on GNU/Linux. Just because GNU/Linux allows you to come into the kitchen (wich proprietary firms don’t), doesn’t mean you get to eat before diner is ready.
I didn’t say linux is the end all be all OS for everyone, sure it has its problems as does every OS. My point is that when it comes to software installw its not as problematic as most people think. If you’re not that technically proficient then maybe u should stick with windows but if u like customizing your OS to fit your needs and u don’t need mind getting your hands dirty linux is definatly worth a look.
Sure windows is easy but it won’t let me do all the things I want ( customizing the kernel, changing desktop enviornments, removing any program I want, creating my own security solutions to vulnerabilities etc)and that price is a little too high just for “ease of use”.
//but the effort involved in installing a windows machine and then making it useable drives me crazy.//
Yah, it’s SO much easier to install an nVidia driver in Ubuntu than it is in Windows XP.
Close. It’s actually: apt-cache search package_name or description
Mea Culpa!
And it’s not the fist time I’ve typed that either ;-P
The totally annoying “Windows has finished installing support for your Hardware. Please Reboot” (or something similar so don’t nit-pick).
Why does it loose its drivers like this when the hardware have been stable for months and not USB or FireWide devices plugged in and the system has been up for days. Auto Updates is totally disabled as well.
It makes me puke that a modern operating system can get away with crap. Ever Server 2003 does it.
Oh well, back to my trusty VMS server. At least here when a driver is loaded it stays loaded.
If M$ cannot provide in this day and age a stable operating system that does not:-
1) Nag you to death about everything under the sun it thinks you should be doing rather than getting on with real work
2) Mess around with the Drivers etc without asking you
3) Not allow me to take a complet backup copy of my O/S disk in case of H/W fault and then not let me boot the copy on the same system.
These are, IMHO pretty basic things but all M$ seem to care about is making everything more complcated.
With Linux, I can choose how much “Stuff” gets installed and when it is installed it stays installed.
Just my 0.02Tenge worth.
Did you install Matlab for Linux ? I did. It is easy as Matlab for Windows installation.
You only has to run
/mnt/cdrom/setup (or install, I don’t remeber the name)
and respond some questions to graphical dialogs that appear.
And more: Matlab for Linux is MUCH better than Windows version. It is more stable and it has better permormance (except those java IDE; I use -nojm to disable these junk).
Windows for scinetific use is a bad joke !