All of us who use computers create a problem we rarely consider. How do we dispose of them? This is no small concern. Estimates put the number of personal computers in use world-wide today at about
one billion. The average
lifespan of a personal computer is only two to five years. We can expect a tidal wave of computers ready for disposal shortly, and this number will only increase. And as if that isn’t challenge enough, there are
already several hundred million computers out-of-service, sitting in attics and basements and garages, awaiting disposal.According to the Environmental Protection Agency,
most computer and electronics waste in the United States is not disposed of properly. Some 85% goes into landfills or is incinerated. Much is exported to
China, where it is “recycled”without regard to the environment or labor safety.
A problem of this scale can only be addressed by a set of coherent solutions. This article discusses one piece of the puzzle. If we simply keep computers in service for their natural lifespans — rather than for the artificially-shortened lifespans promoted by some vendors — we reduce the rate at which we must displose of them.
Better use of existing computers also conserves natural resources.
Making
a single new computer requires:
-
A
half ton
of fossil fuels - One and a half tons of water
- 48 pounds of chemicals
That’s why your laptop is the most expensive electronic item per cubic inch in your house.
Finally, better use of existing computers makes it possible to get technology into the hands of the nearly
one in four
Americans do not own a personal computer. Not everyone can afford a new computer.
Vendors don’t like this thinking. Many consider it heresy. The goal of any business is to produce and sell as many as you can as fast as you can. This is laudable. But this has been mutated into
planned obsolescence, a system that encourages — even forces — premature computer disposal.
The computer industry wants you to view your computer as a
disposable consumer device.
But why toss out a perfectly good computer if it still meets your needs? And why should the computer industry dictate when
your
computer is obsolete?
How Planned Obsolescence Works
I collect computers donated to charity for refurbishing or recycling. A quarter of the computers I receive are still perfectly usable. They merely have software problems that result from Windows deterioration.
Here are a few examples:
Hardware: |
Operating System: |
Problem: |
P-III 733 mhz |
Windows ME |
Disk consumed by Windows System Restore |
P-III 933 mhz |
Windows XP |
Disk consumed by Windows Update |
P-IV 2.8 ghz |
Windows XP |
Slow due to malware |
P-III 1.2 ghz |
Windows XP |
Slow due to unneeded services, startups, and crapware |
P-III 1 ghz |
Windows XP |
Damaged OS, user unable to fix, unreadable recovery CD |
P-IV 2.4 ghz |
Windows XP |
Damaged OS, user unable to fix, vendor recovery software with OS image partition did not work for unidentified reasons |
This is perfectly good computer hardware consigned to premature obsolescence. Windows needs to be tuned up, just like an automobile. Few users know this, so many discard good hardware. (This
guide
tells how to tune-up Windows systems to keep them in service.)
Microsoft bases its businesson planned obsolescence. Theyleverage their monopoly to enforce it. Their goal is a win-win-win ecosystem where:
- Consumers win as Windows and Office improvements provide better and more functional computing
- Hardware manufacturers win as consumers must buy new hardware to obtain these improvements
- Microsoft wins as consumers buy new copies of Windows and Office along with their new computers
Microsoft often achieves this trifecta marketing ideal. But whether it does or not — and whether or not consumers wish to participate — Microsoft leads the computer industry in an infrastructure of planned obsolescence. Here are just a few of the techniques employed:
-
The Registry and product activation technologies lock a Windows install to a particular computer, and further, to the specific hardware configuration of that computer when the OS is installed. This restricts in-place hardware upgrades. It also ensures you can not move a Windows boot drive and still use the product. These artificial constraints limit product flexibility and lifespan to drive sales. In contrast, Linux allows in-place hardware upgrades without constraint. You can move a boot drive to either drive position on the install computer or a different computer and still boot the system.
- The Registry locks an installed application to a particular copy of Windows (and its bound hardware). It is Microsoft’s control point. It stops you from moving software to a different folder, disk letter, physical drive, or computer without special tools. Open systems are more flexible. For example, often you can install individual products across duplicate systems merely by copying the product directories to the new computer(s).
-
Microsoft and other software vendors tie together upgrades of unrelated software products to force users along a path of unnecessary upgrades. This is why you find you must install a new release of Internet Explorer, for example, when you are installing a completely unrelated product. These “co-upgrades” are marketing driven.
-
Microsoft’s changes to long-standing file formats for Microsoft Office files enhance their market control and force upgrades. Some will find it easier to upgrade Office than deal with the complexities of working with and supporting two different families of file formats, especially in companies that use multiple versions of Office.
- Windows licensing restricts consumer user rights and software transferability in consumer EULA’s. Software Assurance contracts similarly force corporate upgrades. Companies pay for software upgrades whether or not they choose to roll them out. Nor are they guaranteed Microsoft will release an upgrade during the contract term.
- Many PC’s today ship without an operating system CD. Instead they have a hidden disk partition with a backup image of theoperating system or a recovery CD. I don’t know whether this “innovation” is due to Microsoft’s efforts or that of many major computer manufacturers. What I do know is that this limits the lifespan of the computer to the lifespan of its most vulnerable hardware component, the hard disk. Disk failure forces the user to buy a new retail copy of Windows, which probably costs more than the computer is worth at the time of the failure.
-
Hardware vendors eagerly support planned obsolescence by proprietarizing laptop and printer power adapters, ink cartridges, and even laptop optical drives with oddball form factors or connectors.
Finally, Windows’ famous vulnerability to malware presents an insurmountable problem when it comes to keeping mature machines in service. Microsoft has
terminated
updates for many Windows XP and earlier systems. This effectively kills those computers because Windows requires security fixes. Moreover, the anti-malware overhead required to protect Windows computers compromises performance on processors operating at less than about a gigahertz.
Can’t these working computers still be useful?
How Linux Helps
Mature hardware in good working condition can still be useful. What is needed is software that does not mandate undue resources or limit flexibility through artificial constraints.
This chart tells the story. While Windows requires more hardware resources in every release, current versions of Linux need much less:
Resources: |
Windows XP: |
Vista: |
Windows 7: |
Ubuntu 10: |
Puppy 4: |
Processor: |
P-III |
P-IV |
P-IV |
P-III |
P-II |
Memory: |
128 / 512 m |
1 / 2 g |
1 / 4 g |
256 / 512 m |
128 / 256 m |
Disk: |
5g |
40 g |
20 g |
5 g |
1 g |
Cost: |
$ 199 – 299 |
$ 239 – 399 |
$ 199 – 319 |
$ 0 |
$ 0 |
Locks to Hardware: |
Yes |
Yes |
Yes |
No |
No |
Sources: websites for Microsoft, Ubuntu, and Puppy Linux, plus web articles and personal experience. Chart is simplified and details have been omitted for clarity.
Microsoft offers many Windows editions, this chart addresses the most common.
Microsoft
prices
are for full versions. In the Memory column, the first number for each system is generally considered the minimal realistic memory, while the second is the memory recommended for best performance. The Processor column is the minimal processor for reasonable performance.
Microsoft’s goal is for consumers to purchase new computers — with new versions of Windows and Office — on the schedule it dictates. Revenue targets drive this schedule. In contrast, functional requirements drive Linux. Not only does Linux require fewer resources for equivalent functionality and performance, it does not suffer the overhead of Windows’ required anti-malware software.
The chart shows that the current version of the popular
Ubuntu
Linux, 10.04, requires computing resources on par with what Windows XP required when it was introduced nine years ago.
Puppy
Linux isone of several major Linux distributions specifically designed for mature hardware. It even renders serviceable machines in the 10 to 13 year old range. Among the many tricks it employs to ensure decent performance is to run the entire OS from memory on any machine having at least 256M. Puppy bundles a full range of applications and runs them on old equipment while being reasonably easy to use.
If Microsoft software drives obsolescence to promote sales, Linux presents a sound alternative. It is competitive, supported software witha full range of free applications. The open source mindset offers an alternative to the disposable-device mentality that litters our landfills and pollutes the environment. Linux and open source keep older computers in service as long as their hardware still works and they support useful work.
Free Geek’s Reuse Model
The computer refurbishing and recycling organization
Free Geek Chicago
embodies this alternative mindset.
Free Geek
is a group of loosely associated 501(c)(3) non-profits in about a
dozen cities. Free Geek shows how Linux can be used to extend computer life while getting computers to those who might not have them otherwise. About
one in four
Americans do not own a personal computer.
This chart illustrates Free Geek’s reuse model:
Computers of approximately ten years old or less are refurbished and reused. Older computers are de-manufactured into their constituent parts. Reusable parts are then used to build up resources on refurbished computers, or are put directly back into the community. Parts that can not be reused are segregated by material and environmentally recycled.
This model ensures that all parts are reused that possibly can be, while only broken or technologically obsolete parts are environmentally recycled.
The model supports education and spreads computer access. Anyone is welcome to come to Free Geek and earn a free computer by participating in the organization. Volunteers normally start out in Teardown, which gives them a hands-on opportunity to see how computer components are connected and assembled. Eventually newcomers know enough to assemble their own computer, and they graduate to refurbishing and building computers.
This reuse model presents compelling advantages:
Open source software is central to Free Geek’s approach. Not only is it free — important to a non-profit with limited resources — it’s free of licensing restrictions and the headaches that go along with that. And it includes thousands of free applications beyond the full set bundled with the operating system itself. What could be better?
Free Geek typically installs
Xubuntu
Linux, a version of the popular Ubuntu system. This gives users all the advantages of Ubuntu — its gigantic repository of free applications, its huge support community, and its vast array of free educational and tutorial resources.Xubuntu requires less memory than Ubuntu due to its lightweight graphical user interface.
How Old Can You Reuse?
If you’re reading this article on a state-of-the-art dual-core computer, you might wonder: of what possible use could old P-IV’s and P-III’s be? The first chart in this article is, after all, representative of the kinds of donations we receive.
The answer lies at the intersection of
user requirements
and
machine capabilities. Any machine above 1 ghz — running Linux — plays video and flash fine and handles social networking. Any P-III or better offers word processing, spreadsheets, presentation graphics, web surfing, email, audio, text editing, chat and IM, image scanning and management, and more.
With the right software, a single-core P-IV or P-III with adequate memory still supports the tasks most people perform today. I researched and wrote this article on a P-III running Ubuntu and Puppy, which I regularly use at a relative’s house. It is her only computer.
Perhaps the best way to answer whether these computers are useful is to tell the story of a single mother of two who asked me for a computer. She was out of work and searching for a job. I felt badly that I had nothing to give her at that moment but a 400mhz P-II running Puppy Linux from its 256M of memory. Never was anyone so grateful for such a small gift! Now she could respond quickly to email from potential employers without taking the bus to the library every day. She could also research job leads when convenient to her at night. The old machine I gave her improved her life.
What
You
Can Do
If you have an unused P-IV or P-III, donate it to charity rather than letting it age into obsolescence in your attic or basement. Make sure you donate it to a refurbisher rather than to a recycler. Refurbishers reuse older equipment if possible, only recycling what they can not reuse. A recycler simply destroys your old computer in an environmentally responsible manner. The components (metals, plastics, glass, etc) are segregated and melted down for their material value. Most “vendor takeback” programs recycle rather than refurbish. They can’t afford the labor cost to refurbish. Your goal should be to “reuse, then recycle.”
When you look for a refurbisher, keep in mind the lax US laws regarding electronics waste. Some companies that say they reuse or recycle your old equipment are actually
fake recyclers. They tell you they will recycle your donation, then export it to countries like China, where it is de-manufactured under unsafe conditions and without any regard to the environment. What these companies do is not “recycling” in any normal sense of the word.
Programs
like
60 Minutes,
PBS Frontline, and
BBC World News
have exposed this scandal. Your tip-off to a fake recycler is that they accept CRT display monitors and printers without any fee. These items can rarely be reused and cost money to properly recycle.
Ask any refurbisher how old a computer they can reuse versus what they recycle. Free Geek reuses computers up to ten years old. The “secret sauce” is Linux. Most Windows refurbishers only reuse about five years back. Find your closest Free Geek affiliate
here.
Social Impact
Linux is not only green in that it saves money, it’s also environmentally green. Open source software extends the useful life of computers and reduces e-waste. It provides a crucial alternative to Microsoft’s planned obsolescence business model. Linux enables mature computers to support education and computer access for those who need it when coupled with a good reuse model like that of Free Geek.
Linux and open source software have become popular for their flexibility, low costs, and utility. How many of us consider their beneficial social impact?
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick is an independent consultant who specializes in databases and operating systems. He’s been active in computer reuse and recycling as a hobby for the past fifteen years.
Resources
Free Geek Chicago |
Reuse and Recycling |
Free Geek |
Reuse and Recycling |
Electronics Take Back Coalition |
Ewaste information |
Basel Action Network |
Fights fake recycling |
Earth911 |
E-waste information |
U.S. Environmental Protection Agency |
The EPA on “ecycling” |
Microsoft’s Refurbishing Programs |
Refurbishing using Windows |
Addendum: Microsoft’s Refurbisher Programs
This article focuses on Microsoft’s business model, which attempts to sell more computers by encouraging planned obsolescence. The goal is not to denigrate Microsoft but rather to explain how its business model shortens computer lifespans. Microsoft has a unique role in the industry due to its operating system monopoly, and many hardware and software manufacturers follow Microsoft’s lead to their mutual benefit. This negatively impacts the environment, efficient resource usage, and consumers’ wallets.
While driving planned obsolescence in the computer industry, Microsoft also supports computer reuse through its Microsoft Authorized Refurbisher (MAR) and
Registered Refurbisher
programs. These programs offer reduced-fee Windows XP and Office licenses to organizations that conform to Microsoft’s program
requirements.
The emphasis has been on reusing computers less than
five
years old.
Member organizations have done a world of good in reuse and recycling.
At the same time, the programs help Microsoft fulfill the “Prime Directive” of every monopoly — maintain monopoly status in all market segments.
Even fairly recent computers are made obsolete by the lack of drivers unless you want to install an old versions of Windows on it.
Linux (particularly Mint or Ubuntu) is very qualified for computer use, as older chipsets, such as my integrated 855 (yes, 10.04 still has a problem with me) gives the user a complete 3D desktop with compiz/fusion and with the overall improvement in speed, and visual quality with the fonts it’s like getting a new computer.
My Dell Latitude X300 is barely qualified to run Windows Vista (and the graphic drivers had glitches and ran in 2D). Under 7 NO graphics support is offered. Under Ubuntu 9.10 I have a lovely desktop that can do anything a Mac can graphically.
I’ve installed several Internet Cafe machines here in Seattle with the new Ubuntu 10.04, and with a small tweak so folks can use the Guest account from the login window, traces of previous users are eliminated with a simple Log Out/Log in..
For even older computers I’m eyeing Haiku. My brother-in-law gave me an eee-PC, the kind without a hard drive, and although I haven’t removed the windows yet, a solid state drive is VERY qualified to run Haiku, because, unlike Windows or Linux, there is not the constant writing to the hard drive. Haiku applications run completely from ram once they are in. (I’m not using firefox..)
Waiting for Arora and the Wi-Fi to be more complete.
This would allow even a very slow 400 Mhz PC laptops with 256K to 512K of ram to be used, which is pretty much useless for anything but Windows 9X; to be used as a very fast and glitch free computing environment.
Oh yes. One last thing.
Regarding the small screen of the 7 1/2 inch eeePC screens (800×480 I believe). The interface from the original BeOS works beautifully on such a screen!
No netbook remix needed..
Indeed. I’ve got a pile of various Dell/HP/Compaq/Generic PII/III boxes that I’ve tested Haiku on. Most of them run it reasonably well, although the biggest problem is usually lack of accelerated graphics driver support still. On older, slow machines, the VESA driver supplied with Haiku isn’t so nice, but it does work.
Are you still talking about Haiku here?
The native Webkit browser is called WebPositive – and it’s quite complete already. The R1/Alpha2 comes with it. Wifi is still a bit unfinished. I can use it on my Acer Aspire One as long as my router is unsecured (even WEP gives me issues, though it’s supposed to be supported)
I assume you meant 256MB to 512MB. That’s still suitable for running Win XP on, BTW. Haiku runs OK with 128mb in my experience, although that’s pushing the lower limits and you’ll end up using swap.
Older laptops have finicky hardware, funny video chips, funky audio chips, etc. PCMCIA support in Haiku is still non-existent as well, making it difficult to add ethernet/wifi cards to these older machines in the future, as many of them often only had a single USB 1.1 port.
Oops. Sorry. I *DID* forget about WebPositive!
A very, very good application.
The main problem with using Haiku on the web will probably be Flash, but we do have Steve Jobs and the iPad many thanks for the assist here.
..and yes. I did mean 256M to 512M (not K).
Edited 2010-06-15 23:07 UTC
I’d be interested to know what tweak you are using to achieve this?
The “finite number of writes” issue is a misnomer for current-generation drives.
The chips are rated at roughly 100k writes, although this is a bottom limit for QC. Most blocks are good for 1-2M. Aggressive block management will ensure the vast majority of long-lived blocks are used, while the earlier bad blocks are cycled out.
Simple math will confirm this:
2M writes * 64GB drive = 1024000000 billion cycled bits.
1024000000 billion cycled bytes / 6Gbps SATA III = 1975.3 solid days of 100% SATA III write utilization.
Even if these numbers are overestimates, the sheer quantity of data which must be written implies a long lifetime for a SSD for anyone who isn’t writing custom software to blow it up, then running that software for a very long time. That sort of repeated mechanical abuse of a magnetic drive isn’t good, either.
Magnetic drives are prone to mechanical failure, with these failures spread across their whole useful lifetime. SSDs do not spontaneously fail at even close to the same rate, with the hypothetical write problems occurring, predictably, after years of use.
The only reason not to buy a SSD is cost.
All nice and theoretical…
However the reason we get New Computers isn’t as much because the OS is out of date, but because we want a new computer that can do new and better things…
Sure Linux runs on Old PCs but it doesn’t mean I want to run Linux on an Old PC. And browse the web like it was 1999. And where a lot of the changes now take more CPU.
Personally, I usually get a new computer because I want a faster computer which can do more. I’m constantly pushing my computer. I put my own computers together and use Linux almost exclusively, so the whole planned obslesence thing doesn’t apply to me at all, but what I have is never as fast as I’d like, and I definitely wouldn’t want a computer coming out of this sort of program.
However, my older computers are frequently of little use to me, and they could be of use to someone else through a refurbisher program, so it could be of some value to find such a program and get into the hands of someone who would actually get some value out of them.
Just because you and I may have no interest in using older computers doeesn’t mean that no one else will.
Actually it’s not the OS which allows you to do new and better things. It’s the applications.
If the application is not available under your older OS you need to update your OS.
Now. Newer OS’s do offer newer services, which is why new and better applications are being written for them. Mac OSX comes to mind.
However, and interesting side note is that in the Windows world, the newer OS always uses more resources to do the same amount of work.
When MS came out with Windows XP, Intel did an evaluation and found that you would need an increase in 200MHz to the processor speed to run applications at the same speed as Windows 2000.
When MS came out with Vista, Intel’s analysis showed that a 20% increase in processor speed was necessary to offset the speed loss in upgrading to Vista from XP.
Intel decided to skip that upgrade and waited till Windows 7, not because it was faster.. (it’s apparently not except in booting) but because their hardware costs had time to amortize and justify the replacement of their computers.
Over on the OSX side interestingly. The operating system only had a reduction in speed when Apple went from OSX 10.4 to 10.5, which they regained in 10.6 but of course, with 10.6 you now have to have an Intel processor.
Interesting note:
http://itmanagement.earthweb.com/osrc/article.php/3863646/Mozilla-D…
One thing I did not hear in the Mozilla thread discussing this problem was what happened to your old Mac. Over and over it was said to just get a new Mac but the old ones don’t simply evaporate.. They get handed down.
The embodied energy in an electronic device is simply fantastic just in terms of oil and water.
It doesn’t behoove us to simply throw the things away.
And there is also the issue of peripherals. Many, many printers, scanners, and video cards no longer work when you upgrade from XP. Most of them work perfectly in Linux.
And lastly, I’m browsing like it’s 2010 in Ubuntu on a 2003 business class Dell which is pretty much only fit for XP. And I don’t have the administration hassles that Windows entails.
For connection to the net and grabbing stuff, it’s perfect. For real work, I’ll stick to my brand new G5..
;^}=
“Member organizations have done a world of good in reuse and recycling. At the same time, the programs help Microsoft fulfill the “Prime Directive” of every monopoly — maintain monopoly status in all market segments.”
*cough* this isn’t slashdot *cough*
I love how the article puts the upfront pricing of Windows yet curiously omits the time expenses of Linux.
If author wants to be taken seriously, don’t take sides.
Well, this situation is a bit different from putting together or buying a new computer. The whole point is that you can get some more life out of older computers using Linux and that there are people out there who could readily use such older computers even if you can’t or don’t want to. If you stick to Windows, that doesn’t work anywhere near as well.
The author could have done it without being so anti-MS, I suppose, but arguably, the nature of the subject is somewhat anti-MS to begin with.
There are also time expenses involved in installing Windows, even if you make a custom image with service packs added; the common programs you want to run…
Yes, I can believe Linux will take more time to get working properly, particularly if you have to do obscure console commands to get hardware working properly, or it has a really troublesome setup; just don’t try to tell me Windows will require no time whatsoever.
On old hardware, I generally find it faster to install Linux than to install Windows. A Linux distro would very likely support the hardware out of the box, while I’d have to hunt down drivers for Windows.
Just recently I had to redo a completely hosed Windows installation on my girlfriend’s old (circa 2004) laptop. But first I had to recover some files from the hard drive. Popped in a CD with a recent release of Ubuntu, and what do you know. All the hardware works out of the box. Sound, wireless, the works. I recovered the files, then proceeded to install XP. Once done, I popped in the CD containing the Windows drivers. Had to manually install ten or eleven different drivers, rebooting after each one. But at least I had the CD and didn’t have to go hunting for drivers over the net.
While the above is just one example, it has been typical of my experience in these situations.
+1 and it applies to new hardware too.
Just the other day I installed Windows 7 on my Dell Inspiron 9100 laptop. After the install, I had to download and install OpenOffice & Firefox. I had to hunt down wifi drivers (which I couldn’t find), sound drivers, video drivers (which didn’t work – no 3D effects), bluetooth drivers etc.. All before I could get Windows 7 kind-of usable.
I then wiped the drive and installed Ubuntu 10.04 on that same Dell laptop – everything worked out of the box and only with one reboot. There was no need to download any drivers. Ubuntu took a fraction of the time and effort to get working compared to Windows 7. Needless to say, Ubuntu is still running on my laptop, and now as the sole operating system.
My Inspiron 9100 is starting to show it’s age looks wise (7+ years old already), but with Ubuntu 10.04 it feels like a brand new system to me.
Sorry, but no.
It all comes down to your level of experience. I am a SysAdmin, with plenty of experience with both, and you could say much more on Windows, and I can tell you this:
I can get a full working computer with Ubuntu, all set up the way I like it, with all the applications that I need, MUCH MUCH faster than with Windows XP.
And that’s without cloning the hard drive, which works perfectly 99% of the time with Linux/Ubuntu with different hardware, but works 0% of the time with Windows XP with different hardware.
In other words, I have never experienced the frequently heard myth: “What you save in money with Linux, you pay in time.” In fact, it’s the other way around, I save time AND money with Linux.
And freedom, of course, can’t forget the freedom… I like being able to give Ubuntu CDs to my friends, the way it’s meant to be: you share good things with your friends.
2 * 1/100 of your currency here.
Because when something goes wrong in Linux it can take a lot longer to fix.
Never had XP require me to track down drivers and then compile them for a common wireless card.
Never had XP break working hardware with a system update.
Until a Linux distro can be relied upon to update the system without breaking applications or hardware it isn’t ready for the typical consumer.
I have said before that crunchbang is a great distro for old computers but I really think it is a bad idea to put Linux on charity computers if there is an XP key available.
Never so in my experience. I put Linux on machines that have parts that support Linux, just as you would install Windows only on machines that support it. For example, you wouldn’t even try to put Windows XP on an ARM smartbook. So, treat Linux the same, and simply don’t install Linux on a machine with closed-Windows-driver-only hardware, and you are good to go.
Neither have I for Linux on a machine with hardware that could run Linux from the outset.
A refurbishment effort such as the one being discussed in this topic would simply toss hardware such as closed-XP-driver-only wireless cards, because such hardware is no longer supported. By anything. One would get a replacement wireless card as a reclaimable spare part from a machine that was otherise going to be scrapped.
Likewise with Linux, for a machine that supported it.
So, it was ready about six years ago then (for hardware that supported Linux)?
Very, very bad idea to put XP on machines meant for ordinary consumers.
XP is no longer supported by Microsoft, so vulnerabilities will no longer be fixed. Even in the past when they were supposed to be being fixed:
http://gorumors.com/crunchies/malware-infection-rate-worldwide/
http://www.sunbeltsoftware.com/About/Security-News/?title=Researche…
Oh my. These showstopper security issues are in addition to the “planned obsolesence” and slowdown-over-time issues of XP surrounding the registry.
Edited 2010-06-16 05:06 UTC
It isn’t about me or you, it’s about the typical consumer. Should people that get these machines be expected to buy Linux compatible hardware?
This article is about putting Linux on random machines. But more importantly they are installing a variant of Ubuntu which in the past has broken Dell machines that were pre-installed with Linux. For a consumer OS to break a working wireless card or dump the user to a command line after an update is unacceptable.
XP is supported to at least 2014 and I have no doubt they will continue to provide security updates past that date.
1999 called, they want their Microsoft propaganda lines back.
Oh, you mean you were not joking?
Slashdot called and they wanted their meme back.
Seriously though, which of his points do you consider a “joke”?
All of it – as all his/her posts. nt_jerkface is a troll and is what his/her name suggests. That’s about all. The same could be said about ronaldst, who is probably the strongest MS-fanboy at OSN. Neither poster have any credibility what-so-ever.
nt_jerkface never fails to attack GNU/Linux (and occasionally also *BSD) whenever it is beneficial for MS. The same goes for ronaldst.
People’s objections gets you mightly upset so you need to attack them? Time to break away from your PC for a while.
I still stand by my original post. Especially the taking sides points.
Lack of credibility combined with dishonesty makes me mightily “upset”. When the most ardent MS-shill talks about neutrality, I know something is wrong.
I’m not attacking anyone. I’m merely pointing out the hypocrisy of yours.
Otherwise you wouldn’t spew the classic Microsoft lies about (GNU/)Linux having particularly harsh time expenses. Time is not necessarily expensive (in money), nor does GNU/Linux necessarily require any particular amount of time. Usually GNU/Linux requires next to no time compared with Windows (any version).
Lack of credibility? You’re not attacking anyone? With what I have read, you’re the only one who got visibly upset and started panicking.
My original post still stands. *wink*
No the name is a joke on people that can’t handle criticism of Linux and resort to calling people names like “M$ $hill” even when that criticism has nothing to do with MS .
You’re lying through your teeth. I’ve said nothing but praise for the BSD teams. Unlike Linux revolutionaries they aren’t trying to push a half-baked system onto naive consumers. They also aren’t following the Don Quixote of the internet who makes ridiculous demands like refusing to do an interview unless Linux is referred to as GNU/Linux. Obviously a lot of his followers have no problem meeting his demands.
My position is that Linux is fine for servers and embedded devices but it isn’t ready to be a desktop replacement for Windows or OSX.
If you want an example as to why Linux is not ready for the desktop, then look here:
http://ubuntuforums.org/showthread.php?t=1369856
That kind of bugs can also be found with Windows Update. Windows Update regularly breaks Windows forcing you to disable security features temporarily, and then manually install vulnerability fixes and then reenable security features. For an instance Windows Update fails to update Windows Server 2008 after you have installed the danish MUI. You have to manually install several updates before Windows Update can carry on on its own. So Windows isn’t ready for the desktop either.
Nor is OSX. The desktop ready OS is non-existing.
In regard to Linux vs. GNU/Linux. I use the former when I primarily refer to the kernel, and the latter when I have a need to remove ambiguity between Linux (the kernel) and Linux (as in GNU/Linux distributions). Otherwise I just use Linux about the kernel and the distributions. I try to use GNU/Linux about the latter as it is technically more correct. But of course, one might argue it ought to be GNU/Linux/X/GTK+/QT/Tk/Gecko/WebKit/blahblahblah :p
I’ve never had Windows break a single piece of working hardware. I can list over a dozen wireless devices that were broken by the Ubuntu 9.04 upgrade. There have been numerous occasions where an Ubuntu upgrade broke a video driver and resulted in a boot to command prompt. How is that comparable to anything that Windows update has done? If something goes wrong with a Windows driver update then the system will rollback to the previous driver. In Linux if something goes wrong the device is left dead.
We’re talking about consumer operating systems, and though I have seen a few issues with Windows update, mostly during the XP SP2 upgrade it was nothing in comparison to ANY minor Ubuntu upgrade. Windows isn’t a perfect system but it can be trusted to handle updates for the typical consumer. The same cannot be said about Linux why is the main reason why I do not consider it to be a viable desktop alternative. OSX is the only alternative for the typical consumer.
That’s not technical correctness, it’s just the opinion of Stallman. There’s more lines of code in KDE than the GNU userland so if anything there should be a /KDE suffix to any KDE based distro. Stallman just wants everyone to say GNU/Linux to plug his movement. Luckily most people don’t go along with it since the revised name looks and sounds awful.
Well, we obviously have very different experiences. I haven’t had issues with updating Ubuntu 9.x and 10.x – but I’ve had many (minor) issues with Windows Update – and of course the big problem with the danish keymap in Windows not properly supporting the danish alphabet. Huge issues in these unicode days.
GNU userland is essential for KDE while the opposite it not true. So GNU/Linux is not an unreasonable demand from Stallman. He’s a tad too excentric for me, but he does have some good point now and then. And you probably cannot have visions (whether they are good or bad) without going too far now and then.
Shall we agree to disagree?
Windows update does not regularly break windows, and hasn’t for a very, very long time (a decade). Time to update your resume, take Windows off it, if it is on it. If it isn’t, then good.
The only thing that may cause problems is drivers from windows update, and they are only updated if the USER CHOOSES to. They are entirely voluntary.
And when something goes wrong with XP it can take a lot longer to fix, depending on what the problem is.
I’ve never had to do that in Linux.
By this metric Windows isn’t ready for the typical consumer. I’ve seen numerous reports of people having their Windows boxes hosed by various updates.
Atheros cards required this for some time, after madwifi stopped being supported, and the kernel became incompatible. Many distros had the drivers in the repos, but by .28, they would typically load and not work, even though they built fine (or some common network setting would make them not work, like using WEP or WPA, or bridging interfaces). It took a very long time for the kernel drivers to properly support the B/G chips. I took the opportunity to move to a Ralink, but since many notebooks had Atheros integrated, it was not a fun time for many people, trying to use up to date software.
HAL and such would often require a newer kernel, but wireless only worked with older ones, and the newer HAL, or newer dependencies that also needed a newer HAL (or directly needed a newer kernel, for API reasons) for themselves, left you only able to use older software. Meanwhile, you may want or need software which is decidedly newer, and start having problems, requiring a great deal of time, manually resolving dependencies, and possibly doing static builds.
Lesson learned: if it’s not in mainline, it’s not really supported.
Edited 2010-06-17 03:45 UTC
That is very rare. Usually the fix to the problem is known and it takes virtually no time to fix the problem. And competent persons know how to apply the fix before the problem bites them.
Naah, but you have to track down binary blob-drivers and install them for common hardware. Or in the case of Windows Server 2008: Replace buggy Microsoft drivers by upgrading to an older version of the same driver, but this time from a third party (true for my NIC – the MS driver kept giving me BSOD’s – solution: said “upgrade”).
I’ve tried that. Also with Win2K3 Server. Reminds me of Windows Update fucking up when installing updates to Visual Studio 2005 and 2008. Takes forever to work around – even with known solutions (which often don’t work due to the closed nature of Windows).
In that case Windows isn’t ready for the desktop. Besides that: Windoes isn’t ready for the desktop. Even with Win7 and a 20 megabit connection it takes several hours to download all the fixes – not to mention installing them and restart several times. In that same time I’ve long finished installing XYZ binary Linux-distro incl. OpenOffice, KDE, Gnome and whatever. Only sourced based distributions still take longer than installing Windows.
Oh yeah. Windows doesn’t support Danish fully. The keymap is buggy and for instance doesn´t support Ǿ / Ç¿. Linux does
I’d recommend GNU/Linux or *BSD anytime over XP if security and ease of use are important. You cannot get both with Windows XP or any other pre-UAC Windows.
Our company manages 3000 desktops or so, only HP hw. The support department had a lot of fun wen we rolled out xp sp3. Some of them bluescreened on every boot. Good thing we use fully automated RIS and sw deployment so the users accualy could recover by them selfes. Btw. if dell ships ubuntu preinstalled, they’ll have to follow proposed-updates and take note when something breakes. Using linux doesnt free you from QA responsibillity!
That was a cause by a rootkit, and is well known. Google is your friend.
We did of course use google. We know the r.kit isue. But in this case it was a driver, don’t remember wich.
That’s a matter of opinion. Linux hardly ever breaks. With Windows I often get the following (and this happens very frequent in our office with others too). Windows worked fine at 5pm. Switch off the pc and go home. The next day when you switch it on, there is a problem with Windows and it can’t boot or doesn’t connect to the network any more etc. WTF! Linux never does this.
That might have been Linux some 10 years ago. I haven’t had to compile a single driver myself in years (6+ years since I have been using Ubuntu).
I have never experienced this, and I maintain 5 systems between work and home.
In other words, I have never experienced the frequently heard myth: “What you save in money with Linux, you pay in time.” In fact, it’s the other way around, I save time AND money with Linux.
I can second that. Just yesterday I had to do a Windows 2000 installation. Getting Windows installed itself was a relative breeze, but installing the drivers and some basic applications afterwards was extremely time consuming.
Finding the drivers, getting them from the net, putting them on a thumb-drive (2000 didn’t have the right ethernet driver OOTB), copying them over and then install, reboot, install, reboot, install, reboot. Then trying to install some apps only to find you need a newer Internet Explorer and a newer MSI installer first. Quite frustrating.
With Linux I can just pop in the Live-CD, klik install, answer a few questions and then I can rest assured a basic, preconfigured and usable desktop is copied to the harddrive. During the installation I can just go about using the Live Desktop to get some stuff done. There might be some post install handholding necessary with some hardware, but that is mostly a quick afair.
Plus I can install a large batch of applications in one simple command and be certain they will be fetched, installed and made ready to use from the menu, without any further interaction from me. Huge timesaver.
To all the people who think: “Linux is free if your time is worthless”. All the benefits of Linux will only materialize if you are familiar with the system. If you never take the time to get to know the system (or even resist it), then everything will be alien and time consuming. So yes, the learning curve is an upfront investment in time, but don’t add that time to the time it takes to work a Linux system in day to day use. You never add learning time to the regular time you need to administer Windows, yet you did make that upfront investment when you started using Windows.
That comparison sounds a bit unfair : you compare a 10-years old release of Windows with a modern linux distro… Modern releases of windows have a bit improved in terms of OOB hardware support, too, even though things like graphics and sound support can still be lacking. And around 2000, desktop linux was just horrible in that area.
Edited 2010-06-16 10:23 UTC
Modern releases of Windows won’t work at all with hardware that was out of production by the time Vista came out. Old XP drivers won’t install on modern releases of Windows, and new Vista/Win7 drivers typically weren’t written for out-of-production hardware.
Modern releases of Windows won’t work at all with 10-years old hardware that originally ran Windows 2000. There are, however, modern releases of some Linux distributions, such as Puppy Linux, MEPIS AntiX and Damn Small Linux, that are designed to do so.
http://www.damnsmalllinux.org/
http://antix.mepis.org/index.php/Main_Page
http://puppylinux.org/main/index.php?file=Overview%20and%20…
There is actually quite a bit of support out there for people who want to run on older hardware:
http://polishlinux.org/choose/linux-on-old-hardware/
http://www.linux.com/news/hardware/desktops/8248-linux-distros-for-…
http://www.linux.com/archive/articles/52134
http://blogs.computerworld.com/how_slow_can_linux_go
Apparently, if you have the RAM, you can even get KDE4 to run on 9-year-old hardware:
http://www.linuxforums.org/articles/a-pclos-kde4-setup-on-old-hardw…
Oh, that’s a bunch of BS. Were Pentium IIIs, i810s, S3 Supersavage video chips, SiS 700-series chipsets, VIA KT2-6 series chipsets, and nVidia Geforces still in production, then? Most of them weren’t. Yet Windows 7 works OOB with most of them. Ironically, it’s XP and Vista era hardware that’s tough, because MS didn’t get to bundle the drivers, but the makers stopped support.
Categorically untrue. If you have such a system, and it can handle 1GB of RAM, it should work fine. You will have to verify support for everything, as you may have something like a video card that isn’t properly supported, but PCI and AGP video cards that will work fine are widely available (get a used one for AGP 1x/2x, or 2x/4x in some cases).
Now, 15 years old, I’ll grant all day long. But 10? Support for common hardware from 10 years ago is actually quite good, much to my surprise.
i have some pci cards that doesn’t work with xp,vista,7 but works with ubuntu linux
I’ve got cards that work in Windows but not Linux. That’s an easy game to play. It’s not that there aren’t significant sample sizes, but that it’s not a good general statement, and, IMO, MS did a very good job of supporting what they could, including hardware much older than most people would think of using a modern OS with. Just like Linux, or a 7 upgrade, you’ll need to go check and see about your specific hardware, but tons of decade-old hardware is just peachy OOB with 7.
Windows today still has a hopelessly archaic installation procedure. It still takes longer to install than GNU/Linux and it still takes many hours to fix issues with Windows after initial installation (vulnerabilities, missing drivers, finding buggy drivers included with Windows and replacing those with less buggy drivers) – not to mention: installing actually useful applications by visiting 23454564567234567894567934806749867 different websites and downloading a variety of applications (sometimes in x86 and x86-64 versions at the same time for the sake of compatibility).
Not to mention the many reboots during installation of security updates and such.
You’ve obviously never installed Windows 7 or Vista. The auto-installation of drivers is excellent.
Anyways most users would still prefer having to go to a couple websites or insert a cd instead of not having a driver or having to setup ndiswrapper which is still the case in Linux for many wireless devices.
Consumers would also prefer a couple restarts after an install to Linux where an update can break working hardware:
http://www.qc4blog.com/?p=857
Linux isn’t fail-safe enough for consumer use. Wishful thinking won’t make it so.
Actually I’ve installed both. And I’m running Windows Server 2008 at home. Before that I ran Windows Server 2003. Before that Win2K Pro. Before that NT 4 Server. Before that Win98. And before that Win95.
The only windows versions for x86/x86-64 I haven’t tried installing are Windows 2.x and Windows ME.
When it works auto-installation of drivers works great in Windows. The same can be said about autodetection of devices in Linux (the kernel).
You keep referring to issues with Ubuntu. Most of the issues with drivers when upgrading from one version of Ubuntu to another newer version of Ubuntu are unknown when using sourcebased distributions, or binary distributions with rolling updates. Fedora, Ubuntu and similar distributions have a tendency to break stuff, which is why I prefer source-based distributions.
That said Windows is very fragile too. Even more than Linux (kernel and distributions). Try switching the motherboard and windows gives you a BSOD. The typical Linux-distribution will simply load different modules than usual (may not work so well with source-based distributions unless you’ve compiled the required modules).
The drivers from Microsoft shipped with Vista, Windows Server 2008 (Vista SP1-based) and Windows 7 (and its Server-equivalent Win2K8-R2)for Realtek RTL8168/8111 (NIC) are very buggy and consistently gives BSOD under heavy network load.
And again. The keymap in Windows doesn’t fully support the danish alphabet, nor are the localized versions particularly well translated. So no, Windows is not desktop ready.
The problem with Linux is that when auto-detection doesn’t work you can run into problems that require command line wrangling or device replacement. With Windows you just have to visit the manufacturer’s website and download an executable and go click-click-click,reboot. I wouldn’t describe the Windows system as perfect but it offers better compliance and is more reliable when it comes to updates.
I am focusing on Ubuntu because Free Geek uses Xubuntu
and because Ubuntu is pushed as the best distro for new users.
Though you probably think I take garbage bags of money from Microsoft and choke puppies as a hobby I would actually like to be able to recommend a Linux distro or any free OS for that matter to people that need one. However there isn’t one that I trust enough for the typical user. By your own admission Ubuntu breaks stuff and I haven’t seen a rolling release distro that is designed for novice users.
Windows has its flaws but it is far more fail-safe than any Linux distro for the typical consumer. Linux is fine for tinkerers or sysadmins but not as a mainstream replacement for Windows or OSX. I support the efforts of Free Geek because a computer with Linux is certainly better than no computer but the update issue needs to be addressed before I will put it on computers that I give to charity.
Utter drivel. Windows is a massive fail for typical consumers, with up to half of Windows machines in use compromised by malware.
Meanwhile, Xubuntu 10.4 is a LTS release. If a Xubuntu 10.4 LiveCD boots and everything works, as it should for most machines up to about 10 years old, you can then wipe the hard drive and install Xubuntu 10.4 to the hard drive of an older machine and everything a typical user needs will be installed, work correctly, and be supported with auto updates from Ubuntu for the next six years.
What time expense of Linux?
Seriously, what time expense?
Recently, I had occasion to restore Windows 7 on a Toshiba Laptop that had been dropped and trashed the hard disk. New hard disk required, and restore Windows 7 from the three recovery DVDs. Yes, three.
To get this machine up and running, with a set of all drivers and desktop applications, took four days of my spare time. Four days.
I also had occasion to install Kubuntu 10.4 on a netbook, just two weeks later. I installed it from a single LiveCD (yes, one CD). Total time to set up the equivalent functionality? Thirty minutes. Printer, scanner and wireless setup included.
Seriously, what time expense of Linux? Where do you get this rubbish from, really?
Edited 2010-06-16 03:53 UTC
Or we could cut the guy some slack. He is doing charity after all.
Agreed. The rhetoric seems childish (or at least one-sided), but no more than say PETA’s when you understand the motivations.
They aim to do something good.
I found the article very inspiring; it almost makes me want to go and start my own refurbishing center right away! I do think the disadvantages of Windows were a bit overstated — a cleanly installed XP with lightweight, free antivirus and spyware software generally runs almost as fast as Linux on P4s with 512MB RAM, and in my experience transplanting the hard drive or doing in-place upgrades has zero ill effects as long as you’re not using an OEM license that’s bound to a specific machine. But there’s no question that Linux offers a much more cost-effective solution that runs on a much broader range of old hardware, and there is definitely something to be said for the fact that every imaginable application is available free of cost. Congratulations on your work, it’s great to know about this possiblity to make changes in people’s lives and save the planet all at once! Awesome stuff.
Windows 2000 and XP run better with less RAM than current Linux ditributions. After all these OSs were built for these 10 year old computers.
Yes, I’m looking at the resource hogs Gnome, KDE, Mono, X.org.
I couldn’t get current linux distros to run on several older machines: a Dell Latidtude D800 laptiop with a GeForce 4 has broken video drivers. Ubuntu LTS 8.04 is the most current distro that supports it.
An iMac G5 wouldn’t stop running its fans at full speed on any PPC distro I tried. So it’s back to Mac OS 10.4. An old IBM laptop with only 40 MB RAM couldn’t boot from the CD drive. Lots of juggling install floppies later: no PCMCIA support, so no networking. X barely ran.
Linux is fine for machines with 512 MB RAM or more. But I would only use it, if the original OS is no longer updated or no Windows license is available.
On Older Macs, I find BSD a better fit .. OpenBSD will tell you what is supported and what isn’t.
Gforce4 uses Nvidia chipsets NV17, NV18, NV19, NV25, NV28.
http://en.wikipedia.org/wiki/GeForce_4_Series
The nouveau open source driver should support this for 2D just fine:
http://nouveau.freedesktop.org/wiki/FeatureMatrix
3D support is still work in progress, but you don’t need that in order to run a desktop.
If you have that much RAM, you can easily run a full-fledged Linux desktop such as KDE 4 or GNOME with only a 500MHz class machine. A lightweight desktop, such as Openbox, Fluxbox, IceWM or JWM would run like a flash.
If you were to run a Windows OS that such a machine could support, on the Internet, it would either be bogged down to uselessness with running anti-malware, or it would be burdened with malware to the point of unusability within a week. Take your choice.
Lightweight Linux distributions are far, far more suitable for low-resource machines.
Here is a Youtube link showing MEPIS AntiX 8.5 (released April 2010) running very happily on a Pentium II.
http://www.youtube.com/watch?v=gGe0KmtT9y8&feature=related
The entire install CD is only 480M.
In theory, this distro can run Debian on as low-spec a machine as a 486 with only 64M RAM, but that would be a bit painful.
Edited 2010-06-16 14:41 UTC
That’s a huge exaggeration. As I said, a free, lightweight antivirus like Avast! (which you can configure to auto-update everything without any prompting or notifications) and a free, lightweight anti-spyware package like Microsoft’s own Windows Defender, plus the built-in Windows Firewall are more than enough protection and hardly take any resources. Unfortunately the big two “security” packages have indeed turned into big, bloated, parasitical monsters–especially McAfee is something I would never wish upon my worst enemies. But they are hardly representative of what it is possible to get by with if you don’t buy into the marketing machine.
That said, these days I would never install any version of Windows below XP on any machine, and Windows XP on anything below 1GHz is a stretch, so for a 500MHz Pentium II Linux is definitely the best choice IMHO.
Edited 2010-06-16 15:38 UTC
I attempted to install Microsoft Security Essentials on my mom’s XP computer–a 1.5GHz P4 processor with 256 megs of RAM. Let me just say, it didn’t take long before I received complaints that it was running slow… and damn, was it! It turned out the program contains a process (which it inherited from its days known as Windows Defender) which was guzzling memory and eating swap like nobody’s business. Uninstalled it, and BOOM! No more slowdown. From unusable to decent in one reboot. I don’t think he was exaggerating at all. IIRC, the swap file + full memory equaled maybe around 350 megs, so chances are 384MB would (possibly?) have been enough. That is, until you fire up Firefox or something similar… then you’re up to needing 512MB.
Bottom line, when working with small amounts of memory especially, you do NOT want large background processes running. Unfortunately, this includes AV; no better way to guzzle resources and slow an older machine down to a crawl. IMO, ALL antivirus software blows… and it’s just not worth it to sift through a bunch of garbage alternatives when they’re all about the same anyway, just some shittier and/or more expensive than others. Not to mention, what they do best is give the user a false sense of security. I wouldn’t trust any of them any more than the malware they are advertised to “protect” users from.
Lesson learned: If you want to run MSE in WinXP, definitely make sure you have at least 512 megs of RAM. Otherwise, choose which is important to run: antivirus software that is likely to not really help you like it claims and will hog resources, or your everyday programs that you actually use. I’ll take my everyday programs, thank you. And I’d recommend everyone else to use some common sense (don’t do anything stupid) when on their computers and do the same… but that’ll never happen. People are dumb.
Edited 2010-06-17 05:28 UTC
It’s not that the graphics aren’t working at all. But there are lots of artifacts. I don’t know where the bug is, if in the drivers, X, or somewhere else. I can only tell you this: everything works fine up to Ubuntu 8.04 with the nv and with the closed drivers. Later versions have regressions. Either the full resolution isn’t available or there are artifacts.
It might be the high resolution display that laptop has. 1920 x 1200 is a bit uncommon. I remember that I had to edit x11.conf in earlier linux installions to make it available.
And yes, I need 3D support. Because I program OpenGL stuff and want to play a game once in a while.
Don’t give me that YouDonTNeedThat bullshit I get too often from linux advocates like you. It’s the standard excuse for breakage and missing features. I’m also really tired of the LinuxRunsBetterOnOldHardware and LinuxSupportsEverythingOutOfTheBox hogwash. It wasn’t true 5 years ago, and it’s not today.
Edited 2010-06-17 21:53 UTC
WTF?
This thread is about refurbishing older computers to re-cycle to people who would otherwise not get one. It isn’t about your problem with nVidia not supporting you and your older Geforce4. In the context of this thread, most people would not need 3D support.
However, even then, despite nvidias refusal to provide programming specifications for its chips, there is 3D support becoming available for older nvidia chips from the nouveau project. Because of nvidia’s recalcitrance, this is a bit of a slow process. You won’t be getting any support from nvidia, that is for sure … so why bark at the one set of people, open source developers of the nouveau driver, who are trying to help you?
Sheesh!
If you want to play 3D games for Linux right now on your older computer, find a computer recycler shop somewhere, lash out $30 bucks or so, and pick up an older ATI card compatible with the graphics slot.
http://www.x.org/wiki/RadeonFeature
R300, R400 & R500 is mostly green. R600 & R700 are nearly there as well.
http://www.x.org/wiki/RadeonProgram
There is still work to go, but the specs are there and it will happen.
Edited 2010-06-17 23:18 UTC
Well the graphics are broken for 2D and 3D!
I think this very relevant to this discussion. In my experience running Linux on older Hardware isn’t as trouble free as you and others suggest. To support my point I provided just a few examples of my own.
I’ve been running various flavors of linux for years now. On old and new hardware. Both had an about equal number, but different kinds of problems.
Linux isn’t a magic bullet for using older computers. It has its drawbacks.
If we are trading stories: some years ago, I had an Toshiba Tecra, it originally ran Windows 98, and it was clocked at about 500MHz or so. I think it had 128MB RAM. It had an issue with the sound card, and it took a lot of coaxing to get ALSA to run. Eventually, that machine met an untimely end at the hands of airport baggage handlers.
Every single machine I have had since then, for many years now, has run perfectly with Linux.
Edited 2010-06-18 13:04 UTC
I noticed that two of your three examples are laptops and I think this brings up a good point: Old laptops often have very proprietary power-management systems that Linux doesn’t know what to do with. For example, I had a 2002-vintage Pentium M Toshiba that had hardly any options directly available from BIOS–instead all of the hardware config was done via proprietary Toshiba Windows XP-compatible tools. Use Linux and you can pretty much kiss any form of power management goodbye. This kind of thing was standard practice back then, before ACPI was standardized across the board. So for these kind of old laptops–of which there are many–I really do think XP is the best solution.
Did you try Lucid Puppy 5 (based on Ubuntu)?
Installed it on a Laptop with 192MB of ram and runs great!
Besides, why do I need a very old computer in order to justify Puppy?
My FatDog64 Puppy (64bit Puppy) runs great on my AMD Turion64 laptop – and it’s the fastest Linux distro I’ve ever tested (Linux Mint 64 is dog slow compared to it…)
Forgot to mention the exact version number of the mentioned Puppy versions:
32bit: Lucid Puppy 5.0.1 (Ubuntu)
64bit: FatDog64 RC3
jello
Edited 2010-06-16 16:33 UTC
The real problem is not Microsoft and companies like them. Sure, they want you to buy new products, but that is natural.
The real problem is people. They seem to have no concept of reuse or fixing things. I know of at least two people who got a virus on their computer (or their computer got “slow”), so they threw away their computer. (I would have loved to have taken them!)
Probably the craziest story is somebody who got a virus that supposedly took pictures through their webcam – so they through away the computer and the webcam, and bought replacements for both.
Personally, in 15 years I have had 3 computers. The first one is now quite old, but I was able to save a few parts (and someone took the rest for free; I have no idea what they did with it), and I still regularly use the other two.
Last year, myself and 2 friends done this. We are Portuguese MSc CS students and we have two courses where we have to, someway, get soft-skills. We thought about this and we done it. We got 5 donated old computers, cleaned them, checked their hardware, installed Ubuntu Linux and offered them to a care center for all-age people with motor disabilities. From those 5 PCs, we delivered only 3 since 2 of them had something like 400/500MHz CPUs, what is becoming to be really old, so they just served as donors. If I remember correctly, the delivered 3 were a 2.4GHz Celeron (512MB RAM), a 1.4GHz P4 (384MB RAM) and a 933MHz P3 (384MB RAM). All of them offered very good performance on “normal” tasks like browsing the web, card games, OpenOffice suite, etc.! Flash videos were a mess on the 933MHz CPU but the other 2 handled them just fine (SD of course). The director and all center population were very, very thankful.
I’m a full-time Linux user. I don’t touch a Windows machine for about 5 years. So if you want to see me as fanboy, go ahead… but the funny thing about this story is that I talked personally with the responsible for the Microsoft Imagine Cup in Portugal during the roadshow at my university, and I told him about our idea since the Imagine Cup sounded great for it (“Imagine a world where software can save lives! bla bla bla!”). When he heard about open source… forget it. So, don’t forget kids: save the world… but only with Microsoft software!
The Free Geek’s Recycling Model appears a sound foundation for the future.
There appears to be one arrow missing – one for the “vintage computer systems” – whether they are a quarter century old or rather unique. Forcing a path to the teardown room for anything below a PIII processor may lead to drastic disappearance of the early examples.
It is unlikely that one may walk into a Free Geek’s location with a Cray or a PDP-10 in hands. However, how about a MicroVax, a NextStation, an Apple Cube? Something to think about?
…it just reminded me about my ancient Gateway (which according to records was shipped in May 2001). It has a pathetic 256MB PC800 RDRAM, which I’ve been wanting to upgrade to something more usable for years now. Problem is RDRAM has always been freaking expensive. I just checked online again and it seems the RAM is somewhat cheaper… I’m really considering doing the upgrade, possibly max it out to 2GB. I’ve kept it going good for almost a decade, with no hardware failures and several performance upgrades and additional PCI cards… I’d love to see it running happily for another 5-10 years (at least). It’s still on 24/7 (unless it storms) and used every day. It’s served me well, and I don’t don’t know how I could ever get rid of it, even with a more powerful system. I’ve become attached to it. Maybe I’ll be able to get that RAM for its tenth birthday.
That said, in my opinion 256MB is just NOT ENOUGH any more. Even for those distros that *can* run in 256MB, I’d still rather have 384MB; most “light” distros seem to just barely get by and don’t have very comfortable GUIs. They’re usable, but IMO not good for everyday operating systems. Those KDE3 and Gnome-based distros are typically best at 512MB RAM, though some are quite impressive with 256. KDE4… hah… better have about a gig minimum (and I’m not 100% happy with its performance with that on a 64-bit system).
Similarly, a PIII is the minimum processor I’d use; PII is just too slow for me. It was back in the late 90s, it is now. If you want no GUI or anything fancy it’d be alright though.
Honestly, I’m not sure I’d recommend something less than a PIII with at least 512MB RAM to anyone else either, unless their use is seriously limited. Of course, I wouldn’t have to… because they seem perfectly happy getting “cheap” new computers with more power/memory/storage than they’d ever know what to do with every time their old one takes a shit (which is usually something dumb like Windows product activation, battery that costs more to replace than it’s worth, malware, etc… just as the article mentions).
I’ve given up talking to them about computer problems, because they ALWAYS decide to just get a new machine anyway. Even if I recommend against it. Somehow it’s already better, as if a 150GB hard drive only 40% full and 2GB of memory, which they don’t even use half of, is not enough.
Edited 2010-06-16 03:05 UTC
Hmm… I hate to talk to myself (or reply, LOL…) but I just remembered Xfce. For some reason, even though it was one of the first Linux desktops I used for an extended period of time, I forgot all about it. It does run quite well with 256 megs of RAM… but still, I’d say 384 would be better anyway, just to be safe. I know I used to run into frequent swapping in Zenwalk with 256 megs in Xfce… all it takes for any desktop (or hell, even a simple window manager like openbox) to swap (ie. run low on RAM) is a single instance of Firefox. Add a couple more programs to the mix and you’re sure to experience some noticeable (and eventually frequent) slowdown.
Edited 2010-06-16 07:56 UTC
Yea this is the real problem. You can get an OS to boot into 256mb of RAM but that says nothing about the applications. I have found Chrome in XP to be tolerable as long as there is at least 368mb, but 512 should really be considered the minimum. Slow hard drives can also make the swapping a lot worse. A lot of those old laptops have anemic 4200rpm drives that get crushed by swapping memory demanding programs like browsers. XP runs just fine on 512mb as long as you have a decent hard drive. I was shocked by how well XP ran on a p3 500mhz machine that had a 7200rpm drive.
Should try LXDE, I’ve used it with 128MB of ram and runs Debian with reasonable performance.
Computer refurbishing is a wonder service that needs to exist in every city.
There are many groups not affiliated with Free Geek that do do refurbishing and load free software. The most comprehensive list I know of is at http://growingupfree.org/wiki/index.php/Orgs_In_USA . If you know a group that’s not listed there, add it!
There is also a mailing list devoted to the topic at http://groups.google.com/group/open-community-technology
This article would have been better without the anti-MS scaremongering.
Incorrect. To be honest, I’m not entirely sure what this mean. Maybe the author is referring to those apps (mostly games) that will not work without the correct keys in the registry? This has nothing to do with MS or the registry though, it’s all about copy protection.
Really? i rarely see this, the only one I can think of is Adobe. Some applications do require an updated version of some other app, you know?
Wow. Older versions of a certian application will not open files made with newer versions. Sometimes file formats changes between versions. News at 11. Or not.
I dont know what you’re trying to say here. That I cant legally transfer my OEM windows that came with one machine to another? Since the new computer most likely came with Windows this shouldnt be much of a problem.
In practice this is the time you call your buddy/son/nephew who has a “Windows installer”.
Uhm no, that is actually what the recovery CD is for. RECOVERING. Amazing, eh?
I guess they’re trying, perhaps, but in practice this is rarely a problem. Any PC repair shop worth their salt will be able to get the right adaptor or form factor.
I didnt know there are standard ink cartridges, btw. Oh wait, there isn’t.
I would have hoped that the goal was to actually help the environment and help people get computers regardless of the OS installed or they OS they want/need
Not incorrect. During a refurbishment, combining two dead machines to make one new fully functional one, just try moving a Windows application from one hard drive to another, as you might do for a hardware card. In most cases, it can’t be done.
And the copy protection most decidedly gets in the way of a refurbishement program such as this. Using Linux, one can run such a refurbishment program, and never run into a copy protetction issue. Saves vast amounts of money.
Sarcasm aside, this is most decidedly an issue with refurbishment. Open formats on Linux, such as ODF, alleviate such issues.
Nope. This is more an issue with desktop applications … Windows desktop applications. Can’t be moved between machines.
What part of “the machines these days don’t come with recovery CDs” did you fail to grasp?
In order to do all of the things they need to, any desktop OS out of Windows, Mac OSX and Linux is perfectly adequate for the vast majority of people who would be wanting a refurbished machine. Of those three options, Linux is by far the best choice for actually being able to do the refurbishment.
Are you kidding? With some exceptions this works just fine. And I hope you’re not trying to argue that this is easier on Linux because it sure isn’t, especially not if the machines are using different distros.
The part where he didn’t say that. What he said was:
Sorry, I’m getting a little tetchy I suppose. There seem to be a lot of quibblers trying to spread wrong information of late, and flinging copious insults at anyone who defends Linux, so it doesn’t help for me to give wrong impressions back again.
Correcting myself:
– Some Windows applications use cryptic keys within the registry as a kind of copy protection so that they can’t easily be moved between disks. Hence the reason for being for sites like this one:
http://portableapps.com/
http://portableapps.com/about/what_is_a_portable_app
– Some recent Windows machines come without recovery media other than their own hard drive. If the hard drive fails, the machine is “bricked”. Hence, pleas for help like these:
http://en.kioskea.net/forum/affich-19557-no-recovery-disk
http://laptopforums.toshiba.com/t5/System-Recovery-and-Recovery/x50…
Edited 2010-06-16 11:24 UTC
PortableApps does not exist because some applications use cryptic keys, it exists because registry keys are not saved to portable media (by design). The point of PortableApps (which are great btw) is that you bring your settings with you on a flash disk instead of them being in the reigstry.
In my experience those machines asks you to create a recovery cd the first time you boot and also include an application to create a recovery cd at any time (like with your linked Toshiba). I dont know who Iqon is but maybe they’rejust a shoddy dealer. There will always be those and it’s not part of any large MS conspiracy to sell more PC’s.
The detailed glossies that come with such a Toshiba machine do indeed advise you to create recovery media. It isn’t a CD that is required, it is three double-layer DVDs these days. This is clearly a cost-cutting measure by Toshiba, and indeed it has very little to do with Microsoft.
Nevertheless, people being what they are, we still end up with people with a bricked machine because they had dropped it, in so doing they had scratched the hard disk, and they hadn’t made the recovery media.
Edited 2010-06-18 03:59 UTC
I think most serious vendors (and I know Dell does this) will send you a recovery CD if you request one and can prove your ownership (last part may not be needed, can’t really recall right now).
It is indeed a cost cutting measure and I imagine quite a bit of money is saved by not sending 3 DVD’s (only 2 with Dell) with every unit.
When my nephew dropped his Toshiba laptop just three months after he had bought it, Toshiba gave him the run-around. After keeping him on hold for half an hour, then talking with him for just 10 minutes, and disbelieving him (he is a teenager), they wanted $35 bucks just to stay on the phone. He never did get any recovery disks from them, but fortunately the store where he bought the machine did eventually help him.
Needless to say, I would never recommend a Toshiba machine to anyone nowadays.
But what is the net gain here if you plan on formatting and installing Linux?
I think putting OpenOffice on these computers is a good idea but that doesn’t require Linux.
The net gain is that computers that have been discarded as being non-functional, due to registry clog and being overrun by malware, can be restored to excellent working order with current applications (and hence data formats) by installing Linux.
No my point is that being unable to transfer a Windows application on one of these old computers really doesn’t matter when they should be formatted anyways.
They could just as easily be restored to working order with a fresh XP and as far as the environment is concerned wether you use XP or Linux does not matter.
True enough.
In reality, by the time a machine has got to over five years old, it is very difficult and time-consuming, and sometimes expensive, to get it installed with a “fresh XP”.
In many cases, OEM disks (which need to be the disks for that particular machine) are not available. Even if they are, after installing XP afresh (which takes ages), it takes an enormous time (and considerable Internet bandwidth) to get it updated & protected against malware.
Even having done all that, one still has to work more to get a reasonable set of applications installed. This all adds up to a very significant time effort if one has a batch of machines to get through.
OTOH, if one uses a single Linux LiveCD, with a bit of luck one can restore an entire batch of machines in thirty minutes each. With five copies of that same LiveCD, one person could probably even set five installations going simultaneously, theoretically bringing the time down to six minutes per machine per installer person.
That is very hard to beat using XP, from an shear practicality point of view. We are talking about “smart re-use” after all.
Edited 2010-06-18 04:11 UTC
Just copy the registry entries when you copy the program directory. It’s not hard. It’s only slightly more annoying than digging through /etc or /usr/share or anywhere else to get the config.
Oh,a nd the dells I bought 2 years ago for my home, and the 10 I just ordered for work came with full Windows DVDs, so I’m not sure what you mean about no recovery disks.
It isn’t hard to copy registry entries.
The hard bit is to identify that a registry entry named {0134f6gb2981349} or whatever contains a key that encrypts the disk ID of the machine where a given copy of Word was installed, and that copy of Word won’t run without that registry entry being present and matching the disk ID of the machine that Word now finds itself on.
Or something like that.
The majority of applications do not use such cryptic keys, they store their data in the standard HKCU/Software location.
Yet some applications, such as Microsoft Office, do exactly that.
AFAIK, Toshiba laptops sold since 2007 come with no recovery media. There is a read-only recovery partition on the machine’s hard disk, but if it is the hard disk that has failed, and a replacement hard disk is purchased and installed, then there is no longer any copy of the OS to recover the machine.
This may well also be the case with other OEMs, but the only one I have had direct experience with is Toshiba.
http://www.justanswer.com/questions/3bckt-i-have-a-toshiba-satellit…
The “procedure” only works if the original hard disk is still working.
Edited 2010-06-17 12:07 UTC
Could be that Dell is the only one. I hope not.
I can’t remember which brands, but I’m certain there are others besides Toshiba that stopped supplying recovery CDs.
Some of them have a “recovery CD creator” program that you have to manually run after turning on the machine… You pop a blank disc in the drive and it burns it. I’ve noticed very few average users ever do this, and it makes it a royal PITA later when they call you for help :/
And yet, they always seem to have their useless manuals/driver/utility CDs in the drawer next to their machine.
The author of this article is a hypocrite. How can you promote Open Source when its plain obvious the article was authored in Microsoft Word?
Check out the Recycle graphic. Word Art Style 28 and Illustration Shapes ‘Block Arrows’.
Also, if that mother of two could afford a computer, how does she afford to pay Internet bills? Is she leaching off someones Wi-Fi.
Its obvious you cannot resist the good life.
What reason do you have for thinking that the author of this piece on OSNews was one and the same person who created the graphic?
It looks as if it was lifted from a Microsoft Powerpoint for Microsoft salesmen training, circa 1998 or something.
What has that got to do with anything? Linux is still more suited to refurbishment, even if the computer is not to be connected to the Internet.
I totally am a MS-hater, but when it comes to e-waste the real bad guys are the chip companies.
I don’t think that different cpus cost that much different to manufacturers.
In a world that did not have profit, we would see only one or two cpus, built at the maximum possible level of technology. Goes without saying that hi-end cpu gets obsolete later.
The windows OEM license model, that ties it to the hardware, should be simply illegal, as what it is really a license to steal for MS. It has some implications on e-waste, but it’s not the main issue.
Chip companies never throw anything away. Have a wafer of chips that can’t hit 3Ghz? Sell them as 2.8Ghz. Have a bunch of 4 cores that can’t run all 4 reliably? Disable 1 and you have a 3 core. Disable 2 for dual core. Disable 3 and you have a celeron or sempron.
They have to recoup R&D, get people buying the chips, and try to get enough money to feed everyone. Even without insane board member compensation, that’s not cheap.
Then, they do use the highest level of technology. Intel and AMD, even having amputated and cultured Glofo, are at the forefront of chip-making technology, and you are getting the fruits of that in their released chips. Same w/ TSMC, save for being more oriented towards lower costs.
If they did not sell different speed grades, they would throw more things away. That’s bad. While, after the first few runs, most chips can hit higher speeds, and are sold lower to keep the higher-speed prices high, it’s still not a case where 100% hit the higher speeds, with no defects.
Without profit, what we might be able to get, in my view, is fair pricing, including cleaning up the waste produced from the manufacturing and distribution of the produced units being sold, as it would be less advantageous for industries to push that off to the future, and for governments to allow them to.
In practice, I found that I had less trouble getting Windows 2k drivers for old machines than finding a Linux distribution that works.
For example, my old Thinkpad 365XD: 40MB RAM (not enough for the alternative Ubuntu installer), Pentium 120, Trident graphics. The driver for Trident was never ported from XFree 3.x to 4.x, so I’m restricted to stick to 3.x. However, once I go back to a distribution old enough to have that, I find that it does not support WPA2 for PCMCIA WLAN adapters (which requires a fairly recent kernel). In addition, it needs a disk manager so that the 20GB hard drive plays well with the BIOS that refuses to even POST larger than a 2GB drive. Unfortunately, I haven’t seen a single distro yet that would cooperate – they get stuck at a boot screen or insist that the drive is only 2GB in size.
Windows 2000 runs a bit slow, but good enough to play mp3 radio streams over WIFI.
So, before you waste too much time trying to get old hardware to cooperate with Linux, give Windows a try. You might be surprised.
Edited 2010-06-16 07:06 UTC
This reminds me of a small sony vaio laptop I inherited. Really sleek machine, but quite under-powered. Similar specs: P1 166MHz, 32MB ram. I insisted to using it, so I installed Arch on it, as it’s my personal favourite.
I now use it as my main radio, mp3 streamed over wifi/wpa2. When it comes out of sleep, it automatically starts playing the stream.
I’m quite happy to be able to use it as its still a perfectly working machine. It does choke on browsing the web though. Gmail is impossible to read, even the basic html under opera. But it does not have any problem playing the stream.
I have many old computers around that are P2, P3s and Pentiums, and I do not think they’re bad to use.
Often they also run the latest things around, just not those which use a lot of CPU time and RAM.
It’s normal that a distro like Ubuntu won’t install on something which is not at least a P3 full of RAM. That’s already bloated on newer machines, and I do not know why one would expect it to install on something 13 years old.
That said, if one gets a Debian or Slackware system for example, and then strips it down (one can also remove the sysinit system for example), Linux can run also in 16 megs or less.
Once I custom-made an uClibc distro for my 386sx 40 mhz with 4 megs of ram (which is now probably dead), and Linux ran quite well. I even played module files and CDs on it.
PIIs and PIIIs are quite fast, and they will even run the latest Firefox and Opera.
Pentium 4s are very fast at almost everything and I don’t know why people feel that now they’re slow or “ancient”.
My brother is, right now, using a Linux computer with a Pentium MMX 233Mhz processor with 32MB of RAM as main machine. And it doesn’t feel bad.
As long as a computer runs, it’s always useful.
This article gets to me, as I am one of the users who used to run PIII/PIV with OpenSource until they die totally, but I can see only one, small hole in this kind of thinking: environment and power savings.
Of course we save our environment by not throwing another crap into it that early, but we are also using hardware that is not as power-saving, as the newest ones.
That’s the only issue I can come up with when I think about this topic.
http://www.groklaw.net/article.php?story=20100615162529651
It is not exactly on topic, but it is tangentially related.
It helps the environment, and helps education for underprivileged kids. A win-win IMO.
http://www.businesswire.com/portal/site/home/permalink/?ndmViewId=n…
The new model XO has:
Bad luck Intel. Bad luck Microsoft. Great news for the Uruguayan kids. Good for the environment, compared with more conventional technology such as Atom netbooks.
Win win win win.
That is indeed cool but where are you getting this stuff about it being more environmentally friendly than a netbook? Netbooks are low-power too. The only time the word “environment” comes up in that article is the “Gnome Desktop Environment”.
How exactly is that “green”? Ever thought about the poor energy efficiency of that old crap? My Old Pentium 4 Desktop needs more energy than my current i7 wasting a lot of electricity and heating up the room. Having to compensate either of those is in no way green.
This is not a certitude if you take recycling costs into consideration. If I remember well, a study once showed that the first Jeep model was greener than a current Toyota Prius because
1/It lasts much longer, due to lack of on-board electronics and other modern gadgetry.
2/Its building process costs much less energy and involve way less C02 emissions.
3/It does not include a very large battery or other electronic components whose recycling is a nightmare.
4/Basically, if you want to recycle it, you take the seats, plastic pipes, and battery off the vehicle and melt the rest. Taking a Prius apart is far from being that easy.
5/Compare all of that to the fact that the Prius consumes, say, 20% less oil/100km ?
The recycling cost and lifetime issue is just as valid for electronic circuitry, which has pretty high recycling costs. As an example, if I quickly make up some numbers, consider two chips which have the same making/recycling cost, 300kJ. The old one consumes 10kJ/year, the new one consumes 8kJ/year. Keeping the old one for 10 years costs 400 kJ. Keeping the old one for 5 years, then buying a new one and keeping it for 5 years, costs 350+340=690kJ. Provided that recycling costs are flying high, the “always upgrade” attitude is generally not a good solution.
Edited 2010-06-16 12:06 UTC
Thanks for this.
He says how it is green right in the article: The new PC’s production requires a half ton of fossil fuels. It would take that old PC around a full year running 24/7, or around three years of normal use, before it would equate to the fossil fuels needed simply to *produce* that new PC. And that’s not even counting energy and environmental impact of the water and chemicals needed for the new PC.
An old p4 desktop with a CRT monitor can use twice the power of 2 new laptops.
In a warm climate like Arizona a dozen old computers in an office will significantly increase cooling costs and quickly negate any manufacturing savings.
OK.
1) Don’t use a CRT.
2) If you compare old laptops to new laptops, I believe the energy savings are much less if any, and heat production is a non-issue.
3) Even if you do compare a P4 desktop with CRT to a new laptop, it would take at least two years of both computers being turned on 24/7 before the total energy costs of the laptop (including production) meet up with those of the old PC. That means probably around five years of actual usage until the energy consumption of both equals out–by which time you’ll probably be wanting to get a new laptop once again….
I doubt this is true for all desktops, but even if it were true there are plenty of situations where old PCs make sense. Keep in mind that if a person can’t afford a new computer there’s a good chance they can’t afford air conditioning either.
Edited 2010-06-16 20:51 UTC
Based on what?
Energy costs are relative.
Cooling costs are relative.
Laptop prices are relative.
Manufacturing costs can only be estimated.
And most importantly:
Productivity gains are not being taken into account. A small gain in productivity can easily offset the price of the new laptop.
Green math is fuzzy math, much like any other politically motivated math.
The focus should be on improving technologies because devices like CRT monitors fail eventually anyways.
Based on the amount of fossil fuel burned.
It’s clear to me that this does not translate directly *to anyone* as a monetary savings. However, the question was posed “how is it green”? and that’s the answer.
An early Prescott, with a high-power CRT, could easily use 3x current desktops, and at least double that current notebooks .
Maybe we should just send all the old PC’s to the eskimo’s?
Wow, claiming that your i7 is “greener” than your old Pentium 4 is like claiming the latest Porsche is greener than your previous one.
Seriously, you’re doing it wrong. Instead of identifying that the latest processor does more stuff in less time with less power, you need to start thinking about different ways to get your work done.
As a start, try a downgrade to an Atom – then start thinking about maybe converting to an ARM device that uses a few Watts of power… Otherwise, you’re not really making a dent, just like upgrading from a 32mpg car to a 34mpg car yields no actual efficiency gains and in fact wastes more energy than it saves through the needless production of an entire new vehicle, and costly destruction of a perfectly good one.
Edited 2010-06-16 18:35 UTC
Switch to an ARM device? Come on be realistic at least. And for a lot of people an Atom isn’t enough power.
The author’s point is entirely valid. Old desktops are inefficient especially if the monitor is taken into account. A lot of those old p4 cpus were built on 130nm dies and don’t have the power saving technologies in a modern cpu like the core i3. They’re not only hot but also require noisy fans which has been shown to reduce productivity.
Environmental nitpicking is lame, especially when you are nitpicking someone that is using less energy than before.
Exactly… just like a bicycle isn’t enough transportation for a lot of people.
I could give a shit less about environmental nitpicking, myself – which is why I find it so crass that people who have no clue what they’re talking about claim that their being “green” because they threw away their old computer and got a new one.
Throwing away and buying new rarely uses less energy in the process.
Edited 2010-06-16 19:54 UTC
Linux is normally worse at using energy saving features of the hardware. Especially on older hardware ACPI support is hit or miss. Generally using Windows could use less energy. Especially if you can use standby, instead of having to leave the computer running all day.
Just turn it off ! It consumes almost infinitely less energy than standby on a desktop computer (Standby energy consumption > 50% power on energy consumption vs power off energy consumption = 2W and less if you unplug the wire, see http://www.sequoyahcomputer.com/Analysis/PCpoweruse/pcpowerconsumpt… ).
Standby is a hack which appeared because Windows’ startup times were growing more and more unacceptable. If you target power saving, standby is *not* the way to go. Hibernate/suspend to disk is the sole sensible alternative to power off in this area (because components are really unpowered), and even it shares some serious technical drawbacks with standby.
Edited 2010-06-18 11:55 UTC
PS : While you’re watching this page, let’s compare numbers…
C2D under load : 176W
Athlon XP under load : 190W
C2D idle : 100W
Athlon XP idle : 148W
As you can see, we did not exactly see a major improvement in power consumption for average desktop PCs. Except if you’re ready to endure Atom performance, there’s only a ~30% improvement.
Moreover, this guy’s dirty old laptop consumes 14W to ~30W of power. I bet newer computers do actually far worse than that, considering that in a few years, laptop power supplies went from ~40W max to ~80-120W max…
Recycling older computers sounds actually quite sensible, when you have those numbers in mind. Only the screen issue is questionable (if I remember well, LCD screens are less energy-efficient than CRT screens if they last less than 5 years because of the higher building/recycling cost. It’s hence right to ditch a CRT screen and replace it with a LCD screen after 5 years, but not before.)
Edited 2010-06-18 12:09 UTC
Oh come on that was made in 2007 and he’s comparing gaming pcs. Trying comparing a C2D 45nm and an early P4.
CRT monitors not only take more power but also increase cooling costs. You’d also be crazy to hold onto a CRT for any “green” reason when LCD monitors are much better for the eyes. Give your eyes a break and go plant some trees if you have eco-guilt. LCD monitors have been shown to reduce eye strain and fatigue.
/begin rant
As an aside I’ve stick on my own CRT as long as possible for eye strain reasons ^^ LCDs made for a long time my eyes hurt much more than a CRT with 85Hz+ refresh rate, due to their excessive brightness and gamma (which you couldn’t set all the way down to pure black, due to use of stupid neon light), and due to their tendency to “vibrate” and become unreadable when scrolling. Add up crappy vision angles, black that’s far from being black, philosophical opposition to substractive color synthesis (waste of space, waste of energy), poor color rendering, too high resolution for pixel-based OSs… I was really hoping that OLED would be ready by the time my CRT wears out.
Sadly, I had to ditch it in meantime, for arm strain reasons. When you’re forced to sit in a fashion that’s not parallel to your desk due to it not being wide enough to store a keyboard *and* a CRT screen, problems ensue…
/end rant
Edited 2010-06-19 10:25 UTC
Check out different displays. That reads like you hate TNs with analog inputs more than you hate LCDs.
Indeed. It was extreme on early LCD displays, but newer ones are much better, even though they still hurt in my opinion (my room uses a 60W light bulb, and the screen is much, much more bright, even when set up to minimal brightness (and it is just unreadable then). My CRT could be set up to just work in such average light conditions, but desktop LCDs require more rear lighting. Laptops screens are more easy on the eyes in terms of brightness, at the expense of worse performance in other areas…)
Edited 2010-06-19 17:37 UTC
You really should find places where you can get a good look at them. I kind of lucked up, getting some older NEC PVA and S-IPS monitors, to get used to. When I started working in front of the cheapest ones that OEMs would put their logos on, you’d better believe I researched, checked them out in person, and ended up spending a decent bit more than minimum for the type and size. I’m not arguing that the ones you’ve seen look bad–many do–I’m arguing that that not all of them do, and the technology and common awareness of it are not yet to a point where you can easily be guaranteed that a given monitor, sight-unseen, will suit you.
LCD monitors made to move stock by being the cheapest are currently worse that cheap CRTs used to be, IMO, at least in terms of image quality.
You can pay too much, but you also do get what you pay for. TN tend to be the cheapest, has fastest natural response, will tend to be very good for clear text, narrowest viewing angles, and narrowest color ranges. But even then, a cheap TN will often have bad backlight bleed, possibly a ‘cold’ look to the backlight, and likely a preset gamma with horrible light colors, or horrible dark colors (you pretty much have to choose one or the other, w/ TNs). MVAs can look fuzzy, and I find the ones with ‘overdrive’ type stuff for faster response times to be nauseating. PVA are like MVA on steroids (generally better in every way, but a similar ‘feel’ when looking at it). e-IPS is cheaper than S-IPS, but may not look good enough to justify the higher cost. S-IPS is generally godly, but has visible grid lines, which is not so good for text.
Well, since my next computer, which I’ll grab at the post office on Monday, is a laptop (Asus N61JV), I’m more of less screwed if the screen is of poor quality
As far as defects are considered, knowing about its 17-inch N71JV cousin and supposing that they took the 16-inch version of the same screen technology, I know that it probably has quite narrow viewing angles (common on laptops, I just don’t know why since cellphones manage to get the screen thing right), and that contrast is average. For the rest, it seems to be an average to good laptop screen, but I don’t know about the technology being used.
Edited 2010-06-19 21:52 UTC
A huge amount of energy is wasted, because people keep their computers on almost all day. They don’t know when they will need it. Just for convenience, because they want to check their Mail, look something up, or check the news.
Turning the computer on and off at every time you need it for 30 secondes to one minute is cumbersome and wasteful.
This is the scenario where standby shows its strength. With working standby the computer can go to sleep after a certain amount of inactivity. But is available again after a few seconds when needed.
This can save a lot of energy.
80+ watts in standby? Er…I think somebody was using S1. S3 is where it’s at, is what you should use, and is what most OEM PCs have been coming set up with for the last several years.
http://www.silentpcreview.com/article869-page5.html
Many boards will be higher than 7W, but almost all should be in the <20W range.
I got four old machines.
One is a dual P3 650. On the “vintage” side, I use it to play Windows 98 games and to run BeOS R5 and Zeta 1.2. On the “get something newer on it” side, I am unsatisfied with Linux, as it runs too slow for my tastes; Haiku does a much better job.
The second is a Thinkpad 365XD from 1996. Still runs the factory installation of Windows 95, and I see no reason to attempt to upgrade it. Just playing old DOS games with it. Or powering LEGO robots at shows 🙂
The third is a Quadra 950 running System 7. A wonderful machine, but I don’t know what to do with it other than playing a bit and it takes a lot of room, so it’s parked at a retrocomputing club.
The fourth is a Commodore 64, running BASIC V2 (I wonder how many people know it’s from Microsoft). Actually I don’t use it at all; running VICE is much easier. But I can say I’m a proud owner of a working C64!
I’m considering to get a Mac Mini G4, to have a second Mac and a PowerPC machine.
This is a notable point since in a few years we could all be installing Haiku on old computers, even in some cases over Linux installs.
Reminds of a Chinese proverb. When one door closes another one opens.
In most of the cases, the problem with “old” computers is not obsolescence. It’s because they tend to become noisy, dusty, rusted, incresingly prone to failures, and if something breaks, it’s a nightmare to find a replacement if you don’t know someone with a bunch of spare parts. So that’s why most people just buy a new machine, also taking into account the price.
For a company, it’s just nonsense to attempt to repair an old, generic office machine, as the price of repairs easily gets close to the price of the new.
Often, they just rent the machines, so they can have it serviced (read: replaced) on the same day.
An older 17″ CRT monitor uses about twice the energy of a new laptop and also increases cooling costs.
I’m all for computer recycling and in fact donated a monitor the other day but let’s not forget that a lot of advances have been made in the area of power efficiency, especially when it comes to monitors.
I’ve been struggling to donate my old PC. I refurbished it with a new power supply and Xubunutu, but no established charity wants anything older than 5 years. It’s still a very decent word processing / web surfing machine.
Any ideas for doing this in the DC area are welcome.
Offer it on craigslist.
1. Registry and such: it’s your choice to use such applications. Many are fine for jumping hardware.
2. Activation: has anyone had problems with Windows and hardware upgrades/replacements? It’s disconcerting to need to call and ask permission to get the OS back, but I’ve yet to have them say no.
3. The hardware requirements chart. The greatest problem with a PIII and Windows 7 is that the whole PC is so often worth less than the required RAM. I’ve used Windows 7 w/ 1.5GB RAM, on slightly faster PCs, and once you turn off a few services (like indexing and such), it runs happily.
FI, my notebook, with a P3 1.13, Supersavage, and 384MB RAM, ran Windows 7 better than XP, until I tried to do some multitasking, or open to many FF windows. Getting it to 1GB RAM costs $100-150, depending on the phase of the moon, and mood of the DRAM oligopoly.
With desktops, it can still be $50-100, if you don’t have some around. This problem is about the same for Windows as it is Linux; with the caveat that you have at least three pretty good low-RAM Linuxes to choose from (DeLi, Puppy, DSL), and several experimental ones (Tinycore, FI).
But, seriously: 5GB HDD?! That’s…almost nothing. I’d have had to regularly uninstall and reinstall games to keep up with something that little. Actually, I had such problems with a 20GB around that time.
I shudder to think of how many K7S5As are being sent to landfills, these days, when they could make good servers/appliances, or excellent hand-me-down PCs for people that just need a computer they can hook DSL or cable up to.
Ask at your local Salvation Army, Good Will, soup kitchens, etc., before you throw away several-year-old PCs.
One thing to be wary of though is limping along systems that are too old – like PIII’s and even some P4’s in terms of their power requirements. Dumping it in a landfill may in fact be better for the environment.
NOT that many places just dump computer parts into landfills anymore – my local dump has a dropoff for computer parts to go through several stages of recycling first (The SECC steel from computer cases is big bucks now)… Though you’ll always have your nutjobs bitching about how plastic doesn’t biodegrade; neither does rock. If you think the chemicals coming out of the plastics as it sits in the ground is going to be any worse than the radon and heavy metals already present…
But in terms of power requirements I was recycling my old A64 4000+ for use as my media center PC, but it was sucking 160 watts at idle and around 240 watts under load between the old 160 gig hard drive, power hungry CPU and the 7600GT video I had sitting in it. Power supply and mainboard have both been iffy for a while so it became time to trash it.
I’ve since replaced it with Acer Revo, for less than $200 I got a machine that does the same job, but consumed 8 watts at idle at 14 watts under load. Given I tend to leave the machine itself running that ended up shaving around 120kwh off my bill. Between stranded charges, taxes, etc, etc, I’m stuck paying 17 cents per KWH (despite the alleged base rate of 7 cents/kwh), so that’s $20/mo off my bill, in ten months the thing will have paid for itself financially…
But seriously, what’s worse for the environment, dumping some six year old system in the landfill, or the carbon footprint of my consuming 120kwh/mo I don’t have to?
Now think about that with something like a P3… That 1.6 atom backed by an Ion on the Revo will blow most any P3 configuration you can come up with out of the water on performance… If power-wise it pays for itself in ten months, why would you even bother from a ‘green’ or financial standpoint limping that old dog along. Sometimes you have to take it round back o’ the woodshed and put it down like Old Yeller.
If the system in question is less powerful than a intel atom, do the world a favor, trash it, and go buy a nettop. Acer Aspire Revo, Asus Eee Box, MSI WindBox, etc, etc. Within a year it will have saved you money, and you’re doing more for the environment.
It’s like the people shlepping along CRT’s not for the picture quality, but because they don’t want to put it in a landfill – given that a modern 24″ widescreen sucks around 35 Watts and one of the old 21″ IBM CRT’s sucks around 140 watts, you’re polluting more by keeping it running than you would be by throwing it in the landfill.
Even on laptops, which are ‘low power’ components – I have a old Dell 8200 1ghz P3 I keep around as a backup just in case something goes wrong with my primary laptop, at idle it sucks down 40 watts, under load it sucks 65 watts, and when charging it consumes a whopping 90 watts. My primary laptop, an HP NC8000 sucks comparable power…
But if I was looking to actually run that Dell as my primary, I’d probably be looking at just getting an el-cheapo modern laptop instead since with all the obsession with battery life even your crappy sempron, celeron 900 or A64 TF-20’s pull less than 10 watts at idle (with screen dimmed), 30 watts under load and 60 watts when charging. Might take two years to break even financially on that, but it’s a hell of a lot greener since people have their panties in a knot over the mere notion of nuclear power, solar and wind are naive pipedreams, I don’t live anywhere near a hydro-electric station and your leftist moonbats are banning the building of new ones, meaning the majority of electricity I recieve comes from coal and oil.
In a lot of ways it’s just the NIMBY attitude in action. “I don’t want that in my local landfill”, oh, so you want it to cost you more AND be at your local power station instead? Kind of like hybrid vehicles where you’re not so much using less energy or polluting less as you are just shifting the damage from one place to another, and thanks to loss in transmission and conversion using MORE energy.
See the Prius, which over it’s service life not only has a larger carbon footprint than a Hummer (no joke)… it’s just those emissions are done somewhere else, the pinnacle of “Not In My BackYard”
— edit — OH, BTW, watch out for the card stacking and lies your eco-nuts will try to feed you… You really think that the majority of the “1.5 tons of water” doesn’t end up back in the environment after treatment and inside a year gone through the full natural cycle? That companies don’t gather and re-use leftover chemicals to build the next machine (chemicals cost money),
It’s treading into the paranoid “chemicals” are evil nonsense where you can trick your typical psuedo-science green nut with the DHMO trick. That pages like that takeback coalition site are filled with outright lies (How do you get four pounds of lead from a CRT when 28 pounds of a 30 pound monitor is glass and steel? That’s 4-8 OUNCES, not pounds) just shows they are outright packing you full of sand, or omitting some facts to support their agenda. BTW, lead comes out of the ground, and since I’m in New England we have a rockslide it’s right there along with the zinc ore.
Their agenda that’s actually somewhat questionable when most of their advice and information has the opposite impact of what they are saying… Figure in the numbers that don’t even make any sense…
Edited 2010-06-17 06:03 UTC
That’s a tough question to answer. Quite frankly, we don’t have the means to precisely measure the environmental impact of producing the electronics, nor the impact of generating your power. CO2 is an easy to understand and use metric, but it is anything but complete, and many specific technologies that are low in causing emissions of CO2 can distribute other harmful pollutants, which take ages to properly measure the damage of.
Who knows if your Atom box is saving the environment, relative to keeping your old box? However, the argument is not without merit.
Your Revo might very well be greener over its lifetime. Even if it is, will options that people actually use be? How many Atom machines for normal home desktop use have you seen in stores, or advertised to normal people? I have seen 0. It’s generally going to be a new normal desktop PC, with a late-model Core 2, i3, or i5, Athlon II, or Phenom II, hideous blue LEDs, and a bunch of crapware. Of them, the i3 will use the least power, and the Phenom II the most, but how much that saves in the long run is up to debate.
Compared to those, a decent several-year-old PC, with a little sprucing up from donated parts, and a shiny new OS load, saves the costs of creating a new one, with basically no sacrifices for most people, who don’t even need the power of single-core non-HT Atoms, but don’t know enough to make an informed decision one way or another (except that the charities have PCs for free, or very cheap).
Beyond that, since there is a clear productivity improvement by using newer, better, hardware, by people that in some ways stress them, there is no lack of demand for new hardware. So, there is then also no lack of supply of older hardware, that might be good enough for someone who doesn’t know or care about the cool/nerdy/inefficient/high-tech things you are trying to use your computer for (I’d normally love to bash Flash, here; but the recent performance improvements genuinely have made it tolerable on P3 and Athlon boxes).
There’s no silver bullet, here, but I think recycling PCs is better than throwing them out, for your average Facebook Youtube, web mail, and job website user, who would otherwise get a crappy new PC that isn’t a true power miser.
If what you say is true, then the “unless there’s an order of magnitude improvement” part of my post is invalid.
However, again, if making/recycling costs are 300kJ, an older device which ate up 10kJ/year lasting 10 years costs 400 kJ, while changing it after 5 years for something which eats up 1kJ/year costs 655 kJ.
I think the issue is non-trivial, and requires extensive studies by the manufacturer. Though they do a lot of nonsense in other areas in my opinion, Apple are most advanced at showing how things should be done here : going into detail about what is done, how many energy is consumed, in which part of the product cycle it is consumed, and taunting competitors not doing the same.
http://www.apple.com/environment/
Some law should be voted to make sure that all manufacturers do that. That law should in particular specify what information should be given on that page, to prevent marketing-oriented environmental webpages made only of positive elements which favor the brand.
As an example of case where precise legislation would rule, on Apple’s website, when I read that the glass of the screens is arsenic-free, I can’t help but wonder : “And what about the backlight diodes ? GaAs and InP are quite common in the optoelectronic industry, and I think there’s much more arsenic in an average diode array than in dull glass. It’s true that silicium-based LEDs (whose feasibility has been recently demonstrated by Intel) consume tremendous amount of energy to work compared to those based on direct-gap semiconductors, which force people to compensate this with more power-efficient software, but it has the obvious advantage of much reduced toxicity”.
Edited 2010-06-17 08:29 UTC
This is an interesting question. This is how I generally try to answer it:
If I buy something for $200, I can’t imagine that they using more than $50 of oil. If you compare the amount of CO2 that that that much oil emits when burned to how much your local power plants would emit from the amount of power you would save with the new device over a few years (which depends the percentage of your power from fossil fuels; here it’s about 35%), you could figure out which one has a smaller CO2 footprint. So in your example of the device paying for itself in a year, you’re probably saving CO2.
However, all this does not take into account any other pollutants.
That trick is so much fun! I think it helps that people tend to associate it with carbon monoxide, which actually is bad.
I agree. The biggest example of this is the anti-nuclear people who would rather see us continue to burn coal and natural gas. They point and yell “wind turbines, solar panels!”, ignoring the fact that nuclear power is the only CO2 free base-load power source besides hydro-power (which is limited).
EXACTLY my point on the logic disconnect much of this ‘think green’ rhetoric not only fails to take into account, but amounts to little more than them pulling numbers out of their backside, and lying through card stacking or just plain assertion.
… and if you have more than one brain cell to rub together the lies become obvious – lies like:
Lets’ assume that’s imperial, and to be really conservative we’ll use #2 heating oil as our standard (one of the most economic fossil fuels by volume skewing the numbers in THEIR favor)… 7.2 pounds per gallon at 70F that’s 138.9 gallons… at $2.529/gallon right now from the place down the road that’s 351 dollars in just fuel.
Volume discount and paying supplier cost certainly would decrease that if your talking about the fossil fuels consumed to make the electricity used in manufacturing, but by the time you figure in loss of energy via conversion, transmission costs, holding costs, etc, you’re right back at the same number or higher.
I very much doubt that the $500 staples/walmart special with display has over two thirds of it’s cost tied up in a half ton of fossil fuel.
You take the time to rub the brain cells together, the numbers you’ll see in articles like this one and in sensationalist eco-nut websites make about as much sense as the average article from the Weekly World News.
Yes, there’s a Virus that makes computers EXPLODE!!!
Edited 2010-06-18 15:22 UTC
Except, of course, that people are charged horrenduous fees for heating-targeted oil because they can afford it and can’t afford getting rid of it (in a capitalistic system, prices are not calculated based on the value of the product but based on how much its consumer is ready to pay for it)
Do you really believe that medicines based on over-common organic chemicals cost anywhere near 60€ to make a few pills ? Even when taking salaries into account, that number is just insane. Again, here, a small fridge costs 300€…
Yea you think it would be a red flag when they provide an unsourced number that just happens to round nice likey like half-ton.
Back in my day, Linux evangelism at least had the virtue of being short.
Is there something wrong with OSNews? I did not see any update since yesterday!!!
Did Thom already step down?
The http://www.londonuggboots.com“>ugg different designs,
The http://www.londonuggboots.com“>ugg on sale online,
The http://www.londonuggboots.com“>ugg fashion and different style,
The http://www.londonuggboots.com“>cheap save 50% off,
The http://www.londonuggboots.com“>discount Free Shipping!NO Tax!
The http://www.londonuggboots.com/ugg-classic-tall-boots.html“>UGG soft ,
The http://www.londonuggboots.com/ugg-classic-cardy-boots.html“>UGG fashion design
The http://www.londonuggboots.com/ugg-classic-cardy-boots-1.html“>U… Highlight gentle.
The http://www.londonuggboots.com/ugg-classic-short-boots.html/christia… excellent in fashion
The [URL=http://www.londonuggboots.com]ugg boots[/URL] different designs
The [URL=http://www.londonuggboots.com]ugg boots sale[/URL] on sale online,
The [URL=http://www.londonuggboots.com]ugg boots london [/URL] fashion
The [URL=http://www.londonuggboots.com]cheap ugg boots[/URL] save 50% off!
The [URL=http://www.londonuggboots.com]discount ugg boots[/URL] Free Shipping!NO Tax!
The [URL=http://www.londonuggboots.com/ugg-classic-tall-boots.html] UGG Classic Tall Boots [/URL] soft,
The [URL=http://www.londonuggboots.com/ugg-classic-cardy-boots.html] UGG Classic Cardy Bootse [/URL] fashion design
The [URL=http://www.londonuggboots.com/ugg-classic-cardy-boots-1.html] UGG Classic Crochet Boots [/URL] Highlight gentle.
The [URL=http://www.londonuggboots.com/ugg-classic-short-boots.html/christia…]UGG Classic Short Boots [/URL] excellent in fashion
“your laptop is the most expensive electronic item per cubic inch in your house.”
Take out the electronic part and my wife win’s.
I know a hot club: = – [_www.Hotsugars.com_]= which is a dating service for Cougars and Young Men or Sugar Daddy and young girl to find their true love. there has been thousands of single members online and charming young girls, Sugar Daddy, Mature women as well as attractive younger men waiting for you, maybe you will like it. Hot and sexy,Share with you~ = – [_www.Hotsugars.com_]= free to join C’MON NOW!!!