Your computer is an important energy consumer in your home. Can you save
energy when using it? This article offers a few tips.
1. Buy a voltmeter
You only know how much electricity you use if you have a way to measure
it. And only if you measure it can you minimize its use. A voltmeter
gives you this ability. You can buy one for only $25 US at Amazon.
A popular inexpensive Voltmeter
You plug the voltmeter into the wall socket, then plug the device you
want to measure into the voltmeter. To measure how much electricity all
your computer equipment uses, plug it all into a power
strip, and plug the power strip into the voltmeter.
If you’re like me, when you get your new voltmeter you’ll run
around the house and measure the power consumption of everything in
sight. You’ll quickly find ways to minimize your use of electricity,
powering off some appliances and using others less. With your computer you’ll discover:
How much electricity your system uses depends on the device type, the power
consumption of the processor and motherboard, the number and type of
memory sticks, how many disk drives, what peripherals you have, etc.
Your computer uses less electricity at idle than when in use, and heavy
use drives the power consumption higher than light use. Heavy
disk I/O and maximum CPU utilization use more power, for example.
2. Use a laptop instead of a desktop. Or a tablet or smartphone.
It’s tough to generalize about computer power
consumption because it varies so widely. (That’s why you bought the
voltmeter!) Most desktop computer systems consume between 70 and 200
watts while in use, while laptops often burn between 15 and 60 watts. Both figures include typical LCD displays. View this power consumption chart to see how much electricity some typical systems use. This chart lists power consumption for displays alone.
A laptop in use always uses less electricity than a desktop in use.
Often by a pretty significant ratio. If you’re buying a computer and
you’re into saving
electricity, go with a laptop. I’ll show you a formula below so you can
easily determine the energy cost savings you get with a laptop. (We’re
only talking ongoing energy consumption here. We’re ignoring factors like
whether laptops last as long as desktops, how much energy goes into
their production, etc.)
If they meet your needs, consider moving to even more energy-stingy devices,
like tablets or smartphones. (Here are example power specs for the iPad and iPhone.) From least to greatest energy consumed, the device type hierarchy is:
3. Use a flat panel display instead of a CRT monitor
Those big, bulky old CRT monitors use several times the power of
space-saving, power-sipping flat panel displays. CRT’s often consume
between 60 and 110 watts, while most flat panels are down in the 15 to
60 watt range. No contest!
The energy savings you’ll get might even pay you to buy a flat panel to
replace an old CRT. You can calculate this yourself. Here’s how.
My old Dell D1028L 17″ CRT used about 100 watts of power, while my beautiful replacement Acer P205H 20″ LCD/TFT display uses only 28 watts (and only 0.75 watts at Standby!).
Say I use the computer for 4 hours per day every day. Here’s the electricity calculation for these two displays:
Electric bills measure your usage in thousands of watts per hour, or
kilowatt-hours. So the comparison is between 146 kWh versus 40.88 kWh.
Your electric bill shows what your power company charges per kWh. Mine
happens to be 15 cents per kWh. Just multiply it out:
So I save $21.90 – $6.13 or $15.77 a year by using the LCD instead of
the CRT. If the Acer costs me about $120, I can divide $120 by $15.77 to
determine that it will pay for itself in under 8 years solely from
electric bill savings.
The financial experts out there will point out I’m ignoring amortization,
inflation, and lord knows what else, but this simple calculation
suffices for most of us. Use it to compare
electrical costs:
/ 1000) * charge per
kilowatt-hour = total electricity cost
Laptop versus Desktop — Redo
Let’s use the formula to compare laptop to desktop energy costs.
Say I want to replace a Dell OptiPlex GX260 Pentium 4 and its 17″
Dell LCD display from 2003 with an Apple MacBook Pro with
15″ integral display from 2010. The Dell desktop and its
display together consume 104 to 162 watts during moderate use. I’ll
assume 135 watts here. The MacBook clocks in at 55 to 58 watts for
moderate use. I’ll assume 55 watts.
We’ll again assume 4 hours of moderate use per day, or 1,460 hours
annually. At 135 watts/hour, the Dell desktop consumes roughly 197.1
kWh
per year. At 55 watts/hour the laptop consumes around 80.3 kWh per
year.
Assuming 15 cents per kWh, that prices the desktop’s annual electrical
bill at about $29.56 and the laptop’s at $12.05. The difference is
$17.51. I can use this number to help me decide whether buying
the laptop is
worth it.
I’ve made many assumptions here but the approach is valid. Use it
to compare electrical costs. Remember that we’re excluding other
factors like the energy used during manufacture, longevity (MTBF),
disposal costs, etc.
4. Set your Power Management options
Measurements with your new voltmeter quickly expose a power
consumption hierarchy. These are your computer’s power usage levels,
from greatest power consumption to least:
Use -> Idle -> Standby ->
Hibernate or Turned Off
A heavily-used computer sucks more power than one at idle, while idling requires more power than standing by or hibernating.
There’s lots of confusion between the terms standby, sleep, and hibernate.
For good reason. How they work vary by operating system and by computer
manufacturer! There is no industry-standard meaning for these terms.
I’ll use the U.S. Department of Energy definitions. These recognize two distinct states: hibernate and standby.
Hibernate means the computer
saves its state, then turns itself off. It will reload its state when
you turn it on again. Hibernate powers up faster than a regular boot.
The computer uses zero power during hibernation.
Standby means the computer still uses power but at a reduced level.
Non-essential components are powered down. The computer can “awake”
faster than from hibernate mode but uses some minimal power to enable
this. This chart summarizes:
Standby: | Hibernate: | OFF: | |
Energy Use: | 1 to 5 watts | 0 watts | 0 watts |
Time to Become Usable: | quickest (under 5 seconds) | intermediate (30 seconds to 2 minutes) | longest (2 or 3 minutes) |
U.S. DOE definitions for Standby and Hibernate. Figures are approximate.
One quick word about screen savers. Many don’t
save energy. That’s not their purpose. They may even use more
electricity if they’ve got color graphics moving all around. (Pick the
“Blank screen” as the most energy efficient screen saver.) To save
energy
either turn off your display or set its power management options.
To set your computer’s Power Management options, in Windows you can right-click on the desktop,
select Properties, then choose the Screen Saver tab. Here you set the screen saver attributes. Click on the Power button to set the power management options. In Ubuntu go to System -> Preferences -> Screensaver. You can click on the Power Management button from there.
Typical Linux Screen Saver and Power Management Panels (from openSUSE 11.4)
Always test your power management settings. It’s not unusual to find quirks in how they work (or
don’t). This is doubly true if the operating system you’re running was
not manufacturer-installed.
Beyond OS-based power management, you can also install an intelligent power calibration tool like Granola. This free Windows & Linux product does dynamic voltage and frequency scaling (DVFS) for your system’s CPUs. MiserWare’s web site claims Granola can reduce your computer’s
electrical use from 15% to 35%. Monitor with your voltmeter over a
period of time to see what results you get.
How much electricity can you save by proper power management? It depends on how you use your computer. This study
cranks real numbers for office workers and finds that reducing annual
energy consumption by 80% is not unusual. I think that what this means is
that going from ignorance of power management to active power
management is a huge win.
5. Turn off your computer when you’re done for the day
From a power consumption standpoint, a computer turned OFF always uses less power than one that is still ON.
Some claim that the power required to start up a computer is greater
than that consumed by leaving the computer on overnight. Not
true. The power involved in start-up is about the same as power
consumed during maximum regular use (“heavy use”). Measure it
with your voltmeter. Or view the proof in this chart. The extra
power consumed at start up will not exceed the power used by keeping
the computer on all the time. Even if the computer is in standby status, which consumes just a few watts per hour.
This research study gives you the numbers on this and concludes that “… ensuring computers are turned off at night dramatically reduces their energy consumption.”
Some claim that starting a computer “stresses it” due to a “power
surge” and makes it more likely to fail sooner. They say you should
leave your computer on overnight rather than switching if off and on.
Majority opinion among those who seriously research the issue is that “Hard
drives and other components are now better built, so wear and tear through
daily powering on and off of desktop computers is no longer a consideration.”
The U.S. Department of Energy has
it right for most of us:
technology long before the effects of being switched on and off multiple
times have a negative impact on their service life. The less time a PC
is on, the longer it will “last.” PCs also produce heat, so turning
them off reduces building cooling loads.”
Of course, if you need your computer on at night to perform work,
that’s a whole different story. Some people perform updates or run
batch processes overnight. Just set your power management options so
that they take effect after the work completes.
6. Use an Ink Jet printer and turn it on only when printing
Printer
energy consumption varies widely. You’ll have to use your voltmeter to
see how much electricity yours draws when printing and when turned on
but idle.
From least to greatest energy consumed printer technologies order like this:
Since dot matrixes are largely
obsolete due to lesser print quality, this makes the ink jet is the best
energy-conscious choice for most home users. An active laser printer uses several times more electricity than your computer!
Most home users only print occasionally. You can save
electricity by turning on the printer only when you intend to print,
then turning it off immediately afterwards. Don’t leave it on all the
time. Some people waste electricity because they
put their printer on the same powerstrip as their computer so it’s on
whenever their computer is on. Energy savings will be dramatic for
laser printers, which consume hundreds of watts when active and tens of
watts even
in standby mode.
7. Buy Energy Star equipment
Energy Star (or ES) is a joint program
run by the U.S. Department of Energy and the Environmental Protection
Agency. Computer equipment that meets ES energy efficiency standards display the Energy Star logo:
The Energy Star Logo
Many countries beyond the U.S. have adopted the ES system
and at least 40,000 appliances conform to it. Energy Star specs cover
all kinds of computer components including system units, power
supplies, batteries, displays, printers, and more. What’s great about
ES is
that the program is strictly voluntary. It does not impose
government regulation. Yet it saves billions of dollars per year in
energy costs while minimizing our carbon footprint. The EPA estimates
ES saved $14 billion in energy costs in 2006 alone.
The first ES specification for computer equipment was developed in
1992. The current computer specification is version 5.0 from 2009. The good news is that the ES specs keep moving
forward and incrementally improving. The bad news is that the changes
make it hard to compare power efficiency between systems released under
different ES standards. ES is a great
tool for buying new energy efficient hardware. It isn’t so useful when
comparing hardware across time.
How much will you save by buying Energy Star products? This
article calculates a total lifecyle
savings for a computer system of $161, or 18% of the initial purchase
price. Even if that’s on the high side, it pays to buy ES!
8. Keep your computer in service longer
In my articles on
computer refurbishing I advocate keeping computers in
service until either technological obsolescence or their natural end of
life. Some readers have responded that this is a bad idea because older
computers consume more electricity than newer ones. Actually, if you’re
talking consumer systems produced over the past decade, there has been
but minor reduction in operational energy consumption in spite of the ES energy
efficiency increases. You can’t take a random new computer and assume it uses
less electricity than one from 2005 or even 2000. The big change for
consumer systems has
come in
displays. Not only do LCDs use way less energy
than CRTs, but newer flat panel displays use less than older ones.
The key reason to keep consumer computers in service longer is environmental. It costs both natural resources and energy to make a computer. Lots. If your current machine
still does the job, replacing it with a new computer
of the same type (desktop or laptop) unnecessarily consumes resources.
According to a United Nations University study, the natural resources that
go into making a computer are about 10 times its weight in fossil
fuels and and chemicals, versus only 1 to 2 times their weight for a
car or refrigerator. This is why your little laptop is so expensive compared to larger appliances in your house.
Then there’s the huge problem
of proper disposal of all the toxins computers contain when they’re no longer usable.
What about energy? This research project found it takes about 1,778 kWh of electricity to produce a desktop
computer and monitor as of 2004. This is as much energy as the typical
household uses
in two months. It’s enough energy to keep your 100 watt desktop in
active use for 17,780 hours. That’s over 12 years of daily use at 4
hours per day. Given
a typical lifespan
of 3 to 5 years, this means that the energy cost to produce a computer
greatly exceeds the energy it consumes during its operational
lifespan.
This article underscores the point. Based on research recently published in the Journal of Cleaner Production, it concludes that “Computer factories eat way more energy than the devices they build.” The study found that 70% of the energy a typical
laptop will consume during its life span is used in manufacturing the
computer!
If you buy two computers of the same type
(desktop
or laptop) over a decade, you use less total energy than if you had
bought three in
that
same time span. The best way to reduce your computer
energy use is to buy fewer computers. This is why research from sources as diverse as Fujitsu and the Gartner Group advocate device longevity.
What if you replace a device of one type with a device of another
type that consumes less electricity?
For
example, say you replace your old desktop with a new laptop or a
tablet. Or
say you replace your laptop with a smartphone. Now you have a much more
complicated comparison. I’ve shown in this article how to compute ongoing
electrical costs. You can use the simple formula to predict and compare your electrical bills for different devices.
However, we
haven’t compared the energy required to manufacture different kinds
of devices. Nor have we explored how recycling costs might differ for
device types. And these are just two of the factors you’d have to
measure to make an accurate judgement.
For now let’s stop at one useful conclusion. So long
as it still does
the job you need done, it is cheaper to keep your current computer
in
service than it is to replace it with a similar new one.
This is true both from the standpoint of the total energy cost (energy
cost to manufacture plus ongoing power consumption) and from the
standpoint of natural resource consumption.
Refurbishing and reusing computers in the 3 to 10 year old range is
without question good for the environment.
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick (President, FCI) is an independent consultant who
supports
databases and operating systems. His hobby is refurbishing computers as
a form of social work and environmental contribution. Read his other articles here or email him
at contactfci at the domain
name of sbcglobal (period) net.
great article, but there are more severe problems you can find with a wattmeter (not voltmeter as stated in the article)
and some times there are simple means to solve them
for instance:
adding extra insulation to a boiler can reduce power-consumption significantly
True, and use night-store rates so that you’re not heating up water during the day when the peak power prices are high; that combined with a good insulation around the broiler has helped me save a few dollars each month.
So these rates actually work ? I see why electricity companies make electricity cheaper at night, but I’ve always wondered if it was an effective way to make people turn power-savvy devices on at night.
Anyway, if I may add my own computer-unrelated advice… During the winter, make sure that heaters only heat when you are at home, and heat up less during the night. One actually sleeps better if the house is at 15-17°C, although you get cold mornings as a counterpart.
Edited 2011-08-03 08:49 UTC
Because the peak power usage is during the day in much the same way that you have public transportation – during the peak time you have all the generation happening but at night you have he generators standing idle which is money wasted so to encourage power usage at night the price is cheaper. In terms of night store rates, most of the power companies in NZ give you two options: a flat rate which is the same price regardless of the time and the second option being night store.
Interesting enough there are quite a number of companies that operate their power hungry equipment at night as to save money IIRC Comalco in NZ (aluminium smelter) do a lot of their stuff at the ungodly hours of the night.
Edited 2011-08-03 14:28 UTC
Yup, as said previously, I know that. I was only wondering if it was effective, that is if many people actually modified their habits to use electricity during the night. Especially when some companies don’t make having two different rates for electricity price the default, and charge a little per month for it.
Besides, isn’t the situation reversed in winter, when people turn on home electric heating at night ? (Whereas large offices tend to prefer fossil fuels for heating)
We have a washing machine with a timer function, so it starts washing at around 03:00 am.
Used to have our water heated at nigh (3 to 6 KWH). However we now use a combi-boiler which overall costs less to run (we switched due to tank and boiler failure).
Howard,
Great article as usual.
Sorry if i repeat anything you already said below.
I sometimes build custom gaming pc’s on order and can share a few pointers.
1.Check if your computer’s PSU (power supplied unit) is 80 Plus certified.
Many pc manufacturers uses generic psu’s that don’t quality under 80 Plus certification.
Check, or let someone qualified check it for you and if not 80+ certified it’s a good idea to replace it. 80 Plus certified psu’s is not even that expensive. It might also make your pc run quieter.
There is three 80 Plus certifications. Bronze, Silver and Gold.
2. LED screen uses less energy than LCD’s
Rather use LED screen.
3. When adding a second 3.5″ Sata hdd go for the “green ones”.
For instance Western Digital caviar Green
Seagate barracuda Green
These drives switch off when inactive.
4.The firmware of motherboard sometimes installs a control centre with tweaking options like “eco mode” among other options.
5. AMD Cool and Quit activated via your bios and then from desktop can save a lot power. see below.
http://www.amd.com/us/products/technologies/cool-n-quiet/Pages/cool…
6.Hardware Aids in power usage.
ZALMAN ZM-PCM1
http://www.zalman.com/ENG/product/Product_Read.asp?Idx=417
ZALMAN ZM-MFC3
http://www.zalman.com/ENG/product/Product_Read.asp?Idx=341
Whit this product you can conveniently switch off discs in a multi disk configuration.
LIAN-LI BZ-H06
http://www.lian-li.com.tw/v2/en/product/product06.php?pr_index=487&…
7. Even some Ram can come in ecoform now a days.
Kingston Hyper-X LoVo ( Green )
http://www.ec.kingston.com/ecom/configurator_new/PartsInfo.asp?root…
And lastly.
Our pc is many times overkill for what we use them for.
If you are only going to use spreadsheets, read email you wont need to buy a high spec pc.
Some new Mobo/cpu’s unit in one has come out with lower power consumption but still respectable speed and other bells and whistles.
For instance.
AMD 350 Brazos that uses lots less energy but they still have high def.,hdmi outs,Sata6G, USB3 and many other features while using much less energy.
SAPPHIRE E350M1 PURE FUSION
http://www.sapphiretech.com/presentation/product/product_index.aspx…
Edited 2011-08-03 00:03 UTC
Replacing an ordinary PSU with a energy efficient one, dropped the noise level and almost halved the powerusage
(now less than 40 Watt idle) on one machine. Largest powersaving is a Mac mini (usually below 20 Watt 13 Watt when idle) but it takes a long time to recoup the Apple tax
Excellent points. Thanks for adding this useful information.
My apologies to everyone for inexplicably renaming the wattmeter/multimeter a voltmeter. Glad to see most folks overlooked this goofy error and focused on the useful points in the article.
— Howard Fosdick
On manuals of printers i seen they recommend switching off the wireless capability when not using. Apparantly this saves a bit of power.
Ink cartridges don’t last nearly as long as toner cartridges (but, admittedly, have less plastic), but in low volume printing, you’ll end up replacing ink far more often (in terms of page count), due to them drying up, and you’ll probably end up replacing printers far more often due to ink being more expensive than the printer.
So, overall cradle-to-grave energy consumption of a laser may be better.
Laser printers use more power, though. Still, I’m not sure which has the lowest cost of operation once you factor in the cartridges.
In a ‘low usage’, what matters most is the lifetime of the printer itself: I had my laser printer for 10 years and I didn’t have to replace the cartdrige yet.
In the same time, my brother had to buy two inkjet printers because the first one failed after a few years, plus several ink cartdrige because they failed too (not enough usage).
Plus my year old laser actually shuts it self off after 5 minutes of inactivity. It doesn’t “Sleep” it shuts off. It can’t be woken up by software, a hardware button must be pressed to get it to print. I usually leave it unplugged just in case. I understand the energy implications, but inkjet is crap.
I don’t often print, but when I do print, I want it to look good.
I had an old laser printer that went into a deep-sleep mode that also required a hardware button to be pressed to wake it up. It was faster than waking up from a power cycle, and it only consumed about half a watt while asleep if I remember correctly.
These days — especially for us alternative OS users — I would suggest a Brother laser printer for several reasons. They tend to be energy efficient, and they have the cheapest toner cartridges by far (they don’t chip their cartridges so you can get generics and refill kits super cheap). They pretty much all use Postscript and PCL, and they are among the least expensive laser printers out there. No, I don’t work for them, but I’ve had a lot of experience with them at my part time job and I absolutely love them!
Here’s a neat trick with Brother printer cartridges: As I said above they don’t use a chip to tell the printer when the cartridge is empty, rather they use a gear on the side of the cartridge. Once the gear has rotated 180 degrees from printing all those pages, the printer senses it and tells you to replace it. You can buy a toner refill kit for less than $10, pop off the plastic cap on the side of the cartridge and refill it, then take a Phillips screwdriver and remove the gear cover. Turn the gear back to the original position and replace the cover. You now have, according to the printer, a new cartridge.
You can even do this with the starter cartridge, the only difference is the starter doesn’t have all the gears necessary to reset, so you have to buy a $5 gear kit for it. Once you’ve installed the kit, it becomes a standard cartridge and holds just as much toner as a retail unit.
You’re the most interesting printer user in the world!
I recently replaced a 140W 20″ CRT with a 14W 20″ Acer LED ($100) and since that monitor is used 16hrs a day at a 15c KWhr rate, the LED pays for itself after a year or so. The CRT though just won’t go and die yet. The CRT still looks way better for square TV/video, but far worse for everything else.
My oldish 24″ LCD though is closer to 80W and is toasty to sit in front of. Most plain LCDs are not as frugal as they could be, typically 60W and up for 24″ and bigger screens. Some of those 27″ and bigger Hanspree and HP models are well over a 100W.
My next 24″ will also be a 1920×1200 LCD from Lenova that uses a half power CC tube at 35W. Wonder why more CC tubes don’t use that design since it compares well with LED on power.
Still LED back lighting really seems to be smacking LCDs now on power and the price difference is becoming minimal and could easily cover itself in short order. Also LED makes for much slimmer, lighter panels (and woblier too).
Also on ATX power supplies, most PCs use <80% efficient PSUs and the power factor is usually bad too. Compare the VA value against the Wattage value, VA is often 20% higher. Since most budget PCs can use a motherboard with built in graphics, they should be able to run on 60W or so esp if a 2.5″ HD is used. That means the 300W PSU could easily be replaced by a micro PSU that fits entirely in the 24pin molex connector. These fanless Minibox supplies can deliver from 60W-150W saving space, noise and power and are >95% efficient. They do use an external 12V DC adapter though. Using one though will limit expansion options.
For most of my PCs, I also switched to refurbished 2.5″ HDs drives leaving the 3.5″ HDs only for mass storage used only when needed. Saves power, space and noise. Microcenter often has these for $15-$20 at 40-60GB size.
You know that marketers are taking over our world when even somebody aware of “LED back lighting” (of… LCD panels) contrasts that with… LCDs ;(
Generally, we had a nice scam going around in retail, for some time. LED-backlit LCDs were probably less expensive to make from the very start of their mass-production. And, regarding labels, imagine the mess when actual (O)LED monitors will really show up en masse… (kinda like the fad of stereoscopy trying to take over the label “3D” – while there are IMHO much nicer approaches)
Oh, and only edge-lit panels seem to be much slimmer. The ones with a “matrix” of dozen+ LEDs at the back of LCD panel don’t appear that different (they do make inexpensive panels much nicer when it comes to contrast, blacks, etc.)
“LED-backlit LCDs were probably less expensive to make from the very start of their mass-production”
That is probably over stating the situation, but today the cross over point has more or less arrived for the sweet spot around 20″ to 23″. Today it seems CC tubes are finally going away for the commodity market.
For the higher end at the 24″ to 30″ panels the price difference is way more marked, perhaps the engineering is still tougher for large area edge lighting, and you also get into TN vs IPS issues and color gamut and professional use issues.
Maybe, maybe not. The screens are generally almost identical, except for the type of edge-placed back-lights* …which should be at least very comparable in cost (especially since CCFL need a relatively fancy power source); LED having probably lower future costs of disposal (which nowadays need to be often factored-in) or greater reliability (less warranty returns)
Additionally, overall, you don’t need to search very far to find plenty of examples of human irrationality, especially when it comes to purchasing dynamics. I see no particular reason why it wouldn’t be the case in this field. Or at least what was in the interest on non-consumers in the equation, to clear their warehouses and supply chains.
* And even for a notably different screens, those with a LED array behind the panel, something happened around half a year ago (in my part of the woods) which made them well entrenched into the segment of …perhaps not the cheapest possible choices, but very much “a fairly typical, average-priced TV” (with large, ~40″ LCD panels; yes, in a bit different league, always topping at 1080p for one; but I see no clear reason for back-light engineering to be much different in smaller, higher DPI panels)
Edited 2011-08-07 12:56 UTC
I just very recently bought myself a new laptop and noticed that it actually has two graphics cards: a low-power, low-performance one and a high-power, high-performance one. I can switch between them manually or the system can do it based on my power-profiles, though it doesn’t switch the high-power one on when I e.g. start a game. I s’spose it’s a shortcoming of Windows and the drivers.
Anyways, it made me wonder when will PC manufacturers actually start doing the same thing on desktop PCs. It would make sense, even if it’ll add $15 to the cost straight-up it’ll pay itself back pretty quickly for most people.
EDIT: As a side-note I’ve gotten the impression that switching graphics card on-the-fly is STILL not possible under Linux. I would think that such an ability would be very high on the to-do list, after all it does save power quite a bit, but I can’t recall having seen anyone even thinking of working on that. Has this changed yet, does anyone know?
Edited 2011-08-03 05:34 UTC
Have switchable graphics on my PC too, and I can attest that support in the Linux world is still in its early stages. Last time I tried, a few months ago, the computer locked up
All I want myself is something that switches off the NVidia GPU and never turns it on again, since I don’t use GPU-intensive software on Linux. Wonder if that’s possible already…
There is, with the ATI proprietary driver (powerXpress). Unfortunately you need unlink/relink to the ATI OGL library every time you switch.
Also there is some open source stuff that works with the radeon and intel gpus (I’m not sure it works with nouveau).
None of them are dynamic. You need to specifically switch, restart X (sometimes even restart the pc). Alas, the rather crufty graphics stack prevents dynamic switching…
Edited 2011-08-03 10:12 UTC
That’s what I thought and that simply is not an acceptable solution. I hope someone gets around to do it properly, though.
That also means that I won’t be using Linux on my laptop.
My PC consumes 20W when used normally.
AMD Sempron 3400+, RAM: 1.5G, 2 hard drives, GeForce 7300 SE.
It runs GNOME with metacity (no OpenGl effects). The screensaver is a black screen (no OpenGl).
It tops at 80W when playing 3D games.
The screen consumes 25W: 19′ iiyama.
It beats your MacBook. My guess is that Macs are bad at energy saving, because they use the hardware accelerated effects to display the desktop. Is that right?
One thing I have noticed is that the computer consumes energy when doing computing. These days, the graphic card is more powerful than the processor. It’s the unit that consumes the most power. It dwarfs all the rest. Getting green hard drive is useless when you waste 80% of your energy in the graphics card. The hard drive’s consumption is not significant.
Edited 2011-08-03 06:41 UTC
My computer consumes 1 to 2W when switched off!
I bought a plug switch and turn it off so it now consumes 0W when switched off.
Some people think there is a spike when the computer is turned on. There is none. If it takes 30 sec to shut down and 2 minutes to boot and you leave for 5 minutes, you will save energy if you switch it off. It’s nothing like a diesel motor, not to mention that what people think about modern diesel motors is also false. Switching your motor off is also worth it.
If you are using a laptop as a desktop replacement, like mentioned in the article, and it is plugged to the wall, please, pretty please, remove the battery! You don’t use it and you are charging it. There is an insane loss of energy to store it, then your battery will loose that energy over time by heat when the computer is turned off. Moreover your battery will live longer if you don’t waste your limited cycles.
Most desktop PSUs have hard switches at the back. That should take care of the stand-by power consumption.
I would hope for average battery-charging software / logic to be better; to not recharge obtrusively when not really needed… (though this might involve a setting or two; still, probably less “taxing” than something so scary as manipulating a battery, for most users)
Overall, having a built-in UPS is too nice to abandon (and again, in a “desktop replacement” normally working off the mains, the battery doesn’t need to be obtrusively topped-up at every opportunity)
PS. Funny thing with diesels, they actually scale exceptionally well their fuel usage down, when only fraction of their total power output is required or when idle. Yeah, people could stop being almost prejudiced about them.
Well, it does not matter how clever the software is. Charging the battery costs energy, always. If you use it, when traveling for instance, you need it. But if your laptop is always plugged, just remove it. The ratio of energy consumed vs energy stored is insanely low. And you store it for nothing, it will dissipate in heat, and not so slowly.
But that’s the thing, “smart” software can be set in such scenario to essentially not use the battery
Yes, when kept in warmer environment of a laptop it will dissipate slightly faster; overall being recharged more; but nothing too dramatic. Especially versus the perks of built-in UPS (and I write it especially from the point of view of “average user” who seems to have a hard time protecting (saving data, for example) against sudden power loss, and we are supposed to “get their data back” after the fact…)
The problem is that it is a trade-off between saving electricity and having equipment that last longer. Laptop tends to have a relatively short life-time, and when something (screen, keyboard…) stop functioning, it is not easy to replace this part, especially after the end of the warranty. So people tends to buy a whole new laptop. Building and getting rid of electronic devices have great environmental costs (how much pure water to make a processor).
A CRT screen is by far less convenient than a flat panel, but the latest is not nice at all when it comes to dismantling.
It’s not much of a trade-off in practice – at worst, you just use it until it fails. I think most units don’t fail throughout their life, and people still get new ones (too often because “old computer started to be broken & too slow” aka just filled with crapware*)
I disagree: my 10 year old desktop PC is still going quite well (except for blue screen from time to time, don’t know if this is a HW or SW issue) whereas I know someone who had a laptop which failed just after the end of the warranty.
So? “I have” / “I know someone” is not only a poor input vs. overall trends (how people don’t seem to hold on to working machines often enough), you also essentially reaffirm my very own assertion that, yes, it’s not something you can plan around much (please, please, drop the confirmation bias of “they don’t build them like they used to” – we just remember positive specimens better, the few still working doubly so), so at worst, you just use it until it fails.
Edited 2011-08-03 08:27 UTC
No, he’s right. 1 out of 15 laptop fails in the first year, 1 out of 5 laptop fails before its 2nd birthday and 1 out of 3 laptop fails in 3 years. And if you have a HP laptop, you have 1 chance out of 4 it will fail in 2 years. It’s not a myth. Laptops have a very poor life expectancy. I can confirm this with my experience as well.
Again, it doesn’t change how they are too often replaced before hitting their limits (likewise for desktops of course; NVM how comparative data between the two classes are still missing, or how laptops are also a bit more likely to endure physical hardships, that’s not the point)
It’s not much of a trade-off when, too often, people aren’t very willing to hold on even to working equipment.
When a laptop fails, it means it is used. It does not fail in the trashcan or in the recycling factory.
Which, again, doesn’t exclude other (and IMHO way too widespread of course) scenarios, practices.
[citation needed]
Any one person’s experience is not statistically relevant. You cannot confirm anything with your experience, even if you were in an industry where you worked with a large number of laptops.
Your experience can lead to a more scientific inquiry to investigate any claim arising from personal experience.
I realise I’m being a bit pedantic, but too many people day in and day out spout similar things about how what they experience, like, prefer is somehow universal, because they are special or something. And, yes, I’d count myself among those. I try not to draw too much conclusions from my own experience, but occasionally its interesting to see some raw unscientific observations as well. Just keep in mind that its not necessarily the case. Computers, people, things, life, existence, the universe, are all complex things with lots of small interacting parts. Its really easy to screw up our understanding of everything. Brilliant people do that everyday.
Here is one citation: http://www.squaretrade.com/htm/pdf/SquareTrade_laptop_reliability_1109.pdf
There has been several studies about laptops failure rates. It is confirmed by my experience but I based the numbers I cited on real studies over hundreds of thousands of laptops. Google laptop failure rate for other citations if you need. They all end up with pretty similar numbers.
Thanks for the citation. It makes everything easier to examine.
For instance: the three year failure rate is not 1/3. Its 1/5. 1/10 people accidentally break their laptop.
Also, there is a bit of potential selection bias going into the study. This looks at laptops covered by a third party warranty. So, more accurately it shows that 1/5 laptops owned by people who think their laptop may fail, fail within three years. Those who don’t pay for the warranty may take better care of their laptops ( Ie not storing it in a hot car, not leaving it on 24/7, not using it to level out the kitchen table, not running the battery till death for the fun of it, using an additional cooling pad ect).
Again, I really don’t have a dog in this fight, we were saying that laptops break more often or something? I’m just complaining about people’s throwing around antidotes and not understanding statistics. Its of minor importance in this case, but we suck just as much when it is important… like economics, politics, and health issues.
For LCD screens, you can make a screensaver, that shows white color instead of black. LCD works in that way, that power is needed to make black color. White is the natural state of the screen, so it only uses power for backlight. And if you turn of the screen light, then it saves power completely. Try white color with a voltmeter.
With plasma and CRT screens black is the natural state of them, so showing black should save power. But those screens are eating more of it naturally.
What about LED-backlit LCD ? Aren’t they smart enough to selectively turn off some of the white LEDs when displaying black ?
Ah, can’t wait until we have OLED and transflective screens everywhere
I believe that most LED backlights don’t actually have a lot of LEDs behind the screen (known as “full array”) that can be turned on and off, but rather have LED emitters at the edges of the screen that uniformly illuminate the area behind the LCD panel.
There is a technology known as “local dimming” that works like you describe, but it’s only in high end TVs at this point. I don’t know if anyone is making computer monitors with local dimming.
I think you are quite wrong there.
It is a fact that the LCD screen pixel structure is only about 5% efficient at letting the light through when fully on, that means 95% of whatever the light source is either CC tube or LED is blocked and is therefore turned to black body heat. If the screen is black it would just be 99.9% instead. To save power means controlling the light source power.
There was/is a company called Unipixel that had a LCD alternative that claimed 60% efficient light cell based on a neat MEMs opto capacitive structure. They even licensed it to Samsung in 2009, not a squeek since though. When driven by side lit LEDs, its power use would be far below todays LED LCD panels which are mostly already good enough.
A voltmeter would tell you that your computer is fed with 230V AC all the time, unless you have some really bad power grid around. I suggest using a wattmeter instead, although it’s open to discussion.
EDIT : Ooops, someone said it before me.
EDIT 2 : Excellent research, by the way. You’ve found quite a lot of references to prove your point !
Edited 2011-08-03 08:38 UTC
By the looks of the power socket, I guess that would be 110V AC …and I didn’t even need a voltmeter to assess it, remotely!
Too bad, it makes the article a bit mixed bag. On one hand, as you say, nicely sourced. But Watts / Volts mix-up (especially since the latter are essentially constant in such scenarios) casts a shadow… No, really, with such basic mistake repeated throughout, it unnecessarily makes the whole article suspect from the start; about the writing process, the author behind it, how much of it can be depended upon, etc.
(don’t get me wrong, obviously quite a lot, it has useful practical guidelines; but BTW, I can’t help but notice how immense portion of it is what I would hope to be common sense knowledge :/ – even bordering on “perpetuum mobile doesn’t exist” – which, sadly, probably isn’t common sense knowledge, hence the usefulness… though I’m not sure if OSNews is the best channel)
PS. And getting another wattmeter is not strictly required when… virtually all dwellings already have a very precise central one. Considering how rarely we would fiddle with this, disconnecting every other electrical energy sink for the time of PC experiments is fairly trivial; you can temporarily move the PC closer to it, too. Absolutely basic arithmetic will take us most of the way to per-socket one; especially since for calculations we see averaging & “assuming” anyway (of course that assumes somebody can notice, connect the dots between energy usage of each device vs. overall bill; but it’s also required with portable wattmeter)
Heck, promoting such portable wattmeters when there’s already a very good one (and not really very inconvenient) in every house, is itself a bit of a waste of resources, energy
Ah, you’re right, totally forgotten that not every part of the world used my fellow voltage standards
Though about wattmeters, it’s my turn to accuse yourself of being region-specific ^^ In France at least (should check in Sweden while I’m here), old-fashioned mechanical watt counters with a spinning thingie (1 turn = X kWh) are still common, and using them for wattage calculation is quite cumbersome and imprecise.
Such meters are also still standard at my place; I don’t really see them as “quite cumbersome and imprecise” (vs. cheap portable meter), especially since for calculations we see averaging & “assuming” anyway (overall, total usage over time is what we need to have in mind, if energy conservation is to become a routine)
And might I add that, in your unsound accusations, you actually gave me an opportunity to berate you so much more strongly!
I’ve never even been anywhere close to a 110V place, I just remember they exist.
Are there any 110 volt zealots or 230 volt fanboys, I want to see there flamewars.
Anyway, anybody dumb enough to live in a 110 volt area deserves to have there equipment broken earlier and there have computers run slower.
230 volt is the true path to enlightenment.
Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator.
I dream of a building where there is a huge central heat pump. It cools down all fridges, computers, and freezers and on the other end distributes heat to whatever needs it. In every room, you find faucets for hot and cold fluid that you can use to heat and cool down things as needed.
(Flamewar contribution : 110V AC and 230V AC are both lame, DC current makes the chances of survival to an electric shock much higher* and is the One True Way
* An irrelevant side-effect of DC being slightly reduced power grid, generator, and transformer efficiency)
Edited 2011-08-03 12:11 UTC
Wow. OSNews readers can come up with a flamewar about anything! 🙂
Back in the very earliest days of electricity production, I’m sure there must have been heated debates over AC vs DC and 110-220-240 Volts and so on. The players would have included Edison, Tesla and other notable figures.
“Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator.”
I had been thinking the same thing too, at least when I was using a D805 + 2 large CRTs that typically used 400W continuous. How to get the heat away from me and towards a more useful purpose. Anyways its all moot now.
Power saving is all right but I am really not enjoying this cold summer … hence I leave my computer ON to contribute to global warming. It’s a small contribution but it’s all I can afford.
So where in the world is summer too cold?
In the US we have probably had enough of the heat dome effect, although in Mass we are not so affected.
You want to start a flame war on global warming seriously!
I often joke about it … getting warmer does not motivate me to save power, quite the opposite. I do not believe in global warming being mainly caused be men and by whatever (sun for example) it is mainly caused by, I would seriously enjoy more heat from it – writing from Central Europe.
It also depends where the power comes from … cutting on other harmful gases is OK, but CO2 is not the one we should avoid!
co2 is one to be avoided, but not the only one
but like with most big tasks:
you have to make the first step
and every little bit counts
so don’t be lazy and turn off all devices you don’t need
instead of a bigger airconditioner beef up the insulation of your hous
put some solar collectors on your roof because the sun shines for free
there is so much you can do today…
CO2 is healthy for the vegetation, all the plants rely on it so what the hell? I am all for saving the planet, not polluting environment, not using rivers for waste dumps, saving endangered species etc… but CO2 is just silly. Even if global warming was harmful (I don’t know), CO2 is a very minor greenhouse effect gas, and human produced CO2 is even less significant.
well, thanks to humans the ammount of co2 in the air has increased by 20% over the last 50 years
and thats a hell of a lot of co2
and to quote wikipedia:
Thanks for bringing up other effects of CO2 increase. I do not believe in it contributing to the global temperature in a serious matter (quite the opposite, temperature increase, increases CO2 natural production) but it might affect life in other ways, possibly harmful as you have pointed out.
http://dangerousintersection.org/wp-content/uploads/2006/09/CO2-Tem… This is often pointed out as a proof of CO2/temperature relationship. Careful examination shows that it’s the CO2 that lags after temperature.
Edited 2011-08-03 22:21 UTC
Your are spewing out the usual climate denial junk we expect to see from the wattsupwiththat crowd as well as the various conservative think tanks funded by Exxonmobile,the Koch brothers or the tobacco industry at the Heartland Institute. Are you a stool for those interests or can you study think for yourself.
If what you say is true, write a paper on all your beliefs and get it published in a peer reviewed journal. You will find your points have all been debunked.
It is one thing to be a conservative/republican, it is quite another to buy into the idea that conservatives automatically have to work for free for Koch industries to promote coal and fossil fuel burning.
It is a simple fact humans have increased CO2 levels from around 250 to 460 ppm since the industrial revolution started, nature can not do so that fast. Since China and India have joined the energy party, we are likely headed to 700 ppm in the next century with no stopping. Last time CO2 levels were this high due to nature, was eons ago when life and flora was very different.
And yes CO2 is good for vegetation up to a point. It is also a trace gas, and physics says CO2 is a warming gas although water vapor and methane are much worse. The Oxygen, Nitrogen, Argon that make up the bulk of the atmosphere are not warming gasses so indeed the remaining gasses can and do make a huge difference.
The CO2 normally cycles through the system over very long periods of time, about the same amount goes into the atmosphere as is taken out by natural processes. The human load is just a small push of a few percent but it is always adding, so the CO2 level drifts upwards, that is simple integration math.
I could go on but you could learn more from your own research.
Personally I think that the only way to a future of guilt free plentiful energy for all the nations is Nuclear power from Thorium LFTR, despite all the green loons anger at Nuclear. Nuclear energy is millions of times energy denser compared to solar in any form and there is enough Th for the entire planet to live well for thousands of years until fusion works. It can even help rid the world of the nuclear waste from weapons and regular nuclear plants.
google kirk sorenson thorium energy
thorium is definitely better than uranium
but fusion is no solution
it only helps big power-companys to keep their monopoles
in the end we have to switch to some green energy, be it solar, wind, water, geothermal or whatever
all we need is a good and cheap way to store energy for several days
You obviously don’t know any chemistry because the treat of acidification is exactly zero. It is literally impossible:
– pH is logartithmic. In other words a pH of 6 is 10x as acidic as a pH of 7.
– Carbonic acid (dissolved CO2) is weak acid. A 30% increase in dissolved CO2 has negligible impact on the pH of seawater.
– pH is highly temperature sensitive. The pH of seawater changes far more in due to temperature changes than atmospheric CO2 levels.
– seawater cannot become acidified (or significantly less basic) by CO2 because it is heavily buffered by dissolved salts.
– Ocean acidification has never occurred even with CO2 concentrations 20x as high as present.
On the desktop, linux consumes 100 times less energy than windows and 5 times less than MacOS.
Gentoo consumes 1000 times less energy than Ubuntu.
I’d just LOVE to hear the maths behind that claim. I mean, if a Windows PC consumes e.g. 300W while powered on, it would only consume 3W when running generic Linux, and 0,003W when running Gentoo…
Or in other words: you have no idea what you’re saying.
Sarcasm failure detected.
Well, the math is really simple.
A linux desktop consumes roughly the same amount of energy as a windows desktop. There are 100 times more windows desktops than linux desktops. Therefore windows consumes 100 times more power than linux on the desktop.
The same math applies when comparing gentoo to ubuntu.
I thought you would get it.
I guess 650 kWh for 2 months would be more accurate for me.. and that includes heating and cooking.
When someone has to get new electrical wiring in the house I can give you one advise: make sure you can switch off most power groups/outlets from the main board. In my case it will be less than 10 years before I saved the initial cost, even when I only take the kitchen into account.
I do plan to build a new desktop, but that one has to be power efficient and should last at least 6-7 years. It’s main purpose will be for more demanding programs like editing photos, running virtual machines, .. When I want to surf, chat, mail I will still use a cheap laptop.
Like others have said what you want is a wattmeter not a voltmeter, no question about that.
The device you bought, judging by the name, is a wattmeter so you got the device right .
Another thing is that any device will use power when turned off _unless_ there is a mechanical switch to completely turn it off, this also applies to hibernation mode.
When it comes to real devices the DOE definition doesn’t mean much, how much each device uses when on (soft) turned off cannot be predicted, it can be less than 1 watt or reach several watt, you need to measure it to find out how much each device uses.
Wow. I nearly cried. It’s difficult enough to explain the concept of energy and power to people without confusing them with unit abuse.
Energy is measured in Joules. The rate you use energy is Power and is measured in Joules per second, or Watts. People don’t like multiplying the wattage of their equipment by the number of seconds it’s on (to give Joules) as the numbers are big, so they multiply the wattage by the number of hours, to give kilowatt-hours. That’s not kilowatts per hour, that’s kilowatt hours (like pound-feet, or Newton-metres). One kilowatt-hour is 3.6 Megajoules.
Your energy company is only concerned how much energy you have used (in Joules, or Kilowatt-hours), not the rate at which you use it (in watts).
Don’t feel too bad. Even eon (the large UK utility company) make mistakes. See the Scroby Sands Wind Farm display in Great Yarmouth, England talking about the number of homes per year that 60 Megawatts can supply.
That meter is a multimeter. It shows more than volts.
LCD’s can use as much as a CRT!!! Some LCD’s use almost 200watts. Look for led lcd’s that use no more than 30 watts active.
I monitor my energy use. Most of the older wall warts use 3 watts all the time. If you have ten in your home that is 30 watts all the time. Like putting a 700 watt bulb on for an hour each day. Who would do that?
Put every item in your system on a power strip. When you are not at the computer turn it all off and use the power strip switch. Power strips also help prevent voltage spikes and in some cases lightning.
Edited 2011-08-03 16:17 UTC
“LCD’s can use as much as a CRT!!! Some LCD’s use almost 200watts”
While shopping for a new large panel, I couldn’t help notice that too. The Hanspree 28″ panel that gives us the 16:10 1920×1200 for $250 was pretty appealing except that it sucks like 110W or so. Two of those would kill desk space as well. The Apple 30″ IPS panel also use 180W IIRC but price is beyond my budget and the resolution is too high for older eyes. I think the IPS panels also need more source light since the IPS switch is less efficient. LG make some 22″-23″ models with the option of IPS vs TN for very small difference in $ and Watts.
Having said that, you have to use a Watt meter to see if the specs are true or just over stated. When I bought my Panasonic Plasma TV the specs suggest upwards of 300W in use. I checked it in the store with a Watt meter (the first time anyone ever did that) and it was half that, and at home it was around 100W.
Also something most people are not considering is the VA rating rather than Watt level. Most appliances use more VA than Watts, we pay for Watts in KWhr charges but the utility must produce VAs about 20% more. That means they have to balance the phase by over producing power.
LED night lights and a lot of DC powered device use really crappy AC-DC circuitry that use far more power than the DC rating suggests. Set your meters to VA to see the difference. The industry really needs to push harder for power factor of 1 so VA equals Watts. That requires better quality switchers, some PC PSUs do have power factor correction in them, most don’t.
When I search “voltmeter” and “wattmeter” at Amazon I get about the same list of devices in the results. Most of them are multimeters with the ability to provide several different kinds of output. You saved yourself from confusing everyone by including the picture. BTW, the Kill-A-Watt you show provides several kinds of outputs (as is typical) and will calculate the kilowatt-hours for you without need for manually cranking through your formula.
How to Save Energy When Using Your Computer
————————————————————-
1. Lay down and type with one hand.
2. Never, ever read the linked articles.
3. Use a Tandy Model 100 laptop, dialed in with a 300 baud modem to your ISP, surfing the web with Lynx.
4. Do not turn your computer on, just stare at the pretty reflection in the monitor.
5. Use binocculars to watch your neighbor surf the web (or so you say).
6. Go wireless – just imagine what the web pages would look like in your mind.
7. Pick up a land-line phone and imitate a modem. If you’re good, you can connect.
8. e = mc^2 – not sure how this helps, thought I would throw it out there,
9. Stop frowning when Thom posts another Software Patent article. EVERYONE knows you use less muscles when you smile :}
10. Don’t thiink too hard about what you post!
Some what off topic, I have recently gotten really sick of seeing the usual black PC cases all around my house, when in reality they were all mostly empty. They were all made from cheap upgrade parts though so the cases were just getting recycled as the insides got ever sparser. Mostly Sempron systems needing 60W or so. They all had noisy fixed speed PSU fans and stock CPU coolers and were dust bunny collectors.
At about 16″ x 18″ x 8″ the volume is unsightly in regular rooms so I built my own wooden cases using spare floor laminate and some skilled use of the table saw. These are about 10″ by 9.5″ by 6″ and are exactly 1/4 of the volume of the metal beasts. They are just big enough to throw in most cheap mobos with a stripped down PSU and a 2.5″ HD. The stock coolers got replaced by $10 heap pipe tall stack coolers which allows for very slow quiet fans. A second fan cools the PSU, both fans are on a regulator set near minimum. They still need more detail work and I doubt the FCC would be pleased.
This still is not satisfactory because even these boxes are mostly empty although stuffed with excess PSU wiring. What I have in mind for the next phase is to mount the mobo directly on the back of the LCD panels with VESA mount bolts and use a miniBox type PSU switcher set into the 24pin connector. The tall heat pipe coolers that now sit on the CPU now stick out. It would make the LCD/PC look more like an old TV with the backwards pyramid.
What I wish for now is to find a way to retain the tall heat pipe stack technology but flatten it over the mobo surface so the whole thing can be packaged in a slim book like package of about 10″ square and maybe 2″ thick. I would like to make or buy a heatpipe cooler integrated with a flat plate ribbed heat sink. There are still the issues of video cables, can you even get a VGA/DVI cable of 1 foot. All of this is an effort to make the PCs disappear behind the display and also get rid of the wiring tangle.
Of course I could just buy all in one PCs or laptops or iPads but that wouldn’t be any fun and those have serious other issues. All of those have displays that are way too small and use laptop technology.
Its just a hobby though, looking for more is less.
If you have a website, link to this article.
Every computer user should read it.
If everyone followed its recommendations we would save a lot of energy (at little cost).
I watched a show about C02.
They said an average of a few hundred thousand years was a number like 225 and the highest recorded (if you believe their assumptions) was like 380.
Today’s readings are like 590 and moving up. I can’t imagine how that can be good.
buy a Mac
jobs is the king of planned obsolescence
buy a rotten apple if you enjoy growing piles of rubbish in africa
Edited 2011-08-04 23:11 UTC
Your articles–while based on interesting concepts (reusing old machines, extracting every ounce of usefulness out of computer hardware, electronics disposal, energy saving, etc.)–always seem to have “WTF?!” moments that make me question what the hell you’re talking about.
I could probably question other things in this article (I’ve stopped reading them in their entirety a while ago), but I’ll just say this: I would use a pen and notebook paper before I’d ever use or recommend anyone to use an inkjet printer again. They are garbage, laser is the way to go.
Toner doesn’t need “cleaned” after a week of no use/printing, requiring a new set of cartridges every month because almost all of the ink is lost while cleaning. Laser just works with minimal problems, and if you rarely print anyway, it’s not like you’ll be using loads of energy anyway. And if you’re afraid to use too much energy–there’s always the “On/Off” switch.
Simply put, inkjet is a horrible recommendation, even for those people who for whatever reason are so paranoid about their energy use; what they “save” in electricity will be eclipsed several times by the cost of regularly buying replacement cartridges. It’s 2011, laser printer prices have gone way down–how could anyone even consider recommending inkjet?