Best Buy advertisement, or really crappy reporting?



  • http://www.miamiherald.com/living/home/story/748603.html

    To summarize the highlights: "The average American household wastes about $1,000 each year in electricity, according to the International Energy Agency. Vampire power -- the standby energy consumed by household gadgets and appliances -- is the culprit."

    The figure they give, at the 8 cents per kilowatt-hour most of us pay, works out to waste of 12,500 KWh per year. Which works out to an excessive power draw of 1,400 watts at all times. The article gets even more ludicrous:

    Not only does it claim that 85% of what I pay on my power bill a year is waste, it blames all of this waste on this idiotic comment:

    "That's because most gadgets have small ''standby'' lights (typically green, yellow or red) that continue to burn power even when we hit the ''off'' switch. In fact, a microwave oven may use more power when turned off than when cooking."

    Yes. They actually said that last sentence. They then discuss the national campaign Best Buy has launched to educate us in this... I smell a paid-off reporter, and either a complicit editor -- or a very stupid one.



  • It does sound like crappy reporting, clearly just quoting and not actually researched. The bit about using more power while on standby than in operation is usually out of context, and makes engineers go "WTF"... it should always be qualified with "over the course of a year" - they estimate most of these devices are on standby 80-90% of the time, and many older devices use a surprising amount of power while on standby (7 watts on my old TV!). This won't be a problem for long though, some manufacturers are now starting to use 'zero power standby' techniques.



  • @versatilia said:

    It does sound like crappy reporting, clearly just quoting and not actually researched. The bit about using more power while on standby than in operation is usually out of context, and makes engineers go "WTF"... it should always be qualified with "over the course of a year" - they estimate most of these devices are on standby 80-90% of the time, and many older devices use a surprising amount of power while on standby (7 watts on my old TV!). This won't be a problem for long though, some manufacturers are now starting to use 'zero power standby' techniques.

    I just boggled thinking of how many LEDs it'd take to reach the 1,400 watt draw I calculated their "average waste" at, after the explanation that it was the power lights. If I'd been drinking at the time, my keyboard would have been covered in coffee or soda...



  • @Wolftaur said:

    @versatilia said:
    It does sound like crappy reporting, clearly just quoting and not actually researched. The bit about using more power while on standby than in operation is usually out of context, and makes engineers go "WTF"... it should always be qualified with "over the course of a year" - they estimate most of these devices are on standby 80-90% of the time, and many older devices use a surprising amount of power while on standby (7 watts on my old TV!). This won't be a problem for long though, some manufacturers are now starting to use 'zero power standby' techniques.

    I just boggled thinking of how many LEDs it'd take to reach the 1,400 watt draw I calculated their "average waste" at, after the explanation that it was the power lights. If I'd been drinking at the time, my keyboard would have been covered in coffee or soda...

    At 35mA on 1.5v you get 0.175 watt per LED. Let's say 0.2 watt per LED as you also waste some on converting the power to 1.5V. So that makes 7000 LEDs.


  • A quick walk through my house yields 22 illuminated light switches, 6 assorted night-lights, 15 assorted led clocks, 8 power-strips, 13 power bricks, modems, routers, and even my wireless headset stand has 2 blinky lights. Not to mention all the electronic junk to which all these are attached.

    Makes you wonder....



  •  @snoofle said:

    A quick walk through my house yields 22 illuminated light switches, 6 assorted night-lights, 15 assorted led clocks, 8 power-strips, 13 power bricks, modems, routers, and even my wireless headset stand has 2 blinky lights. Not to mention all the electronic junk to which all these are attached.

    Makes you wonder....

    All it takes at my house is a glance at my room with the lights off. It's like a Christmas tree.

    LEDs... gotta love 'em. 



  • On its own, standby power does add up to a non-negligible number.  But I recently read a discussion of this that pointed out that, in pure energy terms, the energy consumed by leaving a TV on standby all year is less than that consumed by idling your car for 30 seconds.  Of course there are a hundred factors that reduce the validity of this comparison, but it does provide a perspective.

    And I think this may be the more pertinent point:

    @snoofle said:

    Not to mention all the electronic junk to which all these are attached.

    .



  • Actually, having taken a good look at many of my gadgets that have led's, it seems that most of them are useless. Honestly, who needs to see a red light on a power strip? If it's on, the attached device will work. If it's off, the switch will be either in the off position or the reset will be tripped. The led is pointless. Also, all lcd clocks should, by law and punishable by death, have a turn-me-off option, if only so I don't need to reset them after every power blip.

    It's only a few mw each, but multiplied by billions...



  • @snoofle said:

    Also, all lcd clocks should, by law and punishable by death, have a turn-me-off option, if only so I don't need to reset them after every power blip.
    They have:



  • This sounds like one of those news articles that public relations departments donate to local newspapers and TV stations who don't have enough staff to write their own.  Moreover, it sounds like the writer is not only dishonest but has no grip on basic engineering.

    If the $1000 per year figure has any basis in reality then it probably includes industrial energy waste.  So if you take the total electricity consumption in the US, subtract the amount you would need if every device were 100% efficient, and divide by the number of households then maybe you would get $1000.

    Most people probably do think that it's the LED indicator sucking the power.  The misunderstood bit of knowledge probably went like this: "Many electronic devices draw electricity even when they are turned off. You know that such a device is still using electricity because the lights are on."  Without a sense for how little electricity the LED itself uses, they jump to the conclusion that the LED is the culprit rather than understanding that it's merely a sign that far bigger but invisible drains are active inside the device.


  • BINNED

    There's really two sides to this story.

    For one I don't get why so many things use so much power when they're switched off. Like the 7 watt versatilia states, that just doesn't make sense.

    But then the $1000 figure is probably way off and the power is certainly not used by the 'small ''standby'' lights (typically green, yellow or red) that continue to burn power'. Those LEDs don't use anywhere near 5+ watts of power.



  • How about making faster start up times for operating systems so people don't mind turning computers off?

     

    This reminds me of that program to conserve electricity on ur computer... all it does is sets the timers for sleep mode (a nicer interface than windows gives but mostly bullshit anyways and a way to get your email for spammers).

     



  • @Hatshepsut said:

    On its own, standby power does add up to a non-negligible number.  But I recently read a discussion of this that pointed out that, in pure energy terms, the energy consumed by leaving a TV on standby all year is less than that consumed by idling your car for 30 seconds.  Of course there are a hundred factors that reduce the validity of this comparison, but it does provide a perspective.

    And I think this may be the more pertinent point:

    @snoofle said:

    Not to mention all the electronic junk to which all these are attached.

    .

     

    They solved that problem: Hybrid/electric cars don't idle.

     

    NEXT...



  • @astonerbum said:

    How about making faster start up times for operating systems so people don't mind turning computers off?
     

    I really hate to be "this guy"... but Windows is the only OS I've had problems with startup times in the past, unless a system was -really- screwed (and that includes Mac, Linux and BSD) -- and I don't think that is a major concern of Microsoft. After all, (in my limited experience) Windows also sucks more energy out of the battery, uses more system resources, and deteriorates your hardware faster, so environmentally startup times aren't exactly the biggest problem. 

    Besides which, even on systems where startup times are negligible, people still leave their computers running constantly. What you're observing is a combination of a poor OS choice and idiotic/wasteful behavior. 

    (BTW, I'm not entirely a "fanboi". I own a windows box too, along with 2 virtual installs)



  •  In fact, a microwave oven may use more power when turned off than when cooking

    Standby power (on average according to Korean report): 2.77W

    Operating power (my microwave): 1100W

    Standby: 2.77 x 24 x 356 = 23KWh

    Using the microwave on average for 3.6 minutes per day: 1100 x 0.06 x 356 = 23KWh

    It's not a stretch that someone uses their microwave on average less than four minutes a day.



  •  The 40% of your power being wasted is funny, too.  Let's say an average house uses 1,000 kWh of electricity (not unheard of).  So, according to them, 400 kWh is wasted to standby.  This translates to, in a 30 day month, 555 watts of wated electricity at all times.  A small space heater uses that much power!  At, say, 3 Watts per appliance on standby, were talking 185 appliances here.  :D



  • @leblonk said:

     In fact, a microwave oven may use more power when turned off than when cooking

    Standby power (on average according to Korean report): 2.77W

    Operating power (my microwave): 1100W

    Standby: 2.77 x 24 x 356 = 23KWh

    Using the microwave on average for 3.6 minutes per day: 1100 x 0.06 x 356 = 23KWh

    It's not a stretch that someone uses their microwave on average less than four minutes a day.

     

     

    Well, you have part of the solution right there in your post... remove 9 days from the calendar, and that's 9 days less standby power!



  • @Wolftaur said:

    http://www.miamiherald.com/living/home/story/748603.html

    To summarize the highlights: "The average American household wastes about $1,000 each year in electricity, according to the International Energy Agency. Vampire power -- the standby energy consumed by household gadgets and appliances -- is the culprit."

    The figure they give, at the 8 cents per kilowatt-hour most of us pay, works out to waste of 12,500 KWh per year. Which works out to an excessive power draw of 1,400 watts at all times. The article gets even more ludicrous:

    Not only does it claim that 85% of what I pay on my power bill a year is waste, it blames all of this waste on this idiotic comment:

    "That's because most gadgets have small ''standby'' lights (typically green, yellow or red) that continue to burn power even when we hit the ''off'' switch. In fact, a microwave oven may use more power when turned off than when cooking."

    Yes. They actually said that last sentence. They then discuss the national campaign Best Buy has launched to educate us in this... I smell a paid-off reporter, and either a complicit editor -- or a very stupid one.

     

    My household only spends about $100 a month on electricity, and I don't think my area has particularly cheap energy.  That number alone doesn't pass the smell test, because apparently I would be only spending $17 a month if I unplugged my microwave and wall warts.



  • @Daid said:

    @snoofle said:
    Also, all lcd clocks should, by law and punishable by death, have a turn-me-off option, if only so I don't need to reset them after every power blip.
    They have:

    hammer pic

    I suspect snoofle was hoping for a feature where the LCD clock could then be easily turned back on, without requiring aid of either glue, electrician, or replacement parts. I understand this was not explicitly stated.



  • @operagost said:

    My household only spends about $100 a month on electricity, and I don't think my area has particularly cheap energy.  That number alone doesn't pass the smell test, because apparently I would be only spending $17 a month if I unplugged my microwave and wall warts.

    I got bored and e-mailed the paper with a list of my problems with the article. Including the actual math. The editor responded with an apology, and a statement that the article is NOT up to their standards, and it will be dealt with. Maybe the follow-up will be less... idiotic.

    I agree that many idle devices could be more efficient, but... If you assume the average device wastes 20 watts in standby (which is actually rather high) you still need 70 such devices to equal the 1,400W their numbers work out to. And when I measured everything in my house and my parents' house because we were disputing how much of the electric bill I had to pay, nothing drew more than about 10W in standby, except for their big-screen TV, which drew a whopping 15W. (And that was about 6 years ago. I imagine things got more efficient, not less efficient. At least, I hope so!)



  • @operagost said:

    My household only spends about $100 a month on electricity, and I don't think my area has particularly cheap energy.  That number alone doesn't pass the smell test, because apparently I would be only spending $17 a month if I unplugged my microwave and wall warts.

    I've always wondered about these multi-hundred-dollar electric bills I keep hearing about. The monthly bill for my apartment is around $35, and that's with an electric water heater, electric stove, electric baseboard heating, and eight computers.



  • @mann_jess said:

    I really hate to be "this guy"... but Windows ... deteriorates your hardware faster

     

    WTF?

    Microwaves use more power when turned off, and windows deteriorates solid-state electronics?



  • @ActionMan said:

    @mann_jess said:

    I really hate to be "this guy"... but Windows ... deteriorates your hardware faster

     

    WTF?

    Microwaves use more power when turned off, and windows deteriorates solid-state electronics?

    That was actually once true, but that was long, long ago. Windows didn't halt the processor in the idle loop, once upon a time. And so the CPU would always be running. There was an actual idle loop for when the CPU was doing nothing, just a loop of null... The processor would always be cycling. Most of the UNIX implementations on the x86, even in the very early days, would actually execute a halt instruction, knowing the timer interrupt (or any other device interrupt) would un-halt the CPU.

    But Windows has been using a halt instruction instead of an idle loop since battery-operated laptops started becoming popular, as a halted CPU uses less power than an actively-cycling one. So that statement hasn't had any theoretical basis in fact for well over a decade. One could technically try arguing that Windows is less efficient and therefore can't actually idle as often, but... well, even with Windows, the average system spends a LOT of its time idle-- and we have better heatsinks and fans now than we did a long time ago, so full load isn't actually nearly the wear on a processor's longevity that it once was.



  • @astonerbum said:

    @Hatshepsut said:

    ... the energy consumed by leaving a TV on standby all year is less than that consumed by idling your car for 30 seconds.

     

    They solved that problem: Hybrid/electric cars don't idle.

     

    NEXT...

     

     

    Cars that most people actually own do.

     

    PREV...

     



  • @Wolftaur said:

    Microwaves use more power when turned off, and windows deteriorates solid-state electronics?

    Well, re-reading my post I wasn't specific enough; I haven't had this problem with desktops as much in the past. but I find laptops with Windows tend to overheat very quickly (as a result of using more resources), which oftentimes causes the internals to melt. In my experience, laptops which used to have problems with overheating, when switched to linux, begin running beautifully.

    More resource usage => more product usage => faster wear => faster breakage



  • It's just a pity about the environmental damage, energy usage and carbon emissions caused by the production of hybrids. Then there are issues related to the disposal of the batteries once they reach the end of their lifetime.

    http://www.thetorquereport.com/2007/03/toyotas_prius_is_less_efficien.html#more

     



  • @Wolftaur said:

    The figure they give, at the 8 cents per kilowatt-hour most of us pay

    @operagost said:

    My household only spends about $100 a month on electricity, and I don't think my area has particularly cheap energy. 
     

    You still use a lot of energy compared to my house (three people) as our electricity bill is around $100 (AUD) a month, and costs just over 14c/kWh (AUD).

    We are pretty good about not leaving things on standby though. We do have electric hot water and stove/oven.

    My old house had gas hot water and gas stove/oven and the electricity bills were over $130/month (plus ~$50/month for gas!): Back then that we were not good at turning things off properly, plus we were home more often. :)


Log in to reply