Posts by Lynn

1) Message boards : Number crunching : The Cost of Power? (Message 37357)
Posted 4 Mar 2007 by Lynn
Post:
the P4 Celerons are probably the worst CPU for BOINC in terms of processing per watt - the C2x CPUs are the current best bang per Watt I believe ...


Yah, actually, just this morning I was checking out prices on newegg for a dual-core slightly less than leading edge. A older AMD dual-FX with 1GB DDR667 and uAtx mobo is only $239 total.

At work I have 2 x A64 2.0 GHz doing Rosetta when not running regression tests and they easily slightly top one of the Core 2 Duo's 1.8Ghz cores in the benchmarks. So I am thinking to just split the difference a bit and go with a slightly less state-of-the-art upgrade.
2) Message boards : Number crunching : The Cost of Power? (Message 37352)
Posted 4 Mar 2007 by Lynn
Post:
Summary: Some musings comparing work accomplished by my computer to my personal out-of-pocket costs for the electricity to feed it 24-hours a day - something I dare say very few home computer users look at.

Last month I upgraded a Celeron D 2.53GHz media server to a Core 2 Duo 1.8Ghz. When not using the server for "media" (watching DVD or recording broadcast TV), I run BOINC/Rosetta distributed science jobs on it. Since the Celeron D was functional, I moved it to an old chassis & updated the power supply - both have efficient, after-market power supplies. I now have 2 systems running 24-hours a day.

As a hobby (and part of what my Mother would call our inherited Scot's blood) I enjoy using an AC power meter to evaluate the cost of running appliances. My meter is from http://www.brandelectronics.com/ and it shows some interesting facts, such as that my Cox digital cable box consumes 24-watts when powered "ON" ... and 23-watts when turned "OFF" :-) Gives the concept of "off" a new meaning.

Obviously, the Core 2 Duo - running 2 jobs at once - contributes more credits to BONIC projects than the Celeron D. But I was interested in comparing what I gain given the monthly costs to run my now unnecessary Celeron D.

Computer Summary:

Core 2 Duo: 1.8GHz, 1GB DDR2-800 RAM, 320GB SATA drive, nVidia 7100 (fanless) 400w power supply
* Rosetta Benchmarks; fp=1744 int=3656 (since dual, means maybe fp=3488 int=7312)
* When Idle: CPU temp = 70 DegF, AC power usage = 105 watts
* When both cores at 100%: CPU temp = 100 DegF, AC power usage = 129 watts

Celeron D: 2.5GHz, 512KB PC2100 RAM, 30GB PATA drive, nVidia 6300 (fanless) 350w power supply (it had 1GB RAM, but 1-of-2 sticks went bad)
* Rosetta Benchmarks; fp=764 int=1677
* When Idle: CPU temp = 100 DegF, AC power usage = 98 watts
* When sole CPU at 100%: CPU temp = 125 DegF, AC power usage = 134 watts

I was at first pretty shocked that the Core 2 Duo - even with both CPU at 100% - used less total wattage than the Celeron D. Especially since every time you pick up a computer magazine there are dire warnings about needing a 600w, 800w, or even 1000w supply in a "modern" computer. By the way, a good AC power meter also tracks maximum power - which turns out in my case to be from 140 to 150 watts max when either the Core 2 Duo or Celeron systems first boot up.

Since both systems eat about the same power, just rounding the wattage to 130 watts burned 24-hours per day amounts to from $7.50 to $13.00 per month. This ranges includes my Minnesota kwh charges of about $0.08 per KWH and also my California charge of about $0.14 respectively. I wonder how many people understand they pay that much per month to run their computer 24-hours a day? Over a year that totals from $90 to $160 per computer - and this is JUST the computer. I'm not including the wattage used by monitors, printers, Ethernet switches or the DSL/cable router hardware. Plus with the computers running in a cool Minnesota basement, I don't have to include the extra air conditioning load they'd create in a hot climate like my Southern California home. (I am an engineer spending time in both states)

So now for the true "musing" - if I average the last 10 Rosetta jobs handled for each computer:
* Core 2 Duo: average 10594 seconds and 36.87 credits granted per job
* Celeron D: average 10406 seconds and 22.75 credits granted per job

However, since I'm looking where my $7.50 (or $13.00) per month goes I have to remember the Core 2 Duo runs 2 jobs at once for this same wattage so really one could say I am "paid" an average of 73.74 BOINC credits for each pair of 10600 second jobs that the Core 2 Duo runs. So the Core 2 Duo gives me almost 4 times the BOINC credits for the $100 spent a year on electricty to feed my hungry computer. Of course, even if the CPU throttled back to idle I'd still be paying about $80 per year to run the computer 24-hours per day. So Rosetta really only 'costs' me the extra $20 per year.

So should I still run the Celeron D? Should I upgrade it to something closer to the Core 2 Duo? The upgrade cost me close to $450 once one considers the cost of the CPU, new motherboard, and new DDR2 RAM. This is an interesting question without a simple answer ... yes, running the old Celeron D doesn't cost me any more from a buying-hardware stand-point ... but I am paying good money out of my pocket for the power.

So what is the real cost of power?
3) Message boards : Number crunching : Problems with Rosetta version 5.43 (Message 35339)
Posted 22 Jan 2007 by Lynn
Post:
Are you all using Windows or Linux? I've had no failures on 3 Windows XP Pro systems I am running, but I finally had to detach from Rosetta on *ALL* of my Linux systems (I have two running Ubuntu 6.06 and one running Ubuntu 6.10) as I was seeing nearly 90% failure and these systems were offering rosetta 50% of their time. I didn't mind the tasks that ran 10-15 seconds before failing, but one of these systems is a new P4 dual-core and I had WU's hogging 6 to 8 hours (or 4 days for one WU!!!!) before they failed. Better that I donate that CPU power to another project that produces something useful.

Here is the dual-core's result page: 398561

- Lynn






©2024 University of Washington
https://www.bakerlab.org