New RAM demanding WUs

Message boards : Number crunching : New RAM demanding WUs

To post messages, you must log in.

AuthorMessage
Max DesGeorges

Send message
Joined: 1 Oct 05
Posts: 35
Credit: 942,527
RAC: 0
Message 65598 - Posted: 20 Mar 2010, 16:16:41 UTC

In this period Rosetta is working hard on protein interactions that require lots of RAM, 200-300 MB on average. Is the amount of RAM used a key factor in the analysis of complex interactions? If the RAM used by the WU was, let’s say, 1 GB, it would give better/more precise/faster results?

I’m asking that because there is an idea that come up in my mind a few days ago…
What the project developers think about having some "CPU intensive/RAM demanding” WUs, requiring lots of RAM (at least 600 MB for example)?
Users will choose in the Rosetta preferences to receive this type of WUs (“allow RAM demanding WUs on your computer ") that require a lot of RAM. In this way Rosetta will have a group of high performance computers on which it could run precise simulations of big interactions. At the same time, users that today receive sometimes problematic monster WUs (up to 800 MB of RAM), will be excluded from such a weight.
In my idea, another characteristic of this special WUs would be to have an increased default runtime, for example 12 hours, so that the load on Rosetta servers will be partially lightened. LINK

What the Rosetta developers think about that?
I’m sure that there are a lot of hardware enthusiast guys around, ready and happy to receive this monster WUs, and a lot of users that really want to give the best they can to contribute on scientific project like Rosetta@home.
What the other users think?

I hope I had a good idea. :)

ID: 65598 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mod.Sense
Volunteer moderator

Send message
Joined: 22 Aug 06
Posts: 4018
Credit: 0
RAC: 0
Message 65599 - Posted: 20 Mar 2010, 20:57:32 UTC

Your question seems to be "if tasks were created that assumed more memory were available, would they perform better, or produce better results?".

In a word, no, I don't think so. The primary factors on how much memory is required is the size of the protein being studied, and the methods of study being used. When large proteins, or methods known to require more memory are used, they tasks are flagged in a way that they are only sent to machines with more then the minimum memory requirements.

So, there already are two types of tasks. And the server automatically sends tasks appropriate for your machine.

Your suggestion that the tasks have a longer default runtime is a good one. But makes everything harder to explain to people. And if people don't understand things, then you get lots of false problem reports. So, instead, the tasks follow the same rules in following the user's runtime preference.
Rosetta Moderator: Mod.Sense
ID: 65599 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
[FVG] akd

Send message
Joined: 31 May 08
Posts: 2
Credit: 4,302,121
RAC: 592
Message 65617 - Posted: 22 Mar 2010, 10:02:33 UTC

I was thinking something like Manuel, too.

Thanks for your reply Mod.Sense, I understand problems you are having and your thought. I just wanted to give my two cents on this, by saying I would be available to receive long computations and/or huge ram demanding wus, since my machines could handle them with no problems. Just this, but if you would like to exploit enthusiast's machines even more (and you don't have to explain anything more to them: they already are ethusiasts and have usually powerful machines) this could be a nice idea.

PrimeGrid for example already has very long wus (and I mean even 5-700 existimate hours on a phenom II cpu), would be great have a huge simulation running on our cpus... :-)
ID: 65617 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Max DesGeorges

Send message
Joined: 1 Oct 05
Posts: 35
Credit: 942,527
RAC: 0
Message 65628 - Posted: 23 Mar 2010, 23:44:35 UTC - in response to Message 65617.  

I make the question easier:
would Rosetta take advantage in having a bunch of WUs that use i.e. 1 GB of memory and has a default runtime of 10 hours?
ID: 65628 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mod.Sense
Volunteer moderator

Send message
Joined: 22 Aug 06
Posts: 4018
Credit: 0
RAC: 0
Message 65631 - Posted: 24 Mar 2010, 13:46:39 UTC

Not really, because the individual models do not take 10 hours. They attempt to yield the best solution they can find, regardless of your runtime preference. This is part of why individual models take a variable amount of time to run. So that just becomes a question of runtime preference, where longer runtimes mean you hit the server less to keep a machine busy. So longer is better, if your machine runs enough to complete work within the deadlines etc.

The memory just helps the machine run the work with minimal swapping. This helps the tasks remain running and using the CPU time, rather then waiting for information to be swapped in to continue processing.

%0
ID: 65631 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mod.Sense
Volunteer moderator

Send message
Joined: 22 Aug 06
Posts: 4018
Credit: 0
RAC: 0
Message 65632 - Posted: 24 Mar 2010, 13:46:41 UTC

Not really, because the individual models do not take 10 hours. They attempt to yield the best solution they can find, regardless of your runtime preference. This is part of why individual models take a variable amount of time to run. So that just becomes a question of runtime preference, where longer runtimes mean you hit the server less to keep a machine busy. So longer is better, if your machine runs enough to complete work within the deadlines etc.

The memory just helps the machine run the work with minimal swapping. This helps the tasks remain running and using the CPU time, rather then waiting for information to be swapped in to continue processing.

It is sort of like asking a person if they could run further if we provided them more food. Without food, you can't run at all (for very long). But once you have enough, more doesn't help.
Rosetta Moderator: Mod.Sense
ID: 65632 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Max DesGeorges

Send message
Joined: 1 Oct 05
Posts: 35
Credit: 942,527
RAC: 0
Message 65633 - Posted: 24 Mar 2010, 16:00:14 UTC - in response to Message 65632.  

Now I see the limits of my idea.

Thank you for the answer Mod.Sense.

ID: 65633 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Cruncher Pete
Avatar

Send message
Joined: 3 Sep 07
Posts: 11
Credit: 11,907,591
RAC: 0
Message 65644 - Posted: 27 Mar 2010, 9:38:41 UTC - in response to Message 65599.  
Last modified: 27 Mar 2010, 9:45:43 UTC

Your question seems to be "if tasks were created that assumed more memory were available, would they perform better, or produce better results?".

In a word, no, I don't think so. The primary factors on how much memory is required is the size of the protein being studied, and the methods of study being used. When large proteins, or methods known to require more memory are used, they tasks are flagged in a way that they are only sent to machines with more then the minimum memory requirements.

So, there already are two types of tasks. And the server automatically sends tasks appropriate for your machine.

I am a bit confused here. I sense that there is something wrong in relation to
memory requirement when my 8 core Intel i790 with 6 gig of memory is not sufficient. I am seeing just one core being utilized with a message that the rest is awaiting memory. No problems using any other projects with all 8 cores being used. Unless something I can do at my end I will have to give Rosetta a miss and crunch other projects. All my settings appear to be OK since I have no problems with other projects.
Your suggestion that the tasks have a longer default runtime is a good one. But makes everything harder to explain to people. And if people don't understand things, then you get lots of false problem reports. So, instead, the tasks follow the same rules in following the user's runtime preference.
ID: 65644 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile joseps

Send message
Joined: 25 Jun 06
Posts: 72
Credit: 8,173,820
RAC: 0
Message 65648 - Posted: 27 Mar 2010, 14:45:34 UTC

I think that windows os on 32 bit machine can only recognize a maximum 3.0GB OF RAM, and I think Rosetta WU do not support 64bit machines. From what I read in this thread here, it answer my questions about ram memory allocation I just posted in Qwestions an Answers.
joseps:)
I turned off my 5computers when I went on vacation. When I return today, I can not upload work. Need work units to run computers.
joseps
ID: 65648 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mod.Sense
Volunteer moderator

Send message
Joined: 22 Aug 06
Posts: 4018
Credit: 0
RAC: 0
Message 65650 - Posted: 27 Mar 2010, 16:13:30 UTC

I am a bit confused here. I sense that there is something wrong in relation to memory requirement when my 8 core Intel i790 with 6 gig of memory is not sufficient. I am seeing just one core being utilized with a message that the rest is awaiting memory. No problems using any other projects with all 8 cores being used. Unless something I can do at my end I will have to give Rosetta a miss and crunch other projects. All my settings appear to be OK since I have no problems with other projects.


Cruncher Pete, when you say your settings appear ok... does that mean that BOINC is allowed to use a significant portion of your machine's memory? Both when active and when idle?

Are you perhaps looking at the settings on the website? BOINC will actually use the local preferences for the machine if any are changed from the website.

Even with your current settings, you can run Rosetta (which has high memory requirements when compared to other projects) along with other projects that use less memory and a mixture of tasks from the various projects should result.

Do you keep tasks in memory when suspended? If so, you could have a look at host much memory the tasks were trying to use at the time they were suspended, and that would then give more information to try and access if the amount of memory used is normal or not.
Rosetta Moderator: Mod.Sense
ID: 65650 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Cruncher Pete
Avatar

Send message
Joined: 3 Sep 07
Posts: 11
Credit: 11,907,591
RAC: 0
Message 65654 - Posted: 27 Mar 2010, 22:26:55 UTC - in response to Message 65650.  
Last modified: 27 Mar 2010, 22:33:17 UTC

I am a bit confused here. I sense that there is something wrong in relation to memory requirement when my 8 core Intel i790 with 6 gig of memory is not sufficient. I am seeing just one core being utilized with a message that the rest is awaiting memory. No problems using any other projects with all 8 cores being used. Unless something I can do at my end I will have to give Rosetta a miss and crunch other projects. All my settings appear to be OK since I have no problems with other projects.


Cruncher Pete, when you say your settings appear ok... does that mean that BOINC is allowed to use a significant portion of your machine's memory? Both when active and when idle?

Are you perhaps looking at the settings on the website? BOINC will actually use the local preferences for the machine if any are changed from the website.

Even with your current settings, you can run Rosetta (which has high memory requirements when compared to other projects) along with other projects that use less memory and a mixture of tasks from the various projects should result.

Do you keep tasks in memory when suspended? If so, you could have a look at host much memory the tasks were trying to use at the time they were suspended, and that would then give more information to try and access if the amount of memory used is normal or not.


Thanks for the reply. Using Win Xp32. Machine is set to 100% when idle and 90% when in use. Using 85% of swap space. Tasks kept in memory. Plenty of HD space.

As the problem appeared on more than one machine (using 4 i970's) and is now running OK, I presume the intermittent nature suggests that it might have been a stray WU. If it happens again, I will check just how much memory it is using.
ID: 65654 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mad_Max

Send message
Joined: 31 Dec 09
Posts: 207
Credit: 23,377,132
RAC: 11,305
Message 65670 - Posted: 29 Mar 2010, 0:23:33 UTC - in response to Message 65648.  

I think that windows os on 32 bit machine can only recognize a maximum 3.0GB OF RAM, and I think Rosetta WU do not support 64bit machines. From what I read in this thread here, it answer my questions about ram memory allocation I just posted in Qwestions an Answers.
joseps:)

3.5 GB on 32bit XP
And Rosetta have support for 64bit machines and OS (except Mac).
Look at link: https://boinc.bakerlab.org/rosetta/apps.php
Windows and Linux platforms has both (32bit and 64 bit) versions.
ID: 65670 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 65683 - Posted: 31 Mar 2010, 0:49:49 UTC - in response to Message 65670.  

I think that windows os on 32 bit machine can only recognize a maximum 3.0GB OF RAM, and I think Rosetta WU do not support 64bit machines. From what I read in this thread here, it answer my questions about ram memory allocation I just posted in Qwestions an Answers.
joseps:)

3.5 GB on 32bit XP
And Rosetta have support for 64bit machines and OS (except Mac).
Look at link: https://boinc.bakerlab.org/rosetta/apps.php
Windows and Linux platforms has both (32bit and 64 bit) versions.


The "support" is simply a wrapper. I don't think it can take advantage of the increased memory allocation. Then again I really doubt any WU could consume over 1GB of RAM, and if it did, whether it could bring any advantage.
ID: 65683 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Sid Celery

Send message
Joined: 11 Feb 08
Posts: 1990
Credit: 38,522,839
RAC: 15,277
Message 65794 - Posted: 21 Apr 2010, 15:42:17 UTC - in response to Message 65599.  

The primary factors on how much memory is required is the size of the protein being studied, and the methods of study being used. When large proteins, or methods known to require more memory are used, they tasks are flagged in a way that they are only sent to machines with more then the minimum memory requirements.

I didn't realise this until you wrote it here. One of my team-mates had limited RAM (and a dodgy power supply - both now solved) that saw a lot of WUs crash if they ran for long enough.

Most of my WUs consume 250-350Mb. But today I have one that consumes 750Mb! The benefits of having 8Gb RAM here. Glad to be of service ;)
ID: 65794 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote

Message boards : Number crunching : New RAM demanding WUs



©2024 University of Washington
https://www.bakerlab.org