The Audacious Project

Message boards : News : The Audacious Project

To post messages, you must log in.

AuthorMessage
Admin
Project administrator

Send message
Joined: 1 Jul 05
Posts: 5144
Credit: 0
RAC: 0
Message 90915 - Posted: 16 Jul 2019, 22:18:18 UTC



As you may have heard, the Institute for Protein Design was recently selected as part of The Audacious Project. This large-scale philanthropic collaboration, which is the successor to the TED Prize, surfaces and funds projects with the potential to change the world.

As a result, we are expanding our Seattle-based team of scientists and engineers who will work together to advance Rosetta, our software for protein design and structure prediction. The funding will also allow us to invest in the equipment, supplies and lab space needed to design and test millions of synthetic proteins.

What challenges will we be tackling? Watch my TED talk to find out.

All of this work — like everything we do — will depend on you, the participants in Rosetta@home. Whether it’s creating custom nanomaterials or safer cancer therapies, we rely on the Rosetta@home distributed computing platform. We cannot thank you enough for taking the time to be a part of this exciting research, and we hope you tell at least one friend that they too can play a role in the protein design revolution just by running Rosetta@home.

Thank you,

David Baker
Director, Institute for Protein Design
ID: 90915 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Jim1348

Send message
Joined: 19 Jan 06
Posts: 881
Credit: 52,257,545
RAC: 0
Message 90916 - Posted: 16 Jul 2019, 22:58:19 UTC - in response to Message 90915.  

There is motivation, and then there is motivation. That is good enough for me. Thanks.
ID: 90916 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Filip Falta

Send message
Joined: 14 Apr 12
Posts: 1
Credit: 1,456,352
RAC: 36
Message 90930 - Posted: 23 Jul 2019, 10:38:59 UTC

Great news. Can we expect GPU app in the future with increased funding?
ID: 90930 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile [VENETO] boboviz

Send message
Joined: 1 Dec 05
Posts: 2002
Credit: 9,790,281
RAC: 4,437
Message 90935 - Posted: 24 Jul 2019, 10:17:17 UTC - in response to Message 90930.  

Great news. Can we expect GPU app in the future with increased funding?

Oh, no, please.
Not another thread about Gpu.
Admins are NOT interested in gpgpu.
ID: 90935 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 90956 - Posted: 28 Jul 2019, 15:53:58 UTC - in response to Message 90930.  

Great news. Can we expect GPU app in the future with increased funding?



Their official answer to this question at Rosettacommons.org: We've looked at GPU accelleration for Rosetta in the past, and the conclusion by the people who were playing around with it was that - given how Rosetta does things - there isn't much benefit for GPU acceleration for most of the normal Rosetta protocols. As such, most of Rosetta does not take advantage of the GPU. That said, there are a few sub-protocols which have the facility for GPU accelleration, but unfortunately to enable this you need a special compilation, so most people don't bother.
ID: 90956 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile [VENETO] boboviz

Send message
Joined: 1 Dec 05
Posts: 2002
Credit: 9,790,281
RAC: 4,437
Message 90958 - Posted: 29 Jul 2019, 13:04:37 UTC - in response to Message 90956.  

Great news. Can we expect GPU app in the future with increased funding?


Their official answer to this question at Rosettacommons.org: We've looked at GPU accelleration for Rosetta in the past, and the conclusion by the people who were playing around with it was that - given how Rosetta does things - there isn't much benefit for GPU acceleration for most of the normal Rosetta protocols. As such, most of Rosetta does not take advantage of the GPU. That said, there are a few sub-protocols which have the facility for GPU accelleration, but unfortunately to enable this you need a special compilation, so most people don't bother.


This message is 2ys old and "in the past" means...1y before? 2ys? 5ys?? We don't know.
Maybe now the computational is more powerful (hw+sw) than 7 years ago, at the moment of the test.
But, we don't care.
We care cpu optimizations.
ID: 90958 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote

Message boards : News : The Audacious Project



©2024 University of Washington
https://www.bakerlab.org