Message boards : Number crunching : new version of BOINC + CUDA support
Previous · 1 · 2 · 3 · Next
Author | Message |
---|---|
Orgil Send message Joined: 11 Dec 05 Posts: 82 Credit: 169,751 RAC: 0 |
Looks like someone out there making more forward steps in GPU computation feild while in this corner we are sleeping: http://www.tgdaily.com/content/view/41352/140/ |
Paul D. Buck Send message Joined: 17 Sep 05 Posts: 815 Credit: 1,812,737 RAC: 0 |
There are not applications yet for the ATI series of cards. All current projects are right now targeting the Nvidia series. Work on an ATI application is being done by a volunteer at Milky Way, but it is alpha state an only runs on win 64 ... We are in early days guys and girls ... it will be a little while before we really get going ... |
dcdc Send message Joined: 3 Nov 05 Posts: 1832 Credit: 119,860,059 RAC: 4,566 |
as Paul says, it's early days. CUDA might not last very long - OpenCL and possibly whatever Larrabee runs could potentially be much better investments than CUDA... The inquirer suggests that the next gen consoles will be powered by Intel Larrabee (Sony) and ATI (xbox), and probably ATI for the Wii too, so maybe Nvidia isn't the way to go... |
Paul D. Buck Send message Joined: 17 Sep 05 Posts: 815 Credit: 1,812,737 RAC: 0 |
as Paul says, it's early days. CUDA might not last very long - OpenCL and possibly whatever Larrabee runs could potentially be much better investments than CUDA... True, But at the moment it is really the only game in town for BOINC. Folding has apps that will run on ATI and there are quite a few that run both at the same time ... Open Something or other is supposed to replace CUDA and the older ATI API so that there is a common basis and this should increase the penetration and decrease the efforts required. I pretty well filled out my dance card with Nvidia at the moment and likely will not invest too much more till I can see the way the wind blows ... though if I do build another system this summer I will likely get another pair of GTX 295s if the M Way application is not yet running ... The use of the cell processors is still not well implemented as yet ... SaH is looking to do that too ... The only bad news there is we are still requiring the installation of Linux to run where we might be better off if we could make it more like a game cartridge ... broader audience ... |
The_Bad_Penguin Send message Joined: 5 Jun 06 Posts: 2751 Credit: 4,271,025 RAC: 0 |
For those that want to read the article dcdc was referring to: Intel will design PlayStation 4 GPU
as Paul says, it's early days. CUDA might not last very long - OpenCL and possibly whatever Larrabee runs could potentially be much better investments than CUDA... |
AMD_is_logical Send message Joined: 20 Dec 05 Posts: 299 Credit: 31,460,681 RAC: 0 |
For those that want to read the article dcdc was referring to: Intel will design PlayStation 4 GPU Sony denies this rumor http://www.t3.com/news/sony-denies-ps4-intel-gpu-rumour?=38046&cid=OTC-RSS&attr=T3-Main-RSS |
The_Bad_Penguin Send message Joined: 5 Jun 06 Posts: 2751 Credit: 4,271,025 RAC: 0 |
Interesting. Given the amount of times, imho, that the Inquirer has been correct (many), and the amount of times that Sony "denials" have been correct (few), my money says to believe the Inquirer over Sony. Ymmv. What is the typical life-cycle for a generation of a gaming console? Wouldn't ~2012 be about the right time for the next generation? If so, wouldn't they have to begin planning now? |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,217,610 RAC: 1,154 |
Looks like someone out there making more forward steps in GPU computation feild while in this corner we are sleeping: The problem, right now, is that NVidia uses CUDA while 2 other places are developing a different "standard". Microsoft is one of those, Open MM is the other. I got my Maximum PC magazine yesterday and in it they call for NVidia to dump CUDA and go with one of the other "standards". NVidia wrote CUDA and is the only one that can use it, making projects, like Einstein either code too different standards or wait for Nvidia to come around to one of the other standards. I am guessing they are going to wait and see, that way even more video cards would be supported instead of JUST NVidia ones. |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,217,610 RAC: 1,154 |
duplicate |
Orgil Send message Joined: 11 Dec 05 Posts: 82 Credit: 169,751 RAC: 0 |
But as I understand Cuda is the pioneering technology that gives an option for cpu tasks plus it now achieved Tesla thing that exceeding 1TF computing power with simple card like gadget. So supposedly Cuda has possible future rather than ati and intel. I guess most boinc dc projects are now doing their math research for possible shift to Cuda. |
Paul D. Buck Send message Joined: 17 Sep 05 Posts: 815 Credit: 1,812,737 RAC: 0 |
Looks like someone out there making more forward steps in GPU computation feild while in this corner we are sleeping: Yes, well OpenCL is coming and all parties look to be supporting it. as this quote (from taht article) says: AMD has decided to support OpenCL (and DirectX 11) instead of the now deprecated Close to Metal in its Stream framework.[5][6] RapidMind announced their adoption of OpenCL underneath their development platform, in order to support GPUs from multiple vendors with one interface.[7] Nvidia announced on December 9, 2008 to add full support for the OpenCL 1.0 specification to its GPU Computing Toolkit.[8] At least for the next couple years I would expect that the way to get the best performance will still to use the vendor specific extensions, but, with time maybe, this time, we shall see something different... At any rate, for BOINC, at this moment, Nvidia is the only game in town ... 6 months from now, who knows? This generation of GPUs says the Nvida has the edge in single precision while ATI has the double precision ... at least that is what I have been told ... :) |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,217,610 RAC: 1,154 |
You can crunch with ATI cards too, but only at Folding@home and ONLY with their non Boinc application. |
The_Bad_Penguin Send message Joined: 5 Jun 06 Posts: 2751 Credit: 4,271,025 RAC: 0 |
Re the rumors and denials (discussed below) of an Intel Larrabee being included in the eventual Sony PS4, there also seem to be rumors of it being potentially paired with a Cell2 cpu, whatever that is/will be (FUD alert = 2 x ppe & 32 spe's). But if a potential Cell2 is anything like a PowerXCell 8i, and is mated with a Larrabee that turns out to deliver everything Intel has promised, it certainly will be interesting times indeed...
|
Tino Send message Joined: 2 May 07 Posts: 1 Credit: 3,877,033 RAC: 0 |
Re the rumors and denials (discussed below) of an Intel Larrabee being included in the eventual Sony PS4, there also seem to be rumors of it being potentially paired with a Cell2 cpu, whatever that is/will be (FUD alert = 2 x ppe & 32 spe's). hi to all i've an Nvidia Cuda video card installed how i can see when gpu is working? |
The_Bad_Penguin Send message Joined: 5 Jun 06 Posts: 2751 Credit: 4,271,025 RAC: 0 |
hey Tino, gpu does NOT work with Rosetta@Home. there are some (very few) Boinc projects that will work with nVidia Cuda gpu: gpugrid.net comes to mind. or else you can join Folding@Home... |
Paul D. Buck Send message Joined: 17 Sep 05 Posts: 815 Credit: 1,812,737 RAC: 0 |
hi to all When you start BOINC Manager in the messages tab you should see a note that the BOINC Manager has detected a GPU ... There are only 3 projects in BOINC that use GPUs at this time: SaH, SaH Beta, and GPU Grid ... For ATI there is an ALPHA Test application for the 38xx and 48xx cards. Not all video cards are acceptable for all projects that use them... typically only the higher end cards work ... see the project boards for details. As time wends on we should see more projects come on-line with GPU applications and possibly some of the lower end cards will be usable for other projects. As an example cards that are not usable on GPU Grid can be used on SaH ... Last notes, you have to have later video drivers installed to use the GPU, and a later version of BOINC ... for windows 32 the "best" known driver right now is 181.22 and the best BOINC Manager version is 6.5.0 ... |
todd Send message Joined: 12 Oct 06 Posts: 2 Credit: 1,215,594 RAC: 0 |
I just noticed that there is a new version of BOINC (6.4.5) and it is now advertising that these applications can support CUDA now if they so desire to. I for one would love to see Rosetta support CUDA. I second that vote, Cuda has been out for quite a while and brings a tremendous amount of computing power to the table. I don't understand what the delay is, it's not rocket science. As a developer, it's not that much of a challenge to port between hardware and different operating systems. I've done it many times and lived to tell about it. So tell us rosetta staff, what's the Hold up? |
mxplm Send message Joined: 12 Sep 09 Posts: 4 Credit: 234,332 RAC: 0 |
As a developer, it's not that much of a challenge to port between hardware and different operating systems. I think this is not just porting an existing application to a new architecture, like different types of CPUs. To use a video card effectively, one has to rewrite a great deal of the software and use a completely different style of writing source code. |
dcdc Send message Joined: 3 Nov 05 Posts: 1832 Credit: 119,860,059 RAC: 4,566 |
I just noticed that there is a new version of BOINC (6.4.5) and it is now advertising that these applications can support CUDA now if they so desire to. I for one would love to see Rosetta support CUDA. I don't believe CUDA supports C++ yet for a start. See: https://boinc.bakerlab.org/rosetta/forum_thread.php?id=4100&nowrap=true#52901 and (especially David E K's response half way down): https://boinc.bakerlab.org/rosetta/forum_thread.php?id=5023&nowrap=true#62931 It's not a case of porting/recompiling - it's a case of rewriting and even minirosetta is many hundreds of thousands of lines of code that would all need testing and validating. Assuming GPGPU is suitable and the latency between CPU and GPU doesn't have a negative effect on the processing, I think there's a reasonable probability of it happening at some point seeing as GPGPU is maturing reasonably quickly, but it's never going to be a quick process for R@H. It's not rocket science but it's probably as complicated ;) |
robertmiles Send message Joined: 16 Jun 08 Posts: 1234 Credit: 14,338,560 RAC: 1,227 |
SOME Nvidia cards are about to start supporting code written in C++, OpenCL, OpenGL, or a few other computer languages, but Nvidia hasn't made it clear yet whether that includes any made with the chips they have been selling in the past. NVIDIA’s Next Generation CUDA(TM) Compute Architecture: Fermi http://www.nvidia.com/content/PDF/fermi_white_papers/NVIDIAFermiArchitectureWhitepaper.pdf New FERMI GPU, 4x more cores, more memory https://boinc.bakerlab.org/rosetta/forum_thread.php?id=5094 Nvidia GT300 http://www.gpugrid.net/forum_thread.php?id=1406 The main problem with ATI cards at present is that ATI is far behind at providing the proper compilers to convert software written to run on CPUs so that it can run on an ATI GPU instead, without totally rewriting it in a different computer language. Another problem with converting minirosetta to run on any type of GPU card is the amount of memory it requires - to get a full speedup it would require the GPU card to have approximately 512 MB times the number of GPU cores for its total graphics memory (a few hundred times 512 MB for high-end cards). It could limit the number of GPU cores it uses to the number it can find GPU memory for, though, and at least get it to run on a GPU card, sometimes with somewhat of a speedup. |
Message boards :
Number crunching :
new version of BOINC + CUDA support
©2024 University of Washington
https://www.bakerlab.org