Message boards : Number crunching : New Docker image with Virtualbox and Nvidia GPU passthrough support
Author | Message |
---|---|
3k7cGiWzDHhDmVziFjKz4UkRa1sm Send message Joined: 21 Feb 11 Posts: 4 Credit: 6,609,198 RAC: 0 |
Hey R@H fam! I've been playing around with docker and have created a docker build file that passes through virtualbox and an Nvidia GPU from the host to the container My PCs (Host: Ubuntu 20.04) running this have been successfully completing the python vbox project WUs along with GPUGrid acemd CUDA WUs so at some level this does work. I would love any suggestions at improving it from anyone here! https://github.com/WeLiveInSpace/docker-boinc-vbox-nv_gpu |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
Hey R@H fam! Wait, wait Are you saying you're crunching python wus ON the gpu and not on the cpu? Results? |
bozz4science Send message Joined: 2 May 20 Posts: 7 Credit: 228,784 RAC: 0 |
Sounds very exciting! Didn't know that you could pass through a virtual box designed CPU-intended WU to an NVIDIA GPU using a docker image. Will take a look later this week and test this on my system! Is there a way to limit tasks to the python tasks only for testing? P.S. This docker image could potentially also run on Windows, right? |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
P.S. This docker image could potentially also run on Windows, right? Yes. Windows 11, for example, completely supports Docker and also the access to gpu from the virtual machine (Cuda on WSL, AMD on WLS, etc) |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
P.S. This docker image could potentially also run on Windows, right? My fault. Also Windows 10 (from 1809 release) supports Docker and "gpu acceleration" on WSL2 |
3k7cGiWzDHhDmVziFjKz4UkRa1sm Send message Joined: 21 Feb 11 Posts: 4 Credit: 6,609,198 RAC: 0 |
Good question! Unfortunately nothing that exciting. Containers are a great way to rapidly deploy BOINC with all needed dependencies out to your systems. Though by default they have no access to low level devices on the host. This makes containerizing BOINC a bit tricky as you want to A) standardize the BOINC rollout without B) Artificially limiting the type of WU's you can complete. This is a docker image build that lets you run a containerized BOINC instance without limiting the WU's you are able to complete due to running BOINC within said container. I threw a README in the git repo. But essentially the setup process for a fresh debian system is... 1. Install base nvidia drivers on host 2. reboot 3. Install nvidia-container-toolkit, virtualbox-dkms, virtualbox 4. reboot 5. clone repo, cd into repo, run docker build 6. Run image (example Run commands in git repo) You can seamlessly migrate over from a bare-metal BOINC install by modifying the RUN command (Step #6). Requirements for migration (unsure about Windows, this again may only apply to Linux) 1. Suspend current WU's and completely stop the existing BOINC client. (eg. systemctl stop boinc-client) 2. In the RUN command --a) point the volume (-v /path/to...) to your existing BOINC data directory --b) (optional) modify the hostname to match your existing BOINC hostname. <-- This may not work as it will conflict with the host. Regardless I've found that there's some unique identifier in the BOINC data directory as projects transferred over my host average + total credit even though the hostname changed. Once the build is vetted to work on other systems (IDK about windows and what exactly it allows you to passthrough) I'll create docker.io images so you no don't need to build the image yourself and can instead pull directly from docker.io. :) I also have docker build files for AMD's ROCM and the generic Intel OpenCL runtimes. Both work and have completed WU's on WCG. Still working on getting repo's setup for them.[/list] |
3k7cGiWzDHhDmVziFjKz4UkRa1sm Send message Joined: 21 Feb 11 Posts: 4 Credit: 6,609,198 RAC: 0 |
P.S. This docker image could potentially also run on Windows, right? I didn't know this! Interesting... My build likely will not work in Windows docker containers as Microsoft threw some dependencies in the prebuilt image they are using. Hence the "The container base image must be mcr.microsoft.com/windows:1809 or newer." I don't have much experience with windows docker deploys but I can look into creating a custom build to suit that use case! |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
I also have docker build files for AMD's ROCM and the generic Intel OpenCL runtimes. Both work and have completed WU's on WCG. Still working on getting repo's setup for them I don't have much experience with windows docker deploys but I can look into creating a custom build to suit that use case! You're doing a GREAT work!! I hope that the admins of project will use these ideas/codes to use better ours pc (es. gpu/better work distribution/etc) P.S. If you need help, there is some MS documentation about Docker on Windows, like this. There is also documentations from Docker's site like this and this |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
Interesting news: WSL will be published on Windows Store and not more as a part of Windows. There are two big reasons to be excited for this change: You can get access to WSL features faster, and you don’t have to worry about changing your Windows version when getting the latest WSL updates.This change moves those binaries from being part of the Windows image, to instead being part of an application that you install from the Store. This decouples WSL from your Windows version, allowing you to update through the Microsoft Store instead. So now once new features like GUI app support, GPU compute, and Linux file system drive mounting are developed, tested and ready for a release you will get access to it right away on your machine without needing to update your entire Windows OS, or going to Windows Insider preview builds. |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
Seems that Nvidia GPU is widely used with Rosetta code in Baker Lab: Design of proteins Maybe, one day, on our home computers...... |
Greg_BE Send message Joined: 30 May 06 Posts: 5691 Credit: 5,859,226 RAC: 0 |
Seems that Nvidia GPU is widely used with Rosetta code in Baker Lab: Design of proteins Hah! IF we are lucky..in the next decade. The paper is quoting a run time of a $880 GPU. 90 minutes. So put that down on our level....at least double if not triple that time. And a 225W power draw...damn! I would have to get a separate power supply just for that beast. It scores 38th out of all GPUs and my 1070 scores 64th. Baker lab uses nice stuff internally, but has yet to adapt their stuff for BOINC GPU work. GPU has been asked for over and over again and the response is the same or ignored. |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
The paper is quoting a run time of a $880 GPU. 90 minutes. So put that down on our level....at least double if not triple that time. So, what's the problem? I'm crunching Folding@Home on my entry-level gpu with wus longer than 24hs. And, in the Boinc's world, there are a lot of people with high-end gpus GPU has been asked for over and over again and the response is the same or ignored. That's for sure P.S. Don't forget that a gpu client will attract a lot of volunteers. |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,159,764 RAC: 3,981 |
The paper is quoting a run time of a $880 GPU. 90 minutes. So put that down on our level....at least double if not triple that time. Not if a task tasks 3 or 4 days to run it though, yes most of us run tasks non stop but our gpu's do get a slight break between tasks even if we are running multiple tasks at once, but running the same task nonstop for 3 or 4 days could burn out a gpu if the settings aren't right. |
Greg_BE Send message Joined: 30 May 06 Posts: 5691 Credit: 5,859,226 RAC: 0 |
The paper is quoting a run time of a $880 GPU. 90 minutes. So put that down on our level....at least double if not triple that time. But Baker lab does not give us the super highend stuff. We get lower level to mid level stuff. They have a neural network for deep learning and all that fancy stuff to work on the seriously crazy stuff. FAH barely uses my 1050 and 1070. But I have yet to find a good science GPU project. I might just have to reattach to primegrid or something to get things going. As for this work around, I have a week off in two weeks or so and I can take the time to read through this more deeply and step by step. Right now is browse and comment and move on. |
[VENETO] boboviz Send message Joined: 1 Dec 05 Posts: 1994 Credit: 9,582,381 RAC: 7,850 |
Not if a task tasks 3 or 4 days to run it though, yes most of us run tasks non stop but our gpu's do get a slight break between tasks even if we are running multiple tasks at once, but running the same task nonstop for 3 or 4 days could burn out a gpu if the settings aren't right. A 2080 used in the paper produces approximately 9000 gflops in single precision (i think they use the single) and crunch a wu in 90 minutes. A 1060 (entry level gpu of 2016) produces approximately 4000 gflps in SP, so a wu takes 200 minutes to finish. I think it's acceptable. 3/4 days will take an old integrated Intel gpu.... But mine are only speculation, cause we don't know nothing about this app and the possibility to run this in Boinc. |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,159,764 RAC: 3,981 |
Try Einstein they have both "Gravitational Wave search O3 All-Sky" and "Gamma-ray pulsar binary search #1 (GPU)" tasks right now with the 2nd one giving 3 times the credit of the 1st one. |
Greg_BE Send message Joined: 30 May 06 Posts: 5691 Credit: 5,859,226 RAC: 0 |
I see what the problem is...a certain power supply software that I was looking at for temp monitoring and some other things is way off on its values. I had a look at MSI afterburner and its showing 98% usage which makes sense because my temps are high enough and I can hear my fans blowing. |
mikey Send message Joined: 5 Jan 06 Posts: 1895 Credit: 9,159,764 RAC: 3,981 |
Yup that does make sense |
Message boards :
Number crunching :
New Docker image with Virtualbox and Nvidia GPU passthrough support
©2024 University of Washington
https://www.bakerlab.org