-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tesla GPUs needs vGPU license to pass through to Docker #18
Comments
Mmm I don't know of any particular license between the P40s and the WSL but I am not a windows user, maybe someone else can clarify this. I think windows11 should have a more integrated support for wsl. |
According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html |
any chance you can run |
|
Wasn't this removed relatively recently in the newer drivers? I remember being able to run 2 VMs on my RTX 2080 |
I have the very same set up like to know as much as possible wish there was a way to get in contact ? |
Hi, I recently got two Tesla P40 GPUs which I was hoping to use with this. From my understanding, the Tesla P40s need the vGPU license in order to pass through via WSL. I am using my tesla cards locally for other applications as well and basically use this as a graphics/machine learning server running windows 11 so I don't really want to install Linux on the PC itself.
Do you see any easy way to run this without docker? Hopefully I'm wrong about the licensing. I tried to export the container run the scripts locally but I honestly don't know what i'm doing with that and didn't make much progress.
The text was updated successfully, but these errors were encountered: