Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tesla GPUs needs vGPU license to pass through to Docker #18

Open
angelmankel opened this issue Oct 24, 2022 · 7 comments
Open

Tesla GPUs needs vGPU license to pass through to Docker #18

angelmankel opened this issue Oct 24, 2022 · 7 comments
Labels
help wanted Extra attention is needed

Comments

@angelmankel
Copy link

Hi, I recently got two Tesla P40 GPUs which I was hoping to use with this. From my understanding, the Tesla P40s need the vGPU license in order to pass through via WSL. I am using my tesla cards locally for other applications as well and basically use this as a graphics/machine learning server running windows 11 so I don't really want to install Linux on the PC itself.

Do you see any easy way to run this without docker? Hopefully I'm wrong about the licensing. I tried to export the container run the scripts locally but I honestly don't know what i'm doing with that and didn't make much progress.

@NickLucche NickLucche added the help wanted Extra attention is needed label Oct 25, 2022
@NickLucche
Copy link
Owner

Mmm I don't know of any particular license between the P40s and the WSL but I am not a windows user, maybe someone else can clarify this. I think windows11 should have a more integrated support for wsl.
Can you use commands like nvidia-smi by opening up a console or powershell?

@angelmankel
Copy link
Author

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html
image

When I run nvidia-smi in PowerShell this is what I get.
image

If I run nvidia-smi in wsl this is what I get.
image

@NickLucche
Copy link
Owner

any chance you can run docker run.. from powershell? I am not aware of the changes introduced in win11 sorry, I'll mark "help wanted" here

@huotarih
Copy link

Näyttökuva 2022-11-16 kello 22 55 53

Tesla T4 works fine.

@Anonym0us33
Copy link

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html image

When I run nvidia-smi in PowerShell this is what I get. image

If I run nvidia-smi in wsl this is what I get. image

image
similar problem. different error. Tesla K80 and RTX 3070

@vleeuwenmenno
Copy link

Wasn't this removed relatively recently in the newer drivers? I remember being able to run 2 VMs on my RTX 2080

@helyxzion50943
Copy link

I have the very same set up like to know as much as possible wish there was a way to get in contact ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

6 participants