Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not work well on windows #66

Closed
cuikai-ai opened this issue Sep 10, 2024 · 6 comments
Closed

Not work well on windows #66

cuikai-ai opened this issue Sep 10, 2024 · 6 comments

Comments

@cuikai-ai
Copy link

1725933869049

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip to attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

@cuikai-ai
Copy link
Author

atfer install deepsearch-glm==0.2.3

@PeterStaar-IBM
Copy link
Contributor

@cuikai-ai Yes, our next (dev) step is to support windows. We need the latest deepsearch-glm, but we are on it. We would really appreciate it if you could give us feedback, to verify all works as expected!

@sdspieg
Copy link

sdspieg commented Sep 11, 2024

I works fine on Windows under wsl for me... It also uses my GPU (albeit not very efficiently yet)...
image
I guess I'll have to see if I can optimize the Batch Size and maybe also play with concurrent.futures. Any recommendations for that?

@maxmnemonic
Copy link
Contributor

maxmnemonic commented Sep 16, 2024

I works fine on Windows under wsl for me... It also uses my GPU (albeit not very efficiently yet)... I guess I'll have to see if I can optimize the Batch Size and maybe also play with concurrent.futures. Any recommendations for that?

Underlying models use GPU, and currently there is a big benefit of using GPU, albeit not as big as it could be, because of relatively low utilization. We are looking into ways of improving utilization, but it’s not very straight forward.

You can play with general batching and concurrency settings here: settings.py to get better use of your CPU and RAM, but I doubt you could load GPU more efficiently this way, please let us know if this would influence your GPU utilization as well.

@dolfim-ibm
Copy link
Contributor

This is scheduled to be completed soon. See #104

@dolfim-ibm
Copy link
Contributor

@cuikai-ai @sdspieg Windows support is now available.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants