Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory requirements of the inference #10

Open
martinfleis opened this issue Mar 1, 2024 · 0 comments
Open

Memory requirements of the inference #10

martinfleis opened this issue Mar 1, 2024 · 0 comments

Comments

@martinfleis
Copy link
Collaborator

Inference on larger areas may need a bit more memory than free Azure runner and Wasm (strict 2gb) can provide.

We see failures of this sort when running accessibility

numpy.core._exceptions._ArrayMemoryError: Unable to allocate 509. MiB for an array with shape (10721, 6229) and data type float64

I can possibly cast values to float32 saving a bit of memory and hoping it will be enough but we’ll hit the same issue on any larger area again. Refactoring to avoid allocation of this array would mean complete rewrite of accessibility and most likely a significant drop in performance leading to minutes-long wait time for inference.

The solution for Azure is easy, we just need to bump the instance memory (if we have credits to do that). The solution for wasm I am not sure if exists if we want both fast and super memory efficient inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant