-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sparsity support in pytensor #1127
Comments
Is this specifically about implementing |
The original issue seems to be just solve, I imagine that's good enough to start |
So for solve it's easy enough to wrap For the numba backend we need to write our own overrides, as I did for Finally for the Torch backend I honestly have no idea. It looks like torch has sparse support as well as an spsolve implementation, so it might be straight forward? |
Great, thanks for the plan! Out of curiosity, what could we do? I'm not aware of what general sparsity support would look like beyond making sure things can use csr and csc tensors, which i think pytensor already has |
Some thoughts:
|
Description
I'm investing implementing ALS in pytensor which is usually implemented with sparsity constructs (see implicit for reference). I quickly looked around and saw this older thread where someone asked for sparsity support. @jessegrabowski gave a first pass answer, but mentioned the support is subpar. Opening this to track any enhancements we could bring.
The text was updated successfully, but these errors were encountered: