Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement nlinalg Ops in PyTorch #920

Merged
merged 5 commits into from
Jul 26, 2024

Conversation

twaclaw
Copy link
Contributor

@twaclaw twaclaw commented Jul 10, 2024

Description

Implemented Ops:

  • SVD
  • Det
  • SLogDet
  • Eig
  • Eigh
  • KroneckerProduct
  • MatrixInverse
  • MatrixPinv
  • QRFul

Related Issue

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

Copy link

codecov bot commented Jul 10, 2024

Codecov Report

Attention: Patch coverage is 75.80645% with 15 lines in your changes missing coverage. Please review.

Project coverage is 81.60%. Comparing base (d455460) to head (2d061a5).
Report is 83 commits behind head on main.

Files with missing lines Patch % Lines
pytensor/link/pytorch/dispatch/nlinalg.py 75.40% 15 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #920      +/-   ##
==========================================
- Coverage   81.60%   81.60%   -0.01%     
==========================================
  Files         178      179       +1     
  Lines       47209    47271      +62     
  Branches    11478    11481       +3     
==========================================
+ Hits        38527    38574      +47     
- Misses       6496     6511      +15     
  Partials     2186     2186              
Files with missing lines Coverage Δ
pytensor/link/pytorch/dispatch/__init__.py 100.00% <100.00%> (ø)
pytensor/link/pytorch/dispatch/nlinalg.py 75.40% <75.40%> (ø)

@ricardoV94
Copy link
Member

Looks great, but I think a few of those Ops are already being addressed in other PRs.

@twaclaw
Copy link
Contributor Author

twaclaw commented Jul 10, 2024

Looks great, but I think a few of those Ops are already being addressed in other PRs.

I see. I mixed the linalg and math ops because they were in the same file. Sorry about that. What should I do ? Should I keep only the linalg ops in this PR?

@twaclaw
Copy link
Contributor Author

twaclaw commented Jul 10, 2024

@ricardoV94, what about Max? It is not listed in the description of #821.

@ricardoV94
Copy link
Member

@ricardoV94, what about Max? It is not listed in the description of #821.

Should have been handled as part of the CAReduce, unless we missed it

@ricardoV94 ricardoV94 added enhancement New feature or request linalg Linear algebra torch PyTorch backend labels Jul 12, 2024
@twaclaw twaclaw force-pushed the implement_nlinalg_pytorch branch from a334c6a to 2854c37 Compare July 12, 2024 21:05
@ricardoV94
Copy link
Member

There are some conflicts, and a small comment on the test. Otherwise looks good to me

twaclaw added 5 commits July 23, 2024 21:39
Implemented Ops:
- Argmax
- Max
- Dot
- SVD
- Det
- SLogDet
- Eig
- Eigh
- KroneckerProduct
- MatrixInverse
- MatrixPinv
- QRFul
Replaced instances using Blockwise by the Op constructor.
@twaclaw twaclaw force-pushed the implement_nlinalg_pytorch branch from 425330f to 2d061a5 Compare July 23, 2024 19:44
@ricardoV94 ricardoV94 merged commit 58fec45 into pymc-devs:main Jul 26, 2024
58 of 59 checks passed
@ricardoV94 ricardoV94 changed the title Implemented nlinalg in PyTorch Implement nlinalg in PyTorch Jul 26, 2024
Ch0ronomato pushed a commit to Ch0ronomato/pytensor that referenced this pull request Aug 15, 2024
@ricardoV94 ricardoV94 changed the title Implement nlinalg in PyTorch Implement nlinalg Ops in PyTorch Aug 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request linalg Linear algebra torch PyTorch backend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants