-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Op (Slice - complex) | feat torchlib #2089
Add Op (Slice - complex) | feat torchlib #2089
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2089 +/- ##
=======================================
Coverage 72.25% 72.25%
=======================================
Files 217 217
Lines 29138 29143 +5
Branches 3462 3463 +1
=======================================
+ Hits 21053 21058 +5
Misses 6954 6954
Partials 1131 1131 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think negative axes need to be handled like Xavier suggested? Tests didn’t error which was a little unexpected
In this example, import torch
class ComplexSliceModel(torch.nn.Module):
def forward(self, x):
# Convert input to a complex tensor
x_complex = x.to(torch.complex64)
# Apply a slice operation on the complex tensor
return x_complex[:, :2]
model = ComplexSliceModel()
dummy_input = torch.randn(3, 4)
# Verify the model works as expected
print("Model output:", model(dummy_input))
# This call fails due to the slice op on a complex tensor.
torch.onnx.export(model, dummy_input, "complex_slice.onnx", dynamo=True)
It works because ExportedProgram use two slice ops to slice real and image respectively. I think it means the dim is respected? |
What happens if you did
Granted this may be unlikely. |
Added the dim adjustment. |
if dim < 0: | ||
# Account for the complex dimension in ONNX | ||
dim = dim - 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this could be a little confusing for users when they examine the graph. What do you think if we did dim += len(self.shape) - 1
?
Fix pytorch/pytorch#147896