Skip to content

[TorchToLinalg] Support lowering AtenMaxUnpool2dOp for linalg backend #4265

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

vinitdeodhar
Copy link
Contributor

The change adds support to lower AtenMaxUnpool2dOp for linalg backend. Existing lowering for AtenMaxUnpool3dOp is refactored such that 2d/3d variants share utils

Unlike max_unpool3d, max_unpool2d does not support specification of stride, padding
torch op registry

The default values of stride and padding
https://docs.pytorch.org/docs/stable/generated/torch.nn.MaxUnpool2d.html

Closes #4255

@vinitdeodhar
Copy link
Contributor Author

@vivekkhandelwal1 , @zjgarvey , @sahas3 can you please provide a review ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[TorchToLinalg] Add support for AtenMaxUnpool2dOp and conversion to linalg dialect
1 participant