Skip to content

2301 fold pad into conv #2363

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

Johansmm
Copy link
Contributor

@Johansmm Johansmm commented Jun 4, 2025

Fuses Pad nodes into the following nodes (Conv, ConvInteger)
(#2301)

Convert 'auto_pad' attribute into a list of explicit pads.
@Johansmm Johansmm force-pushed the 2301-fold-pad-into-conv branch from 4b9b69b to 19b0418 Compare June 4, 2025 20:35
@Johansmm Johansmm requested a review from justinchuby June 4, 2025 22:09
Comment on lines +4 to +6
- Pad ∘ Conv -> Conv
- Pad ∘ ConvInteger -> ConvInteger

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this Conv ∘ Pad and ConvInteger ∘ Pad?

)
elif constant_value.const_value.numpy().item() != 0:
return check_result.fail(f"{constant_value.name} must be equal to 0.")
axes = list(range(x_rank))

Check warning

Code scanning / CodeQL

Variable defined multiple times Warning

This assignment to 'axes' is unnecessary as it is
redefined
before this value is used.
"""Replaces ``Pad(ConvInteger(x))`` with ``ConvInteger(x)``."""

def __init__(self, as_function: bool = False):
super(FusePadConv, self).__init__(name="FusePadConvInteger", as_function=as_function)

Check failure

Code scanning / CodeQL

First argument to super() is not enclosing class Error

First argument to super() should be FusePadConvInteger.
return new_pads


def read_conv_attributes(ir_conv: ir.Node) -> dict[str, typing.Sequence[int] | str]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link

codecov bot commented Jun 4, 2025

Codecov Report

Attention: Patch coverage is 94.92188% with 13 lines in your changes missing coverage. Please review.

Project coverage is 70.42%. Comparing base (99323bf) to head (19b0418).

Files with missing lines Patch % Lines
onnxscript/rewriter/fuse_pad_into_conv.py 95.07% 4 Missing and 3 partials ⚠️
onnxscript/rewriter/fuse_pad_into_conv_test.py 94.73% 2 Missing and 4 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2363      +/-   ##
==========================================
+ Coverage   70.15%   70.42%   +0.26%     
==========================================
  Files         196      198       +2     
  Lines       24687    24943     +256     
  Branches     2645     2683      +38     
==========================================
+ Hits        17320    17566     +246     
- Misses       6454     6459       +5     
- Partials      913      918       +5     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Comment on lines +37 to +40
if (kernel_shape := ir_conv.attributes.get("kernel_shape", None)) is not None:
attributes["kernel_shape"] = kernel_shape.as_ints()
else:
attributes["kernel_shape"] = ir_conv.inputs[1].shape[2:]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (kernel_shape := ir_conv.attributes.get("kernel_shape", None)) is not None:
attributes["kernel_shape"] = kernel_shape.as_ints()
else:
attributes["kernel_shape"] = ir_conv.inputs[1].shape[2:]
attributes["kernel_shape"] = ir_conv.attributes.get_ints("kernel_shape", ir_conv.inputs[1].shape[2:])

Comment on lines +37 to +40
if (kernel_shape := ir_conv.attributes.get("kernel_shape", None)) is not None:
attributes["kernel_shape"] = kernel_shape.as_ints()
else:
attributes["kernel_shape"] = ir_conv.inputs[1].shape[2:]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if the shape of ir_conv.inputs[1] is not known? Just making sure it is checked

conv_attr: typing.Mapping[str, ir.Attr] = cnode.attributes.copy()
if "pads" in conv_attr:
new_pads = [x + y for x, y in zip(conv_attr["pads"].as_ints(), new_pads)]
conv_attr["pads"] = ir.convenience.convert_attribute("pads", new_pads)
Copy link
Collaborator

@justinchuby justinchuby Jun 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
conv_attr["pads"] = ir.convenience.convert_attribute("pads", new_pads)
conv_attr.add(ir.AttrInt64s("pads", new_pads))

new_pads = pad_pads[2:x_rank] + pad_pads[x_rank + 2 :]

# Replace conv pads = new + old
conv_attr: typing.Mapping[str, ir.Attr] = cnode.attributes.copy()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
conv_attr: typing.Mapping[str, ir.Attr] = cnode.attributes.copy()
conv_attr = cnode.attributes.copy()

Comment on lines +64 to +65
pnode = pad.producer()
cnode = conv.producer()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pnode = pad.producer()
cnode = conv.producer()
pad_node = pad.producer()
conv_node = conv.producer()

avoid abbreviations unless well known

opset_imports=opset_imports,
name="model",
),
ir_version=9,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
ir_version=9,
ir_version=10,

nit: prefer 10 but doesn't matter too much

conv_attributes: typing.Mapping[str, ir.Attr] | None = None,
opset_imports: typing.Mapping[str, int] = {"": 20},
) -> ir.Model:
tape = ir.tape.Tape()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great example of using the tape module, thanks!

pad_inputs: typing.Sequence[ir.TensorProtocol | ir.Value | None],
pad_attributes: typing.Mapping[str, ir.Attr] | None = None,
conv_attributes: typing.Mapping[str, ir.Attr] | None = None,
opset_imports: typing.Mapping[str, int] = {"": 20},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using dictionaries as default values is not recommended. Since this field is not changed by callers, I suggest we set it as a constant inside the method

self,
op_type: str,
input_shape: ir.Shape,
weight_shape: typing.Sequence[int],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
weight_shape: typing.Sequence[int],
weight_shape: Sequence[int],

prefer importing the classes from the typing module: https://google.github.io/styleguide/pyguide.html#2241-exemptions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging this pull request may close these issues.

2 participants