Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor: Error handling and minor fixes for ShuffleCodec #49

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ccciudatu
Copy link

The following changes are required for allowing the ShuffleCodec to be used alongside other extension codecs, but this patch makes sense irrespective of supporting additional extensions:

  • Replace unreachable!() with error results to allow adding fallback extension codecs for custom physical plans instead of panicking
  • Leverage the built-in recursive plan serialization instead of redundantly serializing the child plan inside the shuffle writer

// No need to redundantly serialize the child plan, as input plan(s) are recursively
// serialized by PhysicalPlanNode and will be available as `inputs` in `try_decode`.
// TODO: remove this field from the proto definition?
plan: None,
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@austin362667 is this assessment legit or is there a reason that I missed for serialising the child plan as part of the ShuffleWriterExecNode proto instead of relying on the built-in recursive input plan serialisation?

As a side note, this change is needed so that we don't assume that the ShuffleCodec (i.e. self in the try_from_physical_plan(writer.plan.clone(), self) call) is the only extension codec needed for serialising the whole plan, but allow for additional extensions be registered without requiring this particular codec to be aware of them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant