[WIP] Interoperability with other Python binding frameworks #5800
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
See also wjakob/nanobind#1140, the same feature for nanobind.
pymetabind is a proposed standard for Python <-> C-ish binding frameworks to be able to find and work with each other's types. For example, assuming versions of both pybind11 and nanobind that have adopted this standard, it would allow a pybind11-bound function to accept a parameter whose type is bound using nanobind, or to return such a type, or vice versa. Interoperability between different ABI versions or different domains of the same framework is supported under the same terms as interoperability between different frameworks. Compared to the
_pybind11_conduit_v1_
API, this one also supports implicit conversions and to-Python conversions, and should have significantly less overhead.The essence of this technique has been in use in production by my employer for a couple of years now to enable a large amount of pybind11 binding code to be ported to nanobind one compilation unit at a time. Almost everything that works natively works across framework boundaries too, at only a minor performance cost. Inheritance relationships and relinquishment (from-Python conversion of
unique_ptr<T>
) don't work cross-framework, and some of the more subtle corners of shared_ptr support probably don't transfer over; if you try to get ashared_ptr<T>
from a foreign T, you'll either get a new shared_ptr control block whose deleter drops a pyobject reference, or a new reference to theenable_shared_from_this
shared_ptr if T has one of those.This PR adds pybind11 support for exposing pybind11 types to pymetabind for other frameworks to use ("exporting") and using other frameworks' types that they have exposed to pymetabind ("importing"). Types bound by a different framework than the internals version of pybind11 that an extension module links with are called "foreign" to that module. This PR does not introduce an ABI break, but there are some ways that it could be simplified and sped up if/when you're willing to take one. I know I just missed 3.0 so I'm guessing that won't be for a while. :-)
One notable impact of this PR is that pybind11 now generates copy and move constructors for every type, instead of generating them lazily when an instance of that type is returned from a bound function. This is needed because we might be asked to copy or move an instance of that type via the foreign bindings framework even though it's never returned from a pybind11-bound function. The new constructors can break code that previously worked: if a type has no copy constructor and none of its immediate members are non-copyable but some of their subobjects are non-copyable, it will look copyable to std type traits, but actually generating the copy constructor produces an error. Similarly, you can now get "definition of implicit copy constructor is deprecated because it has a user-declared destructor" warnings on types that previously didn't produce them. I'm not overly worried about this since it's no worse than what would already happen when returning even
const Foo&
, but we could consider a #define that suppresses the constructor generation if you think this is too much of an upgrade hazard.Current status: nominally code complete and existing tests pass, but I haven't added interop-specific tests or public-facing docs yet.
Performance: Due to the inability to modify internals and type_info without an ABI break, once pybind11 knows about any foreign types, all failed casts incur an additional map lookup. Overhead should still be low for people who don't use the interoperability features at all. I haven't measured it.
Things that need to happen before this can be released:
[ ] add unit tests
[ ] add user-facing documentation
[ ] test correctness of nanobind/pybind11 interop
[ ] test performance
[ ] solicit feedback from maintainers of other binding libraries
[ ] release pymetabind v1.0, incorporating said feedback
📚 Documentation preview 📚: https://pybind11--5800.org.readthedocs.build/