pybind/pybind11

How to expose custom type derived from pure python type

Opened this issue ยท 8 comments

Issue description

My goal is to create a wrapper to the python default file type.
For that I created a C++ class (with no base class) and I want to expose it as a subclass of file. Which I cannot do as file isn't itself exported by pybind11.

Is there a way to support this pattern with this library (I couldn't find this workflow in the doc) ?

Inheriting directly from Python classes like that isn't feasible: method availability isn't determined until runtime in Python, but needs to be known during compilation in C++.

What you'll need to do instead is write a C++ class that provides the methods you want and maps these into Python function calls. So, for example, this class could store a py::object file_; containing the Python file object, then have (for example) a std::string readline() { return file_.attr("readline")(); } and similar methods for whatever other parts of the API you want to expose.

Thanks for the clarification,
I was really hopping that I could do something with theget_(local/global)_type_info() methods to wrap the original file with pybind11 type_info to enable this scenario.

Is there a way to write a short Python "bridge" so that isinstance(cpp_type, python_type) would return true in this case? I have implemented all of the methods of the base class, but my understanding is that isinstance would still fail.

Have a look at __instancecheck__, an 'operator overload' which can be implemented on a Python metaclass.

For example, I've recently found out that while io.IOBase is not in the MRO of the file object you get back from calling open(...) (but io._IOBase is), so I'm assuming that this kind of 'interception' is also used there. That might be what you're looking for?

Awesome thanks!

I believe I may have ran into a situation where I now need this to: I am tinkering with making a pybind11 object a custom user-type in numpy, since operator== and dense referencing using buffers / Eigen::Ref<> doesn't really map with sparse object matrices: RobotLocomotion/drake#8116)

In looking at the NumPy example code, it looks like the object has to inherit from PyGenericArrType_Type / numpy.generic for certain things to work (I think dtype inference with np.array(...), and possibly ensuring that stuff like np.zeros doesn't puke?).

My current thoughts on working around this are to (a) write a wrapper / shim class - blech - or (b) just avoid trying to shove pybind11 into numpy, though this would complicate C++ interfaces for reference semantics (but should be handled well by using a type_caster).

If you can easily extend both classes in your python code, I'd strongly suggest doing that. The bindings can't extend anything they are not aware of at build time.

Thanks! Unfortunately, it was rather painful (or rather, confusing) to try and make a separate np.generic-derived class and have that somehow tied to a bound NumPy class (I've already done some proxying for const-ness, not exactly a behavior I want to promote even more).

In the end, I just ended up abusing py::class_, and used a different instance registry than pybind11, and this allowed me to do the dirty things I wanted. Kinda sucks, but was a ton easier than trying to fully debug inheritance at this point:
https://github.com/EricCousineau-TRI/repro/blob/c6345c87b18a8a40dd9ebd5931fafef6c7f71af0/python/pybind11/dtype_stuff/dtype_class.h#L246