Rename repository
wjakob opened this issue · 13 comments
Hi Sylvain,
Dean Moldovan contributed a new repository named cmake_example
. I was considering to rename this into something consistent, e.g. pip_example
or python_example
. Any thoughts on this?
Thanks,
Wenzel
No problem to rename it into python_example
(since we also provide a conda recipe). Quick question: in which case do you think it is preferable to use a cmake example?
Big complex projects that are mainly developed on the C++ side (i.e. where the pybind11 plugin just constitutes a small "bonus" component that may or may not even be turned on)
Then would it make sense for that small "bonus" pybind11-base plugin to be a separate thing?
An interesting example for a python-example
style thing would be a pip-installable project where the build_ext step would use cmake.
I don't want to get into philosophical discussions about when it makes sense to use CMake to compile pybind11 projects, vs pip :). There is clearly demand to do this, and it's convenient in C++11 codebases that all use CMake by default.
After patches by Dean Moldovan the necessary bits to do could not be more succinct: https://github.com/pybind/cmake_example/blob/master/CMakeLists.txt
Hi Sylvain,
Just to chime in here, the cmake_example
is already pip-installable (see the setup.py
file there - it extends build_ext
). It can also build wheels and upload to PyPI as expected, although right now some manual listing of the source files is required. But I hope to improve that and also provide more documentation for the whole thing.
@dean0x7d thanks!
I was looking at the cmake example, it looks really cool.
Looking at the source for cmake_example, I would recommend to not use a submodule for pybind11 but rather have it as a pip dependency.
The reason is that if you have other modules that are built with a pip-installed (and different) version, you might end up with binary incompatible packages (for example if your extension also exposes headers that include pybind11).
By the way I tried using the cmake example and there might be a couple of things to fix (Unknown CMake command "pybind11_add_module".
).
You need to call git submodule update --init --recursive
. I am greatly in favor of using the recursive git module approach, rather than the pip-installed pybind11 version. A particular project may depend on a specific version of pybind11 (incompatible changes DO happen, for instance in the upcoming release).
The issue with ABI incompatibility among different pybind11 releases was solved a few versions ago by namespacing everything.
Having had to deal with a large project based on submodules, I think that submodules are a good fit for a dependency that is only used by one project, but end up being extremely inadequate for things that may be used by multiple packages.
Relying on a package manager (like conda) and correct version numbering accounting for ABI compatibility is much more manageable especially for big complex projects...
If you need a pre-release version you can always refer to a [pre/alpha/beta/dev]
prerelease.
The issue with ABI incompatibility among different pybind11 releases was solved a few versions ago by namespacing everything.
What do you mean ?
Let's not derail this into a discussion about software engineering :). There are some projects that use this git submodule mechanism (e.g. all of my C++ projects with bindings), and it's a reasonable thing to do in this case (though it may not be in other situations)
Regarding namespacing: If you have two extension modules that are compiled with different and incompatible versions of pybind (e.g. libA and libB), the global Python "capsules" that are used to store pybind11's internal state are decoupled, which means that they won't corrupt each other. This means that it's no problem for these to coexist in the same interpreter.
Of course, if libA references types defined by libB then you'll still get into trouble. (TypeErrors rather than segfaults though). In that case, I find it reasonable to expect that the user would compile libA and libB with the same version of pybind11.
Of course, if libA references types defined by libB then you'll still get into trouble. (TypeErrors rather than segfaults though). In that case, I find it reasonable to expect that the user would compile libA and libB with the same version of pybind11.
Yes, that is the case we are in. Multiple extension modules have both
- a runtime exposing python bindings
- expose headers and import each other at a c level.
- have build-time (inequality) requirements on the pybind version.
So I prefer the solution where everyone uses the same version of pybind, provided by the package manager.
I am not too concerned about pre-release versions, since there seems to be regular releases of the package.
I was actually thinking to have both the submodule and package dependency as possibilities. I believe it could be made pretty seamless and automatically prevent disaster cases, e.g. using one pybind11 version in a submodule for development but then distributing with a different version pip dependency.
It could be that I'm think about it too naively right now, so I'll get back to you when I work out something more concrete.
In any case, I would not want to lose the submodule approach because it allows CMake to work independently, which is very important for the workflow of C++-centric projects. So for right now I'll focus on improving the submodule based system: the setup.py
integration needs to be made a little more robust.