Question on building a hybrid C/C++ extension.
mreineck opened this issue ยท 13 comments
While trying to use pybind11 for interfacing a hybrid C/C++ codebase, following your example project, I encountered the following problem:
If the source files are a mix of C and C++ files, some compilers (most notably clang) complain and stop if the "-std=c++14" flag is supplied when compiling a C file.
This is of course completely reasonable, but even after extensive searching on the web I did not find a way to specify separate C and C++ compilation flags to setuptools (in analogy to CFLAGS and CXXFLAGS use by autotools).
Do you happen to know a trick how to achieve this? If so, I think it would be great if this could be added to your test project, to illustrate how it is done!
I do not know if it is possible for your use case, but you could try to separate the C code into a separate extension module. So you would have a C extension module and a C++ extension module that links to the C module. Then you can use separate flags for each module.
In my project, I use multiple extension modules for other reasons, but also require linking between them: https://github.com/blue-yonder/turbodbc/blob/master/setup.py#L116 (get_extension_modules() function).
I'm not sure whether that works in my case. The problem is that the C library does not provide any Python interface, but is exclusively needed by the C++ code; so I would basically have to "misuse" the setup.py machinery to build/install a package with no Python interface at all. The problem will probably be the C header files, which the C++ library has to locate somehow.
I'll give it a try!
My project has a libturbodbc "extension" that does not expose Python stuff at all. The header files are easily found since you have to put it in the package anyways. I think you will be fine!
Thanks a lot! Looking at your setup.py has helped me write my own version, which works almost perfectly now.
The only remaining problem is that pybind11.get_include() reports incorrect paths ... I installed it via 'pip install --user', but it gives me a path of "/usr/local/include/python2.7", where there are no headers, of course. But that's a different issue...
Use pybind11.get_include(True) to get the --user install path.
@mreineck I should point out that user install are generally evil ๐ฟ, especially for python packages, jupyter extensions, and headers!
The reason is that any python installation would pick up these with a higher precedence than the ones of their prefix.
I should point out that user install are generally evil
Less evil than sudo installs...
Well, the right way to do things is a prefix install...
I see that I'll need a lot more time before I grasp all the intricacies of Python packages ... but for the moment I think I have all the ingredients I need! Thank you all for the help!
short off-topic question concerning the --user install comment by @SylvainCorlay
I'm ususally compiling/installing my custom pybind11-packages on a sun grid engine compute cluster, where python is installed at root level using several specifically compiled libraries, e.g. LBLAS, etc. So I'm not going to install a new python environment. I found installing via --user extremely useful since I obviously don't have root permission, I only need to install it once and it works for every node (user directories are shared between compute nodes), and I can still use the packages installed at root level.
Is there an alternative, more canonic way to do what I need to do there? Otherwise I'd say --user is pretty useful in this case.
User installs only start being an issue when you use other environments than the root one, so you should be fine.
As soon as you have more than one, it start being nearly impossible to keep a consistent state with user installs.
I think the final issue was fixed around pybind11 2.5.
The problem still exists for pybind 2.6.2. Mixing C and C++ still fails, when using clang
error: invalid argument '-std=c++17' not allowed with 'C'