Silencing stdout/stderr by default makes some errors extremely hard to debug
exhuma opened this issue · 7 comments
Using pep517
to extract the project version number resulted in the following error:
Traceback (most recent call last):
File "/usr/local/bin/get-version", line 7, in <module>
meta = pep517.meta.load(".")
File "/usr/local/lib/python3.9/site-packages/pep517/meta.py", line 71, in load
path = Path(build_as_zip(builder))
File "/usr/local/lib/python3.9/site-packages/pep517/meta.py", line 58, in build_as_zip
builder(dest=out_dir)
File "/usr/local/lib/python3.9/site-packages/pep517/meta.py", line 53, in build
_prep_meta(hooks, env, dest)
File "/usr/local/lib/python3.9/site-packages/pep517/meta.py", line 28, in _prep_meta
reqs = hooks.get_requires_for_build_wheel({})
File "/usr/local/lib/python3.9/site-packages/pep517/wrappers.py", line 179, in get_requires_for_build_wheel
return self._call_hook('get_requires_for_build_wheel', {
File "/usr/local/lib/python3.9/site-packages/pep517/wrappers.py", line 329, in _call_hook
self._subprocess_runner(
File "/usr/local/lib/python3.9/site-packages/pep517/wrappers.py", line 76, in quiet_subprocess_runner
check_output(cmd, cwd=cwd, env=env, stderr=STDOUT)
File "/usr/local/lib/python3.9/subprocess.py", line 424, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File "/usr/local/lib/python3.9/subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['/usr/local/bin/python3', '/usr/local/lib/python3.9/site-packages/pep517/in_process/_in_process.py', 'get_requires_for_build_wheel', '/tmp/tmp5zbzjtns']' returned non-zero exit status 1.
Note that the line-numbers may not be representative as I had to resort to some fairly ugly print-debugging with a pinch of pdb
.
After a lengthy session I found quiet_subprocess_runner which completely suppresses any output. Wrapping that with a try/except and printing the exception content has shown me this message:
b'0\n1\n2\nrunning egg_info\nerror: [Errno 13] Permission denied\n'
Which is pretty clear and helped me to fix the issue in my environment.
But this should not have taken me over an hour to fix. At least put a debug-log or something with the error to make it easier to detect. My first try was actually to set the root-logger level to DEBUG
but I could also not see anything there.
The pep517.meta
module is deprecated, along with everything which uses the envbuild
module. This package has never been a good way to build packages, it was only implemented as a proof of concept.
The build package provides an alternative: build.util.project_wheel_metadata()
.
Thanks for this. This works for me. It is a bit surprising to see it construct a MIME message instance. I assume this is for PEP-0566. As it contains all the values I need I can work with this.
Yup, the canonical format for Python packaging metadata is based on email headers. When it was first defined in 2001, JSON either didn't exist or was something new and unfamiliar. And there's little benefit in changing the format, because tools still have to work with all the existing packages.
The
pep517.meta
module is deprecated, along with everything which uses theenvbuild
module. This package has never been a good way to build packages, it was only implemented as a proof of concept.The build package provides an alternative:
build.util.project_wheel_metadata()
.
I have question ..
I'm building almost all my rpm packages using pep517 based build procedure using build
module. I'm trying as well build module documentation as man page always if module uses sphinx to generate documentation.
Looking on this I think that it should be possible to use that build.util.project_wheel_metadata()
to extract project version and other bits in sphinx conf.py file.
Q: May I ask about opinion using that as king of generic/template method? 🤔
I'm asking because I see used are any methods to extract for example current vision. From use pkg_resources
or importlib-metadata
modules to extract that bit from version.py file (and maaany more variations sometimes really strange approaches).
Issue with pkg_resources
or importlib-metadata
is that those modules needs generated .dist-info or .egg-info metadata files to extract that .. but on use for example whey
, poetry
, hatchlings
and few other pep517 backends those files arte not generated during build
execution like it win case of setuptools
which forces to use for example tox
or virtualenv
to be able use sphinx without have installed modules.
I've not been testing yet use build.util.project_wheel_metadata()
but on first looks seems like it does not need to have actual metadata files so looks like perfect candidate to have generic version or other modules metadata extraction method. Am I right?
I would be rally appreciated for any comments about above .. and sorry for a bit off-topic question.
I think from what you describe, build.util.project_wheel_metadata()
should be useful.
importlib.metadata
(the new way), importlib-metadata
(the same backported to older Python versions) and pkg_resources
(the older way) are all meant to extract metadata about installed packages. With setuptools it's easy to generate the metadata and then fudge things a bit so those tools see it like an installed package, but the standard way to use them is to install the package somewhere first.
build.util.project_wheel_metadata()
is meant to get the metadata from a package's source tree (a git checkout, or an unpacked sdist). This involves invoking the backend in a subprocess (and possibly downloading and installing the backend), so it's a bit slower, but it's roughly the standard-based equivalent of what I imagine you were doing with setuptools to get metadata before.
As you've noticed, what projects do in both their Sphinx conf.py
and their own module code to get the version number is entirely up to them. The Sphinx config might assume that the package is already properly installed, or that it's importable, or that it's in a git checkout, or anything. That's unfortunately going to be a pain for downstream maintainers wanting to build the docs.
While a bit off topic, the following might help someone coming across this via google-searches:
I've been using yq (which has toml support) pretty successfully lately for automation. Especially since most of the interesting project metadata has now been well defined via PEP-621 and is therefore independent of build-backend.
With that, extracting the version number is as simple as tomlq -r .project.version pyproject.toml
Don't forget that the spec allows for version (along with most other fields) to be dynamic, so reading it directly from pyproject.toml
won't work for every package. But obviously where it does work, it's likely to be quicker than calling a build backend to find the version number.