pypi/warehouse

Cannot upload with external dependency due to "Invalid value for requires_dist"

Closed this issue · 6 comments

What's the problem this feature will solve?

I cannot upload a fully functioning wheel to pypi:

HTTPError: 400 Bad Request from https://upload.pypi.org/legacy/
Invalid value for requires_dist. Error: Can't have direct dependency: 'dictdiffer @ git+git://github.com/mafrosis/dictdiffer@merge-ignore'

Describe the solution you'd like

Allow packages to be uploaded when they reference an externally hosted wheel dependency.

Additional context

Apologies if this is a repost of an old issue (such as #7136).

My use-case is when a dependency of my package has a bug which I need to fix, but they are unable to release the patch to pypi. This effectively blocks new releases of my package.

A simple solution is to temporarily depend on a binary distributed on a different platform to pypi (ie, Github).

In this specific instance, I packaged a wheel on my Github fork. I update my dependencies to point at this new binary target (instead of a just pointing at a branch on a source repo).

At this point, everything works - distutils is happy and pip can install. I just can't share the work with anyone via pypi.

di commented

Thanks for filing the issue, but this is indeed a duplicate of #7136 and we don't have any plans to change the behavior here.

If the project in question is having issues releasing to PyPI, feel free to have them file an issue and we can attempt to help.

Thanks for coming back to me quickly with an answer.

For future readers, I'd like to note that I do think this is a real issue. Playing devil's advocate for a moment:

Imagine if my downstream dependency has a critical security flaw which needs to be patched immediately, but their CI is broken and their maintainer is on holiday? Where does that leave all the packages that need to ship a patch?

To be clear, I'm not saying someone needs to fix this - but I do think this bug restricts a certain class of very important fixes being released to the community.

di commented

Consider if you were able to do that: that release would be forever tied to your direct dependency, you would need to maintain the git repo indefinitely, it wouldn't be able to go away or that version would become broken, and it also wouldn't benefit once the affected/forked project made a proper new release, as it would continue using the direct dependency.

While in this specific case the existing behavior would mean a potentially slower "fix" for your project, in the long term it results in a much less broken, fractured & forked ecosystem.

I understand, and respect that it's not my call :)

To answer your question - I expect to "yank" that release once the fix makes it upstream, and so for some short indeterminate amount of time, the "hacked" release would continue to work.. But likely would break in the future. It should be up to my project to determine how short that "short time" is for our ecosystem and users.

You can fork the project on PyPI. You can push your own version of the package as a new package that you control, ideally with a long and explicit name and a very clear note in the short + long description that this is a temporary fork, and have your downstream package reference your fork instead of the real one. It's the equivalent of your git solution but without git.

That's absolutely right. I'm not sure that this is a better result than simply allowing packages to depend on binaries hosted by Github.

https://pypi.org/project/dictdiffer-jira-offline-fork/