pypa/pip

Deprecate call to `setup.py install` when building a wheel failed for source distributions without pyproject.toml

sbidoul opened this issue · 107 comments

This is one of a series of deprecations meant to collect feedback on the ultimate goal of always installing source distributions by first building a wheel and then installing from it.

This specific issue is about the case when pip calls setup.py bdist_wheel and that fails.
In that case, pip currently displays the build error and continues installation by attempting a setup.py install.

In a future version, pip will not attempt setup.py install in that case, and fail the installation right away.

Towards #8102

py -3.9 -m pip install -U pillow

fails, includes these messages:

  ERROR: Failed building wheel for pillow
  Running setup.py clean for pillow
Failed to build pillow
DEPRECATION: Could not build wheels for pillow which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Installing collected packages: pillow
    Running setup.py install for pillow ... error
    ERROR: Command errored out with exit status 1:

using:

> py -3.9 -m pip --version
pip 20.2 from C:\Users\Doug\AppData\Local\Programs\Python\Python39\lib\site-packages\pip (python 3.9)
> py -3.9
Python 3.9.0b5 (tags/v3.9.0b5:8ad7d50, Jul 20 2020, 18:35:09) [MSC v.1924 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>>

On windows 10 pro 1909, full patched as of 2020-07-30.

This has been reported as python-pillow/Pillow#4827

@djhenderson Your issue is not related. The setup.py install fallback deprecation simply shows a warning, and would not affect the error you see.

Thanks for clearly providing all the relevant information here @djhenderson. I think the issue tracker for Pillow (where you've filed an issue) is the correct place to get help with your issue. :)

how do you do this in Mac ?

I'm getting following error.
Failed to build pyaudio
DEPRECATION: Could not build wheels for pyaudio which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at #8368.

I was surprised to see so many link-backs for a simple deprecation message, and even downvotes. And then I realised each of the linked issues simply contains an error message of someone failing to build an arbitrary package from source with pip install (which would contain a link to this issue), and people don’t realise the deprecation message has nothing to do with it.

Maybe we should have only show the message if the setup.py install command succeeds instead of unconditionally. This would save us a lot of confused users blaming this issue for their problems.

@uranusjr I realised the same yesterday. With the current deprecation message we risk loosing useful reports in the noise. See #8752 for a possible better approach.

@googleworm if the installation of pyaudio succeeded after displaying the deprecation warning, then please provide the detailed log here. If it ended withRunning setup.py install for pyaudio ... error followed by a compilation error log, then your problem is not related to this issue and you should consult the pyaudio installation instructions for help.

@uranusjr @sbidoul I don't know if this is helpful, but for some reason, I received a link to this issue when my machine failed to build wheels for installing the regex package. I doubt this issue has anything to do with it, is it?

I am fairly new to git and github so kindly excuse if my comment is not very helpful. Thank you.

Screenshot 2020-08-12 at 10 55 49 PM

Here is my the complete error log I recieved:

(base) Aditya's-MacBook-Pro: aditya$ pip install regex
Collecting regex
  Using cached regex-2020.7.14.tar.gz (690 kB)
Building wheels for collected packages: regex
  Building wheel for regex (setup.py) ... error
  ERROR: Command errored out with exit status 1:
   command: /Users/apple/opt/anaconda3/envs/test1/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"'; __file__='"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-wheel-n2v32c8x
       cwd: /private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/
  Complete output (17 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.macosx-10.9-x86_64-3.8
  creating build/lib.macosx-10.9-x86_64-3.8/regex
  copying regex_3/__init__.py -> build/lib.macosx-10.9-x86_64-3.8/regex
  copying regex_3/regex.py -> build/lib.macosx-10.9-x86_64-3.8/regex
  copying regex_3/_regex_core.py -> build/lib.macosx-10.9-x86_64-3.8/regex
  copying regex_3/test_regex.py -> build/lib.macosx-10.9-x86_64-3.8/regex
  running build_ext
  building 'regex._regex' extension
  creating build/temp.macosx-10.9-x86_64-3.8
  creating build/temp.macosx-10.9-x86_64-3.8/regex_3
  gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/apple/opt/anaconda3/envs/test1/include -arch x86_64 -I/Users/apple/opt/anaconda3/envs/test1/include -arch x86_64 -I/Users/apple/opt/anaconda3/envs/test1/include/python3.8 -c regex_3/_regex.c -o build/temp.macosx-10.9-x86_64-3.8/regex_3/_regex.o
  xcrun: error: invalid active developer path (/Library/Developer/CommandLineTools), missing xcrun at: /Library/Developer/CommandLineTools/usr/bin/xcrun
  error: command 'gcc' failed with exit status 1
  ----------------------------------------
  ERROR: Failed building wheel for regex
  Running setup.py clean for regex
Failed to build regex
DEPRECATION: Could not build wheels for regex which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Installing collected packages: regex
    Running setup.py install for regex ... error
    ERROR: Command errored out with exit status 1:
     command: /Users/apple/opt/anaconda3/envs/test1/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"'; __file__='"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-record-fm1zm8k0/install-record.txt --single-version-externally-managed --compile --install-headers /Users/apple/opt/anaconda3/envs/test1/include/python3.8/regex
         cwd: /private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/
    Complete output (17 lines):
    running install
    running build
    running build_py
    creating build
    creating build/lib.macosx-10.9-x86_64-3.8
    creating build/lib.macosx-10.9-x86_64-3.8/regex
    copying regex_3/__init__.py -> build/lib.macosx-10.9-x86_64-3.8/regex
    copying regex_3/regex.py -> build/lib.macosx-10.9-x86_64-3.8/regex
    copying regex_3/_regex_core.py -> build/lib.macosx-10.9-x86_64-3.8/regex
    copying regex_3/test_regex.py -> build/lib.macosx-10.9-x86_64-3.8/regex
    running build_ext
    building 'regex._regex' extension
    creating build/temp.macosx-10.9-x86_64-3.8
    creating build/temp.macosx-10.9-x86_64-3.8/regex_3
    gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/apple/opt/anaconda3/envs/test1/include -arch x86_64 -I/Users/apple/opt/anaconda3/envs/test1/include -arch x86_64 -I/Users/apple/opt/anaconda3/envs/test1/include/python3.8 -c regex_3/_regex.c -o build/temp.macosx-10.9-x86_64-3.8/regex_3/_regex.o
    xcrun: error: invalid active developer path (/Library/Developer/CommandLineTools), missing xcrun at: /Library/Developer/CommandLineTools/usr/bin/xcrun
    error: command 'gcc' failed with exit status 1
    ----------------------------------------
ERROR: Command errored out with exit status 1: /Users/apple/opt/anaconda3/envs/test1/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"'; __file__='"'"'/private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-install-3ss2v4zg/regex/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /private/var/folders/4c/_r8vg69s4bb61j6syx4sdj900000gn/T/pip-record-fm1zm8k0/install-record.txt --single-version-externally-managed --compile --install-headers /Users/apple/opt/anaconda3/envs/test1/include/python3.8/regex Check the logs for full command output.

Also, please let me know if you could possibly help me with this issue. Thank you.

@adityagarg7 as you can see the compilation error is the same for Building wheel for regex and Running setup.py install for regex so you need to investigate your setup for compiling python modules with C code. So if you don't mind I'll mark your comments as off-topic to keep this issue focused. That said, your confusion is understandable and we are working to clarify the deprecation warning.

Windows 10, the same error with sqlcipher3

$ pip install pysqlcipher3
Collecting pysqlcipher3
  Using cached pysqlcipher3-1.0.3.tar.gz (100 kB)
Building wheels for collected packages: pysqlcipher3
  Building wheel for pysqlcipher3 (setup.py): started
  Building wheel for pysqlcipher3 (setup.py): finished with status 'done'
  WARNING: Legacy build of wheel for 'pysqlcipher3' created no files.
  Command arguments: 'c:\sites\sylwester.tech\api\venv\scripts\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\Public\\Documents\\Wondershare\\Creato                    rTemp\\pip-install-r8ub57sq\\pysqlcipher3\\setup.py'"'"'; __file__='"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-r8ub57sq\\pysqlcipher3\\setup.py'"'"';f=geta                    ttr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\Publ                    ic\Documents\Wondershare\CreatorTemp\pip-wheel-1m4gkydz'
  Command output: [use --verbose to show]
  Running setup.py clean for pysqlcipher3
Failed to build pysqlcipher3
DEPRECATION: Could not build wheels for pysqlcipher3 which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this function                    ality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Installing collected packages: pysqlcipher3
    Running setup.py install for pysqlcipher3: started
    Running setup.py install for pysqlcipher3: finished with status 'done'
Successfully installed pysqlcipher3
(venv)

@sylwesterdigital We need to understand why setup.py bdist_wheel for this package does not generate a wheel. Could you try again with --verbose, and also run pip list after the installation command?

$ pip install --verbose pysqlcipher3

Using pip 20.2.2 from c:\users\flaboy\appdata\local\programs\python\python38\lib\site-packages\pip (python 3.8)
Non-user install because site-packages writeable
Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-ephem-wheel-cache-ikdirft5
Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-req-tracker-21e96s63
Initialized build tracking at C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-req-tracker-21e96s63
Created build tracker: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-req-tracker-21e96s63
Entered build tracker: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-req-tracker-21e96s63
Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b
1 location(s) to search for versions of pysqlcipher3:
* https://pypi.org/simple/pysqlcipher3/
Fetching project page and analyzing links: https://pypi.org/simple/pysqlcipher3/
Getting page https://pypi.org/simple/pysqlcipher3/
Found index url https://pypi.org/simple
Looking up "https://pypi.org/simple/pysqlcipher3/" in the cache
Request header has "max_age" as 0, cache bypassed
Starting new HTTPS connection (1): pypi.org:443
https://pypi.org:443 "GET /simple/pysqlcipher3/ HTTP/1.1" 304 0
  Found link https://files.pythonhosted.org/packages/a4/06/1d56bdec3129eff6dd54323d249784ccd90ce03c8cae7870d45e434bae77/pysqlcipher3-1.0.3.tar.gz#sha256=694e5bbb6ece8a064bd55f261e54b9ffbb3af1784afdc4dce4948a0251a8a430 (from https://pypi.org/simple/pysqlcipher3/), version: 1.0.3
Given no hashes to check 1 links for project 'pysqlcipher3': discarding no candidates
Using version 1.0.3 (newest of versions: 1.0.3)
Collecting pysqlcipher3
  Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-unpack-4ms5c69f
  Looking up "https://files.pythonhosted.org/packages/a4/06/1d56bdec3129eff6dd54323d249784ccd90ce03c8cae7870d45e434bae77/pysqlcipher3-1.0.3.tar.gz" in the cache
  Current age based on date: 20598953
  Ignoring unknown cache-control directive: immutable
  Freshness lifetime from max-age: 365000000
  The response is "fresh", returning cached response
  365000000 > 20598953
  Using cached pysqlcipher3-1.0.3.tar.gz (100 kB)
  Added pysqlcipher3 from https://files.pythonhosted.org/packages/a4/06/1d56bdec3129eff6dd54323d249784ccd90ce03c8cae7870d45e434bae77/pysqlcipher3-1.0.3.tar.gz#sha256=694e5bbb6ece8a064bd55f261e54b9ffbb3af1784afdc4dce4948a0251a8a430 to build tracker 'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-req-tracker-21e96s63'
    Running setup.py (path:C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b\pysqlcipher3\setup.py) egg_info for package pysqlcipher3
    Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m
    Running command python setup.py egg_info
    running egg_info
    creating C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info
    writing C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\PKG-INFO
    writing dependency_links to C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\dependency_links.txt
    writing top-level names to C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\top_level.txt
    writing manifest file 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\SOURCES.txt'
    reading manifest file 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    warning: no previously-included files matching '*~' found anywhere in distribution
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    writing manifest file 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-pip-egg-info-9ai39d2m\pysqlcipher3.egg-info\SOURCES.txt'
  Source in c:\users\public\documents\wondershare\creatortemp\pip-install-enu1m14b\pysqlcipher3 has version 1.0.3, which satisfies requirement pysqlcipher3 from https://files.pythonhosted.org/packages/a4/06/1d56bdec3129eff6dd54323d249784ccd90ce03c8cae7870d45e434bae77/pysqlcipher3-1.0.3.tar.gz#sha256=694e5bbb6ece8a064bd55f261e54b9ffbb3af1784afdc4dce4948a0251a8a430
  Removed pysqlcipher3 from https://files.pythonhosted.org/packages/a4/06/1d56bdec3129eff6dd54323d249784ccd90ce03c8cae7870d45e434bae77/pysqlcipher3-1.0.3.tar.gz#sha256=694e5bbb6ece8a064bd55f261e54b9ffbb3af1784afdc4dce4948a0251a8a430 from build tracker 'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-req-tracker-21e96s63'
Building wheels for collected packages: pysqlcipher3
  Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-wheel-3r_61g6i
  Building wheel for pysqlcipher3 (setup.py): started
  Destination directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-wheel-3r_61g6i
  Running command 'c:\users\flaboy\appdata\local\programs\python\python38\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"'; __file__='"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-wheel-3r_61g6i'
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-3.8
  creating build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\dbapi2.py -> build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3
  creating build\lib.win-amd64-3.8\pysqlcipher3\test
  copying lib\test\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test
  creating build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\dbapi.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\factory.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\hooks.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\regression.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\sqlcipher.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\transactions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\types.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\userfunctions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  running build_ext
  Builds a C extension linking against libsqlcipher library
  building 'pysqlcipher3._sqlite3' extension
  creating build\temp.win-amd64-3.8
  creating build\temp.win-amd64-3.8\Release
  creating build\temp.win-amd64-3.8\Release\src
  creating build\temp.win-amd64-3.8\Release\src\python3
  C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\bin\HostX86\x64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -DMODULE_NAME=\"pysqlcipher3.dbapi2\" -Ic:\users\flaboy\appdata\local\programs\python\python38\include -Ic:\users\flaboy\appdata\local\programs\python\python38\include "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\ATLMFC\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\cppwinrt" /Tcsrc\python3\module.c /Fobuild\temp.win-amd64-3.8\Release\src\python3\module.obj
  module.c
  C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b\pysqlcipher3\src\python3\connection.h(33): fatal error C1083: Cannot open include file: 'sqlcipher/sqlite3.h': No such file or directory
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.27.29110\\bin\\HostX86\\x64\\cl.exe' failed with exit status 2
  WARNING: Legacy build of wheel for 'pysqlcipher3' created no files.
  Command arguments: 'c:\users\flaboy\appdata\local\programs\python\python38\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"'; __file__='"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-wheel-3r_61g6i'
  Command output:
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-3.8
  creating build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\dbapi2.py -> build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3
  copying lib\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3
  creating build\lib.win-amd64-3.8\pysqlcipher3\test
  copying lib\test\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test
  creating build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\dbapi.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\factory.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\hooks.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\regression.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\sqlcipher.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\transactions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\types.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\userfunctions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  copying lib\test\python3\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
  running build_ext
  Builds a C extension linking against libsqlcipher library
  building 'pysqlcipher3._sqlite3' extension
  creating build\temp.win-amd64-3.8
  creating build\temp.win-amd64-3.8\Release
  creating build\temp.win-amd64-3.8\Release\src
  creating build\temp.win-amd64-3.8\Release\src\python3
  C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\bin\HostX86\x64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -DMODULE_NAME=\"pysqlcipher3.dbapi2\" -Ic:\users\flaboy\appdata\local\programs\python\python38\include -Ic:\users\flaboy\appdata\local\programs\python\python38\include "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\ATLMFC\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\cppwinrt" /Tcsrc\python3\module.c /Fobuild\temp.win-amd64-3.8\Release\src\python3\module.obj
  module.c
  C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b\pysqlcipher3\src\python3\connection.h(33): fatal error C1083: Cannot open include file: 'sqlcipher/sqlite3.h': No such file or directory
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.27.29110\\bin\\HostX86\\x64\\cl.exe' failed with exit status 2
  ----------------------------------------
  Building wheel for pysqlcipher3 (setup.py): finished with status 'done'
  Running setup.py clean for pysqlcipher3
  Running command 'c:\users\flaboy\appdata\local\programs\python\python38\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"'; __file__='"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' clean --all
  running clean
  removing 'build\temp.win-amd64-3.8' (and everything under it)
  removing 'build\lib.win-amd64-3.8' (and everything under it)
  'build\bdist.win-amd64' does not exist -- can't clean it
  'build\scripts-3.8' does not exist -- can't clean it
  removing 'build'
Failed to build pysqlcipher3
DEPRECATION: Could not build wheels for pysqlcipher3 which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Installing collected packages: pysqlcipher3
  Created temporary directory: C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-record-8pg_shup
    Running setup.py install for pysqlcipher3: started
    Running command 'c:\users\flaboy\appdata\local\programs\python\python38\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"'; __file__='"'"'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-install-enu1m14b\\pysqlcipher3\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-record-8pg_shup\install-record.txt' --single-version-externally-managed --compile --install-headers 'c:\users\flaboy\appdata\local\programs\python\python38\Include\pysqlcipher3'
    running install
    running build
    running build_py
    creating build
    creating build\lib.win-amd64-3.8
    creating build\lib.win-amd64-3.8\pysqlcipher3
    copying lib\dbapi2.py -> build\lib.win-amd64-3.8\pysqlcipher3
    copying lib\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3
    copying lib\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3
    creating build\lib.win-amd64-3.8\pysqlcipher3\test
    copying lib\test\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test
    creating build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\dbapi.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\dump.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\factory.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\hooks.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\regression.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\sqlcipher.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\transactions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\types.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\userfunctions.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    copying lib\test\python3\__init__.py -> build\lib.win-amd64-3.8\pysqlcipher3\test\python3
    running build_ext
    Builds a C extension linking against libsqlcipher library
    building 'pysqlcipher3._sqlite3' extension
    creating build\temp.win-amd64-3.8
    creating build\temp.win-amd64-3.8\Release
    creating build\temp.win-amd64-3.8\Release\src
    creating build\temp.win-amd64-3.8\Release\src\python3
    C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\bin\HostX86\x64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -DMODULE_NAME=\"pysqlcipher3.dbapi2\" -Ic:\users\flaboy\appdata\local\programs\python\python38\include -Ic:\users\flaboy\appdata\local\programs\python\python38\include "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\ATLMFC\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.27.29110\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\cppwinrt" /Tcsrc\python3\module.c /Fobuild\temp.win-amd64-3.8\Release\src\python3\module.obj
    module.c
    C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b\pysqlcipher3\src\python3\connection.h(33): fatal error C1083: Cannot open include file: 'sqlcipher/sqlite3.h': No such file or directory
    error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.27.29110\\bin\\HostX86\\x64\\cl.exe' failed with exit status 2
    Running setup.py install for pysqlcipher3: finished with status 'done'
  Record file C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-record-8pg_shup\install-record.txt not found
Successfully installed pysqlcipher3
Removed build tracker: 'C:\\Users\\Public\\Documents\\Wondershare\\CreatorTemp\\pip-req-tracker-21e96s63'
(flask)

$ pip list

Package        Version
-------------- ---------
atomicwrites   1.4.0
attrs          19.3.0
colorama       0.4.3
iniconfig      1.0.1
more-itertools 8.4.0
packaging      20.4
pip            20.2.2
pluggy         0.13.1
py             1.9.0
pyparsing      2.4.7
pytest         6.0.1
setuptools     41.2.0
six            1.15.0
toml           0.10.1
wheel          0.35.0
youtube-dl     2020.3.24
(flask)

@sylwesterdigital thanks. The relevant error is

    C:\Users\Public\Documents\Wondershare\CreatorTemp\pip-install-enu1m14b\pysqlcipher3\src\python3\connection.h(33): fatal error C1083: Cannot open include file: 'sqlcipher/sqlite3.h': No such file or directory
    error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.27.29110\\bin\\HostX86\\x64\\cl.exe' failed with exit status 2

So the sqlcipher dependency is missing.

We can also see that pysqlcipher3 is not installed despite setup.py install reporting success.

So I suspect there is a bug in pysqlcipher3's setup.py that causes it to report success while it is actually failing.
I'd suggest reporting the issue with that project, as there is nothing we can do on pip side.

Thanks. I will try to rise this issue here: https://github.com/sqlcipher/sqlcipher/issues

Pls help me.

I downloaded Build Tools 2015.
I downloaded setuptools.
I downloaded wheels.
I update pip.

pip version 20.2.2 Python version 3.8.3

I am trying to install a library vkbottle.

Pastebin ---> https://pastebin.com/bRDfpudV

# pip3 install vkbottle
Requirement already satisfied: vkbottle in c:\users\admin\appdata\local\programs\python\python38-32
\lib\site-packages\vkbottle-2.7.8-py3.8.egg (2.7.8)
Requirement already satisfied: aiohttp in c:\users\admin\appdata\local\programs\python\python38-32\
lib\site-packages (from vkbottle) (3.6.2)
Requirement already satisfied: contextvars in c:\users\admin\appdata\local\programs\python\python38
-32\lib\site-packages\contextvars-2.4-py3.8.egg (from vkbottle) (2.4)
Requirement already satisfied: pydantic in c:\users\admin\appdata\local\programs\python\python38-32
\lib\site-packages (from vkbottle) (1.6.1)
Requirement already satisfied: vbml in c:\users\admin\appdata\local\programs\python\python38-32\lib
\site-packages (from vkbottle) (0.3)
Requirement already satisfied: watchgod in c:\users\admin\appdata\local\programs\python\python38-32
\lib\site-packages (from vkbottle) (0.6)
Requirement already satisfied: async-timeout<4.0,>=3.0 in c:\users\admin\appdata\local\programs\pyt
hon\python38-32\lib\site-packages (from aiohttp->vkbottle) (3.0.1)
Requirement already satisfied: yarl<2.0,>=1.0 in c:\users\admin\appdata\local\programs\python\pytho
n38-32\lib\site-packages (from aiohttp->vkbottle) (1.5.1)
Requirement already satisfied: attrs>=17.3.0 in c:\users\admin\appdata\local\programs\python\python
38-32\lib\site-packages (from aiohttp->vkbottle) (19.3.0)
Requirement already satisfied: chardet<4.0,>=2.0 in c:\users\admin\appdata\local\programs\python\py
thon38-32\lib\site-packages (from aiohttp->vkbottle) (3.0.4)
Requirement already satisfied: multidict<5.0,>=4.5 in c:\users\admin\appdata\local\programs\python\
python38-32\lib\site-packages (from aiohttp->vkbottle) (4.7.6)
Collecting immutables>=0.9
  Using cached immutables-0.14.tar.gz (42 kB)
Requirement already satisfied: poetry in c:\users\admin\appdata\local\programs\python\python38-32\l
ib\site-packages (from vbml->vkbottle) (1.0.10)
Requirement already satisfied: idna>=2.0 in c:\users\admin\appdata\local\programs\python\python38-3
2\lib\site-packages (from yarl<2.0,>=1.0->aiohttp->vkbottle) (2.10)
Requirement already satisfied: keyring<21.0.0,>=20.0.1; python_version >= "3.5" and python_version
< "4.0" in c:\users\admin\appdata\local\programs\python\python38-32\lib\site-packages (from poetry-
>vbml->vkbottle) (20.0.1)
Requirement already satisfied: pexpect<5.0.0,>=4.7.0 in c:\users\admin\appdata\local\programs\pytho
n\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (4.8.0)
Requirement already satisfied: pyparsing<3.0,>=2.2 in c:\users\admin\appdata\local\programs\python\
python38-32\lib\site-packages (from poetry->vbml->vkbottle) (2.4.7)
Requirement already satisfied: pkginfo<2.0,>=1.4 in c:\users\admin\appdata\local\programs\python\py
thon38-32\lib\site-packages (from poetry->vbml->vkbottle) (1.5.0.1)
Requirement already satisfied: pyrsistent<0.15.0,>=0.14.2 in c:\users\admin\appdata\local\programs\
python\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.14.11)
Requirement already satisfied: html5lib<2.0,>=1.0 in c:\users\admin\appdata\local\programs\python\p
ython38-32\lib\site-packages (from poetry->vbml->vkbottle) (1.1)
Requirement already satisfied: tomlkit<0.6.0,>=0.5.11 in c:\users\admin\appdata\local\programs\pyth
on\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.5.11)
Requirement already satisfied: shellingham<2.0,>=1.1 in c:\users\admin\appdata\local\programs\pytho
n\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (1.3.2)
Requirement already satisfied: cleo<0.8.0,>=0.7.6 in c:\users\admin\appdata\local\programs\python\p
ython38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.7.6)
Requirement already satisfied: clikit<0.5.0,>=0.4.2 in c:\users\admin\appdata\local\programs\python
\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.4.3)
Requirement already satisfied: jsonschema<4.0,>=3.1 in c:\users\admin\appdata\local\programs\python
\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (3.2.0)
Requirement already satisfied: requests<3.0,>=2.18 in c:\users\admin\appdata\local\programs\python\
python38-32\lib\site-packages (from poetry->vbml->vkbottle) (2.24.0)
Requirement already satisfied: requests-toolbelt<0.9.0,>=0.8.0 in c:\users\admin\appdata\local\prog
rams\python\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.8.0)
Requirement already satisfied: cachecontrol[filecache]<0.13.0,>=0.12.4 in c:\users\admin\appdata\lo
cal\programs\python\python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.12.6)
Requirement already satisfied: cachy<0.4.0,>=0.3.0 in c:\users\admin\appdata\local\programs\python\
python38-32\lib\site-packages (from poetry->vbml->vkbottle) (0.3.0)
Requirement already satisfied: pywin32-ctypes!=0.1.0,!=0.1.1; sys_platform == "win32" in c:\users\a
dmin\appdata\local\programs\python\python38-32\lib\site-packages (from keyring<21.0.0,>=20.0.1; pyt
hon_version >= "3.5" and python_version < "4.0"->poetry->vbml->vkbottle) (0.2.0)
Requirement already satisfied: ptyprocess>=0.5 in c:\users\admin\appdata\local\programs\python\pyth
on38-32\lib\site-packages (from pexpect<5.0.0,>=4.7.0->poetry->vbml->vkbottle) (0.6.0)
Requirement already satisfied: six in c:\users\admin\appdata\local\programs\python\python38-32\lib\
site-packages (from pyrsistent<0.15.0,>=0.14.2->poetry->vbml->vkbottle) (1.15.0)
Requirement already satisfied: webencodings in c:\users\admin\appdata\local\programs\python\python3
8-32\lib\site-packages (from html5lib<2.0,>=1.0->poetry->vbml->vkbottle) (0.5.1)
Requirement already satisfied: pastel<0.3.0,>=0.2.0 in c:\users\admin\appdata\local\programs\python
\python38-32\lib\site-packages (from clikit<0.5.0,>=0.4.2->poetry->vbml->vkbottle) (0.2.0)
Requirement already satisfied: pylev<2.0,>=1.3 in c:\users\admin\appdata\local\programs\python\pyth
on38-32\lib\site-packages (from clikit<0.5.0,>=0.4.2->poetry->vbml->vkbottle) (1.3.0)
Requirement already satisfied: setuptools in c:\users\admin\appdata\local\programs\python\python38-
32\lib\site-packages (from jsonschema<4.0,>=3.1->poetry->vbml->vkbottle) (49.6.0)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in c:\users\admin\appdata\lo
cal\programs\python\python38-32\lib\site-packages (from requests<3.0,>=2.18->poetry->vbml->vkbottle
) (1.25.9)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\admin\appdata\local\programs\python\p
ython38-32\lib\site-packages (from requests<3.0,>=2.18->poetry->vbml->vkbottle) (2020.6.20)
Requirement already satisfied: msgpack>=0.5.2 in c:\users\admin\appdata\local\programs\python\pytho
n38-32\lib\site-packages (from cachecontrol[filecache]<0.13.0,>=0.12.4->poetry->vbml->vkbottle) (1.
0.0)
Requirement already satisfied: lockfile>=0.9; extra == "filecache" in c:\users\admin\appdata\local\
programs\python\python38-32\lib\site-packages (from cachecontrol[filecache]<0.13.0,>=0.12.4->poetry
->vbml->vkbottle) (0.12.2)
Building wheels for collected packages: immutables
  Building wheel for immutables (setup.py) ... error
  ERROR: Command errored out with exit status 1:
   command: 'c:\users\admin\appdata\local\programs\python\python38-32\python.exe' -u -c 'import sys
, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\admin\\AppData\\Local\\Temp\\pip-install-069v
r9ot\\immutables\\setup.py'"'"'; __file__='"'"'C:\\Users\\admin\\AppData\\Local\\Temp\\pip-install-
069vr9ot\\immutables\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read
().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' b
dist_wheel -d 'C:\Users\admin\AppData\Local\Temp\pip-wheel-bk5jpx95'
       cwd: C:\Users\admin\AppData\Local\Temp\pip-install-069vr9ot\immutables\
  Complete output (30 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win32-3.8
  creating build\lib.win32-3.8\immutables
  copying immutables\map.py -> build\lib.win32-3.8\immutables
  copying immutables\_testutils.py -> build\lib.win32-3.8\immutables
  copying immutables\_version.py -> build\lib.win32-3.8\immutables
  copying immutables\__init__.py -> build\lib.win32-3.8\immutables
  running egg_info
  writing immutables.egg-info\PKG-INFO
  writing dependency_links to immutables.egg-info\dependency_links.txt
  writing top-level names to immutables.egg-info\top_level.txt
  reading manifest file 'immutables.egg-info\SOURCES.txt'
  reading manifest template 'MANIFEST.in'
  writing manifest file 'immutables.egg-info\SOURCES.txt'
  copying immutables\_map.c -> build\lib.win32-3.8\immutables
  copying immutables\_map.h -> build\lib.win32-3.8\immutables
  copying immutables\_map.pyi -> build\lib.win32-3.8\immutables
  copying immutables\py.typed -> build\lib.win32-3.8\immutables
  running build_ext
  building 'immutables._map' extension
  creating build\temp.win32-3.8
  creating build\temp.win32-3.8\Release
  creating build\temp.win32-3.8\Release\immutables
  C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG
 /MD -DNDEBUG=1 -Ic:\users\admin\appdata\local\programs\python\python38-32\include -Ic:\users\admin
\appdata\local\programs\python\python38-32\include "-IC:\Program Files (x86)\Microsoft Visual Studi
o 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\
Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcimmutables/_map.c /Fobuild\temp.w
in32-3.8\Release\immutables/_map.obj -O2
  _map.c
  c:\users\admin\appdata\local\programs\python\python38-32\include\pyconfig.h(206): fatal error C10
83: ЌҐ г¤ Ґвбп ®вЄалвм д ©« ўЄ«о祭ЁҐ: basetsd.h: No such file or directory,
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\cl.exe' failed wi
th exit status 2
  ----------------------------------------
  ERROR: Failed building wheel for immutables
  Running setup.py clean for immutables
Failed to build immutables
DEPRECATION: Could not build wheels for immutables which do not use PEP 517. pip will fall back to
legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possibl
e replacement is to fix the wheel build issue reported above. You can find discussion regarding thi
s at https://github.com/pypa/pip/issues/8368.
Installing collected packages: immutables
    Running setup.py install for immutables ... error
    ERROR: Command errored out with exit status 1:
     command: 'c:\users\admin\appdata\local\programs\python\python38-32\python.exe' -u -c 'import s
ys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\admin\\AppData\\Local\\Temp\\pip-install-06
9vr9ot\\immutables\\setup.py'"'"'; __file__='"'"'C:\\Users\\admin\\AppData\\Local\\Temp\\pip-instal
l-069vr9ot\\immutables\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.re
ad().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))'
 install --record 'C:\Users\admin\AppData\Local\Temp\pip-record-_f7_ducl\install-record.txt' --sing
le-version-externally-managed --compile --install-headers 'c:\users\admin\appdata\local\programs\py
thon\python38-32\Include\immutables'
         cwd: C:\Users\admin\AppData\Local\Temp\pip-install-069vr9ot\immutables\
    Complete output (30 lines):
    running install
    running build
    running build_py
    creating build
    creating build\lib.win32-3.8
    creating build\lib.win32-3.8\immutables
    copying immutables\map.py -> build\lib.win32-3.8\immutables
    copying immutables\_testutils.py -> build\lib.win32-3.8\immutables
    copying immutables\_version.py -> build\lib.win32-3.8\immutables
    copying immutables\__init__.py -> build\lib.win32-3.8\immutables
    running egg_info
    writing immutables.egg-info\PKG-INFO
    writing dependency_links to immutables.egg-info\dependency_links.txt
    writing top-level names to immutables.egg-info\top_level.txt
    reading manifest file 'immutables.egg-info\SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    writing manifest file 'immutables.egg-info\SOURCES.txt'
    copying immutables\_map.c -> build\lib.win32-3.8\immutables
    copying immutables\_map.h -> build\lib.win32-3.8\immutables
    copying immutables\_map.pyi -> build\lib.win32-3.8\immutables
    copying immutables\py.typed -> build\lib.win32-3.8\immutables
    running build_ext
    building 'immutables._map' extension
    creating build\temp.win32-3.8
    creating build\temp.win32-3.8\Release
    creating build\temp.win32-3.8\Release\immutables
    C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\cl.exe /c /nologo /Ox /W3 /GL /DNDEB
UG /MD -DNDEBUG=1 -Ic:\users\admin\appdata\local\programs\python\python38-32\include -Ic:\users\adm
in\appdata\local\programs\python\python38-32\include "-IC:\Program Files (x86)\Microsoft Visual Stu
dio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC
:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcimmutables/_map.c /Fobuild\temp
.win32-3.8\Release\immutables/_map.obj -O2
    _map.c
    c:\users\admin\appdata\local\programs\python\python38-32\include\pyconfig.h(206): fatal error C
1083: ЌҐ г¤ Ґвбп ®вЄалвм д ©« ўЄ«о祭ЁҐ: basetsd.h: No such file or directory,
    error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\cl.exe' failed
with exit status 2
    ----------------------------------------
ERROR: Command errored out with exit status 1: 'c:\users\admin\appdata\local\programs\python\python
38-32\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\admin\\App
Data\\Local\\Temp\\pip-install-069vr9ot\\immutables\\setup.py'"'"'; __file__='"'"'C:\\Users\\admin\
\AppData\\Local\\Temp\\pip-install-069vr9ot\\immutables\\setup.py'"'"';f=getattr(tokenize, '"'"'ope
n'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(
code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\admin\AppData\Local\Temp\pip-record-_f
7_ducl\install-record.txt' --single-version-externally-managed --compile --install-headers 'c:\user
s\admin\appdata\local\programs\python\python38-32\Include\immutables' Check the logs for full comma
nd output.

@Andrexxelles the error your are facing is not related to this issue nor pip. The relevant part is basetsd.h: No such file or directory. So I you don't mind I'll go ahead and mark these two comments as off-topic, to keep the discussion on this issue easier to read.

i am getting the same error while installing mysqlclient

DEPRECATION: Could not build wheels for mysqlclient which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above.

wheel issue is-
Building wheel for mysql (setup.py) ... done
Created wheel for mysql: filename=mysql-0.0.2-py3-none-any.whl size=1252 sha256=ca801d4e9888754369abcdfb5791654e3a43361475a58726257b7954c7c4d735
Stored in directory: c:\users\admin\appdata\local\pip\cache\wheels\3e\4a\d0\506edab38d1bdf574b02c24805fcf7348a327297fcc285431d
Building wheel for client (setup.py) ... done
Created wheel for client: filename=client-0.0.1-py3-none-any.whl size=1441 sha256=cf97824f8b63db528014243e69082705c78c4397ca5891bfb2b7217a107e943c
Stored in directory: c:\users\admin\appdata\local\pip\cache\wheels\1e\d6\d3\2084ee3b4ac9adaab2bf853307b140575c5fe9160821ab8e07
Building wheel for mysqlclient (setup.py) ... error
ERROR: Command errored out with exit status 1:

@ritumishra9 Please post build error questions to pypa/packaging-problems instead.

Locking the thread to prevent more people asking unrelated questions here.

In version 21.0, pip will not attempt setup.py install in that case, and fail the installation right away.

I'm a little confused by this part of the plan and I can't find where this was discussed.

Right now, a whole class of packages use setup.py install to work around the install vs build dependency synchronization issues we have. I think until we address the PEP 517 quality-of-implementation issues in pip (config_settings, synchronized build + install dependencies in isolated environments, post-install steps for wheels etc), we shouldn't remove the setup.py install branch this -- it'd be a very significant regression for those packages, since they'd be rendered no-longer-installable.

This seems to be removing a feature without a clear alternative for it, which can't end well. :)

  1. The 21.0 deadline was discussed in #8102. We didn't reach consensus on it, but we also didn't agree on an alternative. It does fit our normal deprecation policy.
  2. Do we have actual bug reports (preferably linked to this issue) that flag problems preventing people from addressing the deprecation? That's rhetorical, the answer is "no, we don't". So how do we ever know if we've done what's needed to action the deprecation?
  3. Issues with isolated environments should not be linked to this issue. We have --no-build-isolation to handle cases where build isolation doesn't work for a project. Before accepting a bug report as being a blocker for this deprecation, I'd want to know why --no-build-isolation isn't a workaround ("it's inconvenient" isn't sufficient IMO - deprecations are always inconvenient for someone).

I'm completely fine with fixing the various PEP 517 and build isolation quality-of-implementation issues. But problems with quality of implementation aren't the same as "unusable", and we need to push people to be clear when they mean "I am unable to build wheels at all" and when they mean "it's clumsy or inconvenient for me to set things up so that I can build wheels".

Having said all of the above, I don't personally intend to push for the removal of the setup.py install route. So if it's left to me, we won't remove in 21.0 anyway. It was @sbidoul who was driving this, so he's probably the person who really needs to comment. (I would support him if he wants to aim for an aggressive timescale, though).

BTW, one unfortunate result of including the issue link in the message is that the huge number of pingbacks here includes cases where people just paste pip output and github turns it into a link, when actually it's an unrelated issue or something the project is handling themselves. Let's not do that in future 🙂

The plan was indeed laid out in #8102, and #8102 (comment) specifically. Comments about it seemed positive back then.

I'm not particularly pushing for this, although I'm convinced setup.py install has to go away sooner or later. Heck, even setuptools maintainers say people should use pip instead of setup.py install - but if it's for pip to do a setup.py install what's the point? And we need data to shed light on the matter, otherwise we can only guess. The deprecation mechanism was considered a decent way to get such data.

This specific issue is for a very specific case: falling back to setup.py install when bdist_wheel fails. Even if we disable this behaviour, setup.py install stays, and can be activated by passing --install-options or --global-options.

It is unfortunate we have had so many backlinks from the error message, that was unintended, and it is now fixed in 20.2.3 where the deprecation happens in a much more focused manner.

So how do we get the feedback about this specific deprecation (other than reading all the backlinks, most of them irrelevant). Should we not unlock the issue to let people report about cases where the fallback is deemed useful ?

wlav commented

I have case where the fallback is still necessary, but mainly because it is a workaround for where the pyproject.toml is insufficient: I'm unable to have a environment marker that picks up the PyPy version (that is, not the Python version it implements like e.g. 3.6, but the PyPy release, such as e.g. 7.3.0). This is probably also an issue for Jython.

Without being able to select package versions for the specific PyPy release in the pyproject.toml file for use in the build environment, I'm back to making that selection in the setup.py file, where I can not control wheel build/install order and thus to relying on the fallback. (After which all is good, for now.)

Is there a known solution to this, or if there should be another marker, is there a process/place for adding or asking for such?

You can still use setup.py with pyproject.toml, the two files are entirely orthogonal. Conditional build-time requirements can be calculated and specified with the setup_requires argument.

wlav commented

Sorry, not following. The reason I'm here is that the behavior of setup_requires is going to break in the future, same as the questions posed and discussion in the other threads. I was under the impression that the behavior of building and immediately installing for use by the next package in dependency order can be recovered by using the pyproject.toml. Point being that the build environment wheels, build and installed, in correct order. They are then reused from the cache in setup.py, so it doesn't matter that the setup_requires dependency order is no longer respected as the re-used wheels were correctly build. It's a bit of a hack, but yes, it can indeed be made to work for CPython.

But it can not for PyPy. The difference is that for CPython, the python version and the interpreter version match, so the dependency versions can be correctly selected. No such thing for PyPy, where the python version and interpreter version differ and only the former can be specified with a marker.

I just tried and If I treat pyproject.toml orthogonally (I think) from setup.py, by not specifying the exact versions in the pyproject.toml file and leaving it to setup_requires, I simply get the latest versions in the build environment, followed by another build of the setup_requires versions by setup.py in incorrect order and in an incorrect build environment. So, what am I missing? Thanks!

(Leaves the separate issue that pyproject.toml is less functional on non-CPython interpreters.)

Honestly, I’m not following your confusion either, since your description is so much deviated from my understanding to the situation. What makes you think setup_requires is going to break in the future? It is not going away as far as I know; PEP 517 (the alternative to setup.py install) provides the build backend (setuptools in this case) a hook to specify dynamic build-time dependencies (get_requires_for_build_wheel), and setuptools uses setup(setup_requires=...) to expose that to the user. The PEP 517 build system would:

  1. Populate an environment with specifications from pyproject.toml
  2. Run setup.py in that environment to install dynamic requirements (i.e. setup_requires)
  3. Build the wheel

So you do not need to specify the build dependency in pyproject.toml if you have specified it in setup_requires. And there should be no “another build” but only one build that performs two setup step. The puzzles don’t fit at all between your understanding and mine.

I probably should have asked this at the very beginning: what exactly are you trying to do? I am imagining something like:

import setuptools

if on_pypy_version_x():
    build_deps = ["build-dep<5"]
else:
    build_deps = ["build-dep>=5"]

setuptools.setup(
    ...,
    setup_requires=build_deps,
)

And this is most definitely supposed to work, both right now and after setup.py install is removed (it’s a bug in setuptools if it does not). And if that’s not what you mean, please provide more concrete description.

It's worth noting that setup_requires is a deprecated option in setuptools. The setuptools documentation includes the statement

Note This used to be accomplished with the setup_requires keyword but is now considered deprecated in favor of the PEP 517 style described above. To peek into how this legacy keyword is used, consult our guide on deprecated practice (WIP)

If the approach @uranusjr noted above isn't enough, then I'd hope that the setuptools documentation gives @wlav enough information to allow them to modify their build process so that it works as needed. If it doesn't, then that's probably something that they should raise on the setuptools issue tracker.

Also, I don't know if setuptools warns when setup_requires is used. If not, then maybe doing so would be worthwhile, but that's really down to whether setuptools users like @wlav feel that the deprecation isn't sufficiently well advertised already - and if that's the case then again I'd suggest flagging that on the setuptools tracker.

Ooh, I didn’t know setuptools actively deprecates setup_requires; I’ve thought they only re-purpose it for PEP 517. Thanks.

If setup_requires is to be avoided completely, PEP 517 in-tree backend would likely be the way to go. A project can define its own get_requires_for_build_wheel to specify build-time requirements dynamically (and proxy everything else to setuptools).

It might be useful if someone were to collect examples of "modern" replacements for older approaches that are now deprecated, or "out of favour" and potentially going to get deprecated at some point.

I don't know where such a document would be hosted, though, and it would be very reliant on people using such older techniques doing the research on how to update them and then contributing that learning back to the community. In practice, though, I don't know how likely that is to happen. Stack Overflow is an obvious possibility, but curating that to ensure that obsolete advice is de-prioritised in favour of up to date answers is likely to be more time consuming than many of the Python packaging maintainers can manage, so again community involvement is needed.

@wlav - what would it have taken for you to find out that setup_requires was deprecated, and that PEP 517 in-tree backends were a thing that you could consider to update your build process? I'm sure we haven't publicised in-tree backends well enough, but I don't know what we could do to improve the situation.

[Personally, I knew that setup_requires was deprecated, but I didn't know that setuptools exposed that data via the PEP 517 hook, and I hadn't made the link to in-tree backends as an approach to replace the functionality. So this stuff isn't easily accessible even to so-called packaging experts like myself 🙂]

wlav commented

A bit of a late reply, but it has been a full and hectic week... Yes, examples would be really nice indeed! But as-is, providing a custom in-tree builder that simply imports all from setuptools.build_meta and then adds the specific requirements for PyPy by shimming get_requires_for_build_wheel works well enough.

wlav commented

@pfmoore - Forgot to answer your question, but basically: I get bug reports. I think it's not easy to keep up with changes in build systems, but enough people do keep up and let me know of important ones through my issue tracker. Similarly, many build systems are compatible with setuptools to some extend, but only if the documented rules are followed. So, if my use doesn't follow those to the letter, it will break one of the many other build systems out there, and someone will bug me.

As for learning about in-tree backends, I got that information right here. :)

Wheels do not currently support symlinks because not all platforms support symlinks. As part of an install process we add logic to setup.py to create the symlinks, and would rather not duplicate the objects because they're large. Forcing wheel creation would require another set of hacks to resolve the issue.

See (other) discussion about symlinks + wheels at #5919

I don'y think this a way to go.
It will by highly inconvenient if this will stop to work after I'will upgrade pip.

I have dockerfile with following line

pip install graphviz==0.14.2 numpy==1.16.2 scipy==1.2.1

Dockerfile itself doesn't contain numpy and scipy, this what I'm seeing when I'm building my Dockerfile:

+ pip install 'graphviz==0.14.2' 'numpy==1.16.2' 'scipy==1.2.1'
Collecting graphviz==0.14.2
  Downloading graphviz-0.14.2-py2.py3-none-any.whl (18 kB)
Collecting numpy==1.16.2
  Downloading numpy-1.16.2.zip (5.1 MB)
Collecting scipy==1.2.1
  Downloading scipy-1.2.1.tar.gz (23.1 MB)
Building wheels for collected packages: numpy, scipy
  Building wheel for numpy (setup.py): started
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): finished with status 'done'
  Created wheel for numpy: filename=numpy-1.16.2-cp38-cp38-linux_x86_64.whl size=10612309 sha256=6ec34736dd5ce8cd69dfe17b51a6a498419c927ac77e212da6b8541a32a94273
  Stored in directory: /tmp/pip-ephem-wheel-cache-l8v8hrqu/wheels/c7/1a/26/d24add95f4ea99e790c3a8b339fb5b46b66f9849142d1b93c5
  Building wheel for scipy (setup.py): started
  Building wheel for scipy (setup.py): finished with status 'error'
  Running setup.py clean for scipy
  ERROR: Command errored out with exit status 1:
   command: /opt/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py'"'"'; __file__='"'"'/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-54nh3dnh
       cwd: /tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/
  Complete output (9 lines):
  /tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py:114: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py", line 492, in <module>
      setup_package()
    File "/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py", line 468, in setup_package
      from numpy.distutils.core import setup
  ModuleNotFoundError: No module named 'numpy'
  ----------------------------------------
  ERROR: Failed building wheel for scipy
  ERROR: Command errored out with exit status 1:
   command: /opt/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py'"'"'; __file__='"'"'/tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' clean --all
       cwd: /tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f
  Complete output (11 lines):
  /tmp/pip-install-bnmbadcn/scipy_27de2a367cd54e0bb00563ab2256b56f/setup.py:114: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp

  `setup.py clean` is not supported, use one of the following instead:

    - `git clean -xdf` (cleans all files)
    - `git clean -Xdf` (cleans all versioned files, doesn't touch
                        files that aren't checked into the git repo)

  Add `--force` to your command to use it anyway if you must (unsupported).

  ----------------------------------------
  ERROR: Failed cleaning build dir for scipy
Successfully built numpy
Failed to build scipy
Installing collected packages: numpy, scipy, graphviz
    Running setup.py install for scipy: started
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: still running...
    Running setup.py install for scipy: finished with status 'done'
  DEPRECATION: scipy was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Successfully installed graphviz-0.14.2 numpy-1.16.2 scipy-1.2.1

@alex-ber thank you very much for the detailed report. We are very much looking for real-world situations where a wheel cannot be built while setup.py install works.

In your case what seems to happen is this. Scipy requires numpy as a build dependency. In the wheel building step, numpy is not installed, therefore it fails. Then pip falls back to setup.py install mode, and I think it succeeds because, by chance, numpy is installed before scipy's setup.py install runs.

Could you try installing in two steps to see if the error, and the deprecation warning disappear ?

  • pip install graphviz==0.14.2 numpy==1.16.2
  • pip install scipy==1.2.1

Alternatively, upgrading to a recent version of scipy that has a pyproject.toml that declares numpy as a build dependency should work too.

@sbidoul I have spited in two lines and this works without warning and using wheel. This is the output:

+ pip install 'graphviz==0.14.2' 'numpy==1.16.2'
Collecting graphviz==0.14.2
  Downloading graphviz-0.14.2-py2.py3-none-any.whl (18 kB)
Collecting numpy==1.16.2
  Downloading numpy-1.16.2.zip (5.1 MB)
Building wheels for collected packages: numpy
  Building wheel for numpy (setup.py): started
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): still running...
  Building wheel for numpy (setup.py): finished with status 'done'
  Created wheel for numpy: filename=numpy-1.16.2-cp38-cp38-linux_x86_64.whl size=10612313 sha256=136c953f8560dd57cdd651b2b9c8af3b3454d00098f8b26c3ba223626f32e82e
  Stored in directory: /tmp/pip-ephem-wheel-cache-79x7lajv/wheels/c7/1a/26/d24add95f4ea99e790c3a8b339fb5b46b66f9849142d1b93c5
Successfully built numpy
Installing collected packages: numpy, graphviz
Successfully installed graphviz-0.14.2 numpy-1.16.2
+ pip install 'scipy==1.2.1'
Collecting scipy==1.2.1
  Downloading scipy-1.2.1.tar.gz (23.1 MB)
Building wheels for collected packages: scipy
  Building wheel for scipy (setup.py): started
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): still running...
  Building wheel for scipy (setup.py): finished with status 'done'
  Created wheel for scipy: filename=scipy-1.2.1-cp38-cp38-linux_x86_64.whl size=55957602 sha256=84f7c97fc95b6893c1924f8d90851b69cc53de3ebb6d536d049b581cbb293532
  Stored in directory: /tmp/pip-ephem-wheel-cache-cr9csfcz/wheels/0a/51/83/29c07027bd3a7ad5cd422b3925ce6a1b8d8232f3beba759588
Successfully built scipy
Installing collected packages: scipy
Successfully installed scipy-1.2.1

Have the same problem when trying to install Python-Levenshtein
Does anyone have a solution to this?

I found that using py -m pip install -U --no-deps ... worked in a few cases. This is not ideal, of course. Otherwise, I believe the package author needs to re-build the package and re-upload to PyPI. When --no-deps works, I suspect it is a dependency that needs to be re-built.

@ddfcrystal the issue most probably lies in the setup.py of the package you try to install. Feel free however to post your installation log here and I'll take a look.

@djhenderson this issue is unrelated to --no-deps. What probably happens is that one of your dependencies has a buggy setup.py that triggers this warning.

A quick note. Pip 21.0 is due for release in about 2 weeks (see #9282). This item is on the 21.0 milestone, so it either needs to be implemented and merged to master before then, or it will miss the release (and I'll move it to the 21.1 milestone).

Removing this from 21.0 because #9423 if this is the correct #8368 tab.

I'm rather confused by this. When I install my library (mindsdb, for which I explicitly provide no wheel since it requires user-specific steps in setup.py for it to install properly) I get:

DEPRECATION: mindsdb was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.

but when updating to 21 the message just changes to:

  DEPRECATION: mindsdb was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.

and everything still works.

When is sdist going to be deprecated? Will it ever be?

I see no alternatives presented in here per say, is every library maintainer supposed to be build wheels for everything now? (Not an issue for me specifically, since I have access to resources to do so on dozens of different envs, but I assume for some maintainers this could prove difficult(?))


I'm fairly sure I'm misunderstanding something here, corrections would be appreciated.

sdist is not boing to be deprecated. Installing it without either having a pyproject.toml in the sdist or having wheel installed in the environment is. As a maintainer, you should seriously consider opting into PEP 517.

@George3d6 sorry for the confusion about the deprecation version.

At this stage we are trying to collect feedback on situations where pip running setup.py install would still be necessary, as opposed to first running setup.py bdist_wheel and then installing the resulting wheel.

When is sdist going to be deprecated? Will it ever be?

There is no plan to deprecate sdists. There are also no plans that I know of to force all maintainers to publish wheels. But the ecosystem is indeed moving to installing via first building a wheel and then installing it.

So in your case it would be interesting to understand why setup.py bdist_wheel would not work for you, and what would block you from moving to PEP 517 as @uranusjr just mentioned.

Ok, so, I think I was probably uninformed on the issue (and still might be).

Quick aside as to why (in case you care):

I think the main issue here might have been that I either didn't read the official docs properly at the time (and instead looked at resources online, which often seem deprecated). I also took the example of pytroch, sklearn and numpy when I originally created my package, and none were using .tomls at the time (though I see 2 of them now are, might be worth pinging the devs from large projects like pytroch about this though, since I assume many dummies alike myself take after what they are doing without properly checking the docs :p).


So, just to confirm my understanding here: As long as I supply a .toml specifying the build tools required I should still be able to distributed a package that installs on the user's machine instead of requiring me to precompile it? As in, everything will work just the same + I will get to chose the exact version of setuptools that will be used, which seems like a plus, and potentially various other build env dependencies.

If so, nothing would stop me from migrating to this new specification and I was just poorly informed until about now.


So in your case it would be interesting to understand why setup.py bdist_wheel would not work for you, and what would block you from moving to PEP 517 as @uranusjr just mentioned.

I assume by this you mean why I am not being able to just publish wheels and still required setup.py to run on the user's machine? If so, the answer is that I need to be able to install dependencies dynamically based on the user's env: https://github.com/mindsdb/lightwood/blob/stable/setup.py#L31 (Mind you this doesn't seem to work all the time, and would be fixed by compiling wheels for every type of env we want to support, and is a very bad practice, and we've more or less moved to custom installers for our project in order to avoid needing these kind of hacks in the future and in order to remove them ASAP)

... hopefully that sheds some light on the situation, thanks a lot for the helps thus far, as I said, I think this is an issue of me being misinformed, so thanks for taking the time to answer.

I assume by this you mean why I am not being able to just publish wheels and still required setup.py to run on the user's machine?

No :) If you reached this issue, it should mean pip first tried setup.py bdist_wheel and that failed, and then fell back to doing setup.py install and that succeeded. What we are trying to understand is why bdist_wheel would fail while setup.py install succeeds. Normally the reason for the bdist_wheel failure should be in the pip install log.

So, just to confirm my understanding here: As long as I supply a .toml specifying the build tools required I should still be able to distributed a package that installs on the user's machine instead of requiring me to precompile it? As in, everything will work just the same + I will get to chose the exact version of setuptools that will be used, which seems like a plus, and potentially various other build env dependencies.

Also, yes, this is what you should expect by opting into pyproject.toml. Tell us if things don’t work like this, because that’s a bug!

No :) If you reached this issue, it should mean pip first tried setup.py bdist_wheel and that failed, and then fell back to doing setup.py install and that succeeded. What we are trying to understand is why bdist_wheel would fail while setup.py install succeeds. Normally the reason for the bdist_wheel failure should

Hm, ok, this makes me a bit confused, since I didn't realize pip would even try to run setup.py bdist_wheel on the user's machine... I thought the logic was.

  1. Look for wheels on pypi
  2. If there are no wheels, install using setup.py install

So I guess I didn't realize pip would try to build then instal the whl on the user's machine.

Also, yes, this is what you should expect by opting into pyproject.toml. Tell us if things don’t work like this, because that’s a bug!

Alright, I expect this to work once I upgrade, I will figure this out today, again, I think the issue here was on my end ignoring the ""right" way to go about packaging.

So, as far as I can tell adding the .toml does break my build. Since it builds a wheel it ignore some or all of the instructions I have for windows that I linked here: https://github.com/mindsdb/lightwood/blob/stable/setup.py#L31 and this causes things to fail.

What I'm doing in the above is hacky, since pypi no longer allows installing 3rd party wheels, but also was required at the time since torch was otherwise uninstallable for windows and it might still be required...

We actually switched from using purely pypi to using our own installers (that pip install and do some extra env setup), so this is fixable for us, but might not be for other projects (wouldn't have been for us e.g. 3 months ago when we still encouraged people to install our software via pip directly)

So can I still make .egg package and delivery it or not? I need a post-install process for devices drivers if not presented, and other crazy windows software installer.... in order to automate and resolve all dependency for my python package.

This change only affects source distributions with setup.py but not pyproject.toml. It has nothing to do with eggs.

@uranusjr - It is my case. My packages is stop work with pip 21.x now. What do I do...
Is this change because of some security issues?

This deprecation has not even been implemented yet. If your package already stopped working, it has nothing to do with this issue.

Hello. We are considering implementing a package installation using sdist only because we need some internal relative-path symlinks created at installation time and wheels cannot include symlinks. We have not found a way to have the wheel installation process include symlinks, but it was suggested that an sdist could create symlinks in the setup.py. The idea was: create a package with the standard files as a wheel, then create a dependent package sdist only that would create the needed symlinks upon installation.

If such a use-case is (or will soon be) deprecated, we will not pursue it. Any information or alternate suggestions would help.

Symbolic links in wheels is a topic that came up a lot of times at discuss.python.org. I think it’s more or less agreed that it’s a valid use case, but the problem is how to specify and implement it correctly since wheel is a cross-platform format and symbolic links are decidedly not cross-platform. I would encourage you to raise this issue on d.p.o again and drive that discussion to the end. It would be tremendous help to drive this deprecation home as well.

Yes, I understand that the cross-platform challenge with symlinks. That is why a sensible workaround is to implement them at installation time using setup.py. The requirement to create a wheel, though, makes even a source installation on a system need to be "cross-platform", which looks like a needless imposition since there is no implied requirement that the result build be installed on any other platform.

Am I understanding that there is no "pip install" option that does not involve creation of a wheel, and thus no option (even sdist) that would allow "pip install" to include some created symlinks?

The wheel format is cross-platform, but a singular wheel does not neeed to necessarily be cross-platform (per-platform support is why wheels exist in the first place). The problem is not that a wheel cannot be used to distribute packages with symlinks, but how it should be designed to do it.

"The wheel format is cross-platform, but a singular wheel does not neeed to necessarily be cross-platform" -- okay, but in practice, when building a wheel you are limited by the format right now. Unless there is a way to say "allow symlinks" for osx/linux wheel builds (aligns with your comment in 2019 discussion to allow it for *nix platform wheels).

There will be a way if someone designs one 🙂

I don't have the resources to fix the symlink problem. I am commenting here because the limitation imposed by deprecating "pip install" of an sdist running setup.py without creating a wheel creates a problem right now because there is no other accepted symlink solution (in answer to the question above by @sbidoul : "it would be interesting to understand why setup.py bdist_wheel would not work for you, and what would block you from moving to PEP 517")

I'm using Docker with Alpline Linux on ARM64 with Anaconda3-2021.05, Python 3.8.8 . I'm using pip==20.3.1.
I have following line in my Dockerfile:

pip install 'numba==0.53.1' 'llvmlite==0.36.0' 'sip==4.19.25' 'numpy==1.19.5'

That produces the following warning:

+ > pip install 'numba==0.53.1' 'llvmlite==0.36.0' 'sip==4.19.25' 'numpy==1.19.5'
Requirement already satisfied: llvmlite==0.36.0 in /opt/anaconda3/lib/python3.8/site-packages (0.36.0)
Requirement already satisfied: sip==4.19.25 in /opt/anaconda3/lib/python3.8/site-packages (4.19.25)
Collecting numba==0.53.1
  Downloading numba-0.53.1.tar.gz (2.2 MB)
Requirement already satisfied: llvmlite==0.36.0 in /opt/anaconda3/lib/python3.8/site-packages (0.36.0)
Requirement already satisfied: setuptools in /opt/anaconda3/lib/python3.8/site-packages (from numba==0.53.1) (51.0.0)
Collecting numpy==1.19.5
  Downloading numpy-1.19.5-cp38-cp38-manylinux2014_aarch64.whl (12.4 MB)
Building wheels for collected packages: numba
  Building wheel for numba (setup.py): started
  Building wheel for numba (setup.py): finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /opt/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py'"'"'; __file__='"'"'/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-gsv3ina6
       cwd: /tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/
  Complete output (7 lines):
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py", line 422, in <module>
      metadata['ext_modules'] = get_ext_modules()
    File "/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py", line 148, in get_ext_modules
      import numpy.distutils.misc_util as np_misc
  ModuleNotFoundError: No module named 'numpy'
  ----------------------------------------
  ERROR: Failed building wheel for numba
  Running setup.py clean for numba
  ERROR: Command errored out with exit status 1:
   command: /opt/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py'"'"'; __file__='"'"'/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' clean --all
       cwd: /tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1
  Complete output (7 lines):
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py", line 422, in <module>
      metadata['ext_modules'] = get_ext_modules()
    File "/tmp/pip-install-jwlc7kja/numba_11d72221b9e14cc89b75278cf5dfedc1/setup.py", line 148, in get_ext_modules
      import numpy.distutils.misc_util as np_misc
  ModuleNotFoundError: No module named 'numpy'
  ----------------------------------------
  ERROR: Failed cleaning build dir for numba
Failed to build numba
Installing collected packages: numpy, numba
    Running setup.py install for numba: started
    Running setup.py install for numba: still running...
    Running setup.py install for numba: still running...
    Running setup.py install for numba: still running...
    Running setup.py install for numba: still running...
    Running setup.py install for numba: finished with status 'done'
  DEPRECATION: numba was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
Successfully installed numba-0.53.1 numpy-1.19.5

Previously, the following line looks like:

pip install 'numba==0.53.1' 'llvmlite==0.36.0' 'sip==4.19.25' 'numpy==1.16.2'

And it works as expected (numpy was built from source that is exactable behaviour, because this version is pretty old).
This version of numpy actually does has wheel for arm64.
Numba is installed from source, because it doesn't have the wheel for ARM64, that is ok.

What does “previously” mean?

I did change in the Docker file. It was
pip install 'numba==0.53.1' 'llvmlite==0.36.0' 'sip==4.19.25' 'numpy==1.16.2'

and worked without any warning. With the same pip version.

In that case, your issue is not related to this deprecation. You are seeing it only because pip fails to build a version from source; it did not attempt the build because it found a compatible prebuilt binary. This indicates you have a conflict in your dependencies. You can ignore the final warning message about setup.py install.

Please feel free to seek support on more user-oriented forums, such as StackOverflow and the online communities listed in https://www.python.org/community/

prophet

Please file an issue against this project, that they're requiring an install via setup.py install.

For avoiding this message on your end, you can manually install the build-time dependency of this (convertdate) before it similar to how you're installing Cython earlier.

So anyone found the solution ?

So anyone found the solution ?

@hbksilver The solution to which problem ?

The message below brought me here, but I must admit I have no idea why I'm here...

WARNING: Built wheel for my_package_name is invalid: Metadata 1.2 mandates PEP 440 version, but 'dev' is not
Failed to build my_package_name
Installing collected packages: my_package_name
  Attempting uninstall: my_package_name
    Found existing installation: my_package_name dev
    Uninstalling my_package_name-dev:
      Successfully uninstalled my_package_name-dev
    Running setup.py install for my_package_name ... done
  DEPRECATION: my_package_name was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above. Discussion can be found at https://github.com/pypa/pip/issues/8368
Successfully installed my_package_name-dev

@drtoche the important part is WARNING: Built wheel for my_package_name is invalid: Metadata 1.2 mandates PEP 440 version, but 'dev' is not. Your package has an version number (dev) which is invalid, therefore pip rejects the wheel that was built from it, and falls back to the legacy setup.py install method which is less strict

Setting a valid version number in your project should fix both the warning and the deprecation.

Thanks @sbidoul!

Hi maintainers !

I was wondering: is there a way to ask pip to be stricter and abort installation in this case ?

Your comment mentions:

In version 21.0, pip will not attempt setup.py install in that case, and fail the installation right away.

But with pip 21.3.1, I still am able to install local packages with invalid versions:

# in a package `mypkg`, with version in setup.py `tag-rc1-342-g3c3542fcb`
$ pip install .
Looking in indexes: <REDACTED>
Processing /src/python/mypkg
  Preparing metadata (setup.py) ... done
Building wheels for collected packages: mypkg
  Building wheel for mypkg (setup.py) ... done
  Created wheel for mypkg: filename=mypkg-tag_rc1_342_g3c3542fcb-py3-none-any.whl size=29290 sha256=026b83a9021a9d2ad51c0586b3f2e68966311a290ec4df8781249f8f3cad58e7
  Stored in directory: /tmp/pip-ephem-wheel-cache-nbejno8l/wheels/64/3e/4e/73312ba594e268dfa32dcdc32f5d399eee17eb8104bbe0c132
  WARNING: Built wheel for mypkg is invalid: Metadata 1.2 mandates PEP 440 version, but 'tag-rc1-342-g3c3542fcb' is not
Failed to build mypkg
Installing collected packages: mypkg
    Running setup.py install for mypkg ... done
  DEPRECATION: mypkg was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above. Discussion can be found at https://github.com/pypa/pip/issues/8368
Successfully installed mypkg-tag-rc1-342-g3c3542fcb

I had a look at this issue but may have failed to see the answer I'm looking for (sorry if this is the case).

One way to make pip “stricter” is --use-pep517. Stricter in quotes, because this is not exactly the same as disabling setup.py install; the option basically makes pip always pretend there is a pyproject.toml for each package (instead of using the globally installed setuptools), but that should introduce no functional differences for most packages. And since projects with pyproject.toml are never installed with setup.py install, the option kind of achieves what you want.

I am getting a below message on installing a python package with custom version 1.45.0webrc1

  WARNING: Built wheel for pm-selenium is invalid: Metadata 1.2 mandates PEP 440 version, but '1.45.0webrc1' is not
    DEPRECATION: pm-selenium was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above. Discussion can be found at https://github.com/pypa/pip/issues/8368
  • Is there a way to silent these warnings ?
  • Will this warning cause build failures in the future ? is the legacy setup.py going to get discontinued anytime soon
    ?
  • Also, a follow-up question, will the above versioning cause build failures if a wheel doesn't get built ?

because i really want to use custom release names 💯

is the legacy setup.py going to get discontinued anytime soon

Yes. I'm not sure without checking what the timescale is, but we will be removing this fallback.

Also, a follow-up question, will the above versioning cause build failures if a wheel doesn't get built

I'm not sure what you mean by this, what sort of build are you thinking of apart from building a wheel? All forms of direct build via setup.py have now been deprecated (by setuptools, not by pip) so at some point you'll have to move to a PEP 440 compliant version. And even if you don't get problems with build failures, you'll start hitting issues with other tools which only support PEP 440 style versions.

because i really want to use custom release names

Sorry, the standards don't support them, so you'll increasingly find that standard tools will reject them or fail.

@pfmoore thanks for the quick response confirming that legacy setup.py will removed in the future, will resort to PEP 440 style versions for pre-releases like

X.YrcN  # Release Candidate
1.1.2rc1-1

Seek guidance, python3.10.2 pyzmq19.0.2 installation failure, where is the problem?

Reach out to pyzmq for help.

Seek guidance, python3.10.2 pyzmq19.0.2 installation failure, where is the problem?

Building wheel for pyzmq (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [248 lines of output]
C:\Users\无双破军\AppData\Local\Temp\pip-req-build-5npw69cc\setup.py:1140: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
if V(Cython.version) < V(min_cython_version):
C:\Users\无双破军\AppData\Local\Temp\pip-req-build-5npw69cc\setup.py:1148: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
Warning: Couldn't find an acceptable libzmq on the system.
Warning: Failed to get cc._vcruntime via private API, not bundling CRT
AttributeError: 'MSVCCompiler' object has no attribute '_vcruntime_redist'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for pyzmq
Running setup.py clean for pyzmq
Failed to build pyzmq
Installing collected packages: pyzmq
Running setup.py install for pyzmq ... done
DEPRECATION: pyzmq was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above.

I do experience issues on an Apple MacBook Pro (13-inch, M1, 2020) with macOS Monterey 12.3.1. To be particular: when calling pip install deidentify, I get ERROR: Failed building wheel for gensim. I dont get these issues on other machines and I think this fails particular for the M1 architecture. Are wheel issues for this architecture being tracked and if yes, where?

@osmalpkoras the build error for gensim was likely printed and you should raise it with that project.

I am unable to install mitreattack-python. It keeps saying

DEPRECATION: cairocffi was installed using the legacy 'setup.py install' method, because a wheel could not be built for it. A possible replacement is to fix the wheel build issue reported above. Discussion can be found at #8368

@IgorGanapolsky the build error for cairocffi was likely printed and you should raise it with that project.

KhomZ commented

is it not possible to use tensorflow.examples in colab???

is it not possible to use tensorflow.examples in colab???

That's a question for colab support, not here...

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [23 lines of output]
Partial import of sklearn during the build process.
Traceback (most recent call last):

Hi,while deploying the application on heroku I am getting following error regarding the wheel.

@KAJURAMBO you can probably reproduce the locally by running python -m pip install wheel then python setup.py bdist_wheel and you'll need to find a way to fix that.

got this issue while trying to package a docker image using python:3.10-slim-bullseye, switched to python:3.10-slim-buster and everything worked fine

In #11456 we are targeting the removal of this setup.py install fallback to pip 23.1 (April 2023).

Moving forward with this will break stuff for me and create a bunch of work. I do like wheel builds, but I think the latest wheel build process is too opinionated, because it breaks with Metadata 1.2 mandates PEP 440 version, but '{git_hash}' is not.

Yes, I really do want git hashes, even thought they don't sort with naive comparison operators. My use-case is that this makes it simple to enforce that python lib versions == github versions == docker tags, and they are all the same value. No, I don't want to change all my processes to ship PEP-440 compatible identifiers to ECR/docker-hub or add a bunch of git tag stuff to existing CI, etc. No, I don't want {semver}-{git_hash} or {cal_ver}-{git_hash}, because those won't work in URLs for github without us mandating additional steps for release-tagging. I don't specify dependencies with inequalities like pkg<=..., so this just isn't a use-case for me.

Due to overreach in the wheel-build process, I am forced to ask that you guys please not change this process. Sorry. I realize that PEP-440 is standards-track, but afaik it leaves open things like enforcement. Pip should only throw a fatal error for a "bad" version string if and only if the user is actually asking for something fancy using <= or ~=, etc. Otherwise we'll soon be in the odd situation where packages can be shipped to pypi successfully, but can't be installed from there.

Please do let me know if there's other places/issues where I need to go to argue this point.

(whelp, wall of text)

@mattvonrocketstein I understand and empathise that this change will require you to change your workflow and you aren’t willing to do so. We're aware that this will be a disruptive change for users such as yourself -- our goal with these warnings prior to enforcement is to allow for users (like yourself) who rely on setup.py install to come up with approaches to deal with the fact that this change is happening.

As a concrete suggestion, no one will force you to update to the latest pip the moment it is released. You have the option of sticking to an older version of pip -- the main consequence of that is that you won't be able to reach out to pip's maintainers for requesting support; or at least, need to reproduce the issue with the latest pip before asking for support. One way around that is to use Python + pip provided by enterprise Linux distributions -- those will include support and back-ported security fixes from their Python/security teams, while still staying on an older version of pip (at least, as long as their support cycle allows).


Allow me to elaborate on why you might feel like everyone else has already decided to not accommodate for your use case, when advocating against the removal of setup.py install code paths to preserve support for installing packages with arbitrary version strings.

When you found out that your workflow was no longer supported when using wheels, and moved to the setup.py install workflow; you also moved to a significantly more problematic mechanism that has taken the largely-volunteer maintainers a lot more time to untangle + come around to deprecating, and you are now asking for them to not remove the more problematic mechanism — something that has been the direction that broader community efforts have been working toward for... well... many years. That’s not going to fly, no matter how passionately you advocate for it.

The only reason that non-PEP 440 style versions work for you today is because pip/setuptools etc have maintained a non-trivial amount of complexity to deal with them, and because they haven’t yet enforced consistency in that code path. setup.py install has been treated as undesirable for quite a while, among both maintainers and many users due to the complexities that it presents and the broken behaviours it has. Not removing it because it’ll break workflows that rely on the lack of enforcement of basic needs for pip — a package manager needs to be able to understand version strings passed to it, and parse them.

Note that non-PEP 440 versions with setup.py install present a warning from setuptools’ end that they are going to be removed (https://github.com/pypa/setuptools/blob/47cec1cd84def9d64f9251a403bb2305273efc9a/setuptools/dist.py#L544-L549) and from pip's end we have the intention documented as well (https://pip.pypa.io/en/stable/reference/build-system/setup-py/#direct-installation). This issue is about acting on what pip's documentation and setuptools' warnings already state.


You might be wondering what the "complexities that it presents and the broken behaviours it has" are. Here's a non-exhaustive list, that I can come up with in... let's time box this to 5 minutes.

  • Quirky version comparisions based on whether a version string is a PEP 440 string or not -- all legacy versions are sorted as "lower" than PEP 440 versions.
  • Increased user confusion due to setup.py install being executed after setup.py bdist_wheel failing.
  • Warnings from pip about build failures despite a successful installation since setup.py bdist_wheel didn't work but setup.py install does; resulting in user confusion as well as making the underlying code much more complex.
  • It uses entry point scripts launches that are generated by setuptools, which do not behave like the ones that pip generates.
  • It does not have an equivalent in a pyproject.toml-based build system, which means either maintaining the legacy codepath forever, doubling the maintainance costs and complexities in pip and setuptools, even though the pyproject.toml-based build system was explicitly designed to be compatible with the setup.py-based build system and the additional costs were deemed acceptable for a multi-year transitory period.
  • A whole lot of logic and cognitive complexity in pip's and setuptools' codebases to ensure that we don't eagerly parse versions or to otherwise allow allow for non-PEP 440 versions to pass through -- which make paying down technical debt within these codebases more complicated, which makes contributing to these projects more difficult, which makes it less likely that volunteers work on these projects or new contributors start contributing to it.

Ok, time box over. I'll flesh those out into sentences now.

All this is to say, this change is a heavily-loaded one that has a lot more complexities to it than "oh, we don't like arbitrary versions" and disallowing arbitrary versions is a understood (and, depending on who you ask, an appreciated) thing that this will trigger. We're aware of that, and that's why we're being considerate and slow in making this change (first warning for a period, before the eventual removal).


Outside of the coupling with setup.py install, looking only at the version-related usage of yours -- I am not aware of any package management system that allows users to specify arbitrary strings as versions 1, and I don’t think Python's should block significant improvements to support use cases that rely on that working; especially if they only work within deprecated portions of Python's packaging systems today.

Footnotes

  1. This is a bit of a tangent -- if you know of a package management system like that, I’d really like to know for the sake of curiosity, if nothing else. Whether it would be a weak argument in favour of supporting preserving such support, would be a nice case study.

PyPI mandates PEP440 versions

@dstufft i assume you mean public pypi. but the situation has changed since 2014.. privately hosted pypi is a thing now whether we like it or not. so my point stands that moving forward with this means: pip will be unable to install things that are hosted on pypi.

if my private pypi is not enforcing pep-440 compliance and the public one is, i think that raises the basic question: where are the official helm-charts (or similar) for self-hosting pypi? and: is pip intended to work only with public-pypi? do other tools like artifactory enforce pep-440 today, and if so have they done that for long enough that we think it's ok? another basic question.. do you guys actually have telemetry that suggests that most usage of pip is for public pypi, or did you just assume that? what percentage is ok to break? i think you also need to consider that users of PaaS deployments like lambda, MWAA, etc often cannot use public-pypi, and may not have a lot of easy control over packaging details for their dependencies, or things like the version of pip they are using.

so i think this is bigger than just modernizing individual libraries to use the latest obscure combination of setup.cfg/setup.py/pyproject.toml. but on that subject.. where is the actual reference implementation that pip recommends for what modern packaging is supposed to look like? packaging needs are certainly myriad and diverse, but i think what people need are production-ready templates that cover real use-cases. whenever I look for this stuff I find only piecewise or toy examples, and then I'm back to hours of sifting out-dated advice/patterns on blogs and stack-overflow. especially if you're strictly mandating version-strings your way going forward, then i think these project templates need to directly demonstrate practical real-world release processes end-to-end. normally i'd say things like how to use versioneer/scm_tools or whatever is up to users, but if you insist on removing user choice that's problematic. versioneer in particular supports pep440 but it is only one option of many. the project boilerplate reference that pip really should be providing needs to pin things, be maintained alongside changing versions of pip, changing versions of versioneer, twine, etc. if pip doesn't want the responsibility of mandating the choice or demonstrating the usage of something like versioneer, then i just think pip needs to stay out of the business of mandating version-string particulars ¯_(ツ)_/¯ because otherwise there's just way too much friction in the package-bootstrap process. is there a reference like this that you want to point people at? are there plans to go the poetry/pipenv route where pip itself can at least pave a working project for me with current version of pip?

philosophically, what i see in #1894 is a lot of odd debate/machinations to solve problems that the pep itself is creating. i don't understand all the effort at gaming the standard to optimize for harassing the smallest number of maintainers/users about compatibility, when of course it's very straightforward to just not harass them. i realize that ship has sailed, for better or worse. but maybe we can get clarity on one point: does the PEP require/discuss enforcement or not? afaik, one way to read it is "if you don't comply with new standards, you won't get new features", and in that sense I support the PEP and for some projects would want the features it adds! it is a different matter though to intentionally break legacy projects and create the push-but-not-pull situation with pypi mentioned above. pip seems to be making a choice here that's not explicitly spelled out by the pep, and which is hostile to both users and maintainers.. there's simply no need to break compatibility to get the benefits of pep440 for use-cases that require it. at best it's quite confusing/inefficient to ship advice about standards for almost 10 years and then pivot to enforcement like that was the plan all along. at worst it's disingenuous. but i guess this is urgent now if the competition is more pep440 compliant than pip!

more practically, how long do we have before you force this breakage in new pips? if I pin to older version of pip, how long before that breaks? i think most maintainers/users are simply trying to get by without getting sucked into the great python packaging wars, and we are dealing with constant breakage and churn in the messy packaging/dependency world that we're stuck with. pep440 isn't really fixing that, it's making it worse.. and breakage/backwards incompatibility in packaging is almost as bad as if it were in the language itself. again this has been safe to ignore for almost a decade, if maintainers even knew about it, which of course tends to lend weight to the expectation that standards-compliance is advice that buys us new functionality rather than just the promise of pointless but necessary make-work to avoid breakage. i think there are also significant and common workflows where this will probably be the first time maintainers are hearing about pep440 (i'll add more details about that later in follow up). since pip is always giving advice to pip install --upgrade pip and because that's usually the best way to fix mysterious emergent new problems with breaking changes, i think you can expect that many are following this advice from CI/CD. that's ill-advised, because of course it was always a toss-up whether it would fix things or break things, but it looks like the pendulum is about to swing sharply towards the latter. so my bet is that many will only notice for the first time when their CI breaks or their deployments are unusable.

i assume you mean public pypi. #1894 (comment).. privately hosted pypi is a thing now whether we like it or not. so my point stands that moving forward with this means: pip will be unable to install things that are hosted on pypi.

I mean PyPI, not a PyPI compatible thing that isn't PyPI.

if my private pypi is not enforcing pep-440 compliance and the public one is, i think that raises the basic question: where are the official helm-charts (or similar) for self-hosting pypi? and: is pip intended to work only with public-pypi? do other tools like artifactory enforce pep-440 today, and if so have they done that for long enough that we think it's ok? another basic question.. do you guys actually have telemetry that suggests that most usage of pip is for public pypi, or did you just assume that? what percentage is ok to break? i think you also need to consider that users of PaaS deployments like lambda, MWAA, etc often cannot use public-pypi, and may not have a lot of easy control over packaging details for their dependencies, or things like the version of pip they are using.

There is only one valid way to specify a version in Python packaging-- that is PEP 440.

Some projects may choose to emit invalid metadata, and other projects may choose to make a best effort at interpreting invalid metadata, but at the end of the day projects that don't emit a PEP 440 version are producing invalid metadata.

There is of course a tension here where there were projects that were producing valid metadata, then a decade ago we declared those invalid after the fact. We attempted to minimize that, but ultimately there were problems solved by the "just support anything as a version number" approach that lead us to standardize PEP 440.

If you disagree with the decision to standardize on PEP 440, that's your right, but that ship has sailed unless you want to write a PEP and gain support for a PEP to roll it back.

one way to read it is "if you don't comply with new standards, you won't get new features", and in that sense I support the PEP and for some projects would want the features it adds!

if I pin to older version of pip, how long before that breaks?

There seems to be some confusion in how pip functions here. If you do not upgrade your version of pip, you are isolated entirely from changes made in newer versions of pip. You can only possibly be affected by this change if someone is trying to use a newer version of pip to install your project that has invalid metadata.

Where this tends to get people is that it's not just the project author's version of pip, but all of their users' as well. With a private PyPI you tend to have more ability to mandate that you have to use a sufficiently old version of pip, with public PyPI you generally have a harder time doing that.

Based on this deprecation warning, it looks like you are planning to lead the PEP-440 enforcement charge from pip install. This is backwards, and makes no sense for the following reasons:

  • Leaving aside the issue of whether installers should refuse what registries have accepted, builders should not build what installers will refuse. For the project boilerplate I'm looking at, pip wheel . has been broken for a while probably, but that's not even the recommended approach. The newer approach recommended by docs uses python -mbuild, which complains about pep440 quietly at the top of some very noisy output, and then loudly declares victory in green "Success building.." text, and obligingly creates a wheel and a tarball. (There is no error and the warning is easy to miss, especially when run from CI or as just a piece of a larger automation task with tox, etc.) It's unclear why newer tooling like -mbuild should bring back a workflow older tooling like pip wheel is trying to deprecate, but the fact remains that supported/recommended workflows here lead to contradictory behaviour.

  • Leaving aside the issue of whether installers should refuse what registries have accepted, uploaders should not upload what registries will refuse. Since pypa maintains twine, and twine has an issue for some time about pep-440 enforcement, it seems like there is awareness of this problem. Twine should be the obvious way that PYPA can require something like "server-side" pep-440 enforcement, even when PYPA doesn't control the registry. Again, supported/recommended workflows here would lead to contradictory behaviour.

  • If you must break something, it's safest to break builds before breaking deployments. Every package legitimately uploaded to a package registry in the last couple of years is potentially a roll-back target for something. Unfortunately pip install has workflows that are related to both development and deployment. If I use pipenv for local development there's no warning and no error for pep440 enforcement. If I then use -mbuild and twine to ship packages, there's no enforcement. If pip install is used from PaaS, then I don't see warnings on headless servers, I just see successful deployments and I don't check logs. So if pip install breaks, I am hearing about changes first from my broken deployment/rollback. The net result here is that end-users or ops is hearing about breakage before developers/maintainers are. And again, contradictory.. PYPA has effectively blessed my package with three different interactions and suddenly choked on the 4th and most important interaction. When that happens, it's unlikely the person seeing that breakage can actually take steps to fix it.

Some projects may choose to emit invalid metadata, and other projects may choose to make a best effort at interpreting invalid metadata, but at the end of the day projects that don't emit a PEP 440 version are producing invalid metadata.

In retrospect, this reads like it might be trying to preemptively dismiss my legitimate confusion/concern with the stuff above, but I hope that's not the case. I don't think it's ok if PYPA's own toolchain is throwing internal consistency out the window.

I mean PyPI, not a PyPI compatible thing that isn't PyPI.

Well there's a lot to unpack here. Still not sure what you're implying since I thought pip was a package installer and not intended to be exactly/only a client for public-pypi. And this raises the questions of whether warehouse is PyPi enough, and whether it enforces pep-440? It seems that folks in this thread know that it does not. (This is confusing since I'm not sure how PyPi has enforcement-features that are missing from warehouse, if PyPi is using warehouse). Not that it matters much, because docs lead me to believe that Warehouse is not actually usable today for a private registry anyway.

So in terms of policy, it seems to me that exactly one the following things must be true:

  1. pip does not acknowledge private-registries as a valid use-case- in which case python effectively only supports OSS development.
  2. pip does acknowledge private-registries as a valid use-case, but only supports warehouse- in which case pip is blocked on requiring new metadata until warehouse is actually usable, and pep-440 aware, and ideally in common usage for some years
  3. pip is agnostic about which registry is being used- in which case pip is blocked on requiring new metadata until PYPA is enforcing metadata standards even across third-party registries using twine, ideally for some years.

There is only one valid way to specify a version in Python packaging-- that is PEP 440.

And one valid package registry also apparently 🙄 - all solutions are very tidy if we're ignoring the real world. But ok. One valid way, so I'll tell my friends, and you can tell yours. You can start with pipenv, -mbuild, twine, and warehouse. Also to the extent you're making a style-guide something that's enforced, it would be great if PYPA would provide a style-normalizer, but after checking into it a bit, hatch (still) isn't really there yet.

Look, it might seem like I'm arguing that the entire PYPA toolchain needs to be tightly coupled, but it's actually the opposite. I don't think changes in pip should be blocked on warehouse deployability or updates to twine. Before PEP-440, these things were loosely coupled. But what PEP-440 does is force tight-coupling on all tooling, or leave us with contradictions like builder-builds-what-installer-refuses. Sorry if that's a hassle, but this is the fall-out from responsibly enforcing your own mandates, and I think you guys asked for this?

IMHO, PEP-440 enforcement on public PyPi all by itself is not even close to hitting due-diligence requirements for the breaking changes you want to require in pip. Warnings that are easy to miss, or even errors which are easy to bypass by reaching for other supported tools in a complex and continuously churning kit also do not help to establish due-diligence. And not that I really expect this, but as a courtesy, it would be nice if the community had as long to deal with this sort of thing as PYPA gave it's own insiders. If the twine issue had been addressed ~4 years ago when public PyPi changed, then that would be ~4 years of fewer broken rollback targets that private registries were hosting.

Some background.

PEP 440 has been the standard for version numbers since 2014, and PEP 345 (metadata 1.2) was updated to make PEP 440 versions mandatory in commit 848fcd1b3 (2016). So, for six years packages with non-PEP 440 versions have been invalid, according to accepted standards.

The PyPA is not a unified body, it's a loose community of independent projects, each of which governs itself. There is no central authority, and in particular no-one who can insist that "all PyPA projects" implement agreed standards on any given timeframe. And given that most, if not all, tools in the packaging ecosystem are developed by wholly-volunteer teams, often of one or two people only, any such imposition would be unenforcable anyway.

You are right, different tools are enforcing the standards at different rates, and often incredibly slowly. Some of that is lack of people willing to do the work, and some of it is deliberately trying to avoid breaking projects that haven't yet adapted to the standards. But it's been six years now, and we've probably gone long enough on the basis of letting people know about the new standards and expecting them to change over time. Most people seem to have done so, and any hold-outs probably (in my personal opinion) aren't going to change just because it's the standard.

So we're now at a point where individual projects have to make the hard choice to start enforcing standards and dropping support for projects that don't follow those standards. This is messy, because as you've correctly noted, tools make these decisions independently and there's no co-ordination. So yes, pip will no longer install projects that don't use PEP 440, when setuptools will still build such projects, and twine will allow them to be uploaded to indexes that don't enforce PEP 440. That's messy, and not ideal for the end users.

But short of someone funding a major project to enforce standards consistently across all projects (the exact opposite of what you want!) a messy and piecemeal enforcement is the best we can manage with the resources we have.

The good news, though, is that it's easy to deal with - just conform to the standards. If you don't want to, tools will gradually stop supporting you. But that's the consequence of your choice to ignore the direction Python packaging has been going in (with community support!) for 6+ years now. You can pin your environments to older versions of tools, as they start to enforce PEP 440. No-one is forced to update anything. Public services like PyPI may no longer support you, requiring you to find new hosting where you can choose the policies, but that's just like any other public service.

Frankly, we'd have moved to enforcement much faster if we hadn't been trying extremely hard (and at significant cost to project maintainers) to give people time to move to new standards. It's pretty disappointing and discouraging to find that even though we've made those efforts, people still feel that it's OK to argue that we haven't done sufficient due dilligence, we didn't give the community the courtesy of sufficient lead time (is 6 years really not sufficient?), or that we're ignoring the real world.

I don't think there's anything more to say here. Your situation has been noted, but we're unlikely to change our plans because of it.

I'm not going to lock this conversation, as we do want to hear feedback from other people affected by this change, but it would be appreciated if you could avoid further diverting the discussion onto general questions about standards adoption - if you want to debate those, please have the conversation on a more appropriate forum like the packaging category on https://discuss.python.org/

Since you asked @pfmoore - I only have one comment to the discussion. I follow this and many other discussions in pip and pypa rather closely over the last few years.

My 3 cents.

I hear over-and-over "we have too little number of people, all volunteers, we cannot deliver what you want. don't expect more from us", I personally know a number of people that would be willing to help and increase the capacity of the pip and pypa efforts (myself including). But they are put off by the attitude of mainters, to put it blantly,

I think good solution to that would be to be open to other people joining the team, opening up with new ideas and ways of thinking. I think there is very little effort from the pypa maintainers to create an open and welcoming environment to nurture and welcome "fresh blood" and increase project's capacity.

Even if there are smart people who have different opinions, making them feeling welcome with their ideas and letting them try things and mentor them in would be a great idea to increase the capacity. You can do it in the way that keeps project in check without impacting overall quality by putting some safeguards in place and making sure the right governance in place so that the new people know how they should approach proposals for changes and how they can engaged in true discussions and when they feel they are heard and when there is a true discussion rather than (similar to the comment above) statments basically saying "we have no time, don't expect more from us" rather than "let's work together and invite others to volunteer to help us to build stronger, sustainable team".

I recently gave a talk about "growing your contributors base" at the Apache Con in New Orleans and one of the points I had there for the maintainers was "know when you need to grow and deliberatly act in the direction to do it - if you do not do it, it will not happen all by itself". Acting in this direction allowed us - in Airlfow to continuously grow not only contiributors base, but also commiter's base without impacting the overal direction of the project (but with many ideas and modifications of the original vision that the "fresh people" brought with themselves. Which required in many cases depriioritising ego of some maintainers in favour of prioritising improvements for the project.

Please don't take it personally, I think you should simply be more empathetic for people who honestly and sincerely want to help, have capabilities and time to do so, by the attitude they see make them think "Meh, it's not worth fighting for it as I am continuously facing with dismissive, harsh and unfair treatment where my voice and feelings are not even seriously considered". This is the current approach I am taking at least after the past experiences. The pip is super important for us and I try to keep up with what's going on in the project, but I dread the idea of making some proposal that goes against current direction (or is not a prioriity for current maintainers) - simply because of attitude of the responses I am expecting.

I understand that you have too little number of maintainers and it is all-volunteered time. I really do. But I think you run yourselves in the corner and you should simply be more open to others who cold help you to grow the team - and deliberately act in this driection.

Those are my 3 cents. Knowing the past discussiosn - I think you might want to hide it as off-topic but before doing it - think a bit about it please. It's very much "on-topic" actually because I think it very much explains the root cause of all similar discussions you have (and there are far too many of those to ignore them IMHO).

I understand that you have too little number of maintainers and it is all-volunteered time. I really do. But I think you run yourselves in the corner and you should simply be more open to others who cold help you to grow the team - and deliberately act in this driection.

You may be right. But the problem is that we are not actually constrained on code contributions. We're constrained on people contributing in support of the project direction to discussions like the "enforcing PEP 440" one here, on people doing research to understand the impact of changes we're considering, and on people reviewing existing PRs (in particular, in terms of how well they fit with pip's overall direction - simple "this code does what it says" reviews are also useful, but not the bottleneck).

And outside of pip, we're constrained by the lack of community involvement in standards setting. The pip maintainers have stated, publicly and repeatedly, that we follow standards. So "why is pip no longer allowing non-PEP 440 versions?" shouldn't even be a discussion here. If people don't like the standards, and don't follow them, why not get more involved in the process and ensure that the standards work for you? Even a standard as old as PEP 440 can be changed if there's sufficient justification.

I don't know how to encourage people to contribute in these areas. When we disagree with proposals, that isn't "not being welcoming", it's just as much the case that saying "this should be added" without considering the wider picture, isn't "contributing"1.

Footnotes

  1. Obligatory Monty Python reference: "This isn't an argument, it's just contradiction." "No it isn't." "Yes it is!"

When we disagree with proposals, that isn't "not being welcoming", it's just as much the case that saying "this should be added" without considering the wider picture, isn't "contributing"[1

I think there is a big difference in saying "No, This is a bad idea. Go and find all the discussions about it yourself." vs. "Yes, we see your point, and it would be great if you can help to hash it out, happy to help to guide you and help you to understand the reasoning.".

The first one is basically "get lost, I have no time for you", the second is "Yeah we want to help others to grow to becoming contributors and we want to help them in becoming ones".

For example above comment might look like "Yeah. It woud be fantastic if twine has the same checks, we have the same long term vision, and maybe you can help there? Here is how you can get started and we are happy to guiide you".

On the point of "why pip" is that every other piece in the toolchain can basically treat version strings as opaque strings and still do their job. To be overly simplifying, twine is basically a well-crafted POST request. build is basically a glorified subprocess call. setuptools is a tarfile+zip-with-a-different-extension generator. None of those pieces involve trying to parse the version.

Even if we started by adding validation there, those tools erroring out first is gonna result in user pushback that "hey, pip allows it so why don't you!?" responses.

That said, what @potiuk said is appropriate: it's a case of lack of availability of people, and the tools are generally moving in this direction.

"Yes, we see your point, and it would be great if you can help to hash it out, happy to help to guide you and help you to understand the reasoning.".

I literally did this in my first long comment -- minus tell me about your use case -- so I'm feeling a bit ignored right now but I digress. :)

if my private pypi is not enforcing pep-440 compliance and the public one is, i think that raises the basic question: where are the official helm-charts (or similar) for self-hosting pypi? and: is pip intended to work only with public-pypi? do other tools like artifactory enforce pep-440 today, and if so have they done that for long enough that we think it's ok? another basic question.. do you guys actually have telemetry that suggests that most usage of pip is for public pypi, or did you just assume that? what percentage is ok to break?

OK, even if I believe that you're trying to be helpful and make a point, or just gather information, this whole comment is a question soup. That's off-putting since it's using questions to make a point implicitly, without committing to a specific answer, while allowing you to use the fact that you never stated your position to push back or ask further questions to anything someone responds to this with. Besides, there are way too many questions -- answering questions typically takes more energy than asking the question.

Using a primarily-question line in discussions is typically a symptom of bad-faith arguing, since it puts all the onus on the party being questioned, with the person asking questions having the ability to push back to any point being made, by asking another leading question.


Finally, I'd like to not get into an extended discussion on change management here. If you want to have that discussion @potiuk, please file a new issue.