microsoft/ptvsd

Debug with specific script exec doesn't work since 2019.11.49689

milnick opened this issue · 1 comments

Hi,
I need to debug Python script using Spark Python interpreter.

My debug configuration contains following settings:

"osx": {
    "pythonPath": "<pathToSpark>/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/bin/spark-submit"
}

It was working well. It doesn't work from version ms-python.python-2019.11.49689.

Symptoms:
It's hangs few seconds and stop without opening Python Debug Console terminal.

Found-outs:

  1. If I switch pythonPath to python default installation path - It works OK.

  2. In Python log I see following two line:

<pathToSpark>/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/bin/spark-submit ~/.vscode/extensions/ms-python.python-2019.10.44104/pythonFiles/interpreterInfo.py

If I execute with spark-submit commands, output contains plenty of rows with one row containing info about Python version:

**{"versionInfo": [3, 7, 4, "final"], "sysPrefix": "/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7", "version": "3.7.4** (default, Sep  7 2019, 18:27:02) \n[Clang 10.0.1 (clang-1001.0.46.4)]", "is64Bit": true}

If I execute it with default Python installation there is only one row with Python info:

**{"versionInfo": [3, 7, 4, "final"], "sysPrefix": "/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7", "version": "3.7.4** (default, Sep  7 2019, 18:27:02) \n[Clang 10.0.1 (clang-1001.0.46.4)]", "is64Bit": true}

Conclussion
It seems to me that there is some logic behind which is working with Python's interpreter version, which was added or changed and cause problem with debugger.

We expect whatever is used for "pythonPath" to be a valid Python interpreter that is 1) capable of handling the same switches as standard python or python3 binary from CPython, and 2) does not produce any extra output other than what the script itself generates (because some tools, including the debugger, use stdin/stdout to communicate with other processes).

Since spark-submit doesn't satisfy either requirement, it is not a supported binary for "pythonPath".