[BUG] Intellisense not recognising `spark` as a global
jhickson opened this issue · 3 comments
Describe the bug
If you type spark
in a Python notebook then Pylance warns that it is undefined and intellisense does not work. The same is not true of the dbutils
global which is correctly detected as a global.
To Reproduce
Steps to reproduce the behavior:
- Create an empty Python file
- Copy the example code from the Databricks docs
- You'll see
spark
underlined by Pylance and autocompletion when using it won't work
Screenshots
In the below note the underlining of spark
but the correct highlighting of dbutils
:
This is my __builtins.py__
as added by the extension:
System information:
Version: 1.77.3 (user setup)
Commit: 704ed70d4fd1c6bd6342c436f1ede30d1cff4710
Date: 2023-04-12T09:16:02.548Z
Electron: 19.1.11
Chromium: 102.0.5005.196
Node.js: 16.14.2
V8: 10.2.154.26-electron.0
OS: Windows_NT x64 10.0.19045
Sandboxed: No
Databricks Extension Version: 0.3.10
Databricks Extension Logs
There's nothing useful in the logs as far as I can see - the only errors relate to not being able to load a .env
file which is expected as there's no such file.
@jhickson we are moving away from builtin type stubs. For now, you can add this to your __builtins__.pyi
file, to provide globals autocompletion again.
from pyspark.sql import SparkSession
spark: SparkSession
@kartikgupta-db that works, thanks. When do you think the full solution to this not using stubs will be implemented?
@jhickson, for the short term, we will be automatically adding the statements above to your __builtins__.pyi
file. The full solution would involve language servers, so is unlikely to happen soon.