databricks/databricks-vscode

Recognize py file as a notebook and use azure cluster as a kernel

virtualdvid opened this issue · 6 comments

Describe the enhance

It will be a great feature if vscode can recognize the databricks notebook py files as a notebook and allow us to select the databricks cluster as a kernel.

Even though we can run the notebook as workflow on Databricks it runs the entire notebook. What if I only want to run one cell at a time.

How to reproduce

vscode already recognize # COMMAND ---------- as a cell:

image

but if we try to run it (or debug), there is not the databricks cluster:

image

image

Ideal

based on the first commented line # Databricks notebook source open the file as a notebook and allow me to select the databricks kernel:

image

NOTE: Please add a tag to suggest features additional to the one for bugs :)

Hi guys,

Found this documentation and I have been trying to implement databricks-connect so far all good until I get this error trying to get dbutils:

Exception has occurred: AttributeError
'NoneType' object has no attribute 'user_ns'
File "/Users/xxxx/xxx/my_repo/poc.py", line 6, in
dbutils = DBUtils(spark)
AttributeError: 'NoneType' object has no attribute 'user_ns'

Here the code:

from databricks.connect import DatabricksSession
from pyspark.dbutils import DBUtils

spark = DatabricksSession.builder.getOrCreate()
dbutils = DBUtils(spark) # error in this line

if I commented out that line and try to load a list of dictionaries in a pyspark df it fails with not mayor logs:

Exception has occurred: AssertionError X
exception: no description
File "/Users/xxxx/xxx/my_repo/poc.py", line 21, in
df = spark.createDataFrame(my_list_of_dicts)
AssertionError:

This code works fine in databricks UI notebook

df = spark.createDataFrame(my_list_of_dicts)
df.limit(10)

Environment:

vscode:
Version: 1.77.3 (Universal)
Commit: 704ed70d4fd1c6bd6342c436f1ede30d1cff4710
Date: 2023-04-12T09:19:37.325Z (2 wks ago)
Electron: 19.1.11
Chromium: 102.0.5005.196
Node.js: 16.14.2
V8: 10.2.154.26-electron.0
OS: Darwin arm64 22.3.0
Sandboxed: Yes

Databricks Cluster:
image

Databricks extension version: v0.3.11

Thumbs up for this feature request! 👍

I do use .py scripts or .ipynb notebooks depending on the project, and for both approaches GitHub copilot is amazing for accelerating code development.

It would be great to be able to open Databricks .py scripts in VS code as a notebook, selecting a Databricks cluster as the compute environment. This would allow us to have a GitHub Copilot enabled Databricks environment, where we could be developing notebooks together with the copilot.