The Astro Databricks Provider is an Apache Airflow provider created by Astronomer to run your Databricks notebooks as Databricks Workflows while maintaining Airflow as the authoring interface. When using the DatabricksTaskGroup
and DatabricksNotebookOperator
, notebooks run as a Databricks Workflow which can result in a 75% cost reduction ($0.40/DBU for all-purpose compute, $0.10/DBU for Jobs compute).
- Apache Airflow >= 2.2.4
- Python >= 3.7
- Databricks account
- Previously created Databricks Notebooks
pip install astro-provider-databricks
-
Use pre-existing or create two simple Databricks Notebooks. Their identifiers will be used in step (6). The original example DAG uses:
Shared/Notebook_1
Shared/Notebook_2
-
Generate a Databricks Personal Token. This will be used in step (5).
-
Create a new Astro CLI project (if you don't have one already):
mdkir my_project && cd my_project astro dev init
-
Add the following to your
requirements.txt
file:astro-provider-databricks
-
Create a Databricks connection in Airflow. You can do this via the Airflow UI or the
airflow_settings.yaml
file by specifying the following fields:connections: - conn_id: databricks_conn conn_type: databricks conn_login: <your email, e.g. julian@astronomer.io> conn_password: <your personal access token, e.g. dapi1234567890abcdef> conn_host: <your databricks host, e.g. https://dbc-9c390870-65ef.cloud.databricks.com>
-
Copy the following workflow into a file named
example_databricks.py
in yourdags
directory: -
Run the following command to start your Airflow environment:
astro dev start
-
Open the Airflow UI at http://localhost:8080 and trigger the DAG. You can click on a task, and under the Details tab select "See Databricks Job Run" to open the job in the Databricks UI.
-
Use pre-existing or create two simple Databricks Notebooks. Their identifiers will be used in step (5). The original example DAG uses:
Shared/Notebook_1
Shared/Notebook_2
-
Generate a Databricks Personal Token. This will be used in step (6).
-
Ensure that your Airflow environment is set up correctly by running the following commands:
export AIRFLOW_HOME=`pwd` airflow db init
-
Create a Databricks connection in Airflow. This can be done by running the following command, replacing the login and password (with your access token):
# If using Airflow 2.3 or higher: airflow connections add 'databricks_conn' \ --conn-json '{ "conn_type": "databricks", "login": "some.email@yourcompany.com", "host": "https://dbc-c9390870-65ef.cloud.databricks.com/", "password": "personal-access-token" }' # If using Airflow between 2.2.4 and less than 2.3: airflow connections add 'databricks_conn' --conn-type 'databricks' --conn-login 'some.email@yourcompany.com' --conn-host 'https://dbc-9c390870-65ef.cloud.databricks.com/' --conn-password 'personal-access-token'
-
Copy the following workflow into a file named
example_databricks_workflow.py
and add it to thedags
directory of your Airflow project:Alternatively, you can download
example_databricks_workflow.py
curl -O https://raw.githubusercontent.com/astronomer/astro-provider-databricks/main/example_dags/example_databricks_workflow.py
-
Run the example DAG:
airflow dags test example_databricks_workflow `date -Iseconds`
Which will log, among other lines, the link to the Databricks Job Run URL:
[2023-03-13 15:27:09,934] {notebook.py:158} INFO - Check the job run in Databricks: https://dbc-c9390870-65ef.cloud.databricks.com/?o=4256138892007661#job/950578808520081/run/14940832
This will create a Databricks Workflow with two Notebook jobs. This workflow may take two minutes to complete if the cluster is already up & running or approximately five minutes depending on your cluster initialisation time.
DatabricksWorkflowTaskGroup
: Airflow task group that allows users to create a Databricks Workflow.DatabricksNotebookOperator
: Airflow operator which abstracts a pre-existing Databricks Notebook. Can be used independently to run the Notebook, or within a Databricks Workflow Task Group.AstroDatabricksPlugin
: An Airflow plugin which is installed by the default. It allows users, by using the UI, to view a Databricks job and retry running it in case of failure.
The documentation is a work in progress--we aim to follow the Diátaxis system:
Astro Databricks follows semantic versioning for releases. Read changelog to understand more about the changes introduced to each version.
All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
Read the Contribution Guidelines for a detailed overview on how to contribute.
Contributors and maintainers should abide by the Contributor Code of Conduct.