Overview

Dynamic Airflow Dags with Airflow fom Databricks Jobs with repair run functionality.

Project Contents

  • dags:
    • databricks-jobs-repair.py - Single DAG that dynamically generates as task group that mirrors a specified databricks job
    • databricks-jobs-dynamic.py - Dynamic DAG that generates a separate DAG for each job in include/databricks_jobs.json
    • databricks-jobs-createjob.py - DAG that creates a databricks job based on a predefined job_config dict and then triggers and tracks it in Airflow.
  • Dockerfile: This file contains a versioned Astronomer Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
  • include: This folder contains any additional files that you want to include as part of your project. It is empty by default.
    • databricks_jobs.json: File that end users maintains that contains Databricks job name and job id so that CICD can generate task dependency information used by dags to mirror Databricks jobs.
    • task_dependencies.json: Generated by GitHub Actions as part of CICD based on the databricks_jobs.json file.
  • .scripts: Scripts used by GitHub Actions as part of CICD
    • dag_builder.py: Uses the databricks_jobs.json to generate task_dependencies.json file in the include folder.
  • packages.txt: Install OS-level packages needed for your project by adding them to this file. It is empty by default.
  • requirements.txt: Install Python packages needed for your project by adding them to this file. It is empty by default.
  • plugins: Add custom or community plugins for your project to this folder. It is empty by default.
    • templates/databricks_plugin/repair.html: For the repair run webpage in Airflow. Used by the databricksrun/repair view and endpoint.
    • extra_link.py: Plugin for all Databricks integrations including
      • Databricks Task - Extra link for the databricks run page url for a given task.
      • Repair Run - Extra link to repair failed tasks in a databricks run.
      • DatabricksRun View - Flask view for the repair run functionality
      • Blueprint - Flask blueprint for plugin
  • airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.
  • .github/workflows:
    • main.yml - CICD for the dynamic DAG to get job task details from Databricks API and deploy DAGs to Astro.

Additional Requirements

Connections:

  • http_default - HTTP connection for the Databricks API

Deploy Your Project Locally

  1. Start Airflow on your local machine by running 'astro dev start'.

This command will spin up 3 Docker containers on your machine, each for a different Airflow component:

  • Postgres: Airflow's Metadata Database
  • Webserver: The Airflow component responsible for rendering the Airflow UI
  • Scheduler: The Airflow component responsible for monitoring and triggering tasks
  1. Verify that all 3 Docker containers were created by running 'docker ps'.

Note: Running 'astro dev start' will start your project with the Airflow Webserver exposed at port 8080 and Postgres exposed at port 5432. If you already have either of those ports allocated, you can either stop your existing Docker containers or change the port.

  1. Access the Airflow UI for your local Airflow project. To do so, go to http://localhost:8080/ and log in with 'admin' for both your Username and Password.

You should also be able to access your Postgres Database at 'localhost:5432/postgres'.

Deploy Your Project to Astronomer

If you have an Astronomer account, pushing code to a Deployment on Astronomer is simple. For deploying instructions, refer to Astronomer documentation: https://docs.astronomer.io/cloud/deploy-code/

Contact

The Astronomer CLI is maintained with love by the Astronomer team. To report a bug or suggest a change, reach out to our support team: https://support.astronomer.io/