/Airflow

Airflow basics and building data pipeline using GoogleBigQuery

Primary LanguageJupyter Notebook

Airflow

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

Usage

Run the web service with docker

docker-compose up -d

# Build the image
# docker-compose up -d --build

Check http://localhost:8080/

  • docker-compose logs - Displays log output
  • docker-compose ps - List containers
  • docker-compose down - Stop containers

Other commands

If you want to run airflow sub-commands, you can do so like this:

  • docker-compose run --rm webserver airflow list_dags - List dags
  • docker-compose run --rm webserver airflow test [DAG_ID] [TASK_ID] [EXECUTION_DATE] - Test specific task

If you want to run/test python script, you can do so like this:

  • docker-compose run --rm webserver python /usr/local/airflow/dags/[PYTHON-FILE].py - Test python script

Connect to database

If you want to use Ad hoc query, make sure you've configured connections: Go to Admin -> Connections and Edit "postgres_default" set this values:

  • Host : postgres
  • Schema : airflow
  • Login : airflow
  • Password : airflow

Credits